Categories
Provide log of rows read and written for Data Flows

Organization Name
Performance Architects
Description
When running a data flow, a success message is displayed when it completes. It would be helpful to also see statistics that show for each data source the number of rows read and then the number of rows stored. Put it into a popup box with text that can be copied. Also, include those counts on the Inspect -> History tab when you click on a data flow run event. It would be even better if the log showed rows in/rows out for each step (maybe controlled by a flag on the data flow for level of detail... why, you could even call it LOGLEVEL!).
Use Case and Business Need
Helps in debugging data flows, especially when joins and aggregates are included in the steps.
More details
Logging is pretty much standard functionality for data integration tools. Would help data flows be more enterprise ready.
Original Idea Number: 641b69981c
Comments
-
What default level of logging do you think would be appropriate? Also, how important is each step?
0 -
Default could be "No log" with options for "Summary Log" and "Detail Log".
0 -
Ran into an issue with this yesterday that would have been helped by having data flow logging. We had an incremental data flow that was supposed to load
only new rows. It returned a success status when we ran it but no new rows were added. There was no way to debug the data flow to see what was happening
behind the scenes. Without any logging/debugging capability, the use cases are limited.
0 -
It would be nice to have an alerting system similar to OBIEE alert which pops up in the tool when some Data Flow Fails, And It would be more helpful if we can enable its to send email if fails.
P.S. It shouldn't flood us with spam mails.
0 -
It would also be great to be able to see the data flow log under Inspect -History and SORT the log by date. Right now there is no way to sort the history of the data flow.
0