You're almost there! Please answer a few more questions for access to the Applications content. Complete registration
Interested in joining? Complete your registration by providing Areas of Interest here. Register

Stage File (Read File in Segments) for large file fails with Read t

Accepted answer

Summary: Large CSV file with 5000+ records takes over 300 seconds to read and write to ATP db.

Content (required): I have an integration wherein I need to save the contents of the csv file to an ATP table. The integration uses stage file (read in segments) to avoid issues with large files. However, even when the file is only 3mb, but has over 14000 rows, it takes over 300 seconds to finish. As such, any integration calling this integration fails after 300 seconds with " Read timed out"

What can be done to avoid this timeout? As I understand Stage file only reads 200 lines at a time and this cannot be increased. Also, the 300-second timeout cannot be increased. How is one supposed to deal with csv files with a large number of rows?

Howdy, Stranger!

Log In

To view full details, sign in.


Don't have an account? Click here to get started!