Apologize me if this question is posted in the wrong category, see if you can help me on this issue.
I have a Scheduled batch job that runs everyweek on Saturday 9.00 AM. The purpose of this job is to collect data that comes into a table A from last Staurday to Friday based on specific conditions.
Few columns from that entire table is moved into a flat file and placed in a Unix directory for my downstream system to pick up.
The issue here is if the count of records less than 800000 the job will get completed. On fe occassions if the count is more than 800000 it will fail.
I checked with my database administrator and he said the space and other stuff allocated to this job can handle only 800000 records. I dont have permission to change any settings related to INFRASTRUCTURE now.
As a temporary workaround when the job gets failed due to huge data, i m splitting the data based on dates, to less than 800000 per set and running the job separately for each set.
Is there any way to overcome this. Also my downstream system can pick only one file at a time. After picking up the file it will delete that file from the server, then only i can place the next file.
Please provide me with some fix for this issue.