This content has been marked as final. Show 5 replies
An alternative way would be create a staging table and upload data there and once upload is complete. Call a DBMS job to transfer data from staging table to original table.
And once the transfer is complete delete/truncate the table. Actually as the size of the table grows day by day then somewhere down the line you are forced to see performance problem.
Hope this will resolve your issue.
Thank you for responding. In a sense that is kind of what we are doing. We truncate the table and upload our .csv to this table and then process into our other tables.
The table does not have a particularly large number of rows (usually 60,000 - 70,000 rows), but does have 112 columns.
It all seems to work fine in our test environment, but there are not the number of users there as in production. We have been looking at the SGA parameters and seem to make headway, but then the customer tries to upload and it slows way way down in production.
Our customers really like this functionality we have given them with apex, but I am running out of ideas of what to look at.