carmac wrote:It would be the same as a small file but you could tweak some of the settings of the loader. You can go for a Direct Path Load insted of Conventional Load. You could modify the frequency of commit being issued, could consider going for a larger value.
How to load this much large file in oracle table using sqlldr.
2. Load this single file using direct=true into an empty table in append mode, *6 minutes*.
->du -sh report.txt 9.9G report.txt ->wc -l report.txt 22287833 report.txt
bottomline, no matter what everyone tells you, you have to test it yourself on your own hardware. It took me maybe 30 minutes to put this whole test together and write this post. Oh and btw i ran this locally on the box that has my db, so didnt use SQLNET.
data Table truncated. Elapsed: 00:00:00.13 External Table created. Elapsed: 00:00:00.13 -- insert in progress 22287833 rows created. Elapsed: 00:01:03.21 Commit complete. Elapsed: 00:00:00.05 External Table dropped. Elapsed: 00:00:00.06