I am getting OutOfMemory exception while inserting more than 20K records.
I have Java frontend which makes the call to the backend Java (using ejb) which has business logic. Due to security reason I can't make database call from front end java.
Currenty I am passing 20K studentids from java frontend to java backend in arraylist. And in backend I am making db call, getting details of all 20K studendids (50 columns per student), storing those in Arraylist and returing Arraylist to front end Java. It works fine for less number of records (e.g. 2-5K) but it creates prob for more records (20K).
I have to process those data at frontend, so I have to return the Arraylist.
Could anyone please help me to resolve issue, let me know incase any more details require.
Not sure if you are actually trying to display 20,000 records at client page, I would say it might have performance issues due to the amount of data passed in network. Also feel it would not be user friendly.
I would suggest you can think of pagination.
I suggest to use pagination. Try increasing jvm memory "-Xms128m -Xmx512m -XX:MaxPermSize=256m " . Possibly suggest JVM to call garbage collector by System.gc();
As suggested by several people use pagination, no one in read world can analyze the 20K records report :)
Check your code for areas where memory would not be freed and also as others suggest, check your xmx setting.
My report process can process over 20k records and spit out a few hundred page PDF just fine. I can run with -xmx512 locally as well. It's all raw data, not too many images.
My 20k+ records come from a resultset and are processed using a cachedrowset. I never move any data to any "holding" objects other than the cachedrowset. Something you might not be able to do here.