2 Replies Latest reply on Mar 19, 2013 6:53 PM by SreeC

    Out of memory during data loading and Transaction Rungraph?

      Hi all,

      I have look through the forum and what I have seen so far to deal with the out of memory problem during loading of data in the Integrator is to basically reduce cut down on the processes which consume your memory on the system. Given this, I believe there is a limit to the amount of memory that you can cut down to. With my data size growing over time, I believe I will still reach a point where I will still encounter errors when I try to load all my data in a single graph.

      Hence, what I would like to ask is, is there any way that I can run multiple graphs sequentially, where each graph will load perhaps 1 million records from my database and after the execution of each graph, the resources utilised for the running of each graph is made available to the next graph so I wont encounter the out of memory issue? Finally, can I verify with anyone who is using the Transaction Rungraph connector, is the connector able to meet the requirements that I have described above?

      Thank you!