0 Replies Latest reply on Nov 13, 2019 5:54 PM by SuddhasatwaBhaumik

    Oracle Data Integrator Memory Usage Paradigm/Problem?

    SuddhasatwaBhaumik

      Hello All !!

       

      Recently, I have migrated an old legacy ETL design into Oracle Data Integrator based solution using procedures and interfaces wherever applicable. All possible improvements to the performance and monitoring aspects of the solution are already put into place. Earlier this solution was running using PL-SQL which was hand coded by my team and its predecessors; in the former case, we had the source database in the same data-center as the target database. AS of now, the source database is moved over to Cloud, hence we've started exporting data out of AWS in form of CSV Files and after they are transferred into our data center, we are then using Oracle data integrator to load these files into Target database into respective target tables.

      However, we see memory and slowness issues from time to time. This is even for cases where I am running the solution on a very good hardware. By Good hardware, I mean my servers with 32GB of available RAM with at least 16 cores of CPU. Many a times the standalone agent hangs, or because of parallel execution using OdiStartScen command line option using the startscen.sh script which comes along with ODI 11g installation. Same situation with memory leakage happens when I run the solution on a Private cloud solution owned by my company in its own data-center premises. How do I know it is a memory leakage issue? Because, almost every day I see a heap error issue resulting into running processes being thread dumped in form of hs_err_.log files on the same directory from where my standalone agent is running.

      As a background of the solution: I am running ODI 11.1.1.6 on Solaris and RHEL both interchangeably based on performance. My source data which I am loading are set of flat files with comma separated data. My target database is Oracle 11.2.0.4. Per hour I get average 10 files to be loaded into 10 specific tables. For file to tables data loading I am using SQLLDR and Merge IKM. SQLLDR IKM is loading the CSV Files from my local disk into Temp I$xxx tables created during run-time and then from I$xxxx tables into the target tables, the data is loaded using Oracle MERGE statements using Merge IKM. In this cases also, appropriate database hints like APPEND and INDEX() wherever applicable are made use of.

      This activity is to be done for 9 different sets of data files, whole the target table(s) remain the same. The latter is not an issue as we have proper merge statements using specific and unique columns as primary keys of the target tables. I have already tuned garbage collection and currently I am using Concurrent Mark Sweep GC.

      I request for all of your expertise to help me further tune the situation. For any further questions kindly let me know!!!

      Thanks,

      Suddhasatwa