Forum Stats

  • 3,826,689 Users
  • 2,260,691 Discussions
  • 7,897,053 Comments

Discussions

What is the maximum file size can be processed in one instance/run in SOA 11g?

905784
905784 Member Posts: 21

Hi,

We are trying to process some 16 xml files (combined sum of file size = 6 GB). There are some weird cases happening here:

1. Integration fetches, converts this 24 files into 1 single file of around 2 GB and send to target folder.

2. When we run this integration after server reboot, instance getting completed in 90 minutes.

3. Next day it is taking 2 hours and subsequent days it is taking little long time and after a week, instance not getting completed at all and getting stuck.

We can see the stuck thread in the server and what could be right solution in order to perform this run. Should we restart the server often (which is not the right solution) or release the stuck thread everyday or any other solution?

P.S: Approach cannot be changed to fetch from database as legacy system does not have that feature.

Please let me know if you require any more details.

Tagged:

Answers

  • Raj__K
    Raj__K Member Posts: 412 Silver Badge

    Something you could try,

    - Server would treat even the working one as stuck based on the specified stuckthreadtime.You need to have appropriate value set, as you are processing a large data.

    - Avoid using XSLT. It would lead to out of memory as it traverses through the whole document.

    - You could use local variables which would get released of memory once the scope completes and not get dehydrated.

    - Have proper garbage collection and purging mechanism in place.

    As the data to be processed is so huge 6 GB(not sure of time period),if it is data sync kind of job ODI batch processing could be considered.

    Raj__K
This discussion has been closed.