1 Reply Latest reply on Sep 6, 2012 4:44 PM by Jason_(A_Non)

    Generating large amounts of XML without running out of memory

      Hi there,
      I need some advice from the experienced xdb users around here. I´ll try to map large amounts of data inside the DB (11.2g) and by large I mean up to several GB. I compared the "low level" mapping via PL/SQL in combination with ExtractValue/XMLQuery with the elegant XML View Mapping and the best performance gave me the View Mapping by using the XMLTABLE XQuery PATH constructs. So now I have a View that lies on several BINARY XMLTYPE Columns for the mapping and another view which lies above this Mapping View and constructs the nested XML result document via XMLELEMENT(),XMLAGG() etc. Now all I want to do is materialize this document by inserting it into a XMLTYPE table/column. Sounds pretty easy but can´t get it to work, the DB seems to load a full DOM representation into the RAM every time I perform an insert into or use the xmlgen tool, how can I get the result document into the table without memory exhaustion. I thought the db would be smart enough to generate some kind of serialization/datastream to perform this task without loading everything into the RAM.

      Best regards

      Edited by: 957051 on 05.09.2012 03:15
        • 1. Re: Generating large amounts of XML without running out of memory
          Not an answer, but maybe a start.

          This post is better suited for the {forum:id=34} forums. When you post over there, include your full version
          select * from v$version

          as well as much of a representative set of code that you can to demonstrate the situation. I'm assuming you mean a single XML file could be in the GB size so you will want to clearly state that as well.

          It seems Marco (http://www.liberidu.com/blog/) may have touched on this topic in the past on his blog, but my quick searching did not turn it up. I could be confusing it with another topic of his too.