This discussion is archived
3 Replies Latest reply: Nov 21, 2012 11:45 PM by 703519 RSS

Import Large XML File to Table

965088 Newbie
Currently Being Moderated
I have a large (819MB) XML file I'm trying to import into a table in the format:

I've tried importing it with xmlsequence(...).extract(...) and ran into the number of nodes exceed maximum error.

I've tried importing it with XMLTable(... passing XMLTYPE(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8'))) and I gave up after it ran for 15+ hours ( COLLECTION ITERATOR PICKLER FETCH issue ).

I've tried importing it with:

insCtx := DBMS_XMLStore.newContext('schemaname.tablename');
ROWS := DBMS_XMLStore.insertXML(insCtx, XMLTYPE(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8')));

and ran into ORA-04030: out of process memory when trying to allocate 1032 bytes (qmxlu subheap,qmemNextBuf:alloc).

All I need to do is read the XML file and move the data into a matching table in a reasonable time. Once I have the data in the database, I no longer need the XML file.

What would be the best way to import large XML files?

Oracle Database 11g Release - 64bit Production
PL/SQL Release - Production
"CORE     Production"
TNS for Linux: Version - Production
NLSRTL Version - Production
  • 1. Re: Import Large XML File to Table
    Jason_(A_Non) Expert
    Currently Being Moderated
    This (rough) approach should work for you.
            (xml_col XMLTYPE)
    VALUES (xmltype(bfilename('DIR_OBJ','large_819mb_file.xml'), nls_charset_id('UTF8')))
    -- Should be using AL32UTF8 for DB character set with XML
    SELECT ...
              PASSING HX.xml_col ...)
    How it differs from your approach.

    By using the HOLDS_XML table with SECUREFILE BINARY XML storage (which became the default in we are providing a place for Oracle to store a parsed version of the XML. This allows the XML to be stored on disk instead of in memory. Oracle can then access the needed pieces of XML from disk by streaming them instead of holding the whole XML in memory and parsing it repeatedly to find the information needed. That is what COLLECTION ITERATOR PICKLER FETCH means. A lot of memory work. You can search on that term to learn more about it if needed.

    The XMTable approach then simply reads this XML from disk and should be able to parse the XML with no issue. You have the option of adding indexes to the XML, but since you are simply reading it all one time and tossing it, there is no advantage to indexes (most likely)
  • 2. Re: Import Large XML File to Table
    965088 Newbie
    Currently Being Moderated
    Thank you... that did the trick in under 3 minutes!
  • 3. Re: Import Large XML File to Table
    703519 Newbie
    Currently Being Moderated
    Can something like this be done for a file about 2 GB in size on Oracle 10.2? I have asked a question on a new thread but no answer yet. Can someone please help?


  • Correct Answers - 10 points
  • Helpful Answers - 5 points