This discussion is archived
1 Reply Latest reply: Jan 4, 2013 10:48 PM by odie_63 RSS

Performance of insertion of binary XML content into table

917785 Newbie
Currently Being Moderated
Hi, all -

I'm using Oracle 11g Enterprise Edition Release and having issues with optimizing an insert query. Its taking about 48 seconds to insert 50K records from a binary xml table with an associated schema into another table's columns. My index didn't seem to help - the query execution plan shows TABLE ACCESS FULL on the xml table instead of using the index.
PARAMETERS ('PATHS (INCLUDE (/employees/employee
                    NAMESPACE MAPPING 
In another thread, A_non mentioned "If you are wanting to put the XML in different columns/tables based on the XML schema it is associated with, then Structured may be want you want." Is it possible the poor performance is because I created the table as binary?

Here is the insert query I'm using in case there's something obviously off about it.
INSERT INTO employees
from employees_xml x,
'for $i in //employee
return element r {
passing x.XML_DOCUMENT
columns "empid", "first_name", "last_name", "gender", "age"
) y
I'll keep digging, testing and tweaking, but wanted to throw this out there in case I come up empty. Thanks!
  • 1. Re: Performance of insertion of binary XML content into table
    odie_63 Guru
    Currently Being Moderated
    Its taking about 48 seconds to insert 50K records
    As in "logical" XML records from a single XML document, or really 50k rows from EMPLOYEES_XML table?

    Before trying to tweak, try to keep things simple.
    I guess that's not your real query and structure since what you've posted is not valid.

    Here's a similar test case derived for HR.EMPLOYEES table, no schema and no XML index :
    -- target table
    create table employees as 
    select employee_id, first_name, last_name, hire_date, job_id
    from hr.employees
    where 1 = 0 ;
    -- XML table
    create table employees_xml (xml_document xmltype) ;
    -- populating EMPLOYEES_XML with a single document of 107k "records"
    insert into employees_xml (xml_document)
    select xmlelement("employees",
                   employee_id "empid"
                 , first_name as "first_name"
                 , last_name as "last_name"
                 , hire_date as "hire_date"
                 , job_id as "job_id"
    from hr.employees
         cross join ( select 1 from dual connect by level <= 1000 );
    -- stats on EMPLOYEES_XML
    call dbms_stats.gather_table_stats(user, 'EMPLOYEES_XML');
    Then :
    SQL> set timing on
    SQL> insert into employees
      2  (
      3    employee_id,
      4    first_name,
      5    last_name,
      6    hire_date,
      7    job_id
      8  )
      9  select y.*
     10  from employees_xml x
     11     , xmltable(
     12        '/employees/employee'
     13        passing x.xml_document
     14        columns "empid"      number(6)
     15              , "first_name" varchar2(20)
     16              , "last_name"  varchar2(25)
     17              , "hire_date"  date
     18              , "job_id"     varchar2(10)
     19       ) y ;
    107000 rows created.
    Elapsed: 00:00:12.85
    See the XQuery expression. Rebuilding a row element (like <r> in your example) is unnecessary work.

    The index is not appropriate either. If it's really necessary, in this case you need an XML index with a structured component :


  • Correct Answers - 10 points
  • Helpful Answers - 5 points