7 Replies Latest reply: Mar 1, 2012 9:03 AM by Andy Klock RSS

    data pump export single tables with specific criteria with the API

    920249
      Hello, I'm trying to export some table data from a schema using dbms_datapump API, but it gives me problems and it takes too much time to elaborate and I don't understand why! I want to export data only in a dmp file, that I will use later to import in an other schema. I'm using Oracle 10g R2 .
      I want to export data from TABLE1 and TABLE2, and ONLY the first 10 rows (this is for test now). I used the data_filter to write the subquery to filter the rows, and NAME_LIST to filter table names. I've set INCLUDE_METADATA to 0, to not export metadata. But it takes 10 minutes to run, and the output log says that there was an error (after 10 minutes??!):

      content of : table_dump.log
      Starting "MYSCHEMANAME"."SYS_EXPORT_TABLE_02":  
      Estimate in progress using BLOCKS method...
      Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
      Total estimation using BLOCKS method: 0 KB
      ORA-39166: Object IN ('TABLE1' was not found.
      ORA-39166: Object 'TABLE2') was not found.
      ORA-31655: no data or metadata objects selected for job
      Job "MYSCHEMANAME"."SYS_EXPORT_TABLE_02" completed with 3 error(s) at 15:58:47
      This is the code I use:
      DECLARE
        handle NUMBER;
        status VARCHAR2(20);
      BEGIN
        handle := DBMS_DATAPUMP.OPEN ('EXPORT', 'TABLE');
        dbms_datapump.add_file(handle => handle,filename => 'table_dump.log',directory => 'DATAPUMP_DIR',filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
        dbms_datapump.add_file(handle => handle,filename => 'table_dump.dmp',directory => 'DATAPUMP_DIR',filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
        dbms_datapump.metadata_filter(handle, 'SCHEMA_EXPR', 'IN (''MYSCHEMANAME'')');
        dbms_datapump.metadata_filter (handle, 'NAME_LIST', 'IN (''TABLE1'',''TABLE2'')');
        dbms_datapump.data_filter(handle, 'SUBQUERY', 'WHERE rownum <= 10', 'TABLE1', 'MYSCHEMANAME');
        dbms_datapump.data_filter(handle, 'SUBQUERY', 'WHERE rownum <= 10', 'TABLE2', 'MYSCHEMANAME');
        dbms_datapump.set_parameter(HANDLE => handle,NAME => 'INCLUDE_METADATA', VALUE => 0) ;
        dbms_datapump.START_JOB(handle);
        dbms_datapump.WAIT_FOR_JOB(handle, status);
      END;
      /
      Edited by: user10396517 on 27-feb-2012 9.17 - added the code formatting
        • 1. Re: data pump export single tables with specific criteria with the API
          Andy Klock
          Welcome to the forums. When pasting code use the {  code  } tags for better readability. See the FAQ for other details. And always feel free to include all the ddl/dml for your test cases so we don't have to do much more than run your code to reproduce it ourselves.

          I've had very little luck with NAME_LISTs. Though I know you can do similar with NAME_EXPR:
          SQL> create table table1 as select * from dba_tables where rownum <= 20;
          
          Table created.
          
          SQL> create table table2 as select * from dba_tables where rownum <= 20;
          
          Table created.
          
          -- note 20 rows each.
          
          SQL> DECLARE
          handle NUMBER;
          status VARCHAR2(20);
            2    3    4  BEGIN
            5  handle := DBMS_DATAPUMP.OPEN ('EXPORT', 'TABLE');
            6  dbms_datapump.add_file(handle => handle,filename => 'table_dump.log',directory => 'DATA_PUMP_DIR',filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_LOG_FILE);
            7  dbms_datapump.add_file(handle => handle,filename => 'table_dump.dmp',directory => 'DATA_PUMP_DIR',filetype => DBMS_DATAPUMP.KU$_FILE_TYPE_DUMP_FILE);
            8  dbms_datapump.metadata_filter(handle, 'SCHEMA_EXPR', 'IN (''ANDY'')');
            9  dbms_datapump.metadata_filter (handle, name=> 'NAME_EXPR', value=> 'IN (''TABLE1'',''TABLE2'')');
           10  dbms_datapump.data_filter(handle, 'SUBQUERY', 'WHERE rownum <= 10', 'TABLE1', 'ANDY');
           11  dbms_datapump.data_filter(handle, 'SUBQUERY', 'WHERE rownum <= 10', 'TABLE2', 'ANDY');
           12  dbms_datapump.set_parameter(HANDLE => handle,NAME => 'INCLUDE_METADATA', VALUE => 0) ;
          dbms_datapump.START_JOB(handle);
           13   14  dbms_datapump.WAIT_FOR_JOB(handle, status);
           15  END;
           16  /
          
          PL/SQL procedure successfully completed.
          
          SQL> !cat /u01/app/oracle/admin/test1/dpdump/table_dump.log
          Starting "SYS"."SYS_EXPORT_TABLE_18":
          Estimate in progress using BLOCKS method...
          Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
          Total estimation using BLOCKS method: 128 KB
          . . exported "ANDY"."TABLE1"                             29.32 KB      10 rows
          . . exported "ANDY"."TABLE2"                             29.32 KB      10 rows
          Master table "SYS"."SYS_EXPORT_TABLE_18" successfully loaded/unloaded
          ******************************************************************************
          Dump file set for SYS.SYS_EXPORT_TABLE_18 is:
            /u01/app/oracle/admin/test1/dpdump/table_dump.dmp
          Job "SYS"."SYS_EXPORT_TABLE_18" successfully completed at 17:01:41
          As for your time issues, nothing is preventing you from seeing what your datapump session is doing. If it is waiting on something, doing a full scan of something, etc.

          Good luck.
          • 2. Re: data pump export single tables with specific criteria with the API
            920249
            Hi Andy, thank you for your reply. I've launched the same anonymous block changing NAME_LIST to NAME_EXPR
            I'm waiting since 20 minutes, and it hasn't finished yet.. I will post tomorrow, when I come back to work. The strange thing is that it takes too much time. I hope that it will at least end correctly this time.

            If I try to run
             select * from dba_datapump_jobs;
            I can see that there is my datapump job and its state is "DEFINING"...

            This is the current log file:
            table_dump.log
            Starting "MYSCHEMANAME"."SYS_EXPORT_TABLE_08":  
            Estimate in progress using BLOCKS method...
            Processing object type TABLE_EXPORT/TABLE/TABLE_DATA
            • 3. Re: data pump export single tables with specific criteria with the API
              Andy Klock
              The best time to diagnose what something is doing is when it is doing :)

              Hopefully you get your dump file, but for your 20 row dump file it should not take that long. So something is not right. Perhaps if you are writing to NFS you you've stumbled onto this :

              Datapump Hanging When Exporting To NFS Location [ID 434508.1]

              But regardless, next time you kick this off take a look at v$session_wait and v$sesstatfor this session to see what it is actually doing.

              Or better yet, use Tanel Poder's Snapper :)

              http://tech.e2sn.com/oracle-scripts-and-tools/session-snapper

              Good luck.
              • 4. Re: data pump export single tables with specific criteria with the API
                920249
                Hi Andy,
                thank you again for your reply.
                Yesterday the job took at least 40 minutes, the output was ok, but again for only 20 rows in total.
                I have just runned the same anonymous block to export the 20 rows and looked to the v$session_wait table. I've found the sid of the job and these are the values of some columns:
                Message: wait for unread message on broadcast channel
                Wait_Class: Idle
                Status: Waiting

                I searched for a solution and I found this link , it says that it can be a "100% full tablespace", but I don't know how to fix this problem
                • 5. Re: data pump export single tables with specific criteria with the API
                  Andy Klock
                  No worries at all, I live for this stuff. Do you have access to Oracle Support? There are some notes in MOS regarding this idle wait event and datapump exports. I strongly doubt it is an issue with a full tablespace, but you should be able to verify that.

                  I take it you are also not writing to NFS or some other type of network storage?
                  • 6. Re: data pump export single tables with specific criteria with the API
                    920249
                    Hi Andy,
                    no it's not writing to NFS or other type of network storage, I'm waiting for the DBA to have some more information about this.
                    If I have some news I'll surely write here
                    • 7. Re: data pump export single tables with specific criteria with the API
                      Andy Klock
                      Sounds good. I thought you were the dba :) Regardless, I'd open an SR. This seems pretty buggy to me.

                      Good luck.