This discussion is archived
2 Replies Latest reply: Sep 5, 2013 6:15 AM by gsmith RSS

datapump - expdp.open creates tables in schema

gsmith Newbie
Currently Being Moderated

Hi

     i am using datapump in Oracle 10g to archive old partitions from the main schema to another schema.

i notice that when dbms_datapump.open is called that a new table is created by dbms_datadpump for internal purposes. This is verified in the oracle documentation

http://docs.oracle.com/cd/B12037_01/appdev.101/b10802/d_datpmp.htm#997383

 

Usage Notes

  • When the job is created, a master table is created for the job under the caller's schema within the caller's default tablespace. A handle referencing the job is returned that attaches the current session to the job. Once attached, the handle remains valid until either an explicit or implicit detach occurs. The handle is only valid in the caller's session. Other handles can be attached to the same job from a different session by using the ATTACH procedure.

 

 

Does anybody know whether this table can be removed by a  "cleanup" Oracle dbms_datapump call, or whether it has to be cleaned up manually.

  • 1. Re: datapump - expdp.open creates tables in schema
    gsmith Newbie
    Currently Being Moderated

    i think i have found the answer.

         you have to call DBMS_DATAPUMP.STOP even if the job has already stopped and set the 'keep master" param to 0

  • 2. Re: datapump - expdp.open creates tables in schema
    gsmith Newbie
    Currently Being Moderated

    can confirm thats what you do

     

     

    v_job_handle:= DBMS_DATAPUMP.OPEN('EXPORT', 'TABLE', NULL, v_job_name);

    -- Set parallelism to 1 and add file

     

    DBMS_DATAPUMP.SET_PARALLEL(v_job_handle, 1);

     

    DBMS_DATAPUMP.ADD_FILE(v_job_handle, v_job_name || '_' || v_partition.partition_name || '.dmp', 'PARTITION_DUMPS');

    -- Apply filter to process only a certain partition in the table

     

    DBMS_DATAPUMP.METADATA_FILTER(v_job_handle, 'SCHEMA_EXPR', 'IN(''SIS_MAIN'')');

     

    DBMS_DATAPUMP.METADATA_FILTER(v_job_handle, 'NAME_EXPR', 'LIKE ''' || t_archive_list(i) || '''');

     

    DBMS_DATAPUMP.DATA_FILTER(v_job_handle, 'PARTITION_EXPR', 'IN (''' || v_partition.partition_name || ''')', t_archive_list(i), 'SIS_MAIN');

    -- Use statistics (rather than blocks) to estimate time.

     

    DBMS_DATAPUMP.SET_PARAMETER(v_job_handle,'ESTIMATE','STATISTICS');

    -- Start the job. An exception is returned if something is not set up properly.

     

    DBMS_DATAPUMP.START_JOB(v_job_handle);

    -- The export job should now be running. We loop until its finished

     

    v_percent:= 0;

    v_job_state:= 'UNDEFINED';

    WHILE (v_job_state != 'COMPLETED') and (v_job_state != 'STOPPED') LOOP

     

         DBMS_DATAPUMP.get_status(v_job_handle,DBMS_DATAPUMP.ku$_status_job_error + DBMS_DATAPUMP.ku$_status_job_status + DBMS_DATAPUMP.ku$_status_wip,-1,v_job_state,sts);

     

         js:= sts.job_status;

         -- As the percentage-complete changes in this loop, the new value displays.

       IF js.percent_done != v_percent THEN

     

              v_percent:= js.percent_done;

         END IF;

    END LOOP;

    -- When the job finishes, display status before detaching from job.

     

    PRC_LOG(f1, t_archive_list(i) || ': Export complete with status: ' || v_job_state);

     

    -- DBMS_DATAPUMP.DETACH(v_job_handle);

    -- use STOP_JOB instead of DETACH otherwise the "master table" which is created when OPEN is called will not be removed.

     

    DBMS_DATAPUMP.STOP_JOB(v_job_handle,0,0);

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points