This discussion is archived
9 Replies Latest reply: Nov 20, 2012 8:41 AM by khemmerl RSS

how to clean dba_datapump_jobs

755353 Newbie
Currently Being Moderated
Hi
Im recieving below errors in the exp log files

ORA-39127: unexpected error from call to export_string :=SYS.DBMS_AQ_EXP_QUEUE_TABLES.instance_info_exp('T_MIE_ERROR_CHANNEL_TAB','DEV_AUD_0809',1,1,'11.01.00.00.00',newblock)
ORA-01403: no data found
ORA-01403: no data found
ORA-06512: at "SYS.DBMS_AQ_SYS_EXP_ACTIONS", line 90
ORA-06512: at "SYS.DBMS_AQ_EXP_QUEUE_TABLES", line 20
ORA-06512: at line 1

have searched the metalink and found below doc

ORA-39127 when executing concurrent expdp sessions. [ID 742789.1]

work around
Avoid running concurrent expdp jobs.

fix
Issue is caused by Bug:7112024 which is fixed in 11.1.0.7.0


I dont remember running multiple concurrent export! but I can see some sessions hanging in dba_dump_files

SQL> select JOB_NAME,STATE from dba_datapump_jobs;

JOB_NAME STATE
------------------------------ ------------------------------
EXPDEV NOT RUNNING
SYS_EXPORT_SCHEMA_01 NOT RUNNING
IMPORT_USER000002 NOT RUNNING

once i tried running the exp as EXPDEV and it came back saying the table already exist ...

how can i clear them ? before -re running the export?

the export has completed successfully do i really need to re-run the export?
  • 1. Re: how to clean dba_datapump_jobs
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    Pl post full versions of OS and database. Pl see MOS Doc 336014.1 (How To Cleanup Orphaned DataPump Jobs In DBA_DATAPUMP_JOBS ?)

    HTH
    Srini
  • 2. Re: how to clean dba_datapump_jobs
    755353 Newbie
    Currently Being Moderated
    Linux1.com 2.6.9-34.ELsmp #1 SMP Fri Feb 24 16:54:53 EST 2006 i686 i686 i386 GNU/Linux

    Red Hat Enterprise Linux AS release 4 (Nahant Update 1)

    SQL> select * from v$version;

    BANNER
    --------------------------------------------------------------------------------
    Oracle Database 11g Enterprise Edition Release 11.1.0.6.0 - Production
    PL/SQL Release 11.1.0.6.0 - Production
    CORE 11.1.0.6.0 Production
    TNS for Linux: Version 11.1.0.6.0 - Production
    NLSRTL Version 11.1.0.6.0 - Production


    I have followd your document and cleaned the queue but the problem hasnt gone away,
    i think the cause is not the queue ,because the doc which i mentioned (when executing concurrent expdp sessions. [ID 742789.1]
    ) doesnt have the same error description, its just the error number that matches..

    Edited by: user9198889 on 11-May-2010 13:18
  • 3. Re: how to clean dba_datapump_jobs
    Srini Chavali-Oracle Oracle ACE Director
    Currently Being Moderated
    Have you applied the patch that MOS Doc 742789.1 mentions ?

    HTH
    Srini
  • 4. Re: how to clean dba_datapump_jobs
    755353 Newbie
    Currently Being Moderated
    not yet, im was trying to use the workaround technique thats "Avoid running concurrent expdp jobs." as priority.

    I also have this error ORA-31693 / ORA-1466 Errors During Datapump Export Using Flashback [ID 1072871.1]
    ORA-01466: unable to read data - table definition has changed
    ORA-31693: Table data object "DEV_PS_XP"."SCHEDULER$_PROGRAM_ARG" failed to load/unload and is being skipped due to error:

    so I wondered if doing startup restrict instead of using flashback time to achieve consistent export could resolve the issue ?

    if not I then be left with no other option but to apply the patch fix
  • 5. Re: how to clean dba_datapump_jobs
    755353 Newbie
    Currently Being Moderated
    the patch mentioned in MOS Doc 742789.1 has only option to selecty HP-UX but my os is linux, is there a similar one for linux ?

    thank you
  • 6. Re: how to clean dba_datapump_jobs
    722368 Explorer
    Currently Being Moderated
    Hi,

    find the jobs which are in 'NOT RUNNING' state. for every data pump job, oracle creates a table. in this example you can find table sys.SYS_EXPORT_TABLE_01. make sure there are no jobs running using v$datapump_session. then drop the above table. Then you can try to run expdp/impdp.

    SQL> select * from dba_datapump_jobs ;

    OWNER_NAME JOB_NAME
    ------------------------------ ------------------------------
    OPERATION
    ------------------------------------------------------------------------------------------
    JOB_MODE
    ------------------------------------------------------------------------------------------
    STATE DEGREE ATTACHED_SESSIONS DATAPUMP_SESSIONS
    ------------------------------ ---------- ----------------- -----------------
    SYS SYS_EXPORT_TABLE_01
    EXPORT
    TABLE
    NOT RUNNING 0 0 0


    1 row selected.

    SQL> desc sys.SYS_EXPORT_TABLE_01

    SQL> select * from v$datapump_session ;

    no rows selected

    Thanks,
    Siva
  • 7. Re: how to clean dba_datapump_jobs
    755353 Newbie
    Currently Being Moderated
    Hi Siva,
    i have already done this and re-ran the expdp but its come back with the same error,
  • 8. Re: how to clean dba_datapump_jobs
    755353 Newbie
    Currently Being Moderated
    just had a closer look at the error msg and i can see the schema name which its complaining about in the error log

    ORA-39127: unexpected error from call to export_string :=SYS.DBMS_AQ_EXP_QUEUE_TABLES.instance_info_exp('T_MIE_ERROR_CHANNEL_TAB','DEV_AUD_0809',1,1,'11.01.00.00.00',newblock)


    that DEV_AUD_0809, i am exporting this schema usilg old exp tool!! assuming the export works with this schema, can i use combination of the dump files generated by expdp and imp for the import? or that i need to retake the expdp because of the error ?

    thanks
  • 9. Re: how to clean dba_datapump_jobs
    khemmerl Newbie
    Currently Being Moderated
    I also have run into the problem with datapump exports failing with the following errors:
    ORA-31693: Table data object "<SCHEMA>"."SCHEDULER$_JOB_ARG" failed to load/unload
    and is being skipped due to error:
    ORA-01466: unable to read data - table definition has changed
    In my case, SCHEDULER$_JOB_ARG is not a normal application tables and actually does not exist when I try to look for it so I assume that the DBMS_SCHEDULER is creating and dropping this table as needed. Since I don't need this table as part of the export of my application I was able to add the following to the datapump export parameter file to fix the expdp:

    exclude=TABLE:"='SCHEDULER$_JOB_ARG'"

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points