3 Replies Latest reply: Dec 17, 2013 4:29 PM by AAlsofyani RSS

    data pump issue.

    AAlsofyani

      Dears,

       

      I used data pump utility to import dump file thru TOAD wizard and i couldn't find imported tables under schema i used ( sys ).

      Note: Job completed successfully

       

      parameter file:

      DUMPFILE="reg_sas1.dmp"

      LOGFILE="imp_reg_sas1.log"

      DIRECTORY=DATA_PUMP_DIR

      SQLFILE=test.sql

      CONTENT=ALL

      JOB_NAME='test'

        • 1. Re: data pump issue.
          TSharma-Oracle

          Your tables will not be imported as you used SQLFILE parameter. You will see the sqlfile created in the directory you mentioned with all the DDL statements. If you want to import the tables, you need to take SQLFILE parameter out of the par file. Although I never used this utility through TOAD.

           

          SQLFILE= file_name: The file_name specifies where the import job will write the DDL that would be executed during the job. The SQL is not actually executed, and the target system remains unchanged. The file is written to the directory object specified in the DIRECTORY parameter, unless another directory_object is explicitly specified here.Source:Oracle


          Check DATA_PUMP_DIR directory to see the content of the sqlfile. Again, you need to get rid of sqlfile parameter if you want your tables to be imported.

          • 2. Re: data pump issue.
            AAlsofyani

            Thanks, That help ...

             

            another error i faced, should i create user with the same name of user in dump file.

            • 3. Re: data pump issue.
              TSharma-Oracle

              depends what is in the dumpfile.

               

              If these are just tables, yes the user should be existed in the database OR you can use REMAP_SCHEMA parameter to import those tables in another user. You can check DATAPUMP documentation for these parameters.