1 2 Previous Next 24 Replies Latest reply: Mar 4, 2010 1:04 PM by sb92075 Go to original post RSS
      • 15. Re: expdp and impdp  doubt
        678145
        Then for first as Justin already recomended, check that file is copied on the right one place at source db server.
        Check directory DATA_PUMP_DIR (select * from dba_directories).
        Go to that path - check that file exists and permissions are ok.

        Check file size for dump file on source and target servers. They should be equal.
        Check checksums for files - they again should be equal.
        • 16. Re: expdp and impdp  doubt
          693720
          Thanks,
          yes Justin and laura , I have placed the dump file in a DATA_PUMP_DIR...I have check the location in (select * from dba_directories) no doubt.


          about persmisions are ok ,

          as far as size of the file is little variation from source file size and target file ..

          this is my logfile entries :

          ;;;
          Import: Release 10.2.0.3.0 - 64bit Production on Monday, 30 March, 2009 15:44:14

          Copyright (c) 2003, 2005, Oracle. All rights reserved.
          ;;;
          Connected to: Oracle Database 10g Enterprise Edition Release 10.2.0.3.0 - 64bit Production
          With the Partitioning, OLAP and Data Mining options
          ORA-31694: master table "SYSTEM"."SYS_IMPORT_FULL_01" failed to load/unload
          ORA-02354: error in exporting/importing data
          ORA-02368: file is not valid for this load operation

          Note: if the file is corrupted , is we get this kind of above errors ?
          • 17. Re: expdp and impdp  doubt
            678145
            akbar_bcait@yahoo.co.in wrote:
            Thanks,
            yes Justin and laura , I have placed the dump file in a DATA_PUMP_DIR...I have check the location in (select * from dba_directories) no doubt.


            about persmisions are ok ,

            as far as size of the file is little variation from source file size and target file ..
            This is suspicious. Why file size is different?
            What about checksum - different as well?
            Then You don't have complete dump file.
            • 18. Re: expdp and impdp  doubt
              693720
              Dear Laura,

              can you make more clear about checksum ,

              how to check the file , for checksum ?

              thanks
              • 19. Re: expdp and impdp  doubt
                678145
                You can use md5sum utility on linux (md5sum - compute and check MD5 message digest).
                Only if file is big, it will take some time to compute, but anyway - check.
                • 20. Re: expdp and impdp  doubt
                  jgarry
                  $ oerr ora 2368
                  02368, 00000, "file %s is not valid for this load operation\n"
                  // *Cause: The specified file could not be used for this load because the
                  // internal header and/or table metadata of this file were not
                  // consistent with those of the first file listed in the DUMPFILE
                  // clause.
                  // *Action: Verify all the files listed in the DUMPFILE clause are from
                  // the same unload operation.


                  Check to be sure you haven't copied files from more than one export, forgetting to get rid of the old ones.
                  • 21. Re: expdp and impdp  doubt
                    693795
                    Have you tried a network-mode import?

                    http://download.oracle.com/docs/cd/B28359_01/server.111/b28319/dp_import.htm#i1006584
                    • 22. Dump File Validation: imp SHOW=Y   -->   impdp ????
                      user8124770
                      With imp, you could issue SHOW=Y to validate the dump file. (i.e. Attempt to use dump, but don't actually load it.)

                      Is there an equivalent command that can be used with impdp to validate a dump file without actually loading it in the database?
                      • 23. Re: expdp and impdp  doubt
                        Marcus2014
                        In the legacy imp/exp this type of error was usually indicative of using the wrong executable from a different Oracle Home. Make sure you are running the impdp from the correct home. CD to the <oracle home>\bin directory for the DB you are importing into.
                        • 24. Re: Dump File Validation: imp SHOW=Y   -->   impdp ????
                          sb92075
                          Is there an equivalent command that can be used with impdp to validate a dump file without actually loading it in the database?
                          impdp help=yes
                          
                          Import: Release 10.2.0.1.0 - Production on Thursday, 04 March, 2010 11:01:55
                          
                          Copyright (c) 2003, 2005, Oracle.  All rights reserved.
                          
                          
                          The Data Pump Import utility provides a mechanism for transferring data objects
                          between Oracle databases. The utility is invoked with the following command:
                          
                               Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
                          
                          You can control how Import runs by entering the 'impdp' command followed
                          by various parameters. To specify parameters, you use keywords:
                          
                               Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
                               Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
                          
                          USERID must be the first parameter on the command line.
                          
                          Keyword               Description (Default)
                          ------------------------------------------------------------------------------
                          ATTACH                Attach to existing job, e.g. ATTACH [=job name].
                          CONTENT               Specifies data to load where the valid keywords are:
                                                (ALL), DATA_ONLY, and METADATA_ONLY.
                          DIRECTORY             Directory object to be used for dump, log, and sql files.
                          DUMPFILE              List of dumpfiles to import from (expdat.dmp),
                                                e.g. DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
                          ENCRYPTION_PASSWORD   Password key for accessing encrypted column data.
                                                This parameter is not valid for network import jobs.
                          ESTIMATE              Calculate job estimates where the valid keywords are:
                                                (BLOCKS) and STATISTICS.
                          EXCLUDE               Exclude specific object types, e.g. EXCLUDE=TABLE:EMP.
                          FLASHBACK_SCN         SCN used to set session snapshot back to.
                          FLASHBACK_TIME        Time used to get the SCN closest to the specified time.
                          FULL                  Import everything from source (Y).
                          HELP                  Display help messages (N).
                          INCLUDE               Include specific object types, e.g. INCLUDE=TABLE_DATA.
                          JOB_NAME              Name of import job to create.
                          LOGFILE               Log file name (import.log).
                          NETWORK_LINK          Name of remote database link to the source system.
                          NOLOGFILE             Do not write logfile.
                          PARALLEL              Change the number of active workers for current job.
                          PARFILE               Specify parameter file.
                          QUERY                 Predicate clause used to import a subset of a table.
                          REMAP_DATAFILE        Redefine datafile references in all DDL statements.
                          REMAP_SCHEMA          Objects from one schema are loaded into another schema.
                          REMAP_TABLESPACE      Tablespace object are remapped to another tablespace.
                          REUSE_DATAFILES       Tablespace will be initialized if it already exists (N).
                          SCHEMAS               List of schemas to import.
                          SKIP_UNUSABLE_INDEXES Skip indexes that were set to the Index Unusable state.
                          SQLFILE               Write all the SQL DDL to a specified file.
                          STATUS                Frequency (secs) job status is to be monitored where
                                                the default (0) will show new status when available.
                          STREAMS_CONFIGURATION Enable the loading of Streams metadata
                          TABLE_EXISTS_ACTION   Action to take if imported object already exists.
                                                Valid keywords: (SKIP), APPEND, REPLACE and TRUNCATE.
                          TABLES                Identifies a list of tables to import.
                          TABLESPACES           Identifies a list of tablespaces to import.
                          TRANSFORM             Metadata transform to apply to applicable objects.
                                                Valid transform keywords: SEGMENT_ATTRIBUTES, STORAGE
                                                OID, and PCTSPACE.
                          TRANSPORT_DATAFILES   List of datafiles to be imported by transportable mode.
                          TRANSPORT_FULL_CHECK  Verify storage segments of all tables (N).
                          TRANSPORT_TABLESPACES List of tablespaces from which metadata will be loaded.
                                                Only valid in NETWORK_LINK mode import operations.
                          VERSION               Version of objects to export where valid keywords are:
                                                (COMPATIBLE), LATEST, or any valid database version.
                                                Only valid for NETWORK_LINK and SQLFILE.
                          
                          The following commands are valid while in interactive mode.
                          Note: abbreviations are allowed
                          
                          Command               Description (Default)
                          ------------------------------------------------------------------------------
                          CONTINUE_CLIENT       Return to logging mode. Job will be re-started if idle.
                          EXIT_CLIENT           Quit client session and leave job running.
                          HELP                  Summarize interactive commands.
                          KILL_JOB              Detach and delete job.
                          PARALLEL              Change the number of active workers for current job.
                                                PARALLEL=<number of workers>.
                          START_JOB             Start/resume current job.
                                                START_JOB=SKIP_CURRENT will start the job after skipping
                                                any action which was in progress when job was stopped. 
                          STATUS                Frequency (secs) job status is to be monitored where
                                                the default (0) will show new status when available.
                                                STATUS[=interval]
                          STOP_JOB              Orderly shutdown of job execution and exits the client.
                                                STOP_JOB=IMMEDIATE performs an immediate shutdown of the
                                                Data Pump job.
                          SQLFILE Write all the SQL DDL to a specified file.
                          1 2 Previous Next