This discussion is archived
5 Replies Latest reply: Feb 6, 2013 3:56 PM by Dean Gagne RSS

Export and Import data or Matadata only

TheHades0210 Newbie
Currently Being Moderated
Aloha,

I need to update an instance, update data only with a full dump. What parameter should i use on import. How can i drop the schema without touching the metadata and database objects?

Thanks in advance.

Hades
  • 1. Re: Export and Import data or Matadata only
    asahide Expert
    Currently Being Moderated
    Hi,

    What do you mean update?

    Or, do you mean imp ROWS=N or impdp CONTENT = METADATA_ONLY ?

    <<http://docs.oracle.com/cd/E11882_01/server.112/e22490/original_import.htm#autoId27>>
    <<http://docs.oracle.com/cd/E11882_01/server.112/e22490/dp_import.htm>>

    Regards,
  • 2. Re: Export and Import data or Matadata only
    TheHades0210 Newbie
    Currently Being Moderated
    Hi,

    Yes, i will import a full dump export but data only that i need to import and existing db_objects and metadata on the instance should not be touch.

    Regards,

    Hades
  • 3. Re: Export and Import data or Matadata only
    sb92075 Guru
    Currently Being Moderated
    TheHades0210 wrote:
    Hi,

    Yes, i will import a full dump export but data only that i need to import and existing db_objects and metadata on the instance should not be touch.

    Regards,

    Hades
    easy as pie
    [oracle@localhost ~]$ impdp help=yes
    
    Import: Release 11.2.0.2.0 - Production on Mon Feb 4 19:57:10 2013
    
    Copyright (c) 1982, 2009, Oracle and/or its affiliates.  All rights reserved.
    
    
    The Data Pump Import utility provides a mechanism for transferring data objects
    between Oracle databases. The utility is invoked with the following command:
    
         Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    You can control how Import runs by entering the 'impdp' command followed
    by various parameters. To specify parameters, you use keywords:
    
         Format:  impdp KEYWORD=value or KEYWORD=(value1,value2,...,valueN)
         Example: impdp scott/tiger DIRECTORY=dmpdir DUMPFILE=scott.dmp
    
    USERID must be the first parameter on the command line.
    
    ------------------------------------------------------------------------------
    
    The available keywords and their descriptions follow. Default values are listed within square brackets.
    
    ATTACH
    Attach to an existing job.
    For example, ATTACH=job_name.
    
    CLUSTER
    Utilize cluster resources and distribute workers across the Oracle RAC.
    Valid keyword values are: [Y] and N.
    
    CONTENT
    Specifies data to load.
    Valid keywords are: [ALL], DATA_ONLY and METADATA_ONLY.
    
    DATA_OPTIONS
    Data layer option flags.
    Valid keywords are: SKIP_CONSTRAINT_ERRORS.
    
    DIRECTORY
    Directory object to be used for dump, log and SQL files.
    
    DUMPFILE
    List of dump files to import from [expdat.dmp].
    For example, DUMPFILE=scott1.dmp, scott2.dmp, dmpdir:scott3.dmp.
    
    ENCRYPTION_PASSWORD
    Password key for accessing encrypted data within a dump file.
    Not valid for network import jobs.
    
    ESTIMATE
    Calculate job estimates.
    Valid keywords are: [BLOCKS] and STATISTICS.
    
    EXCLUDE
    Exclude specific object types.
    For example, EXCLUDE=SCHEMA:"='HR'".
    
    FLASHBACK_SCN
    SCN used to reset session snapshot.
    
    FLASHBACK_TIME
    Time used to find the closest corresponding SCN value.
    
    FULL
    Import everything from source [Y].
    
    HELP
    Display help messages [N].
    
    INCLUDE
    Include specific object types.
    For example, INCLUDE=TABLE_DATA.
    
    JOB_NAME
    Name of import job to create.
    
    LOGFILE
    Log file name [import.log].
    
    NETWORK_LINK
    Name of remote database link to the source system.
    
    NOLOGFILE
    Do not write log file [N].
    
    PARALLEL
    Change the number of active workers for current job.
    
    PARFILE
    Specify parameter file.
    
    PARTITION_OPTIONS
    Specify how partitions should be transformed.
    Valid keywords are: DEPARTITION, MERGE and [NONE].
    
    QUERY
    Predicate clause used to import a subset of a table.
    For example, QUERY=employees:"WHERE department_id > 10".
    
    REMAP_DATA
    Specify a data conversion function.
    For example, REMAP_DATA=EMP.EMPNO:REMAPPKG.EMPNO.
    
    REMAP_DATAFILE
    Redefine data file references in all DDL statements.
    
    REMAP_SCHEMA
    Objects from one schema are loaded into another schema.
    
    REMAP_TABLE
    Table names are remapped to another table.
    For example, REMAP_TABLE=HR.EMPLOYEES:EMPS.
    
    REMAP_TABLESPACE
    Tablespace objects are remapped to another tablespace.
    
    REUSE_DATAFILES
    Tablespace will be initialized if it already exists [N].
    
    SCHEMAS
    List of schemas to import.
    
    SERVICE_NAME
    Name of an active Service and associated resource group to constrain Oracle RAC resources.
    
    SKIP_UNUSABLE_INDEXES
    Skip indexes that were set to the Index Unusable state.
    
    SOURCE_EDITION
    Edition to be used for extracting metadata.
    
    SQLFILE
    Write all the SQL DDL to a specified file.
    
    STATUS
    Frequency (secs) job status is to be monitored where
    the default [0] will show new status when available.
    
    STREAMS_CONFIGURATION
    Enable the loading of Streams metadata
    
    TABLE_EXISTS_ACTION
    Action to take if imported object already exists.
    Valid keywords are: APPEND, REPLACE, [SKIP] and TRUNCATE.
    
    TABLES
    Identifies a list of tables to import.
    For example, TABLES=HR.EMPLOYEES,SH.SALES:SALES_1995.
    
    TABLESPACES
    Identifies a list of tablespaces to import.
    
    TARGET_EDITION
    Edition to be used for loading metadata.
    
    TRANSFORM
    Metadata transform to apply to applicable objects.
    Valid keywords are: OID, PCTSPACE, SEGMENT_ATTRIBUTES and STORAGE.
    
    TRANSPORTABLE
    Options for choosing transportable data movement.
    Valid keywords are: ALWAYS and [NEVER].
    Only valid in NETWORK_LINK mode import operations.
    
    TRANSPORT_DATAFILES
    List of data files to be imported by transportable mode.
    
    TRANSPORT_FULL_CHECK
    Verify storage segments of all tables [N].
    
    TRANSPORT_TABLESPACES
    List of tablespaces from which metadata will be loaded.
    Only valid in NETWORK_LINK mode import operations.
    
    VERSION
    Version of objects to import.
    Valid keywords are: [COMPATIBLE], LATEST or any valid database version.
    Only valid for NETWORK_LINK and SQLFILE.
    
    ------------------------------------------------------------------------------
    
    The following commands are valid while in interactive mode.
    Note: abbreviations are allowed.
    
    CONTINUE_CLIENT
    Return to logging mode. Job will be restarted if idle.
    
    EXIT_CLIENT
    Quit client session and leave job running.
    
    HELP
    Summarize interactive commands.
    
    KILL_JOB
    Detach and delete job.
    
    PARALLEL
    Change the number of active workers for current job.
    
    START_JOB
    Start or resume current job.
    Valid keywords are: SKIP_CURRENT.
    
    STATUS
    Frequency (secs) job status is to be monitored where
    the default [0] will show new status when available.
    
    STOP_JOB
    Orderly shutdown of job execution and exits the client.
    Valid keywords are: IMMEDIATE.
  • 4. Re: Export and Import data or Matadata only
    TheHades0210 Newbie
    Currently Being Moderated
    Hi,

    Before metadata only import do i need to drop the schema? If so what drop user command do I need to use? Is "drop user username cascade;" is enough?

    Regards,
  • 5. Re: Export and Import data or Matadata only
    Dean Gagne Expert
    Currently Being Moderated
    If you want to drop the schema, then yes, you can use:

    drop user <your_schema_here> cascade;

    You need to make sure you are a privileged user. If an unprivileged user exported their objects, then the "user" object is not part of the dumpfile. You need to be prived to get that object. Also, only Data Pump exports the user object in a schema mode export.

    ex: expdp unprived_user/pwd directory=dpump_dir dumpfile=mydump.dmp

    This would not get the user object so before the import, you need to create the user before running the import.

    ex: expdp prived_user/pwd directory=dpump_dir dumpfile=user1.dmp schemas=user1

    This would export the user1 user object and during import, you do not have to pre-create the user, it will be created as part of the import command.

    Hope this helps.

    Dean

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points