This discussion is archived
3 Replies Latest reply: Sep 10, 2013 6:22 AM by Dean Gagne RSS

DATAPUMP for migration?

Nikoline Newbie
Currently Being Moderated

Hi to all,

 

We should migrate our 10gR2 single-instance database with conventional file system to a two-node 11gR2 RAC on ASM (on same Windows Server platform…).

 

Could you please help me in this, as how can I migrate my production database using data pump?

 

I have full data pump export from target but I don’t know how to import, whether the scheme after scheme, full import, do I need to first create manually tablespaces on destination, whether to exclude the index, constraint, statistics…??

  • 1. Re: DATAPUMP for migration?
    TSharma-Oracle Guru
    Currently Being Moderated

    You can do the full import. Well it might be easier for you to create the tablespaces but you don't have to. If your file structure is different ( which willl be in your case), you can use remap_datafile parameter. You do not want to leave the constraint and indexes. For stats, you should test this in your test envorinment.

     

    ORACLE-BASE - Oracle Data Pump (expdp and impdp) in Oracle Database 10g

  • 2. Re: DATAPUMP for migration?
    DK2010 Guru
    Currently Being Moderated

    Hi,

     

    >> whether the scheme after scheme, full import, do I need to first create manually tablespaces on destination, whether to exclude the index, constraint, statistics…??

    1.You have both the choice  better you can import schema data add the parameter schemas=<list of schemas>

    2.If you want then you can create  tablespaces on Target with same name and size(should be grater then used size on source) or you can use REMAP_TABLESPACE option.

    3.If your Database is big in size then you can exclude indexes and create manually using the parallel and nologging option for faster result

    4.you can also exclude statistics, it better to gather new stats after import the data.

    user CLUSTER=N during import for RAC database.

    You can take command help from link provided by TSharma.

    HTH

  • 3. Re: DATAPUMP for migration?
    Dean Gagne Expert
    Currently Being Moderated

    Since you are changing to ASM, I would create the tablespaces before running import.  Trying to do that with an import job can be difficult especially since you are now going into ASM.

     

    Everything else could be imported by running import full=y exclude=tablespaces ...

     

    The statistics can take a long time to import (depending on what you have in the dumpfile) so often times, exclude=statistics is something to think about.  You can always regather them them you are done the import.

     

    Indexes - I'm not sure why people say to skip them.  If you run your job with parallel=x, then Data Pump will create indexes serially, but user multiple pq slaves to build the indexes in parallel.  Up to the value 'x' supplied on the import command.

     

    Good Luck

     

    Dean

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points