If you are moving only 1 schema in many schemas. You can use datapump utility to move the data. Make sure you use "'PARALLEL" parameter while doing expoert and import. Datapump is much faster than conventional export or import.
Here is the link for your help.
Is your Testing env also the same machine or different ?
If different machine, is same like a production machine (Same OS, Same Oracle version ?)
Basically data dump is much slower method. (For 2 TB ,almost will it take 1 day to export and import . correct me if im wrong) .
is there only one schema in the production machine or multiple ?
If multiple , total schema size is 2000 GB or any one schema size is 2000GB ?
You want to copy the data in online or offline ?
Thanks for the quick responses and great links.
I have to copy one schema of 2000 GB in the same machine and I can do it over a night (which means that it can be issued offline)
Actually, I have no experience in such schema size and I have to estimate the time required to do that and to find the best technique.
1. Do you know how to estimate the time required?
2. Is there any other technique to copy a schema?
3. If I have to copy the schema to other machine (identical HW and SW), is there other technique?
1. we can't estimate the exact time for any methods.(It depends on various factors like network spped, HW/SW configurations etc.) But datapump will take one day or more to complete 2000 GB of data (to export and import).
2,3. Yes . "Transportable Tablespace" . Using this method , you can export only metadata from source schema and rest of the data (tablespace) we can move just plug and play method.
As mentioned, export and import of 2000 GB will take around one day. It means that 4000 GB will take two days? Is it linear?
From your experience, parallel export and import improve the performance?
Does anybody have a link to a benchmark?
Hi, It really depends upon a H/W configuration. It is but obvious that if you make things parallelize it will share the workload and provide the output much faster. I can only tell that we have Sun M4000 and database size is 1400 Gb and datapump takes 4.5 hours to export this 1400 Gb of database. Make sure when you Import , avoid importing Indexes and create all indexes after th eimport with parallel and nologging option.
Does this schema you want to export have a dedicated/exclusive tablespace (i.e. not shared with any other schemas) ? If so, and you are using EE version of the database, transportable tablespace could be another option.