This content has been marked as final. Show 4 replies
Assuming you are going to use off-line method for data migration, you need to figure out how much time it will take to offload data from SQLServer and stage it before loading into Oracle. For loading data into Oracle, you will have to test the SQL*Loader scripts or create external tables. Most important of all is what is the target server configuration (CPU/Storage/Memory) and loading mechanism. Keeping aside all these tasks, pure run time for loading 50GB will be an hour or 2 at the most with a decent server. You also have to consider stats collection after the data load, index rebuilds etc.
Hope this helps..
Thanks for your reply, it is very helpful. I have one more question, as per my understanding using SQL developer we have to the following
1. create and associate migration repository
2. capture the source database
3. convert the captured model to Oracle
4. Generate and run DDL statements to create Oracle database objects
5. Migrate the data
Now actually I am working on a Oracle product, we will receive the db scripts for Oracle database to create database objects for the application. Since I have db scripts for Oracle database, I can directly run that script to create database objects, in that way I can skip above steps 1 to 4..right? So after that somehow I need to get the source data and load it in the oracle database. Is that possible? or else I am making things more complicated?
Please share your ideas.
Thanks in advance.
Correct. If you are getting the DDL scripts for Oracle from somebody else then you don't need to go through the migration process. Simply use the off-line data migration scripts to move the data from SQL server to Oracle.
Thanks for your reply.
Using the DDL scripts I can able to generate the database objects and also imported the data from source to target. But I see one issue, to load 400MB data to a table it is taking 6 hours. I am doing this migration in DEV, the same table size in PROD is 9 GB. Not sure how long it takes to load that much data. You mentioned it takes only 2 hours, am I doing anything wrong here? I am running this from my local machine, and the source .dat file exists on my local machine.
The following are the contents in the oracle_ctl file (removed the connection information) and while running I am passing 2 parameters to the batch file
set NLS_DATE_FORMAT=Mon dd YYYY HH:mi:ssAM
REM set NLS_TIMESTAMP_FORMAT=Mon dd YYYY HH:mi:ffAM
REM set NLS_LANGUAGE=<insert the language of your database here e.g., US7ASCII>
sqlplus id/pass@server:1521/service < pre_load.sql
sqlldr %1/%2 control=control\log.ctl log=log\log.log
sqlplus id/pass@server:1521/service < post_load.sql
echo "Usage: oracle_ctl <user> <password> [<nls_date_format>]";
Please let me know what I am doing wrong here.
Thanks in advance