This content has been marked as final. Show 6 replies
If this were from a running extract, I would guess you are not using a data pump. When the trail was sent to the remote server, you probably had a network gack/hiccup and lost some data. For zero data loss - that's precisely why a secondary extract process (a.k.a., data pump) is recommended.
For an initial load, if you were going straight to the remote server, same thing can happen. How were you doing the initial load? Loading data from file to Replicat?
This happened when running replicat.
We extracted the data to trail file and on target side performed the initial load.
So you would suggest adding another extract process on source side?
Just to be sure... The extract process creates trail files on a remote server, and then my replicat process uses this local trail files for an inital load.
Extract is what gets the trail to the target so that Replicat can process it (with the server collector process in front of Replicat, which re-assembles the trail, but we typically do not have to do anything with this component).
There are several approaches for performing initial load:
● Loading data with a database utility - The utility performs the initial load.
● Loading data from file to Replicat - Extract writes records to an extract
file and Replicat applies them to the target tables. This is the slowest initial-load method.
● Loading data from file to database utility - Extract writes records to
extract files in external ASCII format. The files are used as data files for input into
target tables by a bulk load utility. Replicat creates the run and control files.
● Loading data with an Oracle GoldenGate direct load - Extract
communicates with Replicat directly across TCP/IP without using a Collector process
or files. Replicat applies the data through the database engine.
● Loading data with a direct bulk load to SQL*Loader - Extract extracts
records in external ASCII format and delivers them directly to Replicat, which delivers
them to Oracle’s SQL*Loader bulk-load utility. This is the fastest method of loading
Oracle data with Oracle GoldenGate.
Given that you said you have a file from extract, that is why I asked you to confirm that you used the "loading data from file to Replicat" method. Did you?
If so, you can start over (for at least that part of the data/load). If you continue to have file errors, try another approach, or fix the underlying network issue.
And besides, you've also picked the slowest method overall. How much data are you trying to load? If it is just a little bit (relatively), why bother with this method? You can do an insert into target tables as select * from source table. Or use Oracle Data Pump.
Thanks for your reply.
Yes, I'm using the "loading data from file to Replicat" method, just because I'm new to GoldenGate and this seemed to be the easiest way.
I'm trying to load app. 10 million rows from DB2 to Oracle.
I'll try this once again and if it won't work I'll consider other methods.
By the way, what is the best method in your opinion?
For that amount of records/volume, I would probably dump the DB2 data into a CSV file and then SQL*Loader the files into the target.
I forget where I saw this, but I think the trade-off point was around 4MB of data. Less than that, GoldenGate technology, more than that, vendor RDBMS route. I'll see if I can find the reference/guideline.