I have a requirement where I have to move data of few tables from one Oracle database (DB1 - 11g) in one server (Say S1) to another database (DB2 - 11g) in another server (Say S1) to generate reports using OBI.
And this process has to be automated and load process should happen on daily basis. Assume that total of 30 tables around 15 tables will be big like in millions.
Please clarify my below queries,
- Can you please clarify what are all the parameters to be considered for this job?
I have shortlisted the options through which this can be achieved.
DBLinks,Copy from SQL command.export and import
- If the data volume is very high we use export and import data pump option. I have not heard this option been automated. Do you see it to be a best idea to automate this process?
- if the data volume is huge then will dblinks be the right choice to do it?
Another question is
If not the entire dataset, is it the good idea to capture the incrementals to a table and move them to another server? And load that alone? Please clarify. If so how can it be achieved for a general table? What is the best method for it? Please clarify.
- Can you please suggest me the best way to do it ? or Direct me in the right direction to achieve it?
Thanks for the link. I will verify that. But as per my current understanding on Oracle Stream, the server will capture each and every change, creates LCR and then applied to destination database. And its always a good idea to have constraints defined so that the updates will be faster. But do you think this is the best approach for fact tables where the transactions may be high compared to dimension where it grows slowly. Please clarify.
If not, please let me know is there any other best approach. Or export and import data dump option will do it? Please clarify.
If this will be a one time job, you are better off using one of two options:
2. Create DB Link between S1 & S2. Then use CTAS (Create Table As Select).
If this will be permanent, I would not personally recommend Oracle Streams, better use MV's over DB Link, so basically you would:
1. Create DB Link between S1 & S2.
2. Create Fast Refresh MV's in S2 referencing tables in S1.
3. Run one complete refresh for MV's in S2.
4. Then you can schedule daily Fast Refresh for MV's in S2.
Once you reach step 4, you would be living happily ever after.
if you have to do it on a daily basis, the best option you have is streams, you can setup streams for a specific schema or a set of tables. However the all the redologs have to be scanned, but you can offload this part to the target DB by using downstream capture without creating any overhead to your source.