This content has been marked as final. Show 3 replies
I've used datapump using a database link in the past and it should be able to easily handle 1GB of data. Of course that would depend on how fast/slow you network is.
From memory we had 10's of GB of data (in terms of schema size in dba_segments) and this took less than half hour to complete.
It was a case of:
1) Creating the appropriate database links on your target database.
2) Creating the users which would do the export/import on both your source and target databases.
3) Create database directories where datapump files will be created (if they don't already exist).
4) Create a procedure which does the export/import.
5) Schedule this procedure to run in a database job.
Google data pump network import for more info and details on how to set this up.
I remember it was a bit fiddly to set it up but once up and running it worked like a dream.
Another option is to do something similar but just use a shell script to do your export/import using datapump. Schedule the shell script in cron.