1. Need to transfer about a GB of data from one Oracle database system to another on a daily basis. Does anyone see a problem with fetching that much of data using database link? If this is not a good option, what are my other options. We tried spooling the data and loading. That takes hours to complete.
I've used datapump using a database link in the past and it should be able to easily handle 1GB of data. Of course that would depend on how fast/slow you network is.
From memory we had 10's of GB of data (in terms of schema size in dba_segments) and this took less than half hour to complete.
It was a case of:
1) Creating the appropriate database links on your target database.
2) Creating the users which would do the export/import on both your source and target databases.
3) Create database directories where datapump files will be created (if they don't already exist).
4) Create a procedure which does the export/import.
5) Schedule this procedure to run in a database job.
Google data pump network import for more info and details on how to set this up.
I remember it was a bit fiddly to set it up but once up and running it worked like a dream.
Another option is to do something similar but just use a shell script to do your export/import using datapump. Schedule the shell script in cron.