3 Replies Latest reply: Apr 19, 2013 9:21 AM by DK2010 RSS

    Large data spooling

      1. Need to transfer about a GB of data from one Oracle database system to another on a daily basis. Does anyone see a problem with fetching that much of data using database link? If this is not a good option, what are my other options. We tried spooling the data and loading. That takes hours to complete.
        • 1. Re: Large data spooling

          What DB/OS version you are using.
          i believe database link is not good option.
          You can use the Sqlloader to laod the data.. in batch files.

          Edited by: DK2010 on Apr 19, 2013 6:44 AM
          • 2. Re: Large data spooling
            I've used datapump using a database link in the past and it should be able to easily handle 1GB of data. Of course that would depend on how fast/slow you network is.

            From memory we had 10's of GB of data (in terms of schema size in dba_segments) and this took less than half hour to complete.

            It was a case of:

            1) Creating the appropriate database links on your target database.
            2) Creating the users which would do the export/import on both your source and target databases.
            3) Create database directories where datapump files will be created (if they don't already exist).
            4) Create a procedure which does the export/import.
            5) Schedule this procedure to run in a database job.

            Google data pump network import for more info and details on how to set this up.

            I remember it was a bit fiddly to set it up but once up and running it worked like a dream.

            Another option is to do something similar but just use a shell script to do your export/import using datapump. Schedule the shell script in cron.
            • 3. Re: Large data spooling

              Its duplicate thread of

              Large data spooling

              please do not raise duplicate