Reading and loading milions of rows from Oracle to SQL Server in batches — oracle-tech

    Forum Stats

  • 3,708,685 Users
  • 2,241,109 Discussions
  • 7,840,538 Comments

Discussions

Howdy, Stranger!

It looks like you're new here. If you want to get involved, click one of these buttons!

Reading and loading milions of rows from Oracle to SQL Server in batches

Hi,

could anyone you send me some example how to load millions of rows from oracle to sql server in batches (e.g. commit after reading and loading 1000 rows) ?

I tried to read data from Oracle by using Oracle.ManagedDataAccess.Client.OracleDataAdapter and using Fill method by it fails on memory limits.

I desparately tried to add properties like EnableStreaming and BatchSize but data is not loaded continously in batches to destination table but only in the end of read process (in case it does not fail on memory during reading process)

This is code I use:

$oracleConnectionString = "User Id=USERNAME;Password=PWD;Data Source=SERVICENAME";

$sourceQuery = "select * from SCHEMA1.SOURCE_TABLE_ORACLE where rownum < 50000000";

$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($sourceQuery, $oracleConnectionString );

$dtbl = new-object System.Data.DataTable

$adapter.Fill($dtbl);

$sqlConnectionString = "server=SERVERNAME;database=DBNAME;trusted_connection=true";

$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlConnectionString );

$sqlbc.DestinationTableName="SCHEMA1.DESTINATION_TABLE_SQL";

$sqlbc.BatchSize=1000

$sqlbc.NotifyAfter = 1000

$sqlbc.EnableStreaming = $true

$sqlbc.BulkCopyTimeout = 0

$sqlbc.WriteToServer($dtbl);

Sign In or Register to comment.