Skip to Main Content

ODP.NET

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

Reading and loading milions of rows from Oracle to SQL Server in batches

Milan HrickoSep 14 2020

Hi,

could anyone you send me some example how to load millions of rows from oracle to sql server in batches (e.g. commit after reading and loading 1000 rows) ?

I tried to read data from Oracle by using Oracle.ManagedDataAccess.Client.OracleDataAdapter and using Fill method by it fails on memory limits.

I desparately tried to add properties like EnableStreaming and BatchSize but data is not loaded continously in batches to destination table but only in the end of read process (in case it does not fail on memory during reading process)

This is code I use:

$oracleConnectionString = "User Id=USERNAME;Password=PWD;Data Source=SERVICENAME";

$sourceQuery = "select * from SCHEMA1.SOURCE_TABLE_ORACLE where rownum < 50000000";

$adapter = new-object Oracle.ManagedDataAccess.Client.OracleDataAdapter($sourceQuery, $oracleConnectionString );

$dtbl = new-object System.Data.DataTable

$adapter.Fill($dtbl);

$sqlConnectionString = "server=SERVERNAME;database=DBNAME;trusted_connection=true";

$sqlbc = new-object system.data.sqlclient.Sqlbulkcopy($sqlConnectionString );

$sqlbc.DestinationTableName="SCHEMA1.DESTINATION_TABLE_SQL";

$sqlbc.BatchSize=1000

$sqlbc.NotifyAfter = 1000

$sqlbc.EnableStreaming = $true

$sqlbc.BulkCopyTimeout = 0

$sqlbc.WriteToServer($dtbl);

Comments

Processing

Post Details

Added on Sep 14 2020
0 comments
281 views