Merge Statment processing 4-5 milliion at one shot..URGENT
I work in Data warehousing environment. I am designing an daily data load job (using ODI ELT) which has to load 4 Million records from one table to a target table which has about 10 Billion records. Over 95% of the records do not exist in the target and will be inserted and about 5% of the records will get updated . I am considering of using a merge stamen to do this at one shot .
Is there a chance that a merge of about 4 million may run out of rollback/Undo ?
Is it advisable to break the merge using some kind of Range so that it updates only about 500 k each time (using some unique key range and doing merge in a loop)