Discussions
Categories
- 17.9K All Categories
- 3.4K Industry Applications
- 3.3K Intelligent Advisor
- 63 Insurance
- 536.4K On-Premises Infrastructure
- 138.3K Analytics Software
- 38.6K Application Development Software
- 5.8K Cloud Platform
- 109.5K Database Software
- 17.5K Enterprise Manager
- 8.8K Hardware
- 71.1K Infrastructure Software
- 105.3K Integration
- 41.6K Security Software
Prioritize some tables in datapump extract

Hi,
I've occasionally had the problem (and I've seen other people have too) where when doing a datapump extract of a schema or a large number of tables you sometimes want to extract a particular table first - it may be very dynamic or have a truncate run against it at regular intervals but it's size means it isn't extract until later on in the export - at which point it fails rendering the whole export invalid.
Generally doing the biggest tables first makes sense but it would be nice to flag problem tables in some way to say 'do this/these first'
You can kinf of trick datapump like this Oracle DBA Blog 2.0: Getting datapump to follow orders? but it would be much neater to just allow this functionality.
As the whole API has a PLSQL wrapper round it i don;t think this would be that difficult to implement - it would just be a slight change to the ordering algorithm?
The same could also be required with import where you maybe wanting to load parents before children if just uploading data.
Cheers,
Rich
Comments
-
Makes sense
-
this feature could simplify the work
-
Good idea. The following syntax can be used for that:
- add an option such as PRIORITIZE_IN_TABLES=Y
- when using 'TABLES' get the order from the list
- when using SCHEMAS= use the TABLES= to list the first tables to proceed (instead of 'UDE-00010: multiple job modes requested, schema and tables')
-
It is really a great idea.
regards
Pravin
-
Good idea, i miss this long time ago. I hope it will be available soon!
-
Great Idea, i used to face a different issue could be solved by the same solution. I used to get "snap shot too old error" while an export & i know for a fact there are only 3 table's that are been constantly changed & they are the once causing this error. So, if i could just push those table out in the first order then it would be great.
-
you can use EXCLUDE for big tables and export it using a diferent datapump exp.
Regards
-
This is why you can create multiple datapump jobs.
Export the most critical table in one job, and the rest of the objects with a 2nd job.
-
Doesn't your example just means you want the export to fail after the first table?
So you still get a failed export.
-
Great idea. Looking forward to it being implemented.