This content has been marked as final. Show 5 replies
I would suggest this option:1 person found this helpful
Store the following connection pool values in a table and call them in your connection pool using Connection Pool Scripts
Data Source Name
when ever you want to change db just you need to edit these values that would help your bi with no downtime.
You might have to tweak it little bit if needed.
More or less we have use sometime back to a client where they want to switch dbs dev to test to etc
Try it out.
BTW: 2.5 days load time for Full load or Incr?
Edited by: Srini VEERAVALLI on Feb 1, 2013 8:40 AM
Sarvan, as per my understanding your trying to take a snapshot of all the tables including data into another schema... such that it doesnt impact your downtime. (During ur ETL Run)1 person found this helpful
The above solution is workable.
But, i still think solution provided by Srini.. is far better which removes the necessity of another schema/tables etc... So try to develop the same concept.
And, let us know your approach and solution you found.
If you are on Oracle Database,you can try implementing Exchange Partition feature provided by Oracle at ETL end for Data Movement / Data Load. Below link gives an idea about what Exchange Partition does:
Thanks for your response. It's 2.5 days for full load. We are thinking to recreate only the Fact/Dimensional MVs and aggregated tables in the new schema and swith the connection pool to point this schema via. a DB table as suggested. I started developing the POC in lower environment.
How about Incr. load?
Is it configured? If not!. Configuring to Incr. load would be the ideal solution for you.