I am looking for a way to capture a TXs from one database and create a script or use it with tool, so we can capture TXs close to production while testing changes being impleted in database.
OS: Redhat Linux 5.0
Actualy Requirement: We have active-active Golden Gate setup done for one of our DB and once a quater we make changes in the database using DDL (CREATE / ALTER - TABLES, FUNCTIONS, TRIGGERS, etc). We are 24/7 environment and hence no downtime is affordable. What I like to know is a way to capture all TXs from one of the DB and create a script so we can run them while we are testing new changes in the database. Something like DBMS_WORKLOAD_CAPTURE Package (Available only after 11gR1) or some tools available in the market.
I also tried loking into LoadRunner but felt it works with App tier than DB, I may be wrong there.
All over the place in those fine manuals it says you can capture the data in 10g, and in MOS there are lists of patches necessary to do so, but everywhere it says you need 11g to actually replay the data. Unless you mean replay it to an 11g and use GG to propagate back to 10g...???
Thanks again but I guess I sholuld have specified earlier.
Any tool which can mine archive logs only and help me get DMLs in the form that I can directly run them on DB. Using streams on production DB is not advisable in my case to capture TXs. Hence, I was looking for something more towards Archive log mining tool.