Skip to Main Content

SQL & PL/SQL

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

reg: deletion of dump files

chandra_1986May 6 2021

HI Experts,

have written the below script:-EXP_TAB_cmprss.sh to take the export backup through script.

Need to add in the start of script to delete export dump 3 days old and then start export backup,
script should not execute without deleting old backups,
if it fails to delete old backup , script should terminate and send us alert.

# $Header: EXP_TAB_cmprss.sh
# *====================================================================================+

# | |
# +====================================================================================+
# |
# | FILENAME
# |
# |
# | DESCRIPTION
# | Daily Export backup script of a list of table
# | PLATFORM
# | Linux/Solaris

# +===========================================================================+
#!/bin/bash
echo Set Oracle Database Env
export ORACLE_SID=$1
export ORACLE_HOME=/oracle/app/oracle/product/12.1.0/dbhome_1
export LD_LIBRARY_PATH=$ORACLE_HOME/lib:/usr/lib
export PATH=$ORACLE_HOME/bin:$PATH:/usr/local/bin
export TIMESTAMP=`date +%a%d%b%Y`
export EXP_DIR=/export/home/oracle

echo =======
echo Export command
echo =======
echo $ORACLE_HOME
$ORACLE_HOME/bin/expdp \'/ as sysdba\' directory=DB_DUMPS dumpfile=expdp_tab_${TIMESTAMP}_%U.dmp logfile=expdp_log_${TIMESTAMP}.log
tables=DBATEST.ORDER_TAB PARALLEL=5 COMPRESSION=ALL

echo SEND MAIL TO STAKE HOLDERS
echo =======
mailx -s "$ORACLE_SID $TIMESTAMP Export backup logfile" support@dbclass.com < $EXP_DIR/expdp_log_${TIMESTAMP}.log
echo Export completed at $TIMESTAMP
exit

in EXP_DIR we have below files generated after running script

-rwxrwxrwx 1 tuser tuser 1.4K May 6 12:25 EXP_TAB_cmprss.sh
-rwxrwxrwx 1 t2n5463 odba 3.2G May 6 13:13 expdp_tab_Thu06May2021_05.dmp
-rwxrwxrwx 1 t2n5463 odba 3.1G May 6 13:13 expdp_tab_Thu06May2021_03.dmp
-rwxrwxrwx 1 t2n5463 odba 22G May 6 13:13 expdp_tab_Thu06May2021_02.dmp
-rwxrwxrwx 1 t2n5463 odba 2.9G May 6 13:13 expdp_tab_Thu06May2021_04.dmp
-rwxrwxrwx 1 t2n5463 odba 25G May 6 13:13 expdp_tab_Thu06May2021_01.dmp
-rw-r--r-- 1 t2n5463 odba 8.7K May 6 13:13 expdp_log_Thu06May2021.log

Any suggestion how can we add inside the script to , add in the start of script to delete export dump 3 days old and then start export backup,
script should not execute without deleting old backups,
if it fails to delete old backup , script should terminate and send us alert.

Comments

SH_INT
The export/import via an xml file should only be used when creating a new application. It is not designed to support incremental updates between environments. You will have to create the new locations manually in your Production environment and migrate the mappings by exporting to Excel and then importing to the locations in Production. Bear in mind, if your location ID's are different between environments you will need to update the excel mapping templates before importing them to your Production application
user3027589
Thanks for the answer.

It seems it would be common practice to create new data load locations in a development environment first, and then have the ability to easily migrate to production. It sounds like this is not the case at all.

Am I at least able to export some other application elements like scripts into an xml and then load them safely to production?

Edited by: user3027589 on Aug 20, 2012 1:12 PM
beyerch2
You can do this through SQL relatively easily. I know that is not the officially supported way, though.

The key information you would need to move would be in

tPOVPartition - Holds the basic location information
tStructurPartitionHierarchy - Outlines order of locations
tStructPartitionLinks - Links Partitions together
tDatamap - If you created a mapping

If you created a new import format...
---------------
tBhvLogicGroup - Import formats
tBhvImpItemFile - Details for Import Format

If you created a validation entity group for this location
--------------------
tBhvValEntGroup - Group information
tBhvValEntItem - items for the groups

If you create a specific validation rule group
--------------
tBhvValRuleGroup
tBhvValRuleItem

Controls Group...
------------
tControlsGroup
tControlsItem
tControlsItemLink
.
.
.


The trick would be to make sure all of the supporting items exist in the destination first such as Import format, Map, Controls Group, etc. You have to BE CAREFUL because the ID in the 'dev' envirionment for these items may not match the new ID (autogenerated or you need to generate it) in the new environment. You'll need to keep note of the proper ID so that when you copy over the location, you provide the right information.

Then create the location (partition) record. Since the location table does not automatication generate the unique ID, query the table first and determine the highest unique ID, add 1, and use that when you create this record. When you create the hierarchy items, you may need to swap out the old ID with this one. Also, if the hierarchy elements don't exist in destination, you'd need to recreate that as well....

If you need security, you should be able to locate the appropriate tables.....

I'm actually in the middle of making a utility to simplify a lot of the FDM tasks to build off of some things I did at Kscope and to prepare for next year... When I get closer to having something public friendly, I'll put it out here. I think you'd appreciate it given your question. ;)
JeffJon
The scripts are stored on the file system in the <Appname>\Data\Scripts directory. You can easily copy these from the source environment application folder to the target environment application folder and place them in the same directory.
1 - 4

Post Details

Added on May 6 2021
6 comments
1,872 views