6 Replies Latest reply: Nov 24, 2011 12:41 AM by Mohammed Bin Asif RSS

    Upgrade using datapump from 10.2.0.4 to 11.2.0.3

    Mohammed Bin Asif
      Hello Experts,
      I have few doubts regarding oracle database upgrade from 10.2.0.4 to 11.2.0.3 using datapumps.
      We were actually comfortable to upgrade the production database manually which takes less time than using datapumps however our client is insisting us to do the upgrade by using datapump in order to improve the performance of the database.

      My questions are:

      1) is it risky to delete the old database once export is taken as we do not have much space in the server to accommodate both the databases and the dump file. However we are taking the cold backup of the entire filesystem in the tape.

      2) Is there anything as such (objects etc) that will not be imported and we need to take them into consideration.

      3) What about DBMS_Scheduler jobs and dba jobs, will that be imported successfully or anything we need to do post upgrade.

      4) I actually tried this method in my local test box and import was completed with around 240 errors.

      Finally, can anyone please provide me with the exact steps, docs for the Upgrade using this method.

      Thanks in Advance
      Asif
        • 1. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
          Srini Chavali-Oracle
          Pl post OS details

          The steps are documented here - http://download.oracle.com/docs/cd/E11882_01/server.112/e23633/expimp.htm

          Pl elaborate on what performance issues are expected to be solved by this process. What are the errors that you got in your test import.

          HTH
          Srini
          • 2. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
            Mohammed Bin Asif
            HI There,
            Actually there are some white pages, fragmentation has become more in the database that is why it was recommended to us by an OCM to use this method.

            Following are some of the errors that we got:

            ORA-31684: Object type TABLESPACE:"UNDOTBS1" already exists
            ORA-31684: Object type ROLE:"RECOVERY_CATALOG_OWNER" already exists
            ORA-31684: Object type SEQUENCE:"SYSTEM"."REPCAT$_EXCEPTIONS_S" already exists
            ORA-39082: Object type TYPE:"OE"."SHIPPING_INSTRUCTIONS_T" created with compilation warnings
            ORA-39082: Object type TYPE:"OE"."PURCHASEORDER_T" created with compilation warnings
            ORA-39083: Object type PROCOBJ failed to create with error:
            ORA-29357: object AUTO_TASK_CONSUMER_GROUP already exists
            ORA-39151: Table "OUTLN"."OL$" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
            ORA-39151: Table "SYSTEM"."LOGSTDBY$SCN" exists. All dependent metadata and data will be skipped due to table_exists_action of skip
            ORA-39151: Table "SYSTEM"."REPCAT$_EXCEPTIONS" exists. All dependent metadata and data will be skipped due to table_exists_action of skip

            ORA-39117: Type needed to create table is not included in this operation. Failing sql is:
            Processing object type DATABASE_EXPORT/SCHEMA/TABLE/PRE_TABLE_ACTION
            Processing object type DATABASE_EXPORT/SCHEMA/TABLE/TABLE_DATA
            . . imported "SH"."CUSTOMERS" 9.850 MB 55500 rows
            . . imported "SH"."SUPPLEMENTARY_DEMOGRAPHICS" 695.9 KB 4500 rows
            . . imported "OE"."PRODUCT_DESCRIPTIONS" 2.379 MB 8640 rows
            . . imported "SH"."SALES":"SALES_Q4_2001" 2.257 MB 69749 rows
            . . imported "SH"."SALES":"SALES_Q1_1999" 2.070 MB 64186 rows
            . . imported "SH"."SALES":"SALES_Q3_2001" 2.129 MB 65769 rows
            ... etc.
            ORA-31693: Table data object "OE"."LINEITEM_TABLE" failed to load/unload and is being skipped due to error:
            ORA-31603: object "LINEITEM_TABLE" of type TABLE not found in schema "OE"
            ORA-31693: Table data object "OE"."ACTION_TABLE" failed to load/unload and is being skipped due to error:
            ORA-31603: object "ACTION_TABLE" of type TABLE not found in schema "OE"

            ORA-39083: Object type INDEX_STATISTICS failed to create with error:
            ORA-20000: INDEX "OE"."LINEITEM_TABLE_MEMBERS" does not exist or insufficient privileges
            Failing sql is:
            DECLARE I_N VARCHAR2(60); I_O VARCHAR2(60); c DBMS_METADATA.T_VAR_COLL; df varchar2(21) := 'YYYY-MM-DD:HH24:MI:SS'; BEGIN DELETE FROM "SYS"."IMPDP_STATS"; i_n := 'LINEITEM_TABLE_MEMBERS'; i_o := 'OE'; INSERT INTO "SYS"."IMPDP_STATS" (type,version,flags,c1,c2,c3,c5,n1,n2,n3,n4,n5,n6,n7,n8,n9,n10,n11,n12,d1,cl1) VALUES ('I'
            ORA-39083: Object type INDEX_STATISTICS failed to create with error:
            ORA-20000: INDEX "OE"."ACTION_TABLE_MEMBERS" does not exist or insufficient privileges
            Failing sql is:

            ORA-31684: Object type VIEW:"SYSTEM"."MVIEW_WORKLOAD" already exists
            ORA-31684: Object type VIEW:"SYSTEM"."MVIEW_FILTER" already exists
            ORA-31684: Object type VIEW:"SYSTEM"."MVIEW_LOG" already exists

            ORA-39111: Dependent object type COMMENT skipped, base object type VIEW:"SYSTEM"."MVIEW_EVALUATIONS" already exists
            ORA-39111: Dependent object type COMMENT skipped, base object type VIEW:"SYSTEM"."MVIEW_EXCEPTIONS" already exists
            ORA-39082: Object type VIEW:"OLAPSYS"."ALL$OLAP2_AW_CATALOGS" created with compilation warnings
            ORA-39082: Object type VIEW:"OLAPSYS"."ALL$OLAP2_AW_CATALOG_MEASURES" created with compilation warnings
            ORA-39082: Object type VIEW:"OLAPSYS"."ALL$OLAP2_AW_PHYS_OBJ" created with compilation warnings

            ORA-39082: Object type PACKAGE_BODY:"OLAPSYS"."CWM2_OLAP_OLAPAPI_ENABLE" created with compilation warnings
            ORA-39082: Object type PACKAGE_BODY:"OLAPSYS"."DBMS_AWM" created with compilation warnings

            ORA-39083: Object type INDEX failed to create with error:
            ORA-06550: line 2, column 1:
            PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared
            ORA-06550: line 2, column 1:
            PL/SQL: Statement ignored
            ORA-06550: line 8, column 1:
            PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared
            ORA-06550: line 8, column 1:
            PL/SQL: Statement ignored
            ORA-06550: line 10, column 1:
            PLS-00201: identifier 'CTXSYS.DRIIMP' must be declared
            ORA-06550: line 10, column 1:

            ORA-39082: Object type TRIGGER:"OE"."ORDERS_ITEMS_TRG" created with compilation warnings
            Processing object type DATABASE_EXPORT/SCHEMA/MATERIALIZED_VIEW

            Job "SYS"."SYS_IMPORT_FULL_01" completed with 249 error(s) at 10:07:32

            ------------------------------------------------------------------------------------------------------------------------------------------------------------------
            export command:

            expdp \'/ as sysdba\' directory=testdir logfile=expdp_17Nov2011.log dumpfile=expdp_17Nov2011.dmp full=y EXCLUDE=SCHEMA:\"IN \(\'SYSMAN\', \'DBSNMP\'\)\"

            create database statement:

            CREATE DATABASE SWBPROD
            user sys identified by set_password
            user system identified by set_password
            -- SET DEFAULT BIGFILE TABLESPACE
            maxdatafiles 1024
            maxinstances 5
            maxloghistory 1000
            maxlogmembers 5
            maxlogfiles 64
            DATAFILE '/dbo/sysdata/SWBPROD_system_00.dbf' SIZE 400M REUSE AUTOEXTEND ON NEXT 20M MAXSIZE unlimited
            extent management local
            SYSAUX DATAFILE '/dbo/sysdata/SWBPROD_sysaux_00.dbf' SIZE 20M REUSE AUTOEXTEND ON NEXT 20M MAXSIZE unlimited
            DEFAULT TABLESPACE USERS DATAFILE '/dbo/data2/SWBPROD_users_00.dbf' SIZE 512M reuse
            AUTOEXTEND ON NEXT 16M MAXSIZE 30G autoallocate
            SMALLFILE DEFAULT TEMPORARY TABLESPACE TEMP TEMPFILE '/dbo/temp/SWBPROD_temp_01.dbf' SIZE 1G reuse AUTOEXTEND ON
            NEXT 32M MAXSIZE 30G UNIFORM SIZE 1M
            UNDO TABLESPACE "UNDOTBS" DATAFILE '/dbo/undo/SWBPROD_undo_00.dbf' SIZE 200M reuse
            AUTOEXTEND ON NEXT 20M MAXSIZE 10G
            RETENTION GUARANTEE
            CHARACTER SET AL32UTF8
            NATIONAL CHARACTER SET AL16UTF16
            CONTROLFILE REUSE
            LOGFILE group 1 ('/dbo/redo1/SWBPROD/redo_01a.log','/dbo/redo2/SWBPROD/redo_01b.log') SIZE 50M REUSE,
            group 2 ('/dbo/redo2/SWBPROD/redo_02a.log','/dbo/redo1/SWBPROD/redo_02b.log') SIZE 50M REUSE,
            group 3 ('/dbo/redo1/SWBPROD/redo_03a.log','/dbo/redo2/SWBPROD/redo_03b.log') SIZE 50M REUSE,
            group 4 ('/dbo/redo2/SWBPROD/redo_04a.log','/dbo/redo1/SWBPROD/redo_04b.log') SIZE 50M REUSE;

            alter user outln identified by outln;
            alter user dbsnmp identified by dbsnmp;

            Alter system set SEC_CASE_SENSITIVE_LOGON=FALSE SCOPE=BOTH;

            and then we have run the following scripts: Please advice if any other script needs to be run.
            @?/rdbms/admin/catalog.sql
            @?/rdbms/admin/catproc.sql
            @?/rdbms/admin/dbmspool
            @?/rdbms/admin/prvtpool.plb
            @?/rdbms/admin/utlchain
            @?/rdbms/admin/utlxplan
            @?/rdbms/admin/catblock
            @?/sqlplus/admin/plustrce

            @?/javavm/install/initjvm.sql;
            @?/xdk/admin/initxml.sql;
            @?/xdk/admin/xmlja.sql;

            conn system/&&SYSTEMPW
            @?/sqlplus/admin/pupbld

            conn / as sysdba
            @?/rdbms/admin/utlrp.sql
            select owner, object_type, object_name from dba_invalid_objects;

            Import all the data from the dumpfile.
            imp command:

            impdp \'/ as sysdba\' directory=testdir logfile=impdp_17Nov2011.log dumpfile=expdp_17Nov2011.dmp full=y EXCLUDE=SCHEMA:\"IN \(\'SYSMAN\', \'DBSNMP\'\)\"


            Also please let me know if dba_scheduler jobs and dba jobs will also get imported?

            Thanks again
            Asif

            Edited by: 866995 on Nov 18, 2011 5:36 AM
            • 3. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
              Srini Chavali-Oracle
              Expdp and Impdp should be run as SYSTEM, not as SYSDBA. See the first "Note" in each of these two links.

              http://download.oracle.com/docs/cd/B19306_01/server.102/b14215/dp_export.htm#sthref57
              http://download.oracle.com/docs/cd/E11882_01/server.112/e22490/dp_import.htm#i1012504

              Pl retry expdp and impdp using SYSTEM.

              The "object already exists" errors can be ignored - others will need to be fixed. Pl ensure that expdp completes without any errors before starting impdp.

              HTH
              Srini
              • 4. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
                Mohammed Bin Asif
                Hello Srini

                Thanks for your suggestion. Actually we always use sysdba and not system, I dont think there will be any problem but still as you suggested I will try with system as well.
                I have couple of question for you:
                1) Currently there is more Fragmentation in our database. Is going with datapump export/import upgrade method the correct way to deal with it in Production server?
                2) What is the Oracle recommended best practice for database Upgrade. As per my knowledge and docs it is DBUA?
                3) What about DBMS scheduler jobs. Do we need to re-create them after the Upgrade?

                Thanks again
                Asif
                • 5. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
                  Srini Chavali-Oracle
                  866995 wrote:
                  Hello Srini

                  Thanks for your suggestion. Actually we always use sysdba and not system, I dont think there will be any problem but still as you suggested I will try with system as well.
                  As noted in the documentation, you should not use sysdba for export/imports. Oracle has a very good reason for making that statement.
                  I have couple of question for you:
                  1) Currently there is more Fragmentation in our database. Is going with datapump export/import upgrade method the correct way to deal with it in Production server?
                  Yes - pl see this link - http://docs.oracle.com/cd/E11882_01/server.112/e23633/preup.htm#BABFHFIJ
                  2) What is the Oracle recommended best practice for database Upgrade. As per my knowledge and docs it is DBUA?
                  Yes - this is the recommended method - http://docs.oracle.com/cd/E11882_01/server.112/e23633/preup.htm#i694345
                  3) What about DBMS scheduler jobs. Do we need to re-create them after the Upgrade?
                  Scheduler jobs will be exported/imported - http://docs.oracle.com/cd/E11882_01/server.112/e25494/schedadmin003.htm#i1007297
                  Thanks again
                  Asif
                  HTH
                  Srini
                  • 6. Re: Upgrade using datapump from 10.2.0.4 to 11.2.0.3
                    Mohammed Bin Asif
                    Hi Srini,
                    Thanks for your time and response.

                    Regards
                    Asif