8 Replies Latest reply on Nov 22, 2018 2:59 PM by 459893

    updating a very large table


      Scenario : 'employee' table has 900 million records. I want to update the table by adding a new column (along with default value).

      Question : What is the best/quickest way to do this given the fact that I get a short downtime to do this maintenance.


      Thanks and Regards,


        • 1. Re: updating a very large table

          Hi Lily,

          You posted this question in the Oracle Learning Library space.

          Are you using the Oracle Learning Library, or are you referring to a different Oracle product?


          Also, could you please visit the Getting Started page for instructions on how to change your display name so you're not just a number?



          • 2. Re: updating a very large table

            Hello Jim, Thanks for your response. My question refers to Oracle Database Enterprise Edition.  Not sure if this is the correct forum for that question.


            Regards, Lily.

            • 3. Re: updating a very large table

              Hi Lily,

              Since your post is not about the Oracle Learning Library, I don't think this is the right space.

              In the ACTIONS box on the top right of your post, you should be able to click on "Move", and then specify the correct subspace of Database so the right experts can see your question.

              Thanks and good luck!

              MOSC Move.png

              • 4. Re: updating a very large table

                What is your database version ? Post the result of select banner from v$version. Oracle 11.2/12.1 has optimized DDL operations. https://www.oracle.com/technetwork/articles/database/ddl-optimizaton-in-odb12c-2331068.html

                If the tests shows that the above is not meeting your downtime then you can try to use DBMS_REDEFINITION to do it online.

                • 5. Re: updating a very large table

                  Better: first add the column, then update, then add the not null constraint.

                  There will be some downtime when adding the column because all the procedures referring that table directly (not in dynamic SQL) will get decompiled and need to be recompiled.

                  After adding the column you may update all that using the facilities of dbms_parallel_execute, thus not having one single transaction for 900M records. That will help you split the work in chunks of several thousands or hundreds of thousands records.

                  Is it a table used for OLTP? If it's for OLTP, then you may have some problems running such massive updates and I'd say that the chunks for dbms_parallel_execute should be rather small, like no more than a couple of thousand records.

                  If it's used for OLAP, then the chunks for dbms_parallel_exucute may be much larger, like 1M records per chunk.

                  I sau that because I fear one single update may possibly not finish the work.

                  After having updated the column add the not null constraint with novalidate, something like below:

                  alter table zzz_t add constraint nn_xcol check (x_col is not null) enable novalidate;

                  • 6. Re: updating a very large table

                    Starting from 11gR2 for adding new columns with DEFAULT values and NOT NULL constraint , the default value is stored in the data dictionary instead of updating the table column as a result especially for large tables the execution time is reduced.

                    In 12.1 that is further optimized.

                    • 7. Re: updating a very large table

                      Good that it is so. Didn't know that thing.

                      • 8. Re: updating a very large table

                        Thanks much Saubhik and Bede for your valuable input. this is much helpful. By the way, my DB is oracle 12.1 version.


                        Warm Regards, Lily.