This content has been marked as final. Show 5 replies
Per my understanding there is no other option...
You're asking for a 100% sample on a table with 237 million rows and gathering statistics for all the dependent objects. An hour seems pretty reasonable for that.
- How often are you gathering statistics? It seems unlikely that a table this large changes enough to justify gathering statistics all that frequently.
- Is the table partitioned? Could it be? Assuming this is part of a load process, perhaps you really only need to gather statistics on the most recently loaded partition
- Do you really need a 100% sample? Could you use a smaller sample size (or use the automatic sample size)? Or did you encounter problems using something less than a 100% sample?
user9077483 wrote:You will surely need to decrease the sample parameter.
I have used the following statement to gather the stats of a table..
To gather the stats it is taking one hour and the table SALE_AMT is having 237213998 records.
Please help me is there any way to get the stats quickly.
Thanks in advance.
Run this it will use auto sampling & will reduce time without impacting performance
EXEC dbms_stats.gather_table_stats('APP','SALE_AMT',CASCADE=>True,no_invalidate => false);
where 8 is the degree of parallelism (this should be set depending on the number of your cpu's).