But the final value that is used by Oracle is derived from statistics and may not be the same as the one that you have set; hence, you need to test if this change helps you positively.
SQL> alter session set db_file_multiblock_read_count=128; Session altered. SQL> alter session set workarea_size_policy=manual; Session altered.
user1000000 wrote:How long does it take to populate the table that is going to be exchanged?
I Am running dbms stats job to gather stats for one table which hase one partition, In this table after the partition exchange
i rebuild the index , then i run the job for stats, but stats take 3 hours to complete , is thete is any way i can tune this tsts gathering
i need to run this every day
The following are the steps
1) partition exchange
2) rebuild index
3) gather stats
how i can tune this gather stats , following is the syntax i am usingSince you're using GRANULARITY => 'ALL', it actually scans all levels of the segment: All subpartitions individually (if applicable), all partitions individually, and then finally the whole table consisting of all (sub-)partitions. So in case your table consists actually only of a single partition the table is actually scanned twice due to this parameter (once for PARTITION level, and again for GLOBAL level statistics).
EXECUTE DBMS_STATS.GATHER_TABLE_STATS('SIM','SENDUNG',METHOD_OPT => 'FOR ALL COLUMNS
SIZE 1', DEGREE => 5 ,ESTIMATE_PERCENT => DBMS_STATS.AUTO_SAMPLE_SIZE,GRANULARITY => 'ALL',cascade=> FALSE);
user1000000 wrote:My point here is that you could use the "unpartitioned" table as your "users" table and the partitioned table as "staging table" that will be populated by SQL*Loader. This way you simplify the optimizer's work to perform and the statistics management on the final table. If you use "GRANULARITY=>'ALL'" then your partitioned table will effectively be scanned twice.
Thanks , i answered all you questions
The reason why i use partition exchange is
I load the data in stage table , then user need to access this data
that is why i use exchange partition to move this data to another table
and rebuild index.
i create one partition for using exchange partition concept
How long does it take to populate the table that is going to be exchanged?So you say to load the 200,000 records into thee staging table it takes only a couple of minutes using SQL*Loader, right?
using the sqlldr to load the data to stage table and partition exchange the
same data in to other table (ie. permanent table only need this data)
just in a few minutse it will altered
How long does the index rebuild take?I'm not sure why loading the data should take only a couple of minutes but rebuilding an index should take two hours, that sounds odd.
How large is the segment being exchanged (check DBA/ALL/USER_SEGMENTS)?Sorry, I obviously wasn't precise enough: I asked for the segment size in bytes or blocks, the row count doesn't influence how long a scan will need.
user1000000 wrote:And what is the reason why you need to calculate the GLOBAL statistics and the PARTITION level statistics?
in the mean time what is you sugesstion for setting the granularity ?
can i set it to auto , the reason why i set it to ALL is to gather stats for table and partition
laat time i put like this GRANULARITY => 'PARTITION', this will not calculate the stats for table only calculate for partition
i need to calculate for partition and table
this is the detail which you ask#Well, and how does your previous information regarding 200,000 rows fit into this picture? I see here 100 millions rows allocating approx. 11GB assuming a block size of 8KB.
for the partition