GregG wrote:I think Hemant missed your reference to 9.2.0.8. The linear decay algorithm appeared in 10.1, in 9.2.0.8 the cardinality for "column out of range" (whether using equality or between et. al.) is the same as "column = constant inside range".
lets say I've got date column with increasing date (populated as sysdate) , in point of time I created index (compute statistics) on that column without histogram (only min/max value) .
So if I dont regather stats on that column, my cardinality estimate will go lower day after day till value of 1 , right (because out of linear decay) ?
Just clicked answeredYou are allowed to revoke the "answered" star in 15minutes after granting it.
Hemant K Chitale wrote:Cost Based Oracle - Fundamentals, p.49: the linear decay was introduced between 10.1.0.2 and 10.1.0.4
Aah ! "The "formula" may vary between versions -- even patch levels." So, it was quite different in 9.2.0.8
GregG wrote:In this version of Oracle - yes; but It's easy enough to check with a few examples to see if the claim is realistic.
So You mean the cardinality is constant regardless how far we are from max value stored in statistics ?
insert into t select rownum, sysdate-(1+mod(rownum,10)) from dba_source where rownum <= 1000 ;
exec dbms_stats.gather_table_stats(user,'T', cascade => true);
select count(*) from t where data = sysdate+100;
Card=100
select count(*) from t where data = sysdate-10;
Card=100
but why is that 10% of all rows ?
RegardsGregG wrote:You've got 10 distinct values and no histograms, so any given value is assumed to return 1/10 of the rows.
You right.but why is that 10% of all rows ?insert into t select rownum, sysdate-(1+mod(rownum,10)) from dba_source where rownum <= 1000 ; exec dbms_stats.gather_table_stats(user,'T', cascade => true); select count(*) from t where data = sysdate+100; Card=100 select count(*) from t where data = sysdate-10; Card=100