Skip to Main Content

Analytics Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Interested in getting your voice heard by members of the Developer Marketing team at Oracle? Check out this post for AppDev or this post for AI focus group information.

How to use the EVALUATE function in OBIEE?

862823Jan 11 2017 — edited Jan 12 2017

Hi,

I have a function in the oracle database called FNC_TEST( PDATE in date, PQTDE number).

How to use the EVALUATE function for this function?

Luciana

This post has been answered by cesar.advincula.o on Jan 11 2017
Jump to Answer

Comments

unknown-698157

1 What is the size of the table?

2 What is the value of db_file_multiblock_read_count

3 Did you or Oracle (Oracle will do this automatically from 10g and higher) system stats?

If you didn't, you are aware the defaults of these two parameters favor full table scans?

Apparently a full table scan will consume the least amount of IO (which you don't show).

Please be aware that the assertion ' a full table scan is always evil'  has to be labeled as a myth, quite often full table scans are much more efficient.

BTW: there is nothing wrong with the cardinality estimation, what you see there is the number of rows which satisfies your access path, full table scan.

---------

Sybrand Bakker

Senior Oracle DBA

User_OCZ1T

Thanks.

Table size - 211 Mb

db_file_multiblock_read_count - 128

Here i am not running behind the FTS and assuming it as culprit, i am just trying to figure out the fact on which oracle works that is the optimizer math.
i just checking 'cost based oracle fundamentals' by jonathan lewis and checked how it calculates cardinality/cost(in case of constant value in filter) and figureout the path of execution, and was trying to match with my scenaio.

As i already mentioned the estimated value is quite different than actual, (estimated -1171K and actual -0). In a complex query this can significantly affect the over all path of execution!

Jonathan Lewis
Answer

You've got a frequency histogram which has captured one value - there's a known bug for that case (though I can't quote numbers - check Randolf Geist's blog).  I'm a little surprised that it's still around in 11.2.0.3, but there were a few edges cases, perhaps not all fixed.  I note that the histogram value you captured starts: 'ID:414" - given that you used a 20% sample I wonder if the column has a million distinct values that all start with the same 15 characters (select min(), max() from table to satisfy my curiosity) - Oracle gets a little lost with the statistics for character strings after (roughly) the first 6.

Regards

Jonathan Lewis

Marked as Answer by User_OCZ1T · Sep 27 2020
User_OCZ1T

Thank you very much.

but if i am checking the dba_tab_col_statistics the low_value and high value are same, as below. so it means all the value what ever i will put in filter
condition will be mostly out of range.

HIGH_VALUE                                                                                                                                     LOW_VALUE
----------------------------------------------------------------                                                                     ----------------------------------------------------------------
49443A3431346435313230346534313522222222222222432303230323032                           49443A3431346435313230346534313522222222222222432303230323032


few questions

1)Regarding getting fix for this issue, i am thinking like ,as this columns is having mostly distinct values only, if i can drop the histogram from this column
and keep it as it is. Then will this be a good solution?

2)Is there a patch available for this bug (if you can mention the bug no, it will be really great)and i can ask the techops guys to work
with oracle support and get it done?

Jonathan Lewis

Your values are all identical over the first 35 characters - the optimiser doesn't look past 32 when collecting histograms (until 12c)Frequency Histogram 5 | Oracle Scratchpad, so even the"endpoint_actual_value" would be the same for all of them.  It's possible that the column doesn't even had a histogram - check user_tab_columns to see if you have histogram = FREQUENCY or histogram = NONE. If it does have a histogram get rid of the histogram.

I said I didn't have the bug number, but if you  get your techops to raise an SR Oracle support should be able to find any relevant bugs and patches very easily

Regards

Jonathan Lewis

Peter Fong-Oracle

In 12c histograms use the first 64 bytes, though this change is not backportable.  Though not a perfect fixed, you may want to look into applying the patch for the following bug which adds a cardinality sanity check so the cardinality cannot be greater than num_rows - (ndv -1) when using an equality operator and histograms are used.
Bug 18377553 : POOR CARDINALITY ESTIMATE WITH HISTOGRAMS AND VALUES > 32 BYTES

Jonathan Lewis

Excellent news - and an elegant workaround.

Thanks for posting.

Regards

Jonathan Lewis

1 - 7
Locked Post
New comments cannot be posted to this locked post.

Post Details