Oracle Analytics Cloud and Server

Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture

OAC - Table visualization in DV duplicate rows, but LSQL is correct

Accepted answer
353
Views
8
Comments

Hi,

I'm working with OAC Jan 2024 and adding a table visualization in a workbook based on a subject area.

If I enable the "developer" option and look at the queries of my visualization and execute the LSQL query I get 4 rows as result. If I take the physical SQL (on an Oracle Autonomous DWH) and run it I get 4 rows as result.

The expected result is 4 rows in my table visualization in DV, because the LSQL (and the SQL) are what allows this thing to exist.

But no, the table shows 8 rows: my 4 rows are duplicated. And if I keep adding few more columns in it, while the LSQL and SQL will still return 4 rows, the table start showing me 16 rows instead of 4.

Why is the visualization multiplying my rows despite the LSQL and SQL being correct and returning exactly what is expected?

If I can't trust the page to render in my table visualization the result of the LSQL (and SQL) queries, how am I supposed to trust the numbers in my charts will not start being multiplied because of something mysterious that decided that the LSQL wasn't right and multiplying records was a good idea? Or just trust anything visible in DV?

The same table view build in a "classic" analysis generate the same LSQL, the same SQL and display, correctly, 4 rows.

I expect the webpage to display at best the number of rows produced by the LSQL query, not more. That sounds like a key principal of the product.

Thanks


Tagged:

Welcome!

It looks like you're new here. Sign in or register to get started.

Best Answer

Answers

  • Rank 7 - Analytics Coach

    Thanks @Gianni Ceresa for sharing this with the community. It will save me a lot of time if I need to face the same issue.

  • edited March 2024

    Hello fellow nerds (it's always the same three innit? so boring...) Edit: Four. I saw Michal also had a peek at the thread ;)

    I've reproduced that from my side and it's really the DV GUI that's just broken with regards to what Gianni posted. Added titbits:

    1.) The JSON does no parsing whatsoever so going into the JSON and looking at whether a duplicate "columnID" exists in the first place is your only way of ever knowing that this is happening:

    2.) The creation of these duplicate IDs is random and I could not find any pattern.

    3.) Worst JSON I've managed to get so far:

    4.) As you can see it's got nothing to do with query grain and the data stream changing because my measure is triplicated. Not any dimensionality around it.

    5.) You will see diddly squat in the log file. The mess-up happens in the data rendering. Physical SQL, logical SQL and presentation layer request are 100% fine.

    6.) If you keep adding columns the GUI keeps looping and keeps adding more and more rows.....but doesn't apply any arithmetic to it - have a look at the grand total:

    Fun stuff for a Friday geek-out! Thanks @Gianni Ceresa !

  • Rank 7 - Analytics Coach

    @Gianni Ceresa @Christian Berg Hi guys, what a crazy Friday... This is really very weird behavior in DV UI and would definitely deserve someone attention. Since it seems, it can be quite well reproduced, creating SR (and insisting to admin this behavior as a bug ) would be good I think. Have a nice weekend. Thanks to both of you.

  • Hi,

    My first thought when I read the abstract subject was marking queries,but that does not appear to be the case here.

    @Christian Berg - If you can get me a generic test case (reach out to me internally) , we can get a bug logged and some investigation on this. It may be something already addressed.

    If someone has a Service Request logged with a reproducible case, then better, so we can attach it to the code defect

    Or both.

    I will try to assist / expedite this.

    ~Steve

  • Thanks folks for detailing.

    @SteveF-Oracle I can also circulate with Dev team from my side. Will copy you.

    Thanks

    Philippe

  • @Philippe Lions-Oracle and @SteveF-Oracle , I don't know how to reproduce the generation/saving of the wrong JSON in the GUI, mostly because I don't know at what point it went wrong with all the other issues I had yesterday, and how I managed to do it. Being a customer's environment, I don't have the permissions to raise an SR for them, or share their data (the corrupted JSON).

    The GUI behavior once the JSON is wrong can be reproduced (as Christian said).

    We can call it a bug, or a missing functionality. I see 2 actions that could help:

    • Before to write the JSON to the database (catalog), run it through a validator, enforce the DV workbook JSON rules and not just checking the JSON is valid JSON.
    • When rendering something in the GUI, add some safety checks to avoid looping twice over the row set if a duplicate ID exist. I can imagine nothing is validated there now because it should have been validated upfront, when generating/creating the JSON. That's not enough also because there can be ways to write JSON in the catalog bypassing the GUI. But a quick check like "the GUI shouldn't have more rows than what has been retrieved" sounds like a generic quick safety.

    I would really expect a stronger check of the validity of the JSON when saving it. It is an operation you don't perform after every click in DV, you save only once you are done with what you tried to achieve (or at least a consistent part of it). Taking 0.1 second longer because validating the content of the JSON and not just the format sounds legit to me.

    For the rendering of the GUI objects, the scary part is that the logs can lead you to believe it's all good: the LSQL is fine, the SQL is fine, the conclusion is that you can fully trust what you see on screen. Also because I don't believe there is a log somewhere that could let you think you the records are being duplicated "visually". And because the GUI is all compressed javascript files, I'm not going to look at what piece of it has the issue...

  • @Gianni Ceresa - please open and SR so we can do a deeper dive on the details and start an investigation. We appreciate the feedback here and would like to get to the bottom of it.

Welcome!

It looks like you're new here. Sign in or register to get started.