Categories
- All Categories
- 118 Oracle Analytics News
- 21 Oracle Analytics Videos
- 14.4K Oracle Analytics Forums
- 5.5K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 48 Oracle Analytics Trainings
- 6 Oracle Analytics Data Visualizations Challenge
- 4 Oracle Analytics Career
- 7 Oracle Analytics Industry
- Find Partners
- For Partners
OAC - Table visualization in DV duplicate rows, but LSQL is correct
Hi,
I'm working with OAC Jan 2024 and adding a table visualization in a workbook based on a subject area.
If I enable the "developer" option and look at the queries of my visualization and execute the LSQL query I get 4 rows as result. If I take the physical SQL (on an Oracle Autonomous DWH) and run it I get 4 rows as result.
The expected result is 4 rows in my table visualization in DV, because the LSQL (and the SQL) are what allows this thing to exist.
But no, the table shows 8 rows: my 4 rows are duplicated. And if I keep adding few more columns in it, while the LSQL and SQL will still return 4 rows, the table start showing me 16 rows instead of 4.
Why is the visualization multiplying my rows despite the LSQL and SQL being correct and returning exactly what is expected?
If I can't trust the page to render in my table visualization the result of the LSQL (and SQL) queries, how am I supposed to trust the numbers in my charts will not start being multiplied because of something mysterious that decided that the LSQL wasn't right and multiplying records was a good idea? Or just trust anything visible in DV?
The same table view build in a "classic" analysis generate the same LSQL, the same SQL and display, correctly, 4 rows.
I expect the webpage to display at best the number of rows produced by the LSQL query, not more. That sounds like a key principal of the product.
Thanks
Best Answer
-
And the answer is ... a "corrupted" DV workbook JSON, but not corrupted enough for the tool to see it and report it as broken.
I can't explain how, why, and how to reproduce, but the DV workbook JSON for 2 of the 30 columns I needed in that table, contained the same "columnID" twice, pointing to presentation columns (the "expression") being the same but written in different case. The "expression" isn't case sensitive, and expression pointing to
"subject area"."presentation table"."presentation column 1"
is the exact same as"subject area"."presentation table"."PRESENTATION Column 1"
.On this environment I have been having tons of issues yesterday, with Semantic Modeler not finding any error or warning in a model and deploying it just fine. But after a single query the error "missing navigation space for subject area" was everywhere. And I was renaming things and changing cases of presentation columns, and setting aliases, when it was failing all the time.
There can be some correlation between these errors and inconsistency, and the workbook JSON becoming "corrupted" with a duplicated columnID (also not forgetting browser cache between DV screens etc.). Can't reproduce it in a new workbook, it was failing in the workbook where I needed it.
Lesson learnt: if there is a duplicated columnID, the GUI can start duplicating records. Not that it makes sense, but that's it.
I would highly appreciate if some forms of checking and validation was added when saving the JSON, like making sure identifiers are unique, just in case. In the end it's JSON: anything can be stored in there, but the product should enforce the business logic it requires to give a meaning to the JSON.
An example of what the JSON was like for 2 columns giving me issues. I just can't explain how it is possible, mostly because it is ... impossible and can't really happen. It's maybe an unlucky chain of events with all the issues I had on that environment yesterday (solved by dropping the semantic model, building a RPD with the Model Administration Tool and importing that into Semantic Modeler: all the missing navigation space errors disappeared right after deploying).
5
Answers
-
Thanks @Gianni Ceresa for sharing this with the community. It will save me a lot of time if I need to face the same issue.
0 -
Hello fellow nerds (it's always the same three innit? so boring...) Edit: Four. I saw Michal also had a peek at the thread ;)
I've reproduced that from my side and it's really the DV GUI that's just broken with regards to what Gianni posted. Added titbits:
1.) The JSON does no parsing whatsoever so going into the JSON and looking at whether a duplicate "columnID" exists in the first place is your only way of ever knowing that this is happening:
2.) The creation of these duplicate IDs is random and I could not find any pattern.
3.) Worst JSON I've managed to get so far:
4.) As you can see it's got nothing to do with query grain and the data stream changing because my measure is triplicated. Not any dimensionality around it.
5.) You will see diddly squat in the log file. The mess-up happens in the data rendering. Physical SQL, logical SQL and presentation layer request are 100% fine.
6.) If you keep adding columns the GUI keeps looping and keeps adding more and more rows.....but doesn't apply any arithmetic to it - have a look at the grand total:
Fun stuff for a Friday geek-out! Thanks @Gianni Ceresa !
1 -
@Gianni Ceresa @Christian Berg Hi guys, what a crazy Friday... This is really very weird behavior in DV UI and would definitely deserve someone attention. Since it seems, it can be quite well reproduced, creating SR (and insisting to admin this behavior as a bug ) would be good I think. Have a nice weekend. Thanks to both of you.
0 -
Hi,
My first thought when I read the abstract subject was marking queries,but that does not appear to be the case here.
@Christian Berg - If you can get me a generic test case (reach out to me internally) , we can get a bug logged and some investigation on this. It may be something already addressed.
If someone has a Service Request logged with a reproducible case, then better, so we can attach it to the code defect
Or both.
I will try to assist / expedite this.
~Steve
0 -
Thanks folks for detailing.
@SteveF-Oracle I can also circulate with Dev team from my side. Will copy you.
Thanks
Philippe
1 -
@Philippe Lions-Oracle and @SteveF-Oracle , I don't know how to reproduce the generation/saving of the wrong JSON in the GUI, mostly because I don't know at what point it went wrong with all the other issues I had yesterday, and how I managed to do it. Being a customer's environment, I don't have the permissions to raise an SR for them, or share their data (the corrupted JSON).
The GUI behavior once the JSON is wrong can be reproduced (as Christian said).
We can call it a bug, or a missing functionality. I see 2 actions that could help:
- Before to write the JSON to the database (catalog), run it through a validator, enforce the DV workbook JSON rules and not just checking the JSON is valid JSON.
- When rendering something in the GUI, add some safety checks to avoid looping twice over the row set if a duplicate ID exist. I can imagine nothing is validated there now because it should have been validated upfront, when generating/creating the JSON. That's not enough also because there can be ways to write JSON in the catalog bypassing the GUI. But a quick check like "the GUI shouldn't have more rows than what has been retrieved" sounds like a generic quick safety.
I would really expect a stronger check of the validity of the JSON when saving it. It is an operation you don't perform after every click in DV, you save only once you are done with what you tried to achieve (or at least a consistent part of it). Taking 0.1 second longer because validating the content of the JSON and not just the format sounds legit to me.
For the rendering of the GUI objects, the scary part is that the logs can lead you to believe it's all good: the LSQL is fine, the SQL is fine, the conclusion is that you can fully trust what you see on screen. Also because I don't believe there is a log somewhere that could let you think you the records are being duplicated "visually". And because the GUI is all compressed javascript files, I'm not going to look at what piece of it has the issue...
3 -
@Gianni Ceresa - please open and SR so we can do a deeper dive on the details and start an investigation. We appreciate the feedback here and would like to get to the bottom of it.
0