This content has been marked as final. Show 10 replies
Kit.net wrote:Can you first share your experience on the subject ? Right now, I don't see any facts.
but we're not convinced that Oracle provides a robust engine for this kind of application
Are you having difficulties with building the queries?
Are you looking for performance, maintainability, both?
Can anyone offer examples of using large SELECT XmlElement/XmlAgg/XmlForest/XmlConcat queries in use?Why do you need specific examples?
SQL/XML functions can be used to build any structure you require, with no additional layer to manipulate the data.
Would you prefer using a tedious DOM approach, building the document element by element, row by row ?
XSLT may be another option. SQL/XML functions would be only used to generate a "raw" XML doc with the necessary data, then the stylesheet would take care of building the final document.
Again, share your use case.
We have a series of web services that return XML documents from data stored in Oracle. I implemented one with a big SELECT XmlElement(...), but we started seeing core dumps, so we have an SR (service request) open with support.oracle.com. I found SQL/XML the most elegant (fewest lines of code, easy to read and maintain) and efficient (100x faster) way to develop, as you point out, but the response we're getting from Oracle is underwhelming.
I'm trying to decide whether it's worth trying to pursue, or give up and generate the XML using client code. If there are other industrial-strength applications using SQL/XML, that might be worth knowing at this point.
I am using SQL/XML to generate XML and I find it easy to create and maintain the code. I have tried client tools such as Altova but it was very!!!!! slow when I was creating XML for a large dataset. I am creating XML into a CLOB and then using dbms_xlsprocessor to create the OS file. In terms of how big the OS file can be I am finding out that the limit is somewhere around 700mb. I also have been working with Oracle on a SR for over 4 months concerning a memory error when the size of the XML it too big. Oracle created Bug 16219864 for it. For by latest issue see: "xmlagg is executing slow on unicode database " in Oracle Discussion Forums/ XML /General XML.
The maximum possible size of data was 2,476,831 bytes (selecting 25 records at a time); but the core dumps recurred after stepping to a single record.
Once the database is in a state where it will core dump, it seems to core dump repeatedly on certain sets of data, but this is mysterious to me, because it runs through all the data multiple times without a problem before it gets into this state.
The maximum possible size of data was 2,476,831 bytes (selecting 25 records at a time); but the core dumps recurred after stepping to a single record.This is strange indeed. You're far from the limit I observed, 2MB is still a small amount of data.
I'm curious to know what Oracle Support has to say.
Another issue I used to have (and producing a core dump) occurred when using an ORDER BY clause in a SELECT supposed to return a single row and column (via XMLAgg).
Pretty innocuous at first sight but that query crashed the instance all the time, regardless of the volumetry. Removing the unnecessary ORDER BY clause solved the problem. Version was 10.2.0.5 back then.
What version of the database are you on? I am on 188.8.131.52.
I too have XML queries having large number of xmlagg functions (the most is 30). My problem is not size(as long as I keep the OS file below 700mb) it is speed on a unicode database. I also have used views to limit the number of joins and the size of the where clause which makes the XML query much easier to read.