Don't go there: it's not how you design an efficient system on a relational DB.
Even the article you quote says
"A major downside of EAV is its lower efficiency when retrieving data in bulk in comparison to conventional structure. In EAV model the entity data is more fragmented and so selecting an entire entity record requires multiple table joins. What is more, when working with large volumes of EAV data, it can be necessary to transiently or permanently convert between columnar and row/EAV -modelled representation of the same set of data. The operation can be error-prone and CPU intensive."
and you won't know whether to cry or laugh.
Now, I'm not an XML expert but it is possible to store XML data inside a column of type XMLTYPE which has advantages like optimized handling of the XML and indexing
Thanks for the response.
Totally agree that - complex join queries would need to be written.
Let me give some background details - our system is an even based messaging system that uses JBPM for orchestrating the events. Oracle is used in the back end.
Each time the message in workflow enters the compareTrade node of workflow, we hit the database and return a 'heavy-weight' resultSet - that's converted to XML Dom objects
and we do the matching and comparing. Issue we recently faced is - if volume of trades is high - JVM Heap is not able to withstand the load.
Currently we iterate through the DOM objects and match values at xPath (not ideal).
So the solution we are thinking is have a EAV model in schema, normalize the CLOB into EAV model.
perform the 'basic' set of matching in Oracle via Query and return 'filtered' set of results - to do a 2nd round of matching.
The 2nd round of matching will be light weight - since majoring number of Trades would have been filtered.
Also - filtering done in Oracle will be 'fixed' i.e. clients don;t want to change the rules, while the 2nd set of filtering 'should' not be at database level - since the matching rules will be configurable at run time.
and client may decide to "deploy the matching rules at run-time"
So breaking down the XML into EAV seems to be better approach.
I must admit that I sent my reply before I had finished reading the entire article.
I am not an Oracle expert but I am sure a lot must have changed since Oracle 7 in terms of handling large volume of data in a single table.
Does anything in the docs relating to XMLTYPE help you?
Yes, read about XMLTYPE. Looks like Oracle has good support for XQueries. That looks promising.
Only catch is - we sell our product as deploy-able in any SQL Dastabase. Will need to check if I get enough XML support in SQL Server as well.
This "solution" will scale as well as my goat can fly.
You will soon discover that you are inside a deep, dark hole without any ladder to extricate yourself.
> normalize the CLOB into EAV model.
Isn't this a contradiction in terms, aka oxymoron :-)
It's mainly not about the amount of the data but about the mess in your queries you will sooner or later get when you use EAV.
I used EAV only once and only because any other solution would have been too academic for simple, seldom used and low volume data, where I knew that the data is accessed only in a single, previously known way.
XML in a XMLTYPE column can be searched/filtered quite easily. Though I don't know your requirements for matching rules will be configurable at run time
AFAIK they have something similar in SQL server.
It's not very clear (at least to me) whether or not the EAV approach might be appropriate in your case.
Just take a look at
As our italian neighbours say: Uomo avvisato, mezzo salvato. http://translate.google.com/#it/en/Uomo%20avvisato%2C%20mezzo%20salvato