Skip to Main Content

Database Software

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

incremental inferencing questions

696067Nov 13 2009 — edited Nov 18 2009
I am very confused about how to make incremental inferencing work. Here is what I tried in java code:

Attachment attachment = Attachment.createInstance(new String[]{}, "OWLPRIME",
InferenceMaintenanceMode.UPDATE_WHEN_COMMIT,
QueryOptions.ALLOW_QUERY_INCOMPLETE);

graph = new GraphOracleSem(oracle, "mc_model", attachment);
graph.setInferenceOption("INC=T");

The trouble is, when I try to do something like this:

ModelOracleSem oracle_sem_model = new ModelOracleSem(graph);
OntModel model = ModelFactory.createOntologyModel(OntModelSpec.OWL_MEM, oracle_sem_model);
Individual d = model.createInstance(NS+"New_Individual");
parentIndividual.addProperty(transitiveProp, d);
model.commit();
graph.commit();

The command graph.commit() takes around 5 seconds, the same amount of time it if the entailments are created without incremental inferencing. Is there something I am doing wrong? Can I use the OntModel APIs together with Incremental inferencing or do I have to manually add the statements to the model? BTW, I checked the SEM_APIS change tracking property and it seems enabled and has a timestamp

Edited by: alexi on Nov 13, 2009 7:18 AM
This post has been answered by 715399 on Nov 13 2009
Jump to Answer

Comments

715399
Answer
Hi,

It looks like incremental inference is indeed being used. Note that there is some overhead involved with doing incremental and non-incremental inference in general, so even with small models it might take a few seconds to finish. Some of the overhead is caused by the large number of rules in the OWLPRIME rulebase, so if you don't need all of them you can selectively disable some components using the GraphOracleSem.performInference(String components) procedure.

Also, it depends on your dataset. For instance, if you're adding only one triple, but that triple declares some heavily used property to be transitive, then that addition might trigger many additional inferences and updating the inferred graph will take more time.

Regarding OntModel APIs and incremental inference, depends on the loading method you use. Incremental inference works best with incremental loading. Please refer to Section 2.2.9 of the Semantic Technologies Developer's Guide for more details.

Cheers,
Vladimir
Marked as Answer by 696067 · Sep 27 2020
696067
That is very strange. I am basically inserting 'd' into a 5-level-deep tree of Transitive links. The tree has around 5300 nodes but 5000 of them are leaves.
It takes 5 seconds for the inference to be done whether I insert one leaf like 'd' or 100 of them.
By doing a performInference with only SCOH, SPOH, and TRANS it goes down to 3 seconds (further removing TRANS goes down to 2 seconds but I absolutely need to use TRANS).

How come it takes the same 3-5 seconds to infer 5 transitive links and 500 of them?
715399
Hi,

Can you let us know:
a) how large is the asserted dataset and how many triples are generated with inference (from scratch) ?
b) what is the performance target for the incremental inference calls?

Regarding b), if your inference performance target is on the order of milliseconds, you might want to try out PelletDB [1].

Cheers,
Vlad

[1] http://clarkparsia.com/pelletdb/
696067
Thanks Vlad, it looks like in-memory forward-chaining is more of what I am looking for in this regard.
As to your questions:
1) I basically have a 5 level deep tree where every node is transitive to the root, with 5000 leaves and nodes of 200,100,30,1 respectively which to my understanding is 15530 transitive links.
2) My performance target is indeed in the milliseconds but that's when it comes to adding 1-50 leaves.

Thank you for your answers.
Alexi
1 - 4
Locked Post
New comments cannot be posted to this locked post.

Post Details

Locked on Dec 16 2009
Added on Nov 13 2009
4 comments
2,555 views