Skip to Main Content

ORDS, SODA & JSON in the Database

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

POST(insert sql) returns 200 response but all values in DB appears to be null

3505905Dec 4 2019 — edited Dec 5 2019

Hi expert,

I am working on ORDS with Autonomous Data Warehouse and GET,DELETE methods are working as expected.

POST(Insert sql) with JSON body also went through successfully with 200 response as follows:

POST.PNG

However, all values in DB appears to be null.

GET.PNG

The following is my sql as ORDS handler.

insert into ADB_DEMO.LOANS values ( :Loan_ID,

:State_Code,

:Loan_Amount,

:Term,

:Interest_Rate,

:Grade,

:Sub_Grade,

:Score_FICO,

:Employment_Length,

:Home_Ownership,

:Annual_Income,

:Is_Income_Verified,

:Issue_Date,

:Loan_Status,

:Purpose,

:Product,

:Debt_To_Income,

:Last_24_months,

:Inquiries_6_months,

:Current_Credit_Lines,

:Total_Credit_Lines,

:Total_Payments,

:Total_Principal,

:Last_Payment_Date,

:Last_Payment_Amount)

The following JSON is my body:

{

  "loan\_id": "123456",

  "state\_code": "AZ",

  "loan\_amount": "1234",

  "term": "36",

  "interest\_rate": "0.11",

  "grade": "C",

  "sub\_grade": "B0",

  "score\_fico": "123",

  "employment\_length": "10+ years",

  "home\_ownership": "Rent",

  "annual\_income": "24000",

  "is\_income\_verified": "Verified",

  "issue\_date": "2011-12-16T00:00:00Z",

  "loan\_status": "Current",

  "purpose": "Credit Card",

  "product": "Credit card",

  "debt\_to\_income": "0.28",

  "last\_24\_months": "0",

  "inquiries\_6\_months": "1",

  "current\_credit\_lines": "3",

  "total\_credit\_lines": "9",

  "total\_payments": "1234.71",

  "total\_principal": "1234.18",

  "last\_payment\_date": "2014-10-01T00:00:00Z",

  "last\_payment\_amount": "123.87"

}

desc my tale looks like this.

Name Null? Type

-------------------- ----- ------------

LOAN_ID NUMBER(10)

STATE_CODE VARCHAR2(8)

LOAN_AMOUNT NUMBER(10)

TERM NUMBER(4)

INTEREST_RATE NUMBER(4,2)

GRADE VARCHAR2(4)

SUB_GRADE VARCHAR2(4)

SCORE_FICO NUMBER(4)

EMPLOYMENT_LENGTH VARCHAR2(10)

HOME_OWNERSHIP VARCHAR2(10)

ANNUAL_INCOME NUMBER(10)

IS_INCOME_VERIFIED VARCHAR2(20)

ISSUE_DATE DATE

LOAN_STATUS VARCHAR2(20)

PURPOSE VARCHAR2(40)

PRODUCT VARCHAR2(20)

DEBT_TO_INCOME NUMBER(4,2)

LAST_24_MONTHS NUMBER(4)

INQUIRIES_6_MONTHS NUMBER(4)

CURRENT_CREDIT_LINES NUMBER(4)

TOTAL_CREDIT_LINES NUMBER(10,2)

TOTAL_PAYMENTS NUMBER(10,2)

TOTAL_PRINCIPAL NUMBER(10,2)

LAST_PAYMENT_DATE DATE

LAST_PAYMENT_AMOUNT NUMBER(10,2)

Can you please shed a light on what was missing?

Thanks for your time in advance.

Best Regards,

Jhan

This post has been answered by 3505905 on Dec 5 2019
Jump to Answer

Comments

715399
Answer
Hi,

It looks like incremental inference is indeed being used. Note that there is some overhead involved with doing incremental and non-incremental inference in general, so even with small models it might take a few seconds to finish. Some of the overhead is caused by the large number of rules in the OWLPRIME rulebase, so if you don't need all of them you can selectively disable some components using the GraphOracleSem.performInference(String components) procedure.

Also, it depends on your dataset. For instance, if you're adding only one triple, but that triple declares some heavily used property to be transitive, then that addition might trigger many additional inferences and updating the inferred graph will take more time.

Regarding OntModel APIs and incremental inference, depends on the loading method you use. Incremental inference works best with incremental loading. Please refer to Section 2.2.9 of the Semantic Technologies Developer's Guide for more details.

Cheers,
Vladimir
Marked as Answer by 696067 · Sep 27 2020
696067
That is very strange. I am basically inserting 'd' into a 5-level-deep tree of Transitive links. The tree has around 5300 nodes but 5000 of them are leaves.
It takes 5 seconds for the inference to be done whether I insert one leaf like 'd' or 100 of them.
By doing a performInference with only SCOH, SPOH, and TRANS it goes down to 3 seconds (further removing TRANS goes down to 2 seconds but I absolutely need to use TRANS).

How come it takes the same 3-5 seconds to infer 5 transitive links and 500 of them?
715399
Hi,

Can you let us know:
a) how large is the asserted dataset and how many triples are generated with inference (from scratch) ?
b) what is the performance target for the incremental inference calls?

Regarding b), if your inference performance target is on the order of milliseconds, you might want to try out PelletDB [1].

Cheers,
Vlad

[1] http://clarkparsia.com/pelletdb/
696067
Thanks Vlad, it looks like in-memory forward-chaining is more of what I am looking for in this regard.
As to your questions:
1) I basically have a 5 level deep tree where every node is transitive to the root, with 5000 leaves and nodes of 200,100,30,1 respectively which to my understanding is 15530 transitive links.
2) My performance target is indeed in the milliseconds but that's when it comes to adding 1-50 leaves.

Thank you for your answers.
Alexi
1 - 4

Post Details

Added on Dec 4 2019
2 comments
784 views