Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 15 Oracle Analytics Lounge
- 214 Oracle Analytics News
- 43 Oracle Analytics Videos
- 15.7K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 78 Oracle Analytics Trainings
- 14 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
Using Custom Machine Learning Model fails in DV with ORA-06512: at "SYS.XMLTYPE"

OAS 2024Installed OML on Oracle database 19cDeveloped an SVM model using OML4Py client in Redhat 7We are able to use the model to predict values.Exported the model to SQL then registered to DV.
Created a Data Flow to apply the model and to save predicted values as a new dataset.The data flow fails to complete with error message stating that [nQSError: 17001] Oracle Error code: 6502, message: ORA-06502: PL/SQL: numeric or value error: character string buffer too small ORA-06512: at "SYS.XMLTYPE", line 169 at OCI call OCIStmtFetch.
The dataset that the model is being applied to has the following characteristics:
PE - VARCHAR2(10) -- max(length()): 10BLI - VARCHAR2(50) -- max(length()): 27BLI TITLE - VARCHAR2(200) -- max(length()): 62LFD_PROGRAM - VARCHAR2(255) -- max(length()): 0PGM_ID - VARCHAR2(4) -- max(length()): 4PGM_ID_TITLE - VARCHAR2(80) -- max(length()): 80PGM_CATEGORY_1 - VARCHAR2(100) -- max(length()): 25PGM_CATEGORY_2 - VARCHAR2(100) -- max(length()): 41PGM_CATEGORY_3 - VARCHAR2(100) -- max(length()): 54PGM_DEFINITION - VARCHAR2(4000) -- max(length()): 808PE_LONG_TITLE - VARCHAR2(200) -- max(length()): 105PE_DEFINITION - VARCHAR2(4000) -- max(length()): 3662LEAD_JCA - VARCHAR2(4000) -- max(length()): 29DMC_TITLE - VARCHAR2(200) -- max(length()): 55FIC_TITLE - VARCHAR2(85) -- max(length()): 66
As Noted above, none of the rows contain data that is larger than the column size
Full Stack Track:
Caused By: oracle.bi.tech.bips.fwk.exception.BIPSSoapException: Code: OY6IVZ6M.Error occurred while processing the query.Code: OPR4ONWY. Error during query processing (SQLExecDirectW).Code: U9IM8TAC. BISCode: 10058.State: HY000. Code: 10058. [NQODBC][SQL_STATE: HY000] [nQSError: 10058] A general error has occurred.(HY000)Code: U9IM8TAC. BISCode: 43275.State: HY000. Code: 43275.[nQSError: 43275] Message returned from OBIS[ecid:006BwCNKYeQ5yWmY8pI7VT0000а3000I91,0:1:1:9:1:6ts: 2025-03-07T11:27:03.454-05:00].(HY000)Code: U9IMTAC. BISCode: 17001. State: HY000. Code: 17001.[nQSError: 17001] Oracle Error code: 6502, message: ORA-06502: PL/SQL: numeric orvalue error: character string buffer too smallORA-06512: at "SYS.XMLTYPE", line 169at OCI call OCIStmtFetch.(HY000)Code: U9IMTAC. BISCode: 17012. State: HY000. Code: 17012.[nQSError: 17012] Bulk fetch failed . (HY000)Code: 0I2DL65P. SQL Issued:SET VARIABLEOBIS_IS_DATAFLOW_QUERY=1,OBIS_DATAFLOW_PREVIEW_COMPLETE=1,DISABLE_PLAN_CACHE_HIT=1;select"Prediction" as "Prediction","PredictionProbability" as "predictionProbability","PGM_ID" as "PGM_ID","PGM_DEFINITION" as "PGM_DEFINITION","PE_DEFINITION" as "PE_DEFINITION","BLI_TITLE" as "BLI_TITLE","FIC_TITLE" as "FIC_TITLE","PGM_CATEGORY_2" as "PGM_CATEGORY_2","PE_LONG_TITLE" as "PE_LONG_TITLE","DMC_TITLE" as "DMC_TITLE","LEAD_JCA" as "LEAD_JCA","PGM_CATEGORY_1" as "PGM_CATEGORY_1","PGM_ID_TITLE" as "PGM_ID_TITLE","BLI" as "BLI","PGM_CATEGORY_3" as "PGM_CATEGORY_3","PE" as "PE","LFD_PROGRAM" as "LFD_PROGRAM"fromEVALUATE_TABLE_SCRIPT(user.name@mail.com.<view_name>,'is_splitted_details=true;is_for_set=true;cost_model_auto=true;includeInputColumns=TRUE;extendedDataset=NONE;targetForLift=;__isApplyModel=true;outputColumnsSize=2;outputColumnKey0=Prediction;outputColumnValue0=Prediction;outputColumnKey1=PredictionProbability;outputColumnValue1=PredictionProbability;',Select"PGM_ID"as "PGM_ID","PGM_DEFINITION" as "PGM_DEFINITION","PE_DEFINITION" as "PE_DEFINITION","BLI TITLE" as "BLI_TITLE","FIC_TITLE" as "FIC_TITLE","PGM_CATEGORY_2" as "PGM_CATEGORY_2","PE_LONG_TITLE" as "PE_LONG_TITLE","DMC_TITLE" as "DMC_TITLE","LEAD_JCA" as "LEAD_JCA","PGM_CATEGORY_1" as "PGM_CATEGORY_1","PGM_ID_TITLE" as "PGM_ID_TITLE","BLI" as "BLI","PGM_CATEGORY_3" as "PGM_CATEGORY_3","PE" as "PE","LFD_PROGRAM" as "LFD_PROGRAM"from (select"PE" as "PE","BLI" as "BLI","BLI TITLE" as "BLI_TITLE","LFD_PROGRAM" as "LFD_PROGRAM","PGM_ID"as "PGM_ID","PGM_ID_TITLE" as "PGM_ID_TITLE","PGM_CATEGORY_1" as "PGM_CATEGORY_1","PGM_CATEGORY_2" as "PGM_CATEGORY_2","PGM_CATEGORY_3" as "PGM_CATEGORY_3","PGM_DEFINITION" as "PGM_DEFINITION","PE_LONG_TITLE" as "PE_LONG_TITLE","PE_DEFINITION" as "PE_DEFINITION","LEAD_JCA" as "LEAD_JCA","DMC_TITLE" as "DMC_TITLE","FIC_TITLE" as "FIC_TITLE"from(select"V_LFD_PROGRAM_XT_REF"."PE" as "PE","V_LFD_PROGRAM_XT_REF"."BLI" as "BLI","V_LFD_PROGRAM_XT_REF"."BLI TITLE" as "BLI_TITLE","V_LFD_PROGRAM_XT_REF"."LFD_PROGRAM" as "LFD_PROGRAM","V_LFD_PROGRAM_XT_REF"."PGM_ID"as "PGM_ID","V_LFD_PROGRAM_XT_REF"."PGM_ID_TITLE" as "PGM_ID_TITLE","V_LFD_PROGRAM_XT_REF"."PGM_CATEGORY_1" as "PGM_CATEGORY_1","V_LFD_PROGRAM_XT_REF"."PGM_CATEGORY_2" as "PGM_CATEGORY_2","V_LFD_PROGRAM_XT_REF"."PGM_CATEGORY_3" as "PGM_CATEGORY_3","V_LFD_PROGRAM_XT_REF"."PGM_DEFINITION" as "PGM_DEFINITION","V_LFD_PROGRAM_XT_REF"."PE_LONG_TITLE" as "PE_LONG_TITLE","V_LFD_PROGRAM_XT_REF"."PE_DEFINITION" as "PE_DEFINITION","V_LFD_PROGRAM_XT_REF"."LEAD_JCA" as "LEAD_JCA","V_LFD_PROGRAM_XT_REF"."DMC_TITLE" as "DMC_TITLE","V_LFD_PROGRAM_XT_REF"."FIC_TITLE" as "FIC_TITLE"from XSA(user.name@mail.com.<view_name>)) InputDataset_0)ApplyModel_0_Internal) FETCH FIRST 30 ROWS ONLY
January 2025 CPU patch has been applied
Answers
-
Thanks @User_27SWQ for the question, can you confirm you are able to run the model (score) using SQL statement directly in the database ?
Thanks
0 -
YES, the model functions properly in Python and SQL.
0 -
Thanks @User_27SWQ , let us check internally and get back to you asap. I suspect the varchar(4000) fields may be a reason for the issue but need to check. I assume this size is required by your data, right ?
0 -
It's hard to say what's causing the issue without looking at the log. When the dataflow is executed, the OAS server issues a long SQL to get all the prediction related output which includes the actual prediction but also several other prediction details. These prediction details are stored as XML values in the database which is parsed by the OAS server for better readability and there may be a problem with this part of the SQL.
One option is to look at obis1-query log and get the actual SQL being executed and try it manually on the database to see where it fails.0 -
varchar2(4000) is needed. there are a few records with greater than 3600 characters.
We've tried stripping off all non-printable characters including carriage returns and tabs using.
update tablex
set pe_definition=regexp_replace(pe_definition,'[^'||chr(1)||'-'||chr(127)||']')
where regexp_like(pe_definition, '[^'||chr(1)||'-'||chr(127)||']')
then discovered that these left chr(13), chr(10), and chr(9).
So I excluded those as well and tried. but it did not make a difference. I still get the error shown below.
obis1-query.log says the following.[2025—date time--] [OBIS] [TRACE:2] [USER-114] [] [ecid: — rando uuid —] [sik: ssi] [tid: —somevalue—] [messageId: USER-114] [requestid: —session value—] [sessionid: —session value—] [username: —me—] ——— Asynchronous job status check info: Job id =—other rando uuid —, status: Failed, extendedMessage: [nQSError: 17011] SQL statement execution failed. [[[nQSError: 17001] Oracle Error code: 6502, message: ORA-06502: PL/SQL: numeric or value error: character string buffer too small
ORA-06512: at "SYS.XMLTYPE", line 169
at OCI call OCIStmtExecute.
[nQSError: 43224] The Dataflow "—my dataflow name—" failed during the execution.
[nQSError: 43204] Asynchronous Job Manager failed to execute the asynchronous job.
]]
That is all the details there is for this event.0 -
I didn't fully read, therefore I'm maybe totally out of scope…
The Oracle Database error you get is generally raised when your actual value is larger than the definition.
If you defined a VARCHAR2(4000) but have a value that is more than 4000, you will get that error.
I imagine 2 possible solutions: make your value small, maybe adding a substring around to only take a size of 4000 if it's larger, or make that definition larger. In the Oracle database, if you enable extended data types, your varchar2 can go up to 32767 bytes (there are pros and cons in enabling extended data type, talk with your DBA if needed), that should give you more than enough space.
An alternative would be to then change data type, switch to a CLOB or something like that.
0