Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 16 Oracle Analytics Lounge
- 216 Oracle Analytics News
- 43 Oracle Analytics Videos
- 15.7K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 79 Oracle Analytics Trainings
- 15 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
OAS Connection Pool Databricks [nQSError: 16003] Column description failed

Hi, created a connection pool with Databricks and I am able to import physical table (import metadata in physical layer) successfully in RPD. However when trying to update row count/view data for any table, encountered issue with " [nQSError: 16003] Column description failed.". can you please suggest on this what could be the issue or resolution for this.
Best Answer
-
The latest client tools are updated with the quarterly patch releases.
Critical Patch Update (CPU) Advisor For Oracle Analytics Server and Oracle Business Intelligence - Updated for January 2025 (Doc ID 2832967.2) > Analytics Server (OAS)tab
> Analytics Server 2024subtab
> see the commentsClick the "bug" icon to see the OAS bundle patch readme bug fix list, including how to update the Model Administration client tool in section 3.3…
Section 3.3 Update on Oracle Analytics Developer Client Tool for OAS BundlePatch 7.6
.0.0.241220
Oracle recommends you to use the Oracle Analytics Developer Client Tool installer included in the bundle patch to install the updated Oracle Analytics Developer Client Tool.
Download the Patch
37419174
from My Oracle Support for Oracle Analytics Developer Client Tool installer available in the January 2025 bundle patch
Databricks is not specifically tested as certified, but generally speaking generic JDBC/ODBC can connect.
Try using JDBC
The first step is to create the JDBC connection in Weblogic according to documentation.
Please be aware Oracle does not provide the needed jar file and must be procured through the vendor.- Install the JAR (link to doc)
- Create JDBC (JNDI) (link to doc)
- Load Data Sources (link to doc)
The second step is to configure the RPD Connection Pool.
It is important to note that Databricks is similar to "Apache Spark SQL" by design.
When creating the Database in the Physical layer of the RPD, be sure to select "Apache Spark SQL" as the database type and not any other option.1
Answers
-
@User_LTL5P What exact version of which product are you using? And which version of the Administration tool?
You are tagging both "OBIEE 12c" and "OAS" in your post which is contradictive and confusing. OBIEE 12c versions were release between August 2010 and April 2018. OAS versions are being released since January 2020.
1 -
Thank you for the response. It is OAS (2024) and Admin tool 12.2.1.4.0
0 -
Hi,
OAS 2024, also known as OAS 7.6, does comes with the Admin tool version 12.4.2.0.0.
You should try to at least use products of the same versions.
Not saying that this is your issue…
What kind of database type did you select for your Databricks connection? The import of metadata maybe worked, but doesn't mean that the RPD is configured correctly to speak the "dialect" of your source. Also, viewing data or row count in the RPD isn't something required, you should try to see if an analysis with your source works.
1 -
@Gianni ceresa, Thank you for your response. used apache spark sql (ODBC) for connection. please share any points/reference to be consider to "correctly to speak the "dialect" of databricks source" I am trying to test the connectivity with the source (offline) before RPD upload.
0 -
As Gianni correctly stated and as hinted towards in my comment: Please first make sure that you are using an Admin Tool version that actually goes with the OAS version you use and with the functionality you're after.
0 -
Not sure you can do much.
ODBC is a very generic and almost every source has its own little special things. The RPD supports generic ODBC and that's what it does try to speak with the sources.
That's why I was saying to directly see if there is some data coming through when doing an analysis, because doing a row count in the RPD doesn't give you much, while if no data come through in your analysis, it does make your whole effort pointless.
I imagine you had a look in MOS to see if there was anything specific for Databricks and/or Apache Spark (ODBC)?
Also, because you are on OAS 2024, you do have the Semantic Modeler available. Based on
Apache Spark SQL can be defined as source in the semantic modeler, but not in the RPD via the Admin tool (and that's why you used an ODBC connection).3