Categories
- All Categories
- Oracle Analytics and AI Learning Hub
- 43 Oracle Analytics and AI Sharing Center
- 19 Oracle Analytics and AI Lounge
- 281 Oracle Analytics and AI News
- 57 Oracle Analytics and AI Videos
- 16.2K Oracle Analytics and AI Forums
- 6.4K Oracle Analytics and AI Labs
- Oracle Analytics and AI User Groups
- 106 Oracle Analytics and AI Trainings
- 20 Oracle Analytics and AI Challenge
- Find Partners
- For Partners
Support DDLs in External Catalogs
Currently, in order to write data to an external catalog configured in AIDP, there needs to be a pre-existing table defined in the target database (i.e. DDL operations are not supported in configured external catalogs through spark API).
This creates some inefficiencies in managing your gold layer, since some type of step in between defining your silver layer and loading to your gold layer (assuming you maintain gold in something like ADW) requires you to step out of the AIDP APIs and create a DDL "manually".
Although a customer might be able to automate this through things like the ORACLEDB package, having the spark catalog abstractions would make it significantly easier process to do something like df.write.mode("overwrite").saveAsTable('tablenamehere'), which would infer source spark data types, structure, etc and map to DDL to both generate/update the DDL and handle the DML operations together, rather than handle DDL separately.