Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 15 Oracle Analytics Lounge
- 208 Oracle Analytics News
- 41 Oracle Analytics Videos
- 15.7K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 76 Oracle Analytics Trainings
- 14 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
Is there a best practice process for making changes to datasets without breaking existing workbooks?

Hi everyone! We are relatively new to OAC and are trying to determine if there is a best practice to making changes to DataSets without impacting existing workbooks that use these datasets. Without version control built into OAC (yet), we have been looking at less traditional development / maintenance approaches.
Currently when we are making changes to a Dataset(e.g. DS_Employee), we "Save As" to a new DataSet name (e.g. DS_Employee_new). We make our changes to DS_Employee_new, and when ready, we delete the original DataSet (DS_Employee), open our modified DataSet (DS_Employee_new), and "Save As" to the original DataSet name (DS_Employee). Using this approach, we find that we can maintain the original Object_ID of the DataSet, and workbooks using the dataset are not broken (assuming we haven't changed any fields used in the workbook of course).
While this approach seems to work for us, it seems very cumbersome and error prone and hoping that others have perfected a more elegant method for maintaining their datasets!
Looking forward to everyone's thoughts!
Answers
-
Hi @OAldrich -depending on how many workbooks depend on that dataset- a feature that may help you in that process is the dataset-replace capability. For a given workbook, you can fully repoint it from the original dataset it's using to another distinct dataset as long as you are able to map each column consumed by the workbook. See this -rather old- video about it : https://youtu.be/2cvmQZItCHE
This is valid mostly if you don't have too many workbooks to update, but may help in your process too.
0 -
thanks very much @Philippe Lions-Oracle ! While we don't have many DataSets and dependent Workbooks (yet), we are hoping to not have to modify each reliant workbook and "replace" the DataSet. Our goal would be to update the DataSet and allow dependent workbooks continue to function as normal.
Using our example provided, we do test the new DataSet (DS_Employee_new) against a copy of the existing production Workbooks by using the replace dataset function and it seems to work very well! Thank you for confirming the validity of this function as we weren't certain if we had interpreted its usage correctly!
With our current approach to DataSet modifications, we are finding ourselves retaining prior versions of Datasets for safe keeping until we archive them (e.g. DS_Employee_v1, DS_Employee_v2, etc). Hopefully there is a upcoming feature where we can organize DataSets like we can with Workbooks using folders etc.
0 -
Your current solution is what I call "cheating" as you play on something that could be seen as a weakness in constraints in the tool. What I mean is that by deleting and renaming you get an ID which wasn't supposed to be given to you as it was used in the past (even if you deleted that dataset) and there are even objects depending on it. This process could also break at any time Oracle decide to enforce some constraints and your rename will generate a new ID instead of re-creating the same as the original dataset you deleted.
The answer to your question is the same as "what is the good practice to make changes to a database design".
The solution Philippe mentioned, remapping an existing workbook to the new version of the dataset would be cleaner, because you cut the implicit dependency you currently have: maintain all the existing columns in a dataset exactly as they are, because if you rename one or remove it, you break the existing references and your trick isn't working anymore. Currently you are kind of doing only incremental changes on datasets that should be considered read-only: you can extended them, but not really modify them.
I would say that a good practice would be to have tools in place to evaluate impact analysis when you change a dataset, and then you have to handle the changes.
Is there an API to modify the code of workbooks or dataflows based on the changes you are doing? Yes there are ways to do the job, I just don't believe it is public APIs (yet).
All in all, a dataset should be seen "like" a database, the good practice for database changes and development could work well, even if you don't have the same tools available.
What you ask, and hope that would exist, is the equivalent of EBR in the Oracle database (Edition-based redefinition): the same object exists with multiple structure in parallel because they are different versions, and each application can select the version (the structure) of the object it is using. This feature doesn't exist in OAC, and I wouldn't expect to see it any soon (or never).
3 -
Thanks @Gianni Ceresa , about the needed catalog API being public or not (yet), it's better to contact PM directly on this - I don't have the exact info about it. Just email me if you want to follow up on this and I will circulate.
One more note, about your remark on '...have tools in place to evaluate impact analysis when you change a dataset...' , we used to have this with catalog manager for Classic, and for DV (self service), there is currently an extension (a plugin) that helps you get there, see this video https://youtu.be/QWi06ivvcDI (bit old, but concept remains). The extension is available on the OA library here https://www.oracle.com/business-analytics/data-visualization/extensions/.
2 -
@Philippe Lions-Oracle Hi Phillipe, I know, that this product area is probably not within your responsibility, but it is very, very overlooked by Product Mgtmt/Development (I am trying to stress this on many places, created several Ideas, contributing with discussions within other Ideas, but so far without any impact...).
Without
a) being able to perform impact analysis (if I change/delete some object in DV, which other objects will be influenced by this action ?)
b) having good tools in place (part of the product) supporting DEV2PROD processes (or automation of DevOps processes within DV in general)
DV cannot be perceived as "enterprise wide" analytics tool (Classics is possessing all above mentioned).
This plugin, you are mentioning , is just very poor attempt to fill the gaps, product (DV) as such is having since it's launch (and this is for quite long time, which could be used to fill those gaps, but apparently this is not the priority). So I am again stressing this, hoping, that it will finally gain attention.
Thanks
Michal
3 -
Thanks @Philippe Lions-Oracle ,
With "tools" I was thinking at that extension but couldn't remember the name and staid "vague" (also because I don't use that extension as I developed my ways to have access to everything in OAC/OAS over time...)
That extension should become part of the product by default, it's one of the minimum required tools for administrators at least, and also developers as the use case exposed above shows.
1 -
Thanks @Philippe Lions-Oracle for the DV Governance extension. However, the video mentions a useful DV project (DV Objects Governance.dva) we can´t find in the library. Any help with this, please?
0 -
You are correct @AGilabert, sorry about this, over time we had to remove the prebuilt projects as the data structure itself for maintenance/legal reasons related to data itself. However, once you get the dataset outputs from the plugin itself, it should be fairly straightforward to query these with OA, let us know if you need help with this last step, we will be happy to assist in your case.
1 -
Great feedback everyone! We're certainly looking forward to built-in tools within OAC to help with maintaining our environment.
Given the challenges identified with making changes to a productionized dataset, does anyone have a solid method for making a backup of a dataset that could be 'restored' in the event that changes need to be undone?
1