Categories
- All Categories
- 75 Oracle Analytics News
- 7 Oracle Analytics Videos
- 14K Oracle Analytics Forums
- 5.2K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 40 Oracle Analytics Trainings
- 59 Oracle Analytics Data Visualizations
- 2 Oracle Analytics Data Visualizations Challenge
- 3 Oracle Analytics Career
- 4 Oracle Analytics Industry
- Find Partners
- For Partners
Data Flow Caching for Enhanced Performance
I propose the implementation of a Data Flow Caching feature within Oracle Analytics Cloud. This function would allow users to cache the entire data flow, enabling each preview step to utilize cached data rather than rerunning the entire process each time.
Key Benefits:
- Improved Performance: By caching data at each step, users can significantly reduce processing time, leading to faster insights and decision-making.
- Resource Optimization: This feature would minimize the computational load on the system, allowing for more efficient use of resources and reducing costs associated with data processing.
- User Experience Enhancement: Users can interact with their data flows more fluidly, making it easier to iterate on analyses without the frustration of repeated long processing times.
This caching capability would not only streamline workflows but also empower users to focus on deriving insights rather than waiting for data to process.
Comments
-
In addition to being able to cache the entire data flow, we should be able to be in the middle of a data flow and have the option to run it to a certain step in the flow. If you have 10 steps in the flow and I worked on step 5, I want to run it to step 5, be able to check the data, then proceed with step 6. Then as I'm working on steps 6-10, it doesn't have to re-run steps 1-5 because those are already cached.
0 -
@Branden Pavol just during development … like a "save a temporary dataset at this point and proceed from here" type of thing? Would just sample data be enough?
A not great work around today would be to save that data flow and then start another one at step 6. But then you would have to put them into a sequence to run "End to end" and the middle check point would have a some limits which might not need to be materialized if steps 6-10 include aggregations, etc.0 -
@Bret Grinslade - Oracle Analytics-Oracle Yes, a save temporary dataset at this point would be good. Sample data would NOT be enough for me. I would be ok with being able to insert a save step and then being able to continue on beyond that step. Right now, as soon as you put a save step in, you can't do anything past that. Ideally, I would like to be able to run the entire dataset to that point and then do the following:
- check the data profile (like you can with datasets)
- open it in a DV workbook so I can check for abnormalities and data statistics
- go back to the data flow and pick up where I left off
0