Categories
- All Categories
- 15 Oracle Analytics Sharing Center
- 16 Oracle Analytics Lounge
- 216 Oracle Analytics News
- 43 Oracle Analytics Videos
- 15.7K Oracle Analytics Forums
- 6.1K Oracle Analytics Idea Labs
- Oracle Analytics User Groups
- 79 Oracle Analytics Trainings
- 15 Oracle Analytics Data Visualizations Challenge
- Find Partners
- For Partners
What is the Max size limit of Dataflow.?

While running Dataflow , I am getting error " Your Data size crossed teh 5GB limit".
MAX_ANALYTICS_FILE_SZ max limit is 5GB or customizable to higher limits?
Answers
-
HI @Rajakumar Burra ,
While the community can infer some information from your post, it is best practice to provide complete contextual details about your exact steps with the data flow.There are certain limits that are documented, and some that are embedded directly in the code itself.
"While running a Dataflow." is very vague.
Please provide more detail about your reproduction steps, so that community members can accurately respond.0 -
I have a dataflow and joined multiple tables with filters and expressions. Datasets rows and columns are more.
When I run dataflow , dataflow is failing with error message. Error message is "Data is exceeded the limit of 5000 MB".
Question is can we increase this size.?
0 -
https://support.oracle.com/epmos/faces/DocContentDisplay?id=2722590.1
For Oracle Analytics Cloud (OAC):
This quota cannot be changed and is managed by Oracle.
Generally speaking, it is a rare use-case that these limits would need to be changed. If they do, then it would be an Ideas Lab Request (enhancement).If your dataset is more than 5Gb, you should maybe question your usage: is that really the right tool for the job? Or maybe it's time to use what has been built to handle lot of data, something like a database for example.
OAC only has a "lightweight" ETL, don't abuse it as a full ETL to handle all your data moves, transformations and storage needs.
0 -
Dataset size is not 5GB. Due to complexity of filters & calculations, number of joins, size is going beyond limit.
0 -
The error message seems to have changed from your original posting.
In general, no the limit cannot be increased; however, there can be different behavior depending upon the datasource and whether automatically cached, or live. Whether saving to a database or dataset.
It is still unclear which step this is failing.. query, saving to a dataset, etc.
If you need more, please open a service request and provide screenshots of your data flow, the error message returned, and the exact step where it is failing, Support may also request further information.0