Oracle Analytics Cloud and Server

Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture

What is the Max size limit of Dataflow.?

Received Response
201
Views
5
Comments

While running Dataflow , I am getting error " Your Data size crossed teh 5GB limit".

MAX_ANALYTICS_FILE_SZ max limit is 5GB or customizable to higher limits?

Answers

  • HI @Rajakumar Burra ,

    While the community can infer some information from your post, it is best practice to provide complete contextual details about your exact steps with the data flow.

    There are certain limits that are documented, and some that are embedded directly in the code itself.

    "While running a Dataflow." is very vague.
    Please provide more detail about your reproduction steps, so that community members can accurately respond.

  • Rajakumar Burra
    Rajakumar Burra Rank 6 - Analytics Lead

    I have a dataflow and joined multiple tables with filters and expressions. Datasets rows and columns are more.

    When I run dataflow , dataflow is failing with error message. Error message is "Data is exceeded the limit of 5000 MB".

    Question is can we increase this size.?

  • Gianni Ceresa
    edited Apr 2, 2024 2:26PM

    https://support.oracle.com/epmos/faces/DocContentDisplay?id=2722590.1

    For Oracle Analytics Cloud (OAC):

    This quota cannot be changed and is managed by Oracle.
    Generally speaking, it is a rare use-case that these limits would need to be changed.  If they do, then it would be an Ideas Lab Request (enhancement).

    If your dataset is more than 5Gb, you should maybe question your usage: is that really the right tool for the job? Or maybe it's time to use what has been built to handle lot of data, something like a database for example.

    OAC only has a "lightweight" ETL, don't abuse it as a full ETL to handle all your data moves, transformations and storage needs.

  • Rajakumar Burra
    Rajakumar Burra Rank 6 - Analytics Lead

    Dataset size is not 5GB. Due to complexity of filters & calculations, number of joins, size is going beyond limit.

  • The error message seems to have changed from your original posting.

    In general, no the limit cannot be increased; however, there can be different behavior depending upon the datasource and whether automatically cached, or live. Whether saving to a database or dataset.

    It is still unclear which step this is failing.. query, saving to a dataset, etc.

    If you need more, please open a service request and provide screenshots of your data flow, the error message returned, and the exact step where it is failing, Support may also request further information.