Oracle Analytics Forum

Welcome to the Oracle Analytics Community: Please complete your User Profile and upload your Profile Picture

Dataflow Failing with “User Exceeded Per File or Overall Quota Limit” Despite Low Storage Usage

Accepted answer
142
Views
12
Comments

Hi,

We are facing repeated failures when executing a Dataflow in Oracle Analytics Cloud (OAC).


The error message is as follows:

[nQSError: 46267] Failed to stream data in chunk through HTTP request - {"prefix":"DSS","code":400301,"message":"User exceeded per file or overall quota limit","subMessage":"User has exceeded maximum allowed quota limit"}[nQSError: 46275] Failed to send the last data chunk through HTTP request to Server.[nQSError: 43232] Failed to populate data through HTTP request.[nQSError: 43267] Failed to finish parquet data chunked transfer in HTTP request.[nQSError: 43224] The Dataflow "Candidate Conversion 12th Mar_Current status" failed during execution.[nQSError: 43204] Asynchronous Job Manager failed to execute the asynchronous job.

However, the user who has utilized s only 6 GB of data usage out of the 50 GB default user quota, so I think the quota should not have been exceeded.

Any suggestions to fix this issue?

Thanks in advance.

Regards,

Subhakara Netala

Tagged:

Best Answers

  • Brendan T
    Brendan T Rank 6 - Analytics & AI Lead
    Answer ✓

    Check if your dataflow is attempting to process more than five million rows from a single source. If it is, you may need to filter the data earlier in the flow or split the processing into multiple stages.

  • Subhakara Netala-Oracle
    Subhakara Netala-Oracle Rank 6 - Analytics & AI Lead

    @Chere-Oracle :

    We have split this into multiple dataflows and created a union report at the Analysis and Workbook level.

«1

Answers