Categories
- All Categories
- Oracle Analytics and AI Learning Hub
- 51 Oracle Analytics and AI Sharing Center
- 19 Oracle Analytics and AI Lounge
- 288 Oracle Analytics and AI News
- 57 Oracle Analytics and AI Videos
- 16.3K Oracle Analytics and AI Forums
- 6.5K Oracle Analytics and AI Labs
- Oracle Analytics and AI User Groups
- 111 Oracle Analytics and AI Trainings
- 21 Oracle Analytics and AI Challenge
- Find Partners
- For Partners
Encountering "Max Rows Query Limit" Error
We’re encountering the [nQSError: 60008] "’Exceeded the maximum query governing rows 2000000 from the database " error while trying to retrieve data for a specific period. Even when reducing the period to 1 day, 1 hour, or 1 minute, we still receive the same error. We’ve observed this issue in two scenarios:
Case 1:
The data volume for the query is around 23–25 Lakh records, according to the queries. It’s a simple pivot query. We’ve applied all possible filters to reduce the data volume and optimised it to the maximum extent on our end. We’re running the report to retrieve just a day's worth of data, which shouldn't trigger the above error, as the data volume is much lower than the 2,000,000 row limit.
Case 2:
There are two tables, T1 and T2, in the Dataset. Table T1 contains almost 20 million records. After applying a left outer join on the ID column, the data is limited to 189,000 rows. When attempting to create a bar chart with this data, we encounter the same error. We tried reducing the report time period to just 1 day, but the above error persists. Even when adjusting the period to 1 hour or 1 minute, the error remains.
Has anyone else experienced this issue? Any known workarounds available?
Answers
-
With Table T1 containing 20 million records, if the join logic or the filter is not being fully "pushed down" to the database, OAC may be attempting to pull a massive volume of data into its internal memory to perform the join or calculation, thereby hitting the 2-million-row governor. Up could scale up the instance, or change the Data Access method from Live to Extracted, pre aggregation with data flows is also an option
0 -
You definitely need to look at the requirement with this kind of volume of records. Left joins are notorious for preventing filter pushdown, increasing row count and breaking star schema optimization. If business logic allows replace LEFT JOIN BETWEEN T1 and T2 with INNER JOIN.
0

