-
1. Re: ORA-30009: Not enough memory for CONNECT BY operation
John Thorton Jun 8, 2018 3:01 AM (in response to Quanwen Zhao)unwilling or incapable to use GOOGLE yourself?
[oracle@localhost trace]$ oerr ora 30009
30009, 0000, "Not enough memory for %s operation"
// *Cause: The memory size was not sufficient to process all the levels of the
// hierarchy specified by the query.
// *Action: In WORKAREA_SIZE_POLICY=AUTO mode, set PGA_AGGREGATE_TARGET to
// a reasonably larger value.
// Or, in WORKAREA_SIZE_POLICY=MANUAL mode, set SORT_AREA_SIZE to a
// reasonably larger value.
-
2. Re: ORA-30009: Not enough memory for CONNECT BY operation
AndrewSayer Jun 8, 2018 6:00 AM (in response to Quanwen Zhao)What has arraysize got to do with this at all? That is just a setting to determine how many rows to fetch from a cursor each iteration..
Either reduce the amount of times you are connecting the row in dual to itself - do you really need 100 million rows just to try something?
Or rewrite your statement to use cross joins or something else to prevent needing to store so much in memory.
-
3. Re: ORA-30009: Not enough memory for CONNECT BY operation
Quanwen Zhao Jun 8, 2018 6:45 AM (in response to AndrewSayer)Either reduce the amount of times you are connecting the row in dual to itself - do you really need 100 million rows just to try something?
Or rewrite your statement to use cross joins or something else to prevent needing to store so much in memory.
Hi, Andrew
Yep, I wanna use this test table to simulate production system's table size (approximately 6 gigabytes) for testing something.
Last time I used this figure "10000000" with connect by but the table's size is about 600 megabytes.
SQL> set timing on SQL> SQL> create table test1 2 segment creation immediate 3 nologging 4 as 5 select rownum as id 6 , to_char(sysdate + rownum/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime 7 , trunc(dbms_random.value(0, 100)) as random_id 8 , dbms_random.string('x', 20) random_string 9 from dual 10 connect by level <= 10000000 <<== 11 ; Table TEST1 created. Elapsed: 00:15:58.255 SQL> select sum(blocks),sum(bytes)/1024/1024 size_mb from user_segments 2 where segment_name = 'TEST1' 3 and segment_type = 'TABLE'; SUM(BLOCKS) SIZE_MB ----------- ---------- 76800 600
It's so far as 6 gigabytes so I change that figure to be "100000000" for retry it.
Best Regards
Quanwen Zhao
-
4. Re: ORA-30009: Not enough memory for CONNECT BY operation
AndrewSayer Jun 8, 2018 6:50 AM (in response to Quanwen Zhao)Quanwen Zhao wrote:
Either reduce the amount of times you are connecting the row in dual to itself - do you really need 100 million rows just to try something?
Or rewrite your statement to use cross joins or something else to prevent needing to store so much in memory.
Hi, Andrew
Yep, I wanna use this test table to simulate production system's table size (approximately 6 gigabytes) for testing something.
Last time I used this figure "10000000" with connect by but the table's size is about 600 megabytes.
- SQL>settimingon
- SQL>
- SQL>createtabletest1
- 2segmentcreationimmediate
- 3nologging
- 4as
- 5selectrownumasid
- 6,to_char(sysdate+rownum/24/3600,'yyyy-mm-ddhh24:mi:ss')asinc_datetime
- 7,trunc(dbms_random.value(0,100))asrandom_id
- 8,dbms_random.string('x',20)random_string
- 9fromdual
- 10connectbylevel<=10000000<<==
- 11;
- TableTEST1created.
- Elapsed:00:15:58.255
- SQL>selectsum(blocks),sum(bytes)/1024/1024size_mbfromuser_segments
- 2wheresegment_name='TEST1'
- 3andsegment_type='TABLE';
- SUM(BLOCKS)SIZE_MB
- ---------------------
- 76800600
It's so far as 6 gigabytes so I change that figure to be "100000000" for retry it.
Best Regards
Quanwen Zhao
Its likely you need the table to have the same row distribution, the same indexes etc if you want your tests to be meaningful.
Cant you spin up a duplicate DB from a backup of production, you would need to obfuscate data that doesn’t belong outside of production before you use it though (And if this effects your selectivities then you’ll need to change your queries accordingly)
To directly answer this, like I said before you can use a cross join to bump this up, if level<=100,000 is fine then cross join that to a generated row source of 100 rows (I can’t count your 0s properly on my phone screen so you’ll have to do the maths)
-
5. Re: ORA-30009: Not enough memory for CONNECT BY operation
Quanwen Zhao Jun 8, 2018 6:57 AM (in response to John Thorton)unwilling or incapable to use GOOGLE yourself?
[oracle@localhost trace]$ oerr ora 30009
30009, 0000, "Not enough memory for %s operation"
// *Cause: The memory size was not sufficient to process all the levels of the
// hierarchy specified by the query.
// *Action: In WORKAREA_SIZE_POLICY=AUTO mode, set PGA_AGGREGATE_TARGET to
// a reasonably larger value.
// Or, in WORKAREA_SIZE_POLICY=MANUAL mode, set SORT_AREA_SIZE to a
// reasonably larger value.
Hello, John Thorton
As you know, due to country's specific situation I haven't never logged in Google in China. BTW I've searched other posts but it hasn't any clues and answers so I come here.
Best Regards
Quanwen Zhao
-
6. Re: ORA-30009: Not enough memory for CONNECT BY operation
BEDE Jun 8, 2018 7:00 AM (in response to Quanwen Zhao)Wouldn't export and import provide better conditions to test whatever you wish to test?
-
7. Re: ORA-30009: Not enough memory for CONNECT BY operation
AndrewSayer Jun 8, 2018 7:07 AM (in response to BEDE)BEDE wrote:
Wouldn't export and import provide better conditions to test whatever you wish to test?
Any logical duplication (like export and import) will leave the rows in any order it likes. Your indexes will then all be built from scratch - your statistics will be different and the performance of using your indexes will be different.
If you want to test performance then you absolutely must use physical duplication
-
8. Re: ORA-30009: Not enough memory for CONNECT BY operation
Gaz in Oz Jun 8, 2018 7:14 AM (in response to Quanwen Zhao)Try using less memory with "connect by":
For example:
create table test1 segment creation immediate nologging select r as id , to_char(sysdate + r/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime , trunc(dbms_random.value(0, 100)) as random_id , dbms_random.string('x', 20) random_string from (SELECT rownum r FROM (SELECT level r FROM dual CONNECT BY level <= 1000) d1, (SELECT level r FROM dual CONNECT BY level <= 1000) d2, (SELECT level r FROM dual CONNECT BY level <= 1000) d3 WHERE rownum <= 100000000 );
...generating 100,000,000 will take time.
-
9. Re: ORA-30009: Not enough memory for CONNECT BY operation
BEDE Jun 8, 2018 7:22 AM (in response to AndrewSayer)Physical duplication would mean copying the database files as they are - a cold backup. But that may not be feasible. So, for some tests, export and import of some data may be used, although, as you have pointed out it will not be the same for performance. Moreover, for exact duplication, not only a cold backup would be needed, but also exactly the same kind of system (processors, memory and disks), which I pretty much doubt that many organizations have. For development and test environments pretty many use not so powerful computers as in production, and, if in test environment, on a not so powerful machine, some batch process performs satisfactory, then, in production, it should do no worse...
I meant export and import in order to have the same data.
-
10. Re: ORA-30009: Not enough memory for CONNECT BY operation
John_K Jun 8, 2018 7:25 AM (in response to Quanwen Zhao)So instead of using a very narrow table where each row is going to consume a very small amount of space, why not base your query on something a bit more hefty like dba_objects?
-
11. Re: ORA-30009: Not enough memory for CONNECT BY operation
AndrewSayer Jun 8, 2018 7:43 AM (in response to BEDE)BEDE wrote:
Physical duplication would mean copying the database files as they are - a cold backup. But that may not be feasible. So, for some tests, export and import of some data may be used, although, as you have pointed out it will not be the same for performance. Moreover, for exact duplication, not only a cold backup would be needed, but also exactly the same kind of system (processors, memory and disks), which I pretty much doubt that many organizations have. For development and test environments pretty many use not so powerful computers as in production, and, if in test environment, on a not so powerful machine, some batch process performs satisfactory, then, in production, it should do no worse...
I meant export and import in order to have the same data.
If export and import are good enough then there is absolutely no way its necessary to create such a big random data set from dual.
You don’t need a cold backup for physical duplication. Hot backups work fine and should already be available. I imagine PDB cloning can achieve the same.
Sure, different hardware will perform differently. But so long as you use default system stats (which is the current recommendation) your queries will do the same plans (for the same bind peeking profile) and any system related time can be scaled up or down depending on your known hardware differences. Sure, it’s not quick to determine this but if you’re going to test on different hardware you are going to have to do this, otherwise whats the point of testing?
If your company says they can’t afford to test properly then theyll have to deal with the consequences
-
12. Re: ORA-30009: Not enough memory for CONNECT BY operation
Quanwen Zhao Jun 10, 2018 2:55 AM (in response to Gaz in Oz)- create table test1
- segment creation immediate
- nologging
- select r as id
- , to_char(sysdate + r/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime
- , trunc(dbms_random.value(0, 100)) as random_id
- , dbms_random.string('x', 20) random_string
- from (SELECT rownum r
- FROM (SELECT level r FROM dual CONNECT BY level <= 1000) d1,
- (SELECT level r FROM dual CONNECT BY level <= 1000) d2,
- (SELECT level r FROM dual CONNECT BY level <= 1000) d3
- WHERE rownum <= 100000000
- );
Hi, Gaz in Oz
According to your vice SQL code, I've created test table test1 and it has shown this error ORA-00922 below,
SQL> create table test1 2 segment creation immediate 3 nologging 4 select r as id 5 , to_char(sysdate + r/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime 6 , trunc(dbms_random.value(0, 100)) as random_id 7 , dbms_random.string('x', 20) random_string 8 from (SELECT rownum r 9 FROM (SELECT level r FROM dual CONNECT BY level <= 1000) d1, 10 (SELECT level r FROM dual CONNECT BY level <= 1000) d2, 11 (SELECT level r FROM dual CONNECT BY level <= 1000) d3 12 WHERE rownum <= 100000000 13 ); Error starting at line : 1 in command - create table test1 segment creation immediate nologging select r as id , to_char(sysdate + r/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime , trunc(dbms_random.value(0, 100)) as random_id , dbms_random.string('x', 20) random_string from (SELECT rownum r FROM (SELECT level r FROM dual CONNECT BY level <= 1000) d1, (SELECT level r FROM dual CONNECT BY level <= 1000) d2, (SELECT level r FROM dual CONNECT BY level <= 1000) d3 WHERE rownum <= 100000000 ) Error report - ORA-00922: missing or invalid option
Best Regards
Quanwen Zhao
-
13. Re: ORA-30009: Not enough memory for CONNECT BY operation
Gaz in Oz Jun 8, 2018 9:36 AM (in response to Quanwen Zhao)Missing "as" keyword.
create table test1 segment creation immediate nologging as select r as id , to_char(sysdate + r/24/3600, 'yyyy-mm-dd hh24:mi:ss') as inc_datetime , trunc(dbms_random.value(0, 100)) as random_id , dbms_random.string('x', 20) random_string from (SELECT rownum r FROM (SELECT level r FROM dual CONNECT BY level <= 1000) d1, (SELECT level r FROM dual CONNECT BY level <= 1000) d2, (SELECT level r FROM dual CONNECT BY level <= 1000) d3 WHERE rownum <= 100000000 );
-
14. Re: ORA-30009: Not enough memory for CONNECT BY operation
BEDE Jun 8, 2018 9:38 AM (in response to Quanwen Zhao)The select itself is running in Oracle 11.2.0.4. And so does the create table as select. All you need is an "AS" after nologging (if that is exactly as you have posted).
Of course the figure I used was only 10000, for I wasn't that insane to produce that many rows you need.