Skip to Main Content

Oracle Database Discussions

Announcement

For appeals, questions and feedback about Oracle Forums, please email oracle-forums-moderators_us@oracle.com. Technical questions should be asked in the appropriate category. Thank you!

Block corruption with creating tables/inserting data

658887Sep 9 2008 — edited Sep 9 2008
I appear to have a corrupt block that I encounter when I reach a certain point in creating a table or inserting new data in an Oracle 10gR2 database. The error is

SQL Error: ORA-00604: error occurred at recursive SQL level 2
ORA-01578: ORACLE data block corrupted (file # 1, block # 59522)
ORA-01110: data file 1: 'E:\ORACLE\PRODUCT\10.2.0\ORADATA\IMAGES\SYSTEM01.DBF'
00604. 00000 - "error occurred at recursive SQL level %s"
*Cause: An error occurred while processing a recursive SQL statement
(a statement applying to internal dictionary tables).
*Action: If the situation described in the next error on the stack
can be corrected, do so; otherwise contact Oracle Support.



(I only get the first two lines when inserting new data)

Is there a way to mark this block so as to not use it. There is no existing object using this block -- the issue appears to occur when something tries to use it for the first time.

Edited by: user8026807 on Sep 9, 2008 8:13 AM

Comments

21205
why aren't you using Oracle's built in replication for this?
Anurag Tibrewal
Hi,
This morning's text file was processed, but failed to insert records in either table.
Did you get any error? If yes what is the error.
I do not see any reason why you would not be getting any error, if the data is not inserted?
Yesterday's hourly feed caused 35 records to be inserted in exchange_rate, but only 12 of these were echoed to DM107ExchangeRatesODS.
Are you sure all these 35 records has distinct TS.CNTR_CRNCY_ID, TS.RATE_PRPS AND TS.REF_CURNCY. Why do you believe that out of these 35 records/trigger none trigger would have deleted the records from DM107ExchangeRatesODS

Regards
Anurag
730392
I'm not using Oracle's built-in replication, because it's not appropriate for the task.

You are right about the 12/35 issue, further data analysis shows that the hourly feed records are not unique on the three columns used for the deletion, so only the most recent row for those three values is retained. This is the intended behaviour, but I hadn't thought that through.

So the only issue that remains is why nothing from the daily feed was processed. I haven't seen any errors, it just fails silently.
21205
user2143457 wrote:
I'm not using Oracle's built-in replication, because it's not appropriate for the task.
Not appropriate?... Could you elaborate on that?
Anurag Tibrewal
user2143457 wrote:

So the only issue that remains is why nothing from the daily feed was processed. I haven't seen any errors, it just fails silently.
This is something hard to believe.
As far as I know Oracle does not have any failure without any error reporting and its typically ORA-XXXX.
It is possible you are suppressing this error in your code. If that is the case check your code and then check for the error.

If it is not possible to change the code. Can you run a small test with the below code as your trigger
CREATE OR REPLACE TRIGGER EXRATE_AFT_INS_ROW_ODS
AFTER INSERT ON EXCHANGE_RATE
REFERENCING NEW AS NEW
FOR EACH ROW
BEGIN
DELETE FROM DM107EXCHANGERATESODS@DM53 TS
WHERE TS.CNTR_CRNCY_ID = :NEW.CNTR_CRNCY_ID
AND TS.RATE_PRPS = :NEW.RATE_PRPS
AND TS.REF_CURNCY = :NEW.REF_CURNCY;

INSERT INTO DM107EXCHANGERATESODS@DM53 (
CNTR_CRNCY_ID, RATE_PRPS, PRC_DT, REF_CURNCY,
AMT_FROM, BID_RATE, AMT_TO
) VALUES (
:NEW.CNTR_CRNCY_ID, :NEW.RATE_PRPS, :NEW.PRC_DT, :NEW.REF_CURNCY,
:NEW.AMT_FROM, :NEW.BID_RATE, :NEW.AMT_TO
);

EXCEPTION
  WHEN OTHERS THEN  
     insert into error_tab(SQLCODE,SQLERRM);

END;
Here it is assumed you have a table named error_tab which has two columns that can store the above value.

Regards
Anurag Tibrewal.
kendenny
Is PRC_DT the effective date? If so are you loading the records in PRC_DT order? If not you could load the most current rate and then overlay it with an older rate.
730392
The reason it's not applicable is that I'm not simply replicating, I am removing old records and only retaining the most recent record.
730392
Yeah, unfortunately the code which processes the morning file and does the inserts is a third-party app to which I don't have source access, and it is emminently possible that they are suppressing the error.

I'll give the code you provided a go, as soon as I finish sorting the Live system issues this has created, and can get on to a test system.

Thanks.
730392
prc-dt is indeed the efefctive date, but is supplied by the original data-provider in the data field, and is the date of the last working day. It has no time field.

Not sure what you mean by "load the most current rate and then overlay it with an older rate", there isn't always a current rate on the Bridge system, and the basis for retention is that we should keep in the Brige system only the most recent record added to the Live system, for a given Currency, Counter-currency and purpose.
21205
I think it's a great candidate for replication... after loading the file, refresh...
something like
SQL> create table test
  2  as
  3  select rownum id
  4       , 'THIS' cd
  5       , sysdate - rownum dt
  6    from dual
  7   connect by level <= 5
  8  /

Table created.

SQL> 
SQL> alter table test
  2  add constraint test_pk primary key (id)
  3  /

Table altered.

SQL> 
SQL> create materialized view log
  2  on test
  3  /

Materialized view log created.

SQL> 
SQL> conn usr/pw@other_side
Connected.
SQL> 
SQL> create materialized view test_mv
  2  as
  3  select cd
  4       , max (dt)
  5    from test@db_link
  6   group by cd
  7  /

Materialized view created.
730392
Ah, now I see what you mean. We can't use a materialised view because the table in Bridge is the subject of a number of Transformation Server Pipes, which don't work properly with materialised views.

We have however solved the problem.

Turns out that there was a second, conditional after insert trigger on the table, which only fired on the records from the morning feed, hence the hourly feed working OK.

The two triggers set up a deadlock between them, since each was updating a remote table, and required either a save or rollback from the other before it could proceed, resulting in a time-out and a trigger failure.

Solution was to merge the two triggers, and apply the conditions to the insert statement from the original conditional trigger (since the new merged trigger is unconditional)

Hence adding:

INSERT INTO blah@remote ( <column list>)
SELECT :new.<column>,...
FROM dual
WHERE :new.<conditional column> = <value>


Thanks for all the suggestions and help guys. I was particularly impressed by how quickly they came, and they all helped get us to the solution.
1 - 11
Locked Post
New comments cannot be posted to this locked post.

Post Details

Locked on Oct 7 2008
Added on Sep 9 2008
8 comments
583 views