Forum Stats

  • 3,872,015 Users
  • 2,266,364 Discussions
  • 7,911,025 Comments

Discussions

Autonomous Transaction Triggers

rvstucke
rvstucke Member Posts: 11
edited May 27, 2016 2:46PM in Database Ideas - Ideas

There are many times I have had to create an insert or update trigger with a cursor, which contains the same table I am inserting/updating.  I need it to satisfy a complex constraint, to ensure data integrity.  The problem is, in certain situations I get a "mutating trigger" error telling me the result of the cursor could change based on what I'm inserting/updating in my table.  I end up having to create a complex set of tables which would then be used to crosscheck, through foreign key constraints, what I am changing.  This works because I'm not changing data in these tables until AFTER I change the table I'm in the midst of changing through a trigger which fires after I write.  In plsql there is a execution parameter, AUTONOMOUS TRANSACTION, which executes in an isolated session DML operations and a commit, without committing my current sessions DML activity.

My solution would be to extend that parameter to triggers, which would, 'isolate' the DML currently being executed in my session from the cursors being executed inside the trigger, via execution as though it's a different session, which until commit, doesn't even see my changes.

rvstucke
4 votes

Active · Last Updated

Comments

  • Mike Kutz
    Mike Kutz Member Posts: 6,251 Gold Crown

    umm... no.

    You should use an ASSERTION instead.

    MK

  • Hi,

    Have you considered using a view with an instead of trigger to implement your business rules? Your application can issue DML towards the view which has an instead of trigger. You can implement any complex logic in the instead of trigger view without running into mutating table problems and then issue the DML's to the original table.

    Hope this helps.

    Kind Regards,

    Yalim

  • Sven W.
    Sven W. Member Posts: 10,559 Gold Crown
    edited May 30, 2016 7:30AM

    The standard solution to the mutating table problem is to put the relevant code into an AFTER STATEMENT trigger.

    I don't see why anyone would create a complex set of tables instead.

    Also the idea of using an autonomous transaction instead looks very much like a misunderstanding of what a constraint does.

    Instead of enforcing data integrity it would probably do the exact opposite.

    The main problem is the misleading error message when the mutating table trigger happens.

    Myriads of novice delevopers fall into that trap and think they can solve their logical issue using autonomous transactions. Which is wrong.

  • rvstucke
    rvstucke Member Posts: 11

    I disagree.

    The problem is a simple one.  Don't consider what I am doing to the table data when I am checking to see if the data qualifies.  The AFTER trigger cannot be done on a per row basis if I am updating multiple rows in a table.  So I still have to keep track separately of what rows I've changed to validate.  The autonomous transaction or whatever you want to call it, would isolate the actual dml action from any queries executed inside the trigger.  I wouldn't need all the additional "stuff" just to satisfy a business rule.

  • Mike Kutz
    Mike Kutz Member Posts: 6,251 Gold Crown

    I disagree.

    The problem is a simple one.  Don't consider what I am doing to the table data when I am checking to see if the data qualifies.  The AFTER trigger cannot be done on a per row basis if I am updating multiple rows in a table.  So I still have to keep track separately of what rows I've changed to validate.  The autonomous transaction or whatever you want to call it, would isolate the actual dml action from any queries executed inside the trigger.  I wouldn't need all the additional "stuff" just to satisfy a business rule.

    yeah.. you really need to use the (not yet existing) ASSERTIONs to do want you want.

    In the mean time

    create a TAPI (Table API) Package that can handle BULK processing.

    Modify your ETL process to use a GTT to temporarily store the values so you can bulk process all the row.

    My $0.02

    MK