This discussion is archived
8 Replies Latest reply: Nov 13, 2012 12:58 AM by Tony Garabedian RSS

DBMS_COMPRESSION tables and schema level Streams

Tony Garabedian Pro
Currently Being Moderated
Hello,

I am configuring schema level Streams replication between two 11gR2 databases. I will exclude the two compression tables DBMS_TABCOMP_TEMP_UNCMP and DBMS_TABCOMP_TEMP_CMP .
I should create a rule for each schema and exclude the tables right?

Thanks in Advance.

Regards,
Tony
  • 1. Re: DBMS_COMPRESSION tables and schema level Streams
    Mathias Zarick Newbie
    Currently Being Moderated
    Hi Tony,
    you will need to have both positive rule set (schema level rules) and negative rule set (exclusiion of tables not to capture) for the capture process. set inclusion_rule parameter of DBMS_STREAMS_ADM.ADD_TABLE_RULES to FALSE to add rules to a negative ruleset.
    See http://docs.oracle.com/cd/E11882_01/appdev.112/e25788/d_streams_adm.htm
    and http://docs.oracle.com/cd/E11882_01/server.112/e17069/strms_rules.htm#i1010994 for details.
    HTH Mathias
  • 2. Re: DBMS_COMPRESSION tables and schema level Streams
    Tony Garabedian Pro
    Currently Being Moderated
    Hello Mathias, Thanks for your reply.
    I will read the Docs you pointed right away.

    I wasn't very clear in my question, I will blame it on the lack of coffee.

    I actually have the positive rules set and functioning smoothly. the problem I faced is when the database started the automated maintenance tasks specifically the segment adviser.

    According to metalink Doc ID 1082323.1 It appears that the "Automated Maintenance Window" jobs Segment Advisor calls dbms_compression which creates two tables called DBMS_TABCOMP_TEMP_UNCMP and DBMS_TABCOMP_TEMP_CMP in users schema.
    and since these tables are created with nologging option the apply processes aborted.

    I have several schemas configured with positive rule set, my question is, should I create the negative rule set for each schema on its own right? or a negative rule for a table will handle the table regardless of in which schema it is being DDLed or DMLed?

    Thanks in Advance.

    Tony
  • 3. Re: DBMS_COMPRESSION tables and schema level Streams
    934532 Newbie
    Currently Being Moderated
    HI Tony,

    You can create negative rule for two tables only to exclude from replication where you can specify the table name as OWNER.table_name. No need to create negative rule for schema.


    BEGIN
    DBMS_STREAMS_ADM.ADD_TABLE_RULES
    (
    table_name => 'hr.unwantedtable',
    streams_type => 'capture',
    streams_name => 'strmadmin_capture',
    queue_name => 'strmadmin.streams_queue',
    include_dml => true,
    include_ddl => true,
    source_database => 'dbs1.net',
    inclusion_rule => false <<<<<----------------------specifies the negative rule set
    );
    END;



    How To Exclude A Table From Schema Capture And Replication When Using Schema Level Streams Replication [ID 239623.1]

    Thanks.

    Regards,

    Tarun
  • 4. Re: DBMS_COMPRESSION tables and schema level Streams
    Tony Garabedian Pro
    Currently Being Moderated
    Thanks Tarun for your reply.

    I think when the segment adviser is executed for each schema in the database by the Automated maintenance Task, the temp tables are created and dropped under each schema that is being analyzed by the adviser, so each schema would have a OWNER.DBMS_TABCOMP_TEMP_UNCMP and OWNER.DBMS_TABCOMP_TEMP_CMP created and dropped right?

    if that's corrected, and the in the rule i will have to specify the owner then I would need a negative rule in each schema right?
    what do you think guys?

    I will test it on the test environment and let you guys know the results

    Thanks


    Tony
  • 5. Re: DBMS_COMPRESSION tables and schema level Streams
    user8886876 Newbie
    Currently Being Moderated
    If each schema is part of streams replication , then we have to set negative rule for each schema.

    Thanks

    Tarun
  • 6. Re: DBMS_COMPRESSION tables and schema level Streams
    Tony Garabedian Pro
    Currently Being Moderated
    sorry for the late reply guys,

    I initially disable the automated segment adviser to avoid the error temporarily. it worked fine, I was further continuing tests when I face an unpublished bug, when a distributed transaction fails, the capture to aborts and you'll have a LogMiner fatal error followed by ORA-600 krvuatla20 error.

    metalink recommend to install 11.2.0.3 or upgrade to this version. so I'm in the middle of that now. I will post final results as soon as I finish.

    thanks for your help guys.,

    Regards,
    Tony
  • 7. Re: DBMS_COMPRESSION tables and schema level Streams
    Mathias Zarick Newbie
    Currently Being Moderated
    Hi Toni,
    i experienced now similar problems. My solution was simple. As i do not need all the advisors running, I disabled them.
    Cheers Mathias
  • 8. Re: DBMS_COMPRESSION tables and schema level Streams
    Tony Garabedian Pro
    Currently Being Moderated
    Hello Mathias, sorry for the late reply.

    I implemented the same solution, simply disabled the segment adviser from EM and streaming is running smoothly.


    Cheers.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points