5 Replies Latest reply on Jul 18, 2018 12:13 PM by AndrewSayer

    Streaming data from Oracle (without Golden Gate)

    Michael Tsilikidis

      Hello all,

       

      I'd like to know if there's anyone out there who has implemented (or, tried to implement) a way of streaming data out of Oracle - without using the obvious solution called Golden Gate.

       

      I'm trying to export data from Oracle on a row-by-row basis, which also means, I'm not interested in doing that using files.

      The data is sent to either an external Queue (like RabbitMQ) using Java or to an API with UTL_HTTP but I'm not sure these two options I've explored are the most optimal.

       

      Any suggestions and remarks would be appreciated.

       

      Thanks,

      Michael

        • 1. Re: Streaming data from Oracle (without Golden Gate)
          John_K

          Well, you could use a trigger on the table which then sends the data row by row, however that is generally considered bad practice because it can add a lot of overhead into your system. Do you actually need this to be real-time?

          • 2. Re: Streaming data from Oracle (without Golden Gate)
            Michael Tsilikidis

            I have actually done so

             

            I've used a trigger to "capture" data and do something else with it. To avoid the overhead to my system, I've decoupled what I want to do with the data by sending messages to an Advanced Queue, ie trigger sends a message per row - which is pretty fast to be honest.

             

            My question is mainly around how to quickly export the data from the Advanced Queue. I have tried the approach of having an app (service) that leaves outside Oracle and constantly pulls the AQ data but I want to avoid that.

            Instead, I'd like Oracle to push the data out.

             

            I need this to be as close to real-time it can be, without having to pay extra for a product.

            • 3. Re: Streaming data from Oracle (without Golden Gate)

              Instead, I'd like Oracle to push the data out.

              Oracle already 'pushes' the data out.

               

              That is what is captured in the REDO logs.

               

              You can use log miner functionality to process the info in those logs.

              I've used a trigger to "capture" data and do something else with it. To avoid the overhead to my system, I've decoupled what I want to do with the data by sending messages to an Advanced Queue, ie trigger sends a message per row - which is pretty fast to be honest.

              Again - Oracle already has that functionality.

               

              It is called a Materialized view log and can capture changes to the MV log table. Then you can do whatever you want with that info.

              My question is

              And my question is: have you produced WRITTEN REQUIREMENTS that specify what needs to happen for all possibilities?

               

              If so  you haven't posted that info.

               

              A key missing piece of info is about read-consistency. In order to apply data to sync a second system you not only need to know the changes that were made but you need to know the ORDER the changes were made and be able to apply those changes in the proper order.

               

              Oracle's REDO logs, MV logs and Golden Gate do that.

               

              I need this to be as close to real-time it can be, without having to pay extra for a product.

              Ain't gonna happen - because you can NOT manually do what I just told you above that REDO, MV logs and GG can do.

               

              We can't help you with a workaround solution because you haven't told us the ENTIRE scope of the requirements.

              • 4. Re: Streaming data from Oracle (without Golden Gate)
                Michael Tsilikidis

                And my question is: have you produced WRITTEN REQUIREMENTS that specify what needs to happen for all possibilities?

                 

                If so you haven't posted that info.

                If I've posted the WRITTEN REQUIREMENTS my post would probably be a TLDR

                 

                My question to you is: Have you read my initial POST?

                Copy-pasting that again: I'm trying to export data from Oracle on a row-by-row basis ...

                 

                Ain't gonna happen - because you can NOT manually do what I just told you above that REDO, MV logs and GG can do.

                 

                We can't help you with a workaround solution because you haven't told us the ENTIRE scope of the requirements.

                I don't have an intention to rewrite what Oracle already does at the low levels of their functionality and I appreciate I can't do that. Also, I stated it in the title (without Golden Gate)

                I'm interested in help focused around workarounds from an application development perspective - if possible and by anyone in here.

                 

                As I said, not going to create a post with the full requirements - It's going to be long and I don't see this as absolutely necessary to get some advice/help.

                 

                Thanks for your help though.

                 

                The current picture of my approach is, I have an Advanced Queue to which I'm adding data (that part works fast) and I'd like to have something that can push the data out in the fastest way possible.

                I have added a subscriber and the two ways I could think I could do this from within a callback procedure are:

                - POST messages using the UTL_HTTP package to an API

                - Have Java send messages to RabbitMQ using the relevant library

                 

                I don't know if what I'm doing is good performance wise.

                 

                Any extra help would be appreciated.

                • 5. Re: Streaming data from Oracle (without Golden Gate)
                  AndrewSayer

                  Michael Tsilikidis wrote:

                   

                  I have actually done so

                   

                  I've used a trigger to "capture" data and do something else with it. To avoid the overhead to my system, I've decoupled what I want to do with the data by sending messages to an Advanced Queue, ie trigger sends a message per row - which is pretty fast to be honest.

                   

                  My question is mainly around how to quickly export the data from the Advanced Queue. I have tried the approach of having an app (service) that leaves outside Oracle and constantly pulls the AQ data but I want to avoid that.

                  Instead, I'd like Oracle to push the data out.

                   

                  I need this to be as close to real-time it can be, without having to pay extra for a product.

                  I can't give you specific code because I've not done this before but...

                   

                  it should be very possibly to hook up whatever you're using to the Advanced Queue. That is sort of how streams propagation used to work (before Oracle bought GG).

                   

                  Connor McDonald discussed a metric ton of ways to handle replication in an AskTom office hours session, you should be able to find it on the asktom site, well worth going over. Unfortunately I can't check which one this was on because my work firewall blocks the videos but start here https://asktom.oracle.com/pls/apex/f?p=100:500:::NO:RP,500::

                  1 person found this helpful