9 Replies Latest reply: Apr 4, 2012 6:31 AM by 920680 RSS

    Simultanious package execution

    920680
      Hi All,

      I am just starting with ODI, but have experience in data warehouse design, so spare me.
      I wonder if that is going to work:
      Creating package with interfaces, that will be executed using context substitution.
      Will it work if SIMULTANIOUSLY I set agent to run the same package using different contexts ?
      Package would be using different sources then, but target would be one common table for all executing processes.

      We have fairly complex project and I am trying to help myself with further supervision and changes to ETL when necessary

      Any remarks are welcome :-)
        • 1. Re: Simultanious package execution
          721646
          Hi Turtle,

          Only issue you have " if SIMULTANIOUSLY I set agent to run the same package using different contexts ", it will error out while creating intermediary tables like, C$, i$.
          If you could modify KM to use context as your basis to craete, intermediary tables, then I think you should be able to run same package for multiple Contexts. I could find exact substitution method to add in KM, to use context as a basis to create temp/intermediary tables...


          Regards,
          kk
          • 2. Re: Simultanious package execution
            920680
            Hi,

            Thanks for your thought. I see ODI is using its "internal" staging, error, exception tables using naming convention x$xxx.
            Is there any way to skip this part ? I'd like to do insert from source table to target table without any middleman and extra complication.
            • 3. Re: Simultanious package execution
              PeakIndicators_Alastair
              You can hack up the knowledge modules if you want to skip the staging tables. I've done it many moons ago, basically create a dummy LKM that does nothing, rip out the API substitution code though from the existing LKM that provides the 'Load Data' functionality, use this API code in the IKM for the 'insert flow into I$ table' step (if you want to keep the I$ table) or go straight to the the 'insert new rows' step to skip all intermediate tables.

              Thats a rough guide, like I said it was ages ago and I dont have it to hand - You need to be careful with the API code as the source data for an IKM is by default the C$ table, the target table for an LKM is by default the C$ table, Get my drift? You need to take the source from the LKM step and use this as the source for the IKM step (or vice versa).

              Can I ask why you dont want the intermediate objects?
              • 4. Re: Simultanious package execution
                920680
                Thanks for that tip :-), as I am not very familiar with ODI I wouldn't start from doing it.
                I keep it in my memory for future.

                Why I want to skip that ODI "internal staging" ? I explain in example:
                my source dimension (goes to) -> my staging dimension table -> my target dimension table
                1) reason : I have a full control on "my staging dimension table", while ODI "internal staging" is ODI dependent.
                2) reason: if I use ODI as it is, I load the same data twice: to my and ODI internal staging, so perfomance argument

                but perhaps my thinking is skewed by working with OWB.
                • 5. Re: Simultanious package execution
                  PeakIndicators_Alastair
                  Turtle wrote:
                  1) reason : I have a full control on "my staging dimension table", while ODI "internal staging" is ODI dependent.
                  2) reason: if I use ODI as it is, I load the same data twice: to my and ODI internal staging, so perfomance argument

                  but perhaps my thinking is skewed by working with OWB.
                  Haha I made the jump before 11GR1 and I've not looked back, in all honesty you could set up ODI such that the staging tables were in a seperate 'staging' schema (use the 'work' schema on your dataserver to do this) and you could go as far as dropping the C$ prefix leaving you with an identical object name as the target schema. (If it wont let you have Null in the Physical schema, Work tables prefix you could get around it with some Java find/replace tags in the Knowledge module API, But I do know what you mean, in OWB you hand cranked everything, ODI pushes a certain methodology onto you thats difficult to work with at first, my reason for skipping the C$ table is we where using ODI to get data from an AS/400 and load a 'landing' table to an Oracle DWH , using...... OWB for the ETL stuff ! since it was truncate / reload we didnt need the C$ stuff either as its a double load if there is no transformation going on.

                  For really complex mappings I'd take OWB all day long but with ODI 11G at least you can do temporary interfaces to get you that 'derived select' functionality that OWB does for every single joiner / splitter etc.
                  • 6. Re: Simultanious package execution
                    920680
                    Big thanks for all your posts
                    • 7. Re: Simultanious package execution
                      920680
                      simultanious run of the same interface will cause odi staging tables to error out
                      • 8. Re: Simultanious package execution
                        PeakIndicators_Alastair
                        Hi - I remembered this post earlier today - Check out this thread : http://odiexperts.com/interface-parallel-execution-a-new-solution

                        Uses the session number to make each temp table dynamic , should resolve your conflicts.
                        Regards
                        Alastair
                        • 9. Re: Simultanious package execution
                          920680
                          simple and awsome :-) thanks