Using Oracle TimesTen on Exalytics as an Operational Data Store

Version 3

    Using Oracle TimesTen on Exalytics as an Operational Data Store

     

    By KiranTailor, kt1

    The Oracle Exalytics In-Memory machine features an optimized Oracle BI Foundation Suite that includes Oracle Business Intelligence Enterprise Edition and can be installed with either Oracle TimesTen In-memory Database (IMDB) or an Oracle Database 12c with the in-memory option.  Exalytics is an engineered system and most engineered systems have large system configurations with huge amounts of memory.  Most users of Exalytics who use TimesTen will only be storing aggregates but it is possible to store selected tables (Hot Data), or entire data warehouse in TimesTen.

     

    In this article we will look at how this can be done using the TimesTen utility and the Oracle DAC.

     

    Before We Begin

     

    This article assumes you have a good understanding of:

    1. Oracle Exalytics and TimesTen; for more information please refer to http://www.oracle.com/technetwork/documentation/exalytics-doc-1645656.html
    2. Oracle DAC; for more information please refer to http://docs.oracle.com/cd/E35287_01/fusionapps.7964/e14849/toc.htm

    Using the TimesTen ttImportFromOracle

     

    In the Steps below we will use the ttImportFromOracle utility to load a table into TimeTen.  The document http://download.oracle.com/otn_hosted_doc/timesten/1122/quickstart/html/admin/howto_util_ttimportfromoracle.html is a good read to familiarize yourself with this utility.  By default this is not pre-installed so you will need to download and install this from http://www.oracle.com/technetwork/indexes/samplecode/utilities-2180928.html.

     

    1) Open a putty/terminal session to the server TimesTen is installed. In these steps we will assume $DTT is /u01/app/oracle/product/TimesTen/tt1122

     

    2) Make sure your LD_LIBRARY_PATH variable includes the TimesTen instant client folder
    $export LD_LIBRARY_PATH=$LD_LIBRARY_PATH: $DTT/ttoracle_home/instantclient_11_2

     

    3) Create a directory ‘impscripts’ to store your files that will be generated by ttImportFromOracle

     

    4) cd impscripts

     

    5) We have below made a connection to the oracle database by using the switch -oraconn, the switch –tables is used to list the tables you want to import, in this example we have used the table sam.wc_king_f.  The prefix tm1 is used to name the files.

     

    $  ./ttImportFromOracle -oraConn read/only@21.231.123.111:1521/samlive -tables sam.wc_king_f -typeMap 1 -prefix tm1

     

    Beginning processing
    Resolving any tablename wildcards
    Eliminating any duplicate tables
    Getting metadata from source
    Generating database user list
    Assigning TimesTen datatypes
    Analyzing source tables
    Analyzing table 'SAM.WC_KING_F' ...
    Estimating table sizes
    Evaluating parallel data load
    Generating output files
    Finished processing

    6) Let's have a look at what we have created:

    $ls
    tm1CreateIndexes.sql  
    tm1CreateTables.sql  
    tm1CreateUsers.sql  
    tm1DropIndexes.sql  
    tm1DropTables.sql  
    tm1LoadData.sql  
    tm1TableList.txt  
    tm1ttPDL.sh  
    tm1ttSizing.sh  
    tm1UpdateStats.sql

    Note: The files generated will have the oracle db owner, so replace this with TT DB Owner.

     

    7) Using Using ttisql we can use the scripts in the following way to load SAM.WC_KING_F.

     

    TimestenCodesnip1.jpg

     

    Summary ttImportFromOracle

     

    This is great for full Loads as it could be scripted, but if we wish to carry out incremental loads, we would then have to use a tool such as the Oracle DAC or ODI.

     

    Using the Oracle DAC

     

    The Data Warehouse Application Console (DAC) is used for the loading of Oracle Business Analytics Warehouse, it is used to execute Extract Transform and Load ETL routines from your source database to a target database such as a warehouse.  As well as using the DAC to load the warehouse it can also be used to copy data to another target.  In this article we will show you how you could use this data copy feature to enhance your existing DAC to copy the data to another target.  The target in this case is the TimesTen database which resides on the Exalytics machine where a schema called ‘supertest’ exists.  TimesTen is an in-memory database full-featured,memory-optimized relational database (IMDB).

    In this setup, we will use a common DAC/BI setup found across many organizations, we are using DAC 11.1.1.6.4 which is currently set up to do normal execution loads from an oracle database (OLTP) to a warehouse (OLAP) and includes the in memory property for the tables.  The setup would look like Fig 1.1.  You could achieve the same results using load plans and ODI.

     

     

    Fig1.1OTN.jpg

     

    What we will do is:

    1. Configure another target TimesTen (IMDB) that resides on an exalytics machine that has been already configured by Oracle Engineered team and execute a job to copy a table from the OLAP to the IMDB, for your testing you could install a TimesTen database on a machine of your choice.
    2. Configure the same job to copy incremental changes on the OLAP table to the IMDB.
    3. We need to be using a DAC version 11.1.1.6.4 or higher that includes an In Memory property for the tables.

     

     

    Environmental Preparation

     

    If your environment has the DAC client and DAC server on the different machines, then this needs to be done on both machines.
      1) Install the TimesTen 64 bit client installer to your DAC machine
      2) Copy the TimesTen library .jar files to the DAC environment


        a. Navigate to $CLIENT_TT/lib
        b. Select all .jar files
        c. Copy to the local $DAC\lib folder
        d. Navigate to $CLIENT_TT/3rdparty/jms1.1/lib folder
        e. Select all .jar files
        f. Copy to the local $DAC\lib folder
        g. Create an ODBC connection to the Times Ten server name it such as DACTT.

    Configure the DAC

    1) Start your DAC client
    2) Click Setup , Physical Data Sources New  fill in the details as per (Fig 2a), substituting table owner and table owner password relevant to your environment.

     

     

    Fig 2a.jpg

     

    Click Test Connection (verify that DAC can connect to the TimesTen data source) 
    Click Save

    3) Create Logical connection ‘DACTT’ for the TimesTen database; Tools Seed DataLogical Data SourcesNew (See Fig 3a)

     

    Fig 3a.jpg

     

    4) Create Task Phase ‘DataCopy to TimesTen’ for the datacopy process; Menu ToolsSeed DataTask PhasesNew (See Fig 4a)

    Fig 4a.jpg

     

    5) Create an External Executor called ‘TT’ (as per figure 5a); Click Setup , External Executors New

    Fig 5a.jpg

     

    6) We are now going to select the tables that we want copied to Times Ten by going to the tables screen; Design Tables; here we can select what tables we want to copy to our IMDB in TimesTen by using the query function.  For this example we have selected the table ‘People’ and checked the tick box ‘In Memory’ (Fig 6a)

     

    Fig 6a.jpg

    If we had more than 1 table in the list we can set multiple tables by positioning your cursor on one of the table names in the list Right-ClickUpdate RecordsSelect In MemeryClick  Set ValuesCheck In Memory Check Update Referenced Records Click OK


    7) We need to now make sure for all the tables we want to copy we have the columns in the column tab populated,so with the cursor on one of the table names in the list Right-ClickImport From database Import Database ColumnsSelect the Datasource for the OLAP Read Columns Import Columns Click OK

     

    8) For all the tables we have selected in section 6, we need to make sure the column ‘In Memory’ in the column tab is checked for all the columns; with your cursor on 1 column Right-ClickUpdate RecordsSelect In MemeryClick  Set ValuesCheck In Memory Check Update Referenced Records Click OK

     

    9) We can now generate the scripts to be run on the TimesTen database.  Within the Design Tables tab, Query for the table/s we are copying Right-Click  Generate DW Table Scripts  select ‘All Records in the list’, do not execute, just copy the file contents and run as script on the TimesTen server this will create the schema.

     

    10) The data copy can be a full load or an incremental load, for an incremental load we need to identify the changed data during each load in this example we will be using the field etl_proc_wid to distinguish load changes.  Within the Design Tables tab, Query for the table/s we are copying with which we want to do incremental loads, Right-Click Generate Data Copy TasksAll Records in the list. For the Primary Logical Source - choose your connection to the OLAP; Primary Logical Target – choose your TimesTen connection; For your task phase – DataCopy To TimesTen.  (See Fig 7a)

     

    Fig 10a.jpg

     

    Note: If we were doing full loads every time, we just select the tables we only want to do full loads and untick enable incremental in the pop up screen above.

     

    11) We will now create the subject area to run these Data Copy tasks; DesignSubject Areas New ’Enter a name of your choice’Save

     

    12) Select the Tasks tab for your subject area, in the name field query for DATA* and then select ADD/ADD ALL OK

     

    13) We will now create the global parameters that will be needed for the datacopy to work; DesignSource System ParametersNew; Create the 2 parameters.

    TimestenasODSfig8.jpg

     

    Parameter 1 tells the datacopy tasks what the unique identifier is for the tables, so by checking if the row exists in the target table it will do an update or an insert.
    Parameter 2 is used by the datacopy tasks to identify the rows that have been inserted or changed by the last load and need to be copied to your IMDB target.

     

    14) Now that we have finished the configuration, we can build the execution plan.  Execute Execution Plans New; give your execution plan a name and save. With your new execution plan selected, navigate to the Subject Areas tab, Add/Remove Go Select your subject area you used in section 11 Add OK

     

    15) We now need to generate the logical to physical maps, With your new execution plan selected, navigate to the Connectivity Parameters, Generate, Update the value columns so they point to the correct value,  you will find this easier if you kept the logical and physical names the same in the earlier tasks.

     

    16) When we run this plan we can schedule this to run after our normal load to the OLAP warehouse, so we need to tell the plan how to identify the last load; With your new execution plan selected, navigate to the Execution Parameters New, (see fig 9a for the values). Save

     

    TimesTenasODSfig9a.jpg

     

    17) Build the execution plan and run; Execute Execution Plans Select your plan Build;Built you can Run Now

     

    Troubleshooting

     

    Data Copy Execution logs

     

    Look in the folder called Dac/Log/<EexecutionPlanName>.<ProcesId> for log files with the following name format DataCopy_<TaskName>_<SourceDb>_<TargetDb>.log.

     

    Patches and Updates

     

      Please refer to Oracle Technology Network and My Oracle Support for any late breaking patches or fixes.

     

    Conclusion

     

    We have successfully loaded a table into TimesTen IMDB that resides on an Exalytics box, using the same principal, you could choose as many tables as you wish and gain dramatic performance improvements.  Your limitation would only be the amount of memory you have available.  Even though we have used an exalytics box as they come with minimum 1tb of ram, you could use any hardware of your choice.  So if your warehouse is smaller than the amount of memory you have available, there is no reason why you can’t load your entire warehouse into memory.

     

    About the Author:

     

    Kiran Tailor is an Oracle Ace Associate with Over 18 years’ experience as a Senior Oracle Database Administrator in Oracle 8i,9i,10g,11g on Linux,Solaris 9 & 10 and Windows comprising of installations, configurations and upgrades. Which has allowed him to build up his skills and knowledge in Oracle ASM, Oracle RAC, Dataguard, Application SQL Tuning, Backup & recovery through RMAN.  He has also been extensively working with Oracle Exalytics comprising of OBIEE configuration and full integration of Oracle Times Ten DB (In-Memory).