Forum Stats

  • 3,854,198 Users
  • 2,264,340 Discussions
  • 7,905,604 Comments

Discussions

LKM performance issue

779170
779170 Member Posts: 52
edited Jun 29, 2010 10:20AM in Data Integrator
Hi gurus!

We have experienced big performance issues using LKM.
I have tried use LKM SQL to Oracle and have found that LKM works very slowly with tables with big count of columns.
I have tried load table with 3 million records and for table with 20 columns it works much faster than with 200 columns.

With 2 separate Oracle servers I solved problem with dblink, and don't use LKM.

But how to solve this problem with load from MS SQL to Oracle without special db-link?
Have you found the reason of this performance issue?
I believe that performance bottleneck is in ODI agent, but I don’t know how to configure it to use more hardware resources.
Tagged:

Answers

  • Ankit J
    Ankit J Member Posts: 750 Bronze Trophy
    Well, the agent is not the bottleneck here.
    The way LKM SQL to Oracle works is that it extracts data from Source using an SQL command and performs one insert per row into Oracle using "insert into". So, this is bound to be slow for millions of records.

    For bulk record loading, other KMs must be considered that use the database's native features such as SQLLDR for Oracle or BCP for MSSQL.

    Did you check out - LKM MSSQL to Oracle (BCPSQLLDR)
    Ankit J
  • 779170
    779170 Member Posts: 52
    I'm working on that (BCP->SQLLDR).
    I just wondering that ODI do not have normal way to do that without additional utilities.
    If the bottleneck is in insert, why the option "commit"="1000 rows/ no commit" does not help?

    Are you shore that performance issue is caused by insert execution and not formating on agent side?
  • 779170
    779170 Member Posts: 52
    Hello again!

    I'm still fighting with LKM BCP->SQLLDR and I have experienced lot of problems with control file generation.
    Maybe it's easier create dblink to ms sql?
    Has anybody compared performance for db link to ms sql and SQLLDR LKM?
  • Ankit J
    Ankit J Member Posts: 750 Bronze Trophy
    What are the problems that you are experiencing with CTL file generation ?
    Is the CTL file malformed ?
    Pasting the contents of the command and error encountered will help forum members to help you
  • 779170
    779170 Member Posts: 52
    By example in export file decimal separator was "." but as I use russian NLS_LANG separator should be ",". I solved that with translate function.

    Now I have problem that load text field where I have 262 russian character (code page 1251), and I cant load it in varchar2(4000) field.
    When I try to insert this value manual, it has been inserted.

    In log file I have error message like this:
    Error in table xxx , column yyy.
    The field in file exceeded maximal length.
    (It's translated, as I should use NLS_LANG russian)
  • Ankit J
    Ankit J Member Posts: 750 Bronze Trophy
    There are 2 things that you can try :
    1.) UNICODE_SUPPORT - In KM, check this option
    or
    2.) make the target column definition as NVARCHAR2
  • 779170
    779170 Member Posts: 52
    No, have found. Problem is that default field size for sqlldr is set to 255 characters. Now I need to change again my ctl generation and put there all datatypes and lengths.
  • Ankit J
    Ankit J Member Posts: 750 Bronze Trophy
    Thats very good to know that you are getting your hands dirty with KM customization.

    As a starting point, you can use LKM File to Oracle (SQLLDR) from ODI > 10.1.3.6.0 and use the code from step "Generate CTL file" in that KM. It has most of the conditions handled, including char > 255
    Ankit J
  • 779170
    779170 Member Posts: 52
    THNX, will try out that.
This discussion has been closed.