This discussion is archived
1 2 Previous Next 18 Replies Latest reply: Sep 23, 2013 5:45 AM by thundef RSS

Sql loader not able to load more than 4.2 billion rows

883044 Newbie
Currently Being Moderated
Hi,

I am facing a problem with Sql loader utility.
The Sql loader process stops after loading 342 gigs of data i.e it seems to load 4.294 billion rows out of 6.9 billion rows and the loader process completes without throwing any error.

Is there any limit on the volume of data sql loader can load ?
Also how to identify that SQL loader process has not loaded whole data from the file as it is not throwing any error ?

Thanks in Advance.
  • 1. Re: Sql loader not able to load more than 4.2 billion rows
    882093 Newbie
    Currently Being Moderated
    it may be a prob with the db config not in the sql loader...

    check all the parameters

    Maximizing SQL*Loader Performance

    SQL*Loader is flexible and offers many options that should be considered to maximize the speed of data loads. These include:



    1. Use Direct Path Loads - The conventional path loader essentially loads the data by using standard insert statements. The direct path loader (direct=true) loads directly into the Oracle data files and creates blocks in Oracle database block format. The fact that SQL is not being issued makes the entire process much less taxing on the database. There are certain cases, however, in which direct path loads cannot be used (clustered tables). To prepare the database for direct path loads, the script $ORACLE_HOME/rdbms/admin/catldr.sql.sql must be executed.

    2. Disable Indexes and Constraints. For conventional data loads only, the disabling of indexes and constraints can greatly enhance the performance of SQL*Loader.

    3. Use a Larger Bind Array. For conventional data loads only, larger bind arrays limit the number of calls to the database and increase performance. The size of the bind array is specified using the bindsize parameter. The bind array's size is equivalent to the number of rows it contains (rows=) times the maximum length of each row.

    4. Use ROWS=n to Commit Less Frequently. For conventional data loads only, the rows parameter specifies the number of rows per commit. Issuing fewer commits will enhance performance.

    5. Use Parallel Loads. Available with direct path data loads only, this option allows multiple SQL*Loader jobs to execute concurrently.

    $ sqlldr control=first.ctl parallel=true direct=true

    $ sqlldr control=second.ctl parallel=true direct=true



    6. Use Fixed Width Data. Fixed width data format saves Oracle some processing when parsing the data. The savings can be tremendous, depending on the type of data and number of rows.

    7. Disable Archiving During Load. While this may not be feasible in certain environments, disabling database archiving can increase performance considerably.

    8. Use unrecoverable. The unrecoverable option (unrecoverable load data) disables the writing of the data to the redo logs. This option is available for direct path loads only.

    Edited by: 879090 on 18-Aug-2011 00:23
  • 2. Re: Sql loader not able to load more than 4.2 billion rows
    BluShadow Guru Moderator
    Currently Being Moderated
    Is it producing discard and/or bad files?
    What do the log file say about rejected rows?
  • 3. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    The utility is correctly defined in my program.The utility is not throwing an error.It is completing successfully without loading complete data from the file.
    Is there any way to know whether the utility has loaded complete data from the file?
  • 4. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    Those records are not gettinf inserted in bad nor in discard file.
  • 5. Re: Sql loader not able to load more than 4.2 billion rows
    BluShadow Guru Moderator
    Currently Being Moderated
    And what does the log file say?
  • 6. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    It just prints like below

    Table XYZ:
    4294291572 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    675723 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.

    Date conversion cache disabled due to overflow (default size: 1000)


    Table ABC:
    675723 Rows successfully loaded.
    0 Rows not loaded due to data errors.
    4294291572 Rows not loaded because all WHEN clauses were failed.
    0 Rows not loaded because all fields were null.

    Date conversion cache disabled due to overflow (default size: 1000)

    Bind array size not used in direct path.
    Column array rows : 300000
    Stream buffer bytes: 2304000
    Read buffer bytes: 9437184

    Total logical records skipped: 0
    Total logical records read: 4294967295
    Total logical records rejected: 0
    Total logical records discarded: 0

    Total stream buffers loaded by SQL*Loader main thread: 71853
    Total stream buffers loaded by SQL*Loader load thread: 107858
  • 7. Re: Sql loader not able to load more than 4.2 billion rows
    BillyVerreynne Oracle ACE
    Currently Being Moderated
    In that case, are you sure that there are in fact 6 billion rows in that file? SQL*Loader does not seem to think so?

    So before pointing to it as the problem, do a sanity check and ensure that the file actually contain the number of rows you expect it to contain. Also make sure the file format is correct - for example, the file could contain and EOF or similar marker at the start of a line, that makes SQL*Loader believe that all was processed. (not sure that SQL*Load today will choke on such an error, but it is worthwhile to also do a sanity check on the file format anyway).

    Also, any specific reason for such a large file? Loading and management of the load will be a lot easier dealing with a 1000 smaller files than a single gianormous massive file.
  • 8. Re: Sql loader not able to load more than 4.2 billion rows
    bluefrog Expert
    Currently Being Moderated
    If your file is on a unix server, then find the line count as follows,
    wc  -l filename
    Compare that number to the sum of the two figures reported by SQL Loader.
  • 9. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    I used this wc -l approach...But it's taking hell lot of time to come out..wc command is taking more than 2 hours.I am not ready to use this time consuming approach.
  • 10. Re: Sql loader not able to load more than 4.2 billion rows
    BluShadow Guru Moderator
    Currently Being Moderated
    880041 wrote:
    I used this wc -l approach...But it's taking hell lot of time to come out..wc command is taking more than 2 hours.I am not ready to use this time consuming approach.
    Well it's an operating system command written in low-level C language, so it's about as fast as it's going to get for counting the lines in a file.

    Other than that how do you know how many records are in the file? How have you determined it?
  • 11. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    The thing is I used this command to know the filesize and hence got to know how many got loaded and how many got skipped but to use this approach as permanant solution is not feasible.I have many no of such files to load,and if it takes 2 hrs for 1 file then we can imagine how much time it will take for all the files.
  • 12. Re: Sql loader not able to load more than 4.2 billion rows
    BluShadow Guru Moderator
    Currently Being Moderated
    Nobody was suggesting it as a permanent solution, we were asking you to confirm how many rows were in your file so we could try and eliminate what the causes of your initial issue are and hopefully find a solution.

    So, have you confirmed that you definitely have a well structured file containing 6.9 billion rows?
  • 13. Re: Sql loader not able to load more than 4.2 billion rows
    883044 Newbie
    Currently Being Moderated
    Yes.The file is well structured.
    Also wanted to update one more thing.There is no unique key present in the table.
  • 14. Re: Sql loader not able to load more than 4.2 billion rows
    thundef Newbie
    Currently Being Moderated

    I'm facing the same issue, the problem seems to be what Benoit is mentioning in this post:

    SQL*Loader stuck after loading 4.2 billion records - Stack Overflow

     

    The report given here by the original poster gives an even further indication that this is true, since the

    "Total logical records read:" is 4294967295, which is EXACTLY 2^32.

    This is a Y2K-type of bug. "Who would -ever- try and load 5 billion records, right? right?"

1 2 Previous Next

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points