2 Replies Latest reply: Apr 28, 2005 8:24 AM by 32387 RSS

    sql loader 510

    32387

      I am loading a file with some BLOBs. Most of the data seems to have loaded ok but I am now getting this error:

      SQL*Loader-510: Physical record in data file
      (c:\Sheets_2005.dat) is longer than the maximum(20971520)

      The ctl file was auto generated by Migration Workbench...i have added the options in....

      options (BINDSIZE=20971520, READSIZE=20971520)
      load data
      infile 'c:\sheets_2005.dat' "str '<EORD>'"
      append
      into table SHEETS
      fields terminated by '<EOFD>'
      trailing nullcols
      (REFNO,
      SHEETNO,
      DETAIL CHAR(100000000)
      MESSAGE,
      SIZE_)

      Any ways around this error?

      Thanks


        • 1. Re: sql loader 510
          user9630 - oracle
          Hello,

          Can you tell me which plugin you are using?


          option#1
          ********

          Cause: I think from the error message it appears that the datafile has a physical record that is too long.

          If that is the case, try changing the length of the column [this problem could be most likely at the blob/clob column]. Also try Using CONCATENATE or CONTINUEIF or break up the physical records.

          OPTION#2
          ********

          If you are using sql server or sybase plugin, this workaround may work:

          Cause: Export of Binary data may be bit too big. Hence it needs to be converted to the HEX format. Thus produced HEX data can be saved into a clob column.

          The task is split into 4 sub tasks

          1. CREATE A TABLESPACE TO HOLD ALL THE LOB DATA
               --log into your system schema and create a tablespace
               --Create a new tablespace for the CLOB and BLOB column
               --You may resize this to fit your data ,
               --Remember that we save the data once as CLOB and then as BLOB   
               --create tablespace lob_tablespace datafile 'lob_tablespace' SIZE 1000M AUTOEXTEND ON NEXT 50M;

          ----------------------------------------------------------------------------------------------------------------
          ----------------------------------------------------------------------------------------------------------------
          2. LOG INTO YOUR TABLE SCHEMA IN ORACLE
               --Modify this script to fit your requirements
               --START.SQL (this script will do the following tasks)
                    ~~Modify your current schema so that it can accept HEX data
                    ~~Modify your current schema so that it can hold that huge amount of data.
                    ~~Modify the new tablespace to suite your requirements [can be estimated based on size of the blobs/clobs and number of rows]
                    ~~Disable triggers, indexes & primary keys on tblfiles

          ----------------------------------------------------------------------------------------------------------------
          ----------------------------------------------------------------------------------------------------------------
          3. DATA MOVE: The data move now involves moving the HEX data in the .dat files to a CLOB.

               --The START.SQL script adds a new column to <tablename> called <blob_column>_CLOB.  This is where the HEX values will be stored.
               --MODIFY YOUR CONTROL FILE TO LOOK LIKE THIS
                    ~~load data
                    ~~infile '<tablename>.dat' "str '<er>'"
                    ~~into table <tablename>
                    ~~fields terminated by '<ec>'
                    ~~trailing nullcols
                    ~~(
                    ~~ <blob_column>_CLOB CHAR(200000000),
                    ~~)

          The important part being "_CLOB" appended to your BLOB column name and the datatype set to CHAR(200000000)

               --RUN sql_loader_script.bat
               --log into your schema to check if the data was loaded successfully
               --now you can see that the hex values were sent to the CLOB column
               --SQL> select dbms_lob.getlength(<blob_column>),dbms_lob.getlength(<blob_column>_clob) from <tablename>;

          ----------------------------------------------------------------------------------------------------------------
          ----------------------------------------------------------------------------------------------------------------
          4. LOG INTO YOUR SCHEMA
               --Run FINISH.SQL.  This script will do the following tasks:
                    ~~Creates the procedure needed to perform the CLOB to BLOB transformation
                    ~~Executes the procedure (this may take some time a 500Mb has to be converted to BLOB)
                    ~~Alters the table back to its original form (removes the <blob_column>_clob)
                    ~~Enables the triggers, indexes and primary keys


          Good luck
          Srinivas Nandavanam
          • 2. Re: sql loader 510
            32387
            Thanks,

            Access 97.

            The problem must be with the images from the Access table. The column length is already at max,DETAIL CHAR(100000000)
            , it doesnt run if i increase it any further. How can i use continueif concatenate for this type of dat file, i.e. with images?