1 2 3 Previous Next 31 Replies Latest reply: Jul 24, 2012 6:00 PM by Gor_Mahia Go to original post RSS
      • 30. Re: processing very large file with util_file
        Etbin
        It works for me on <tt>Oracle Database 11g Enterprise Edition Release 11.2.0.2.0 - 64bit Production</tt>
        on Company time ;) but I might need it too
        CREATE TABLE EXTERNAL_CLOB
        (
          THE_ROW  clob
        )
        ORGANIZATION EXTERNAL
          (  TYPE ORACLE_LOADER
             DEFAULT DIRECTORY DIRECTORY_NAME
             ACCESS PARAMETERS 
             ( 
              records delimited by 'YYYYYYY'
              badfile 'EXTERNAL_CLOB' 
              logfile 'EXTERNAL_CLOB' 
              discardfile 'EXTERNAL_CLOB' 
              fields missing field values are null 
              reject rows with all null fields
              (the_row char(32000))
             )
             LOCATION ('clob_test.txt')
          )
        REJECT LIMIT UNLIMITED
        NOPARALLEL
        NOMONITORING;
        Produces two single column clob type rows (the second row as expected begins with a newline character)
        Adding 50 rows some 300+ characters to the second record did not throw an error - seems you're safe while record lengths are under 32000

        Regards

        Etbin

        Edited by: Etbin on 23.7.2012 10:44
        CREATE TABLE SKLADI_DEV.EXTERNAL_CLOB_1
        (
          AN_ID  NUMBER,
          NOTES  CLOB
        )
        ORGANIZATION EXTERNAL
          (  TYPE ORACLE_LOADER
             DEFAULT DIRECTORY skrbnistvo
             ACCESS PARAMETERS 
             (records delimited by 'YYYYYYY'
              badfile 'EXTERNAL_CLOB_1' 
              logfile 'EXTERNAL_CLOB_1' 
              discardfile 'EXTERNAL_CLOB_1' 
              fields terminated by 'XXXXXXX' lrtrim
              missing field values are null 
              reject rows with all null fields
              (an_id char(15),
               notes char(1000000)
              )
             )
             LOCATION ('clob_test.txt')
          )
        REJECT LIMIT UNLIMITED
        NOPARALLEL
        NOMONITORING;
        seems to work too
        select an_id,length(notes) clob_length 
          from external_clob_1
        returns to me
        AN_ID   ,CLOB_LENGTH
        74511453,        719
        74511454,      52035
        • 31. Re: processing very large file with util_file
          Gor_Mahia
          Etbin,
          I tested this in all scenarios looks good..thank you & best regards.
          1 2 3 Previous Next