This discussion is archived
5 Replies Latest reply: Oct 15, 2013 11:15 AM by 937454 RSS

external table load when issue

937454 Newbie
Currently Being Moderated

hi,

 

my db version is 11g

 

I have a field in the csv file which is greater than 4000 characters.

 

But my external table has field defined as varchar2(4000).

I cannot change it to clob (checked with my lead).

So I need to get the csv data to populate into the external table.

Is there a way, I can do it within external table definition.

 

I tried using:

load when length (acdfsummary) <= 4000

 

But I understand, I cannot use the above as it will not load the entire record.

 

I checked documentation, I dont find any other way. Please advice any alteratives.

  • 1. Re: external table load when issue
    Hoek Guru
    Currently Being Moderated

    Why can't you change it to a CLOB?

    ORACLE-BASE - External Tables Containing LOB Data

  • 2. Re: external table load when issue
    937454 Newbie
    Currently Being Moderated


    The data is enormous. Our servers dont have that much capacity. So avoiding using CLOB.

    And those fields have lot of junk data, only the first 1000 or 4000 characters makes sense, which we want to load.

     

    I was trying to apply SUBSTR to field within external table definition, but gives a syntax error. I think that is not supported.

  • 3. Re: external table load when issue
    BluShadow Guru Moderator
    Currently Being Moderated

    But the external table definition is just a mechanism for viewing the data in the file.

    What you actually store on the database depends on what you do when you query it to insert it, so you can apply the SUBSTR to the data when you query and just insert the first 4000 characters to your real table.

  • 4. Re: external table load when issue
    Frank Kulash Guru
    Currently Being Moderated

    Hi,

     

     

     

    937454 wrote:

     


    The data is enormous. Our servers dont have that much capacity. So avoiding using CLOB.

    And those fields have lot of junk data, only the first 1000 or 4000 characters makes sense, which we want to load.

     

    I was trying to apply SUBSTR to field within external table definition, but gives a syntax error. I think that is not supported.

     

    As Blushadow said, the data in an external table exists only in the external file.  If your csv file already has the huge strings, then you're already using all the space you need, and nothing you do in the CREATE TABLE statement will make it worse.

     

    If, for some reason, you want people to only see the first 4000 characters, you can create a view that only includes the first 4000 characters, like this:

     

    CREATE TABLE table_ext
    (
      col_a  VARCHAR2 (  50 BYTE)
    , col_b  CLOB
    )
    ORGANIZATION EXTERNAL
    (
    TYPE ORACLE_LOADER
    DEFAULT DIRECTORY DIR_EX
    ACCESS PARAMETERS
    (
    RECORDS DELIMITED BY NEWLINE SKIP 1
    FIELDS TERMINATED BY ',' MISSING FIELD VALUES ARE NULL
        ( col_a
        , col_b  CHAR (32000)
        )
    )
    LOCATION
    (
    'table_ext.csv'
    )
    )
    REJECT LIMIT UNLIMITED;

     

     

    CREATE OR REPLACE VIEW view_ext
    AS
    SELECT  col_a
    , CAST ( SUBSTR (col_b, 1, 4000)
          AS VARCHAR2 (4000)
          )   AS col_b
    FROM table_ext;

    You don't need to give users any privileges on the table (table_ext); just give them privileges on the ciew (view_ext).

  • 5. Re: external table load when issue
    937454 Newbie
    Currently Being Moderated

    @blushadow: I missed that. I get it. It doesnt create any issue with memory as I am just reading it from external file.

    @Frank: I will apply the transformation provided in my custom etl  process while loading into target table.

    Thank you.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points