4 Replies Latest reply on May 24, 2018 11:05 PM by rp0428

    Bug in SQL Developer 18.1 (import .csv file)

    _Dima_

      It looks like a bug, when I try to import a .csv file into database table I can't use CLOB data type during importing. Thus I can't import data from .csv file into database when it contains cell with over 4k letters, because of varchar2 can work only with 4k letters length.

       

      I bypass it by creating table with CLOB data, then import .csv file into this table. In this case Import Wizard don't ask me to specify data types for columns as the table is exists already.

       

      For ppl who works with large set of data SQL Developer just unable to do import. Where can I fill a bug?

        • 1. Re: Bug in SQL Developer 18.1 (import .csv file)
          thatJeffSmith-Oracle

          Not sure this a bug. Sounds more like a 'should you' vs 'could you' issue.

           

          For LOBs, I would bypass CSV inserts and go straight to SQL*Loader. CSV's are also problematic when you have text with CR/LF's in them, CLOB or no CLOB.

          1 person found this helpful
          • 2. Re: Bug in SQL Developer 18.1 (import .csv file)

            Where can I fill a bug?

            You may report the issue to Oracle using your MOS account.

             

            It is NOT a bug - what you report shows sql developer is working EXACTLY as designed.

             

            The most common CSV file protocols (e.g. simple comma separator for fields and LF/CRLF for records) are NOT designed to work with data that might actually contain the embedded delimiters - particularly the record delimiter.

             

            1. Those protocols do NOT have the necessary escape mechanism to allow such data to be parsed/loaded properly. CLOBs, large ones in particular, can have almost any character allowable in the character set. Although some of these protocols allow 'enclosure' characters (e.g. double quotes if the data contains the field separator) few of them actually support the only reliable escape mechanism. That mechanism is the simple 'one character escape' - a single occurrence of the  'escape' character means to take the next character as data no matter what it is.

             

            2. Record-based parsers also typically restrict the length of each record. In some cases the length must be less than 1000 characters. This prevents larger records from being loaded properly by those parsers since the data would have to be truncated.

             

            3. If you try to extract your data that includes a large clob you will find that the resulting file won't contain the clob data either. That is also NOT a bug but works as designed.

            1 person found this helpful
            • 3. Re: Bug in SQL Developer 18.1 (import .csv file)
              _Dima_

              Thank you for detailed answer.

              Yes it more looks like a feature request than a bug.

               

              3. If you try to extract your data that includes a large clob you will find that the resulting file won't contain the clob data either. That is also NOT a bug but works as designed.

              I have JSON data in cells of .csv files that I imported into database. I will process JSON data and extract only some peaces of information from JSON.

              In case I need to import all data from CLOB fields into file .csv/json/xml whatever, is there any bypass way to do it?

              • 4. Re: Bug in SQL Developer 18.1 (import .csv file)

                I have JSON data in cells of .csv files that I imported into database. I will process JSON data and extract only some peaces of information from JSON.

                In case I need to import all data from CLOB fields into file .csv/json/xml whatever, is there any bypass way to do it?

                Not sure which 'direction' you are asking about.

                 

                The first sentence above appears to be about 'file into database'.

                 

                but the second say 'CLOB to file'.

                 

                Oracle 12c adds EXTENSIVE support for JSON data. So unless your need is a special case or you only need a very small part of the JSON data in the DB I suggest you load the entire set of data into the DB and then use Oracle's horsepower to 'extract only some pieces'.

                https://docs.oracle.com/database/121/ADXDB/json.htm#ADXDB6246

                39 JSON in Oracle Database

                Oracle Database supports JavaScript Object Notation (JSON) data natively with relational database features, including transactions, indexing, declarative querying, and views.

                Otherwise you will need to find/write a small utility (e.g.j Java) that can parse out what you want and load it. Java can stream clobs easily to the DB.