12 Replies Latest reply on Mar 23, 2012 6:17 PM by 924924

    Unable to import data from .cvs into oracle table

    924924
      Hi Everyone,

      I am trying to import an excel file into an Oracle table. I am using the Data Import Wizard of Oracle SQL Developer 1.5.5 Build MAIN- 5969. When I get to the Data Preview dialog, (step 1 of 4) The Data Preview does not look right. For example: The data in column 1 contains some data from column 4. I import my data using a .csv file. When i open the .cvs file in excel the data looks fine. I tried using a different delimiter, as my data fields have commas which I want to preserve. But even that does not seem to work.
      Any help would be greatly appreciated.


      Thanks,
      S
        • 1. Re: Unable to import data from .cvs into oracle table
          Gary Graham-Oracle
          Hi Sammy,

          Well, 1.5.5 works fine for me in the following case:
          "WAREHOUSE_ID","WAREHOUSE_SPEC","WAREHOUSE_NAME","LOCATION_ID","WH_GEO_LOCATION"
          1,,"Southlake, Texas",1400, 
          Note the embedded comma in WAREHOUSE_NAME and the use of double quote as the left/right enclosure character. I also tried both Window and Linux line termination styles. No problem with either one.

          That said, SQL Developer 1.5.5 is out-dated. Why not upgrade to the latest 3.1 production? Now the Data Import wizard lets you specify the line termination style, enclosure characters, character encoding, and so on.

          Regards,
          Gary
          SQL Developer Team
          • 2. Re: Unable to import data from .cvs into oracle table
            >
            Any help would be greatly appreciated.
            >
            Can you post a few sample rows of data and tell us what you think the problem is?
            • 3. Re: Unable to import data from .cvs into oracle table
              924924
              Hi
              Thank you for replying,

              For example:
              In my excel file
              Column1 contains
              123
              456
              while column 2 contains data

              When i convert the file into .csv the looks fine.
              But in data import wizard the Data preview looks like that

              123
              data,data,data
              data
              datadata
              data
              456
              data,data
              data
              data

              It should be
              123|data,data,data
              456|data


              Hope this helps!
              (P.s I can send you a screen shot if it helps. But i am not sure if we can paste the picture in the text box)
              S
              • 4. Re: Unable to import data from .cvs into oracle table
                924924
                Thank you gary, i will ask my higher up if we can upgrade to 3.1
                Regards,
                S
                • 5. Re: Unable to import data from .cvs into oracle table
                  >
                  Hope this helps!
                  >
                  No - it doesn't help. There is no way to try to reproduce your problem without a sample of the data you are using.
                  I ask you to 'Can you post a few sample rows of data '. Instead you posted 123 and 456 for column and then just said 'column 2 contains data'. Could you create a test file if I just said 'columns 3 and 4 contain data'?

                  If I create an .xls file with A1 => 123 and B1 => 'data,data,data' and save it as a CSV it comes out like this:
                  123,"data,data,data"
                  If I save it as tab-delimited it comes out like this:
                  123     "data,data,data"
                  So if the data starts in Excel and is then saved it saves ok fo rme.
                  If you want help with this post a few rows of sample data and don't mix your use of terms.

                  What is the ORIGINAL source of the data? CSV file or Excel (.XSL) file?
                  Originally you said 'I am trying to import an excel file '. That would mean the file has an extension like 'XLS'.
                  A file with an extension of 'CSV' is NOT an Excel file. It can be opened by Excel, or notepad, or wordpad but it is not an Excel file.

                  Do you have an Excel file ('.XLS') that you are trying to save as a CSV file for import?
                  Or do you have a CSV file you want to import but you view it using Excel?

                  Which of those two you are trying to do makes a difference.
                  • 6. Re: Unable to import data from .cvs into oracle table
                    924924
                    I apologize for being so vague.
                    What is the ORIGINAL source of the data? CSV file or Excel (.XSL) file?
                    Both
                    I use .XSL files as well as .CVS file
                    The .XSL file is converted into .CVS befor i import it in SQL Developer

                    But most of the time my Original source file is .CVS that i get from my business user as an import file to sql developer

                    This is the copy of my test.cvs file
                     
                    Warehouse Number,Group,Description,Area,Team 
                    PNR12345,G1,"Ron 
                    Josh 
                    Bob 
                    Ken 
                    Kristi 
                    Kate 
                    Sam 
                    ",Americas,ABC 
                    PNR12346,G2,"Xxxxxxxx 
                    Test File Test File Test File Test File Test File Test File Test File Test File Test File 
                    Test File Test File Test File Test File Test File Test File 
                    Test File Test File Test File 
                    Test File Test File Test File 
                    Test File Test File Test File, Test File Test File Test File, “Test File Test File Test File” 
                    ",EMEA,LMN 
                    PNR12347,G3,"Test File Test File Test File 
                    Test File Test File Test File 
                    ",APJ,PQR 
                    PNR12348,G4,,Americas,XYZ 
                    PNR12349,G5,,Americas,ABC 
                    PNR12350,G6,,EMEA,LMN 
                    PNR12351,G7,,APJ,PQR 
                    PNR12352,G8,,Americas,XYZ 
                    PNR12353,G9,,EMEA,XYZ 
                    PNR12354,G10,,APJ,LMN 
                    Also, A image of how the preview looks in Import wizard.
                    [Data Import wizard|http://i1156.photobucket.com/albums/p578/nairsowmya/Capture.png]

                    S
                    • 7. Re: Unable to import data from .cvs into oracle table
                      Gary Graham-Oracle
                      Hi Sammy,

                      It looks like SQL Developer (even 3.1) does not really handle enclosed values with embedded line termination characters as in your test case.

                      I was easily able to create a table with a description column, insert row data into it so the description value spanned multiple lines, then export to a csv file. Opening it in Excel, the appearance is just as you say. However, upon attempting to import that back into 3.1, an informational confirmation dialog ("Import Data Field Format") gets displayed on top of the preview with this message:
                      File contains invalid format data. Details for the error at line 2 follow.
                      Line contains invalid enclosed character data or delimiter at position 17.
                      LINE CONTENTS:
                      PNR12345,G1,"Ron
                      {code}
                      In reading about the csv format standard, it seems some products support double quoted data with embedded line termination characters (more common on Windows) but not all.  On Linux, escaping the termination character \n by preceding it with an extra \ is a more typical requirement.
                      
                      So it seems SQL Developer 3.1 is working as-designed, but this would make for a good enhancement request candidate on the SQL Developer Exchange.
                      
                      Regards,
                      Gary
                      SQL Developer Team                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                
                      1 person found this helpful
                      • 8. Re: Unable to import data from .cvs into oracle table
                        924924
                        Thanks Gary.
                        I actually started loading the data directly as an .xls file and it seems to work. So i am going to just use that for now.

                        Also, I have started using SQL Developer 3.1 and its so much better then 1.5.5
                        Thank You!

                        S
                        • 9. Re: Unable to import data from .cvs into oracle table
                          Gary Graham-Oracle
                          Sammy,

                          Great to hear about your progress on this issue. My thinking was to favor the csv format over xls since there is more of a memory limitation with xls when exporting. Perhaps there is no such similar limitation when importing. And if it is working for you now, that's what matters most.

                          All the best,
                          Gary
                          • 10. Re: Unable to import data from .cvs into oracle table
                            924924
                            Hi Gary,
                            My thinking was to favor the csv format over xls since there is more of a memory limitation with xls when exporting.
                            Perhaps there is no such similar limitation when importing.
                            The limitation still exists when importing.
                            If the data in my description column is more than 4000 bytes, as my data type is varchar2.
                            If i use Long or CLOB as a data type, the below error pops up
                            Data Types LONG,  not supported for import 
                            Right now my data set does not exceed the limit so i am ok using xls. But i would like to figure out a solution before i face the memory limitation issue. Any recommendations?
                            Thanks,
                            S
                            • 11. Re: Unable to import data from .cvs into oracle table
                              Gary Graham-Oracle
                              Hi Sammy,

                              Actually the memory limitation I referred to is a Java VM memory issue. In terms of import/export data type restrictions, I have seen (very old) enhancement requests for CLOB support but nothing is scheduled yet, as far as I know. As for LONG, it has been deprecated for many years, so there is virtually no hope of any enhancement there:
                              Re: SQL Error: Stream has already been closed using SQLDev 3.1

                              If your data starts exceeding the 4000 VARCHAR2 database limit, your options at this time are:
                              1. Push for CLOB support in import.
                              2. Define an external table on top of the csv file. External tables support CLOB columns.

                              For (2), however, you would probably need to convince your users to reformat the description values to remove the embedded line termination characters.

                              Cheers,
                              Gary
                              1 person found this helpful
                              • 12. Re: Unable to import data from .cvs into oracle table
                                924924
                                Thank you for All the help Gary!
                                Appreciate it!