This discussion is archived
1 Reply Latest reply: Jan 3, 2013 11:56 PM by fac586 RSS

Unable to upload CSV File through APEX 4.2 (Blob) to Oracle table.

Financial_Monitoring Newbie
Currently Being Moderated
Hi all,

I needed to create a page on my existing APEX application that would allow the user to upload a file, I followed an online tutorial where the user had created a dummy table and inserted CSV File inserted through APEX into the table. Following that simpe example I am able to load the simple CSV file (from tutorial) into a dummy table (from tutorial) but when I attempt to insert actual/dummy data into my actual database (which has a lot more fields of different types), using the exact same process, I am unable to do so.

Ironically, I am unable to insert even dummy values despite the fact that I have been able to insert the same dummy values using SQL Developer. Icing on the cake is that APEX does not produce any error, this lack of debugging feature (expecially line by line debuggine) is such a pain. Just to add, I can load the values into an Array and can successfully print the delimited values off the array but am still unable to insert the same values into my table. Here is the table that I am attempting to insert into (actual names replaced by Dummyxx):

CREATE TABLE "TABLE"."STG" (

     "DUMMY01" NUMBER,
     "DUMMY02" NUMBER,
     "DUMMY03" NUMBER,
     "DUMMY04" DATE,
     "DUMMY05" CHAR(3 BYTE),
     "DUMMY06" CHAR(3 BYTE),
     "DUMMY07" VARCHAR2(20 BYTE),
     "DUMMY08" VARCHAR2(20 BYTE),
     "DUMMY09" VARCHAR2(100 BYTE),
     "DUMMY10" VARCHAR2(100 BYTE),
     "DUMMY11" VARCHAR2(30 BYTE),
     "DUMMY12" VARCHAR2(30 BYTE),
     "DUMMY13" VARCHAR2(100 BYTE),
     "DUMMY14" VARCHAR2(30 BYTE),
     "DUMMY15" VARCHAR2(100 BYTE),
     "DUMMY16" VARCHAR2(100 BYTE),
     "DUMMY17" VARCHAR2(30 BYTE),
     "DUMMY18" VARCHAR2(30 BYTE),
     "DUMMY19" VARCHAR2(100 BYTE),
     "DUMMY20" VARCHAR2(30 BYTE),
     "DUMMY21" VARCHAR2(100 BYTE),
     "DUMMY22" VARCHAR2(100 BYTE),
     "DUMMY23" VARCHAR2(255 BYTE),
     "DUMMY24" VARCHAR2(255 BYTE),
     "DUMMY25" VARCHAR2(255 BYTE),
     "DUMMY26" VARCHAR2(30 BYTE),
     "DUMMY27" VARCHAR2(30 BYTE),
     "DUMMY28" VARCHAR2(30 BYTE),
     "DUMMY29" VARCHAR2(30 BYTE),
     "DUMMY30" VARCHAR2(30 BYTE),
     "DUMMY31" VARCHAR2(30 BYTE),
     "DUMMY32" VARCHAR2(30 BYTE),
     "DUMMY33" VARCHAR2(30 BYTE),
     "DUMMY34" DATE,
     "DUMMY35" VARCHAR2(100 BYTE),
     "DUMMY36" VARCHAR2(100 BYTE),
     "DUMMY37" CHAR(3 BYTE),
     "DUMMY38" VARCHAR2(255 BYTE),
     "DUMMY39" CHAR(1 BYTE),
     "DUMMY40" DATE,
     "DUMMY41" CHAR(3 BYTE),
     "DUMMY42" VARCHAR2(255 BYTE),
     "DUMMY43" DATE,
     "DUMMY44" CHAR(3 BYTE),
     "DUMMY45" NUMBER,
     "DUMMY46" NUMBER,
     "DUMMY47" CHAR(3 BYTE),
     "DUMMY48" VARCHAR2(255 BYTE),
     "DUMMY49" VARCHAR2(100 BYTE),
     "DUMMY50" VARCHAR2(100 BYTE),
     "DUMMY51" VARCHAR2(255 BYTE),
     "DUMMY52" VARCHAR2(255 BYTE),
     "DUMMY53" VARCHAR2(255 BYTE),
     "DUMMY54" VARCHAR2(30 BYTE),
     "DUMMY55" VARCHAR2(30 BYTE),
     "DUMMY56" VARCHAR2(30 BYTE),
     "DUMMY57" VARCHAR2(30 BYTE),
     "DUMMY58" VARCHAR2(30 BYTE),
     "DUMMY59" VARCHAR2(30 BYTE),
     "DUMMY60" VARCHAR2(30 BYTE),
     "DUMMY61" VARCHAR2(30 BYTE),
     "DUMMY62" DATE,
     "DUMMY63" VARCHAR2(100 BYTE),
     "DUMMY64" VARCHAR2(100 BYTE),
     "DUMMY65" CHAR(3 BYTE),
     "DUMMY66" NUMBER DEFAULT 1,
     "DUMMY67" DATE DEFAULT SYSDATE,
     "DUMMY68" VARCHAR2(30 BYTE),
     "DUMMY69" VARCHAR2(50 BYTE),
     "DUMMY70" VARCHAR2(100 BYTE),
     "DUMMY71" VARCHAR2(100 BYTE),
     "DUMMY72" VARCHAR2(255 BYTE),
     "DUMMY73" CHAR(3 BYTE)
)


And here's the insertion process:

DECLARE

v_blob_data BLOB;
v_blob_len NUMBER;
v_position NUMBER;
v_raw_chunk RAW(10000);
v_char CHAR(1);
c_chunk_len NUMBER:= 1;
v_line VARCHAR2 (32767):= 'NULL';
v_line2 VARCHAR (32767):='NULL';
v_data_array wwv_flow_global.vc_arr2;
v_rows NUMBER;
v_sr_no NUMBER:=1;
dlm VARCHAR(1):=':';
array_count NUMBER:=1;



BEGIN

-- Read data from wwv_flow_files</span>

select blob_content into v_blob_data from wwv_flow_files
where last_updated = (select max(last_updated) from wwv_flow_files where UPDATED_BY = :APP_USER)
and id = (select max(id) from wwv_flow_files where updated_by = :APP_USER);

v_blob_len := dbms_lob.getlength(v_blob_data);
v_position := 1;


-- Read and convert binary to char</span>

WHILE (v_position <= v_blob_len ) LOOP
v_raw_chunk := dbms_lob.substr(v_blob_data,c_chunk_len,v_position);
v_char := chr(hex_to_decimal(rawtohex(v_raw_chunk)));
v_line := v_line || v_char;
v_position := v_position + c_chunk_len;


-- When a whole line is retrieved </span>

IF v_char = CHR(10) THEN


-- Convert comma to : to use wwv_flow_utilities </span>

v_line := REPLACE (v_line,',',':');


-- Convert each column separated by : into array of data </span>

v_data_array := wwv_flow_utilities.string_to_table (v_line);


-- If I run a process to produce the output of the data array onto the screen, the values displayed are in the pattern desired.


-- Inserting Dummy values instead of values from the Array (v_data_array) to test

EXECUTE IMMEDIATE 'insert into TABLE.STG (DUMMY01, DUMMY02 ..... DUMMY65)

VALUES (
60040,
5,
TO_DATE("14-Dec-12","DD-MON-RRRR HH:MI:SS"),
"ABC",
"ABC",
60,
"11DEC1201",
"331",
"SCOTT TIGER",
"MARKETING",
21327,
21328,
"scott@abc.com",
NULL,
"SCOTT 1",
"AUDIT",
21327,
21328,
"scott1@abc.com",
NULL,
"SCOTT2",
"SCOTT3",
"HOUSE",
NULL,
NULL,
NULL,
NULL,
NULL,
33181,
81302,
NULL,
NULL,
NULL,
"20-Jun-68",
"UNO",
"HOUSE",
"ABC",
NULL,
NULL,
NULL,
"XYZ",
2650,
5250,
"ABC",
"XYZ",
NULL,
11620,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL,
NULL)';


-- Clear out

v_line := 'NULL';
v_sr_no := v_sr_no + 1;

END IF;

END LOOP;

END;


Please note that all the the CSV does not contain all the fields, the CSV files that are expected to be entered into the system contain 65 Fields whereas the Table has 73 Fields. Also note that the process runs fine through SQL loader that is invoked through a different server which I need to release and hence the attempt to load the table this way. Also, the procedure on the SQL Loader server is quite complex and involved JAVA+Unix Shell Scripts etc. which I would prefer to avoid.

Any help is much appriciated.

Thanks & Regards.

Edited by: 975015 on 03-Jan-2013 21:03
  • 1. Re: Unable to upload CSV File through APEX 4.2 (Blob) to Oracle table.
    fac586 Guru
    Currently Being Moderated
    >

    Please update your forum profile with a real handle instead of "975015".
    I needed to create a page on my existing APEX application that would allow the user to upload a file, I followed an online tutorial where the user had created a dummy table and inserted CSV File inserted through APEX into the table.
    Why? With APEX 4.2 the ability to create applications with data loading capabilities is built-in.

    If you're using any online tutorials or articles, then provide a link to them.
    Icing on the cake is that APEX does not produce any error, this lack of debugging feature (expecially line by line debuggine) is such a pain.
    APEX has many debugging features, most importantly Debug Mode. In properly designed applications based on PL/SQL program units, full debugging using SQL Developer is available.

    Always post code using <tt>\
    ...\
    </tt> tags as described in the FAQ.
    CREATE TABLE "TABLE"."STG" (
    
         "DUMMY01" NUMBER, 
         "DUMMY02" NUMBER, 
    ...
    DO NOT use quoted identifiers unnecessarily.
    EXECUTE IMMEDIATE 'insert into TABLE.STG (DUMMY01, DUMMY02 ..... DUMMY65)
    ...
    DO NOT use dynamic SQL unnecessarily.
    VALUES (
    60040,
    5,
    TO_DATE("14-Dec-12","DD-MON-RRRR HH:MI:SS"),
    "ABC",
    "ABC",
    ...
    Certainly a major part of the problem is the use of double quotes in this dynamic SQL. As described in the link above, in Oracle SQL double quotes (") are used to delimit identifiers, and single quotes (') to delimit string values. When a single quote is required to appear inside a text literal like this dynamic SQL statement, +two single quotes+ are used to represent it:
    {code}
    EXECUTE IMMEDIATE 'insert into TABLE.STG (DUMMY01, DUMMY02 ..... DUMMY65)
    VALUES (
    60040,
    5,
    TO_DATE(''14-Dec-12'',''DD-MON-RRRR HH:MI:SS''),
    ''ABC'',
    ''ABC'',
    ...
    {code}
    The better approach when working with dynamic SQL is to use the <a href="http://docs.oracle.com/cd/E11882_01/server.112/e26088/sql_elements003.htm">alternative quoting mechanism</a>:
    {code}
    EXECUTE IMMEDIATE q'{insert into TABLE.STG (DUMMY01, DUMMY02 ..... DUMMY65)
    VALUES (
    60040,
    5,
    TO_DATE('14-Dec-12','DD-MON-RRRR HH:MI:SS'),
    'ABC',
    'ABC',
    ...
    }';
    {code}
    The +best+ approach is not to use dynamic SQL unnecessarily, and it's not necessary here.

Legend

  • Correct Answers - 10 points
  • Helpful Answers - 5 points