Hello All
While reading data from DB, our middileware interface gave following error.
java.sql.SQLException: Fail to convert between UTF8 and UCS2: failUTF8Conv
I understand that this failure is because of a multi-byte character, where 10g driver will fix this bug.
I suggested the integration admin team to replace current 9i driver with 10g one and they are on it.
In addition to this, I wanted to suggest to the data input team on where exactly is the failure occured.
I have asked them and got the download of the dat file and my intention was to findout where exactly is
that multi-byte character located which caused this failure.
I wrote the following code to check this.
import java.io.*;
public class X
{
public static void main(String ar[])
{
int linenumber=1,columnnumber=1;
long totalcharacters=0;
try
{
File file = new File("inputfile.dat");
FileInputStream fin = new FileInputStream(file);
byte fileContent[] = new byte[(int)file.length()];
fin.read(fileContent);
for(int i=0;i<fileContent.length;i++)
{
columnnumber++;totalcharacters++;
if(fileContent<0 && fileContent[i]!=10 && fileContent[i]!=13 && fileContent[i]>300) // if invalid
{System.out.println("failure at position: "+i);break;}
if(fileContent[i]==10 || fileContent[i]==13) // if new line
{linenumber++;columnnumber=1;}
}
fin.close();
System.out.println("Finished successfully, total lines : "+linenumber+" total file size : "+totalcharacters);
}
catch (Exception e)
{
e.printStackTrace();
System.out.println("Exception at Line: "+linenumber+" columnnumber: " +columnnumber);
}
}
}But this shows that the file is good and no issue with this.
Where as the middleware interface fails with above exception while reading exactly the same input file.
Anywhere I am doing wrong to locate that multi-byte character ?
Greatly appreciate any help everyone !
Thanks.