OCINumberToReal and SQLT_NUM
Hy,
I have the following code :
unsigned short datatype[255];
unsigned char *outvar[255];
double num_val;
int i = 0;
int errorcode;
if (datatype[i] == SQLT_NUM)
{
errcode = OCINumberToReal((OCIError *) errhp,
(CONST OCINumber *) outvar[i],
(uword) sizeof(double),
(dvoid *) &num_val);
if (errcode > 0)
{
printf("\n");
printf("OCINumberToReal error code is : %d \n",errcode);
}
else
printf("value is %f\n",num_val);
}
and
outvar[i] have been defined with OCIDefineByPos with char[21] as written in the 11g OCI Doc.
The problem is that there are no errors in the call to OCINumberToReal, but the value 2,22 in the database is printed as -82004701173600.000000.
In the database the SELECT retrieve values from a NUMBER datatype column.
I have the following code :
unsigned short datatype[255];
unsigned char *outvar[255];
double num_val;
int i = 0;
int errorcode;
if (datatype[i] == SQLT_NUM)
{
errcode = OCINumberToReal((OCIError *) errhp,
(CONST OCINumber *) outvar[i],
(uword) sizeof(double),
(dvoid *) &num_val);
if (errcode > 0)
{
printf("\n");
printf("OCINumberToReal error code is : %d \n",errcode);
}
else
printf("value is %f\n",num_val);
}
and
outvar[i] have been defined with OCIDefineByPos with char[21] as written in the 11g OCI Doc.
The problem is that there are no errors in the call to OCINumberToReal, but the value 2,22 in the database is printed as -82004701173600.000000.
In the database the SELECT retrieve values from a NUMBER datatype column.
0