2 Replies Latest reply on Jan 7, 2020 10:00 AM by user11763611

    Distinguish whole and real numbers in numeric expressions with OCI_ATTR_PRECISION/OCI_ATTR_SCALE




      In our OCI application we need to describe the column types of SQL statements.


      We use the OCI APIs for that:


      OCIParamGet(stmth, OCI_HTYPE_STMT, eh, &p, pos);

      OCIAttrGet(p, OCI_DTYPE_PARAM, &data_type, 0, OCI_ATTR_DATA_TYPE, eh );

      OCIAttrGet(p, OCI_DTYPE_PARAM, &precision, 0, OCI_ATTR_PRECISION, eh );

      OCIAttrGet(p, OCI_DTYPE_PARAM, &scale, 0, OCI_ATTR_SCALE, eh );


      When the SQL statement contains numeric expressions such as (col * 2), or aggregates like SUM(col), the data type is SQLT_NUM and the precision and scale is ALWAYS ZERO, no matter if the original column type is a NUMBER(P,0)/INTEGER, or NUMBER(P,S).


      As result, we cannot distinguish NUMBER(P,0)/INTEGER results from NUMBER(P,S) to format the result with or without decimal digits.


      Is there another OCI descriptor attribute that we could use?


      SQLPLUS seems to deal well with this: The result of whole expressions is displayed without a decimal point and zero digit while the result of a real expression shows decimals:


      SQL> create table mytable ( ci integer, cn number(10,2) );

      SQL> insert into mytable values ( 123, 123.45 );

      SQL> select sum(ci), sum(cn) from mytable;

         SUM(CI)    SUM(CN)

      ---------- ----------

             123     123.45


      SQL> select ci * 0.3,  cn * 3 from mytable;

          CI*0.3     CN*3

      ---------- ----------

            36.9     370.35


      How is this done?


      We know the workaround is to use a CAST() but that means a lot of code change for what we consider as an OCI limitation...