6 Replies Latest reply on May 2, 2012 2:12 PM by Fischert

    IR group by function and timestamp datatype

      Is there any limitations on the group by function e.g. based on the datatype.

      My problem\misunderstanding is with the IR:

      I have a TIMESTAMP column (columnname TM_TIME) and I can schose it in the group by clause but not in the function section. Is there any reason why and could I get it to work.


      Edited by: Fischert on 02.05.2012 03:05
        • 1. Re: IR group by function and timestamp datatype
          I think it has to do with the basic concepts of SQL and Timestamp datatype.
          The Timestamp datatype stores fractional seconds of time. So, when you perform a "group by" on a Timestamp column the probability of 2 or more rows having the exact same fractional seconds is almost zero.
          As a consequence there is practically no grouping / aggregation that takes place and the number of rows returned are almost the same as without the group by clause.

          Food for thought?

          • 2. Re: IR group by function and timestamp datatype
            Hi Pradbodh,
            thank you.
            Thats what I mean - it is useless but possible choosing TIMESTAMP in GROUP BY but it's impossible to selct it as column in a function like min or max. Here TIMESTAMP could be very usefull.

            What do you think?

            • 3. Re: IR group by function and timestamp datatype
              It really depends on the granularity that you need from the timestamp to make it meaningful. The key word is aggregation. So you should ask what level of aggregation do I need?

              You can reduce the granularity from the fractional seconds to some lower level of granularity like seconds, minute, hours or even date (equivalent of trunc(date_column) using CAST or Truncating or to_date(to_char(...)) with appropriate format mask.

              It just depends on the application and data.
              E.g. for the LHC at CERN chasing the Higgins, timestamp is just not fine grained enough (you need 10 exp -23 or lower I guess !)
              E.g. if you look at seismic data for oil exploration a hell of a lot happens in 5th and 6th decimal places in the timestamp.
              But if you are looking at data logged by a normal SCADA system then maybe a second is detailed enough for the purpose.

              In normal business application we are better off "rounding" the timestamp to some meaningful level for aggregation/ reporting.

              • 4. Re: IR group by function and timestamp datatype
                From my point of view there is no reason why not using DATE, TIME or TIMESTAMP datatype with min, max functions.

                Or is there any reason?

                • 5. Re: IR group by function and timestamp datatype
                  Not sure how you are trying it. I am able to make an IR with Group By on Timestamp with Min and Max functions for other columns.

                  Here is how I did it, just for you to compare the steps.

                  a. Create an IR with a simple query with 2 columns, Reference_No (Varchar2) and Date_Posted (Timestamp). No group by or aggregation fucntions
                  b. At run time Actions > Format > Group By
                  c. Selected the Timestamp column in Group By Column 1
                  d. Added Min > Reference_no > Min Ref in the first row of the first table
                  e. Added Max > Reference_no > Max Ref in the second row
                  f. Clicked Apply

                  It works !

                  • 6. Re: IR group by function and timestamp datatype
                    Sorry for my poor english what I mean is the TIME datatypes are not available as columns for functions like min and max.

                    the anser on metalink

                    As you have noticed date, timestamp do not allow to use functions like min, max, etc.. in iteractive report.
                    Currently those functions are only supported for columns of type number.
                    The problem has been addressed to development as bug:


                    As far I can see it will be addressed in the next release (apex 4.2). Unfortunately we do not have a timeframe for the next release.
                    For the time being, please monitor the bug via My Oracle Support.