1 Reply Latest reply: Aug 17, 2012 8:59 AM by AdamMartin RSS

    Manage historical data in OLAP

    956315
      Hi All,

      We have been preparing incident and availability reports for many clients since many years. We use OLAP processing for reports at the back end. Since it has been more than 3 years the amount of data got stored in OLAP tables is very huge. So the front end which shows the report is very slow. generated reports most of the times require only the last 3 months data but sometimes we require the old data also. So we could not delete the old data from OLAP tables. We tried creating more n more indexes but could not imporve the performance of the reports. can someone tell me how to manage huge amount of data in olap systems in the above scenario without affecting the performance.
        • 1. Re: Manage historical data in OLAP
          AdamMartin
          It looks like you are writing about the physical tables and indexes in your data model. In this forum, OLAP means something different, specifically the OLAP option of Oracle, which involves building multidimensional cubes in an analytic workspace. The cubes are generally built using the data warehouse or star schema as a source to load the cubes.

          You could use OLAP to build cubes based on your tables, but I think that would be jumping ahead a little.

          To address your immediate performance issues, you may want to consider partitioning the large tables you described, and using local indexes. Since the most recent data is needed most, consider partitioning on time (perhaps by month or quarter).

          Without knowing more about your system, that's the most advice I can give at the moment. Please describe your tables and indexes in more detail, and also describe the reports (or post the SQL) if you would like additional help.