Use of information drives model design ...
You need a snapshot fact table that stores the balances monthly (by whatever other dimensional keys in the grain) ... then you just are looking at the (selected month/year) balance and (month12/year-1) balance from the snapshot table. The logical variance and percent change columns are then super easy to build.
Thank you for the reply.
I'm not sure I understand how can this be implemented. However, I used AGO function with a presentation variable holding the Month as an offset parameter. And it worked just fine.
Performance Rule of Thumb is to push as much work back to the database as possible ... yes, you can do it logically, but I can assure you your approach only works for small data sets and it only works for the one tool (OBI) you've implemented it in -- in other words it's a limited approach. I'll never advise you on a limited approach.
Yes you are right. It took so long to run the report and I also couldnt manage to calculate the change.
Can you please illustrate your approach in more details?
How can I get the measure from the snapshot to filter on month12/year -1 while filtering on a specific date from the time dimension?
You need to build the snapshot fact first ... to get you started ...
Okay, I built the snapshot fact table.
I dont understand how will I filter the measure from the snapshot to captrue the end of last year data, as im filtering on a specific date from the time dimension?
Now that you have your snapshot and the date on which it was taken, you can build a logical column which is pinned at the LAST aggregation level (based on snapshot date) and additionally you can logically filter that by your current year minus one.
I have found the LAST aggregation method to be extremely poor performing. In essence, if you have data at the "day" level and pull back the "Last" value for a year... it pulls back data for all 365 days, and then only keeps data for the last day that contains data. This is because LAST means "Last day that has data", not "Last day of period".
We also report by last day of month / quarter / year. To do this, I first added columns on our date dimension that contain what the last day of each month / quarter / year should be. I then had to aliases of the fact table and join using those "last date" columns in my date dimension. This works fine and is performant - but adds quite a bit of complexity to the logical layer (multiple LTSs for each fact and the time dimension, etc.)
I submitted an enhancement request years ago to add a new aggregation type that would actually pick the last day of a period vs. the last day that had data (assuming "first" works same way) - it would make things perform MUCH faster on the database side. But unfortunately that's never happened, so I've been stuck with forcing the logical layer to work by adding complexity.
Hope this helps,
That's why I recommended a PHYSICAL structure FIRST ....