What is the size of cube?
What is the difference between number of page files before and after aggregation?
What is the hit ration on Index and Data Cache?
Is that a migrated application? How much time it was taking to aggregate in base version?
Few general things to reduce the calc time:
1. Using Parallel Calculations,
2. Calc Taskdim,
3. Increasing data and index cache depending on hit ratios,
4. Hour glass/ Hour glass on stick,
5. Restructuring etc
If you think 10 minutes is a long time I dread to think what you would make of some of our calcs!!
Try changing your rule to just an 'aggregate' command on your sparse dims and see how long that takes....my guess is that calculating the dense dims is probably responsible for the majority of the calc time. If so then changing the dense dims so upper level members are dynamic might be the best bet, but what implications this could have on the functionality on the rest of your model I cant be sure.