My cube size is 2gb i did expot -uimport but still size is same and used buffred-RLE to reload data and managed to mae diff like 1.7 but
i assume it can be further reduced
i have 10 standard dimension in that 2 dence -Account time rest are sparce
i did hour glass model
Any other ways to optimize the data ...and cube and yes its BSO cube
What is your Commit block for your essbase database,is it default 3,000.
Increase the commit block to 20000 ,change the compression type to bitmap and run the data export script again.
In general, if the commit block interval automatically sets larger than 20,000, then it is recommended to reconsider the block density.Poor selection of block density will make too many blocks and potentially the reason for increased fragmentation.