I am not 100% clear on a good backup strategy for HFM. The document from Oracle, is not too clear on some points.
My main point is Financial Reports, what is the best way to back these up? Is it really exporting the reports from the Workspace, or is there a way to do this on the server. I think the files/reports are stored there.
Does anybody have any documentation on how they do their backups? Would be really helpful.
Backing up FR is fun. If you don't want to run exports / imports (and I don't blame you), you'll have to jump through a couple of hoops.
The actual reports are stored on your file system in the location you specified to serve as the Reporting and Analysis repository. If you don't recall where that is, open your workspace database and query the V8_SERVICEAGENT table. Look at the HOME_DIRECTORY value for the RM1 AGENT_UUID. (select HOME_DIRECTORY from V8_SERVICEAGENT where AGENT_UUID = 'RM1') You can then backup the files using any typical file backup approach.
This may not solve all of your problems in the event you are trying to restore after an accidental report deletion or if database corruption occurred as well as losing reports. To restore the reports into Workspace, you'd need (at a minimum) the following tables : V8_Container, V8_Folder, V8_Folder_Tree... As the tables contain other information and there are other tables/information you need (i.e. security), you may just have to settle with backing up the entire database which is relatively straight forward... This isn't 'perfect'; however, as if your goal is just to be able to restore individual reports from time to time, restoring the entire database would probably blow out other data you weren't looking to change....
With all that file system / database stuff said, I generally just export the reports periodically and then re-import as needed (rarely).
For our DR environment, we just clone our VMWare setup; therefore, it's a lot easier to deal with all of these issues. ;)
I know what you mean. There are many things that partially work or aren't worth the effort, but using LCM to back up artifacts such as FR reports, FM rules, data grids/forms, metadata (FM or EPMA), member lists, security, etc., works very well. It's quite easy to backup/restore these artifacts and usually is something an admin will feel comfortable doing after a time or two.
Version 11 works much better than version 9, and has a better interface through Shared Services.
Version 220.127.116.11 is different than version 18.104.22.168 and prior.
Backing up to the same environment is simply done through the front end of SS, but crossing environments requires a user folder on the server to be copied (can be done via batch script).
Yep. You can choose any repository object. It can be all reports, a certain folder, or selected reports.
The high level steps would be:
1) Export whatever you want to file system in LCM (this to the current environment)
2) Find import_export server folder where LCM artifacts were exported: \Oracle\Middleware\user_projects\epmsystem1_FOUNDATION\import_export
3) Copy desired import_export subfolder to new environment
4) Open Shared Svcs in new environment and restore!
Why am i reading this while envisioning you as the ShamWow guy? :-)
One question though, when you restore, is it smart enough to bulk update connections? It seems to me it isn't. If I have multiple test applications or a different database connection on the dev machine, then I'd have to update all of the reports for the new connection? This is annoying since you'd have to go through and make DB connection updates... While you can update multiple reports in one folder, I do not believe there is a way to do a parent folder and it have recursively go through the structure updating?
I feel more like the Dell dude today. I don't know of any way to bulk update the db connection once migrated. You'd sill have to do that process the same as when you do the export/import from the repository.
If you export the top level of the reports, it sends you a 7z file which contains the folders and DES files. The DES file is basically XML/Plain text; therefore, if you know where the connection information is defined in the file, you can use something like WinGrep to bulk update everything.
A little bit of work to figure out where the connection data is stored in the DES files; however, once you figure it out, it is pretty easy to work with it.