This content has been marked as final. Show 9 replies
Backing up FR is fun. If you don't want to run exports / imports (and I don't blame you), you'll have to jump through a couple of hoops.
The actual reports are stored on your file system in the location you specified to serve as the Reporting and Analysis repository. If you don't recall where that is, open your workspace database and query the V8_SERVICEAGENT table. Look at the HOME_DIRECTORY value for the RM1 AGENT_UUID. (select HOME_DIRECTORY from V8_SERVICEAGENT where AGENT_UUID = 'RM1') You can then backup the files using any typical file backup approach.
This may not solve all of your problems in the event you are trying to restore after an accidental report deletion or if database corruption occurred as well as losing reports. To restore the reports into Workspace, you'd need (at a minimum) the following tables : V8_Container, V8_Folder, V8_Folder_Tree... As the tables contain other information and there are other tables/information you need (i.e. security), you may just have to settle with backing up the entire database which is relatively straight forward... This isn't 'perfect'; however, as if your goal is just to be able to restore individual reports from time to time, restoring the entire database would probably blow out other data you weren't looking to change....
With all that file system / database stuff said, I generally just export the reports periodically and then re-import as needed (rarely).
For our DR environment, we just clone our VMWare setup; therefore, it's a lot easier to deal with all of these issues. ;)
Sounds like LCM would be excellent for this, and very easy too. FR reports can be backed up and the entire process can be automated.
The process can literally be set up in a few minutes to back up to the same environment. It takes just one extra step to copy to a different environment.
Restoring is very easy as well.
Check out Lifecycle Management in the guides. Here's some more info: https://support.oracle.com/epmos/faces/DocContentDisplay?id=1072301.1
We're not running that here, so bear with me; however, does it actually work as advertised? I've been let down by a couple of these things, curious if you have some first hand feedback on this one.
I know what you mean. There are many things that partially work or aren't worth the effort, but using LCM to back up artifacts such as FR reports, FM rules, data grids/forms, metadata (FM or EPMA), member lists, security, etc., works very well. It's quite easy to backup/restore these artifacts and usually is something an admin will feel comfortable doing after a time or two.
Version 11 works much better than version 9, and has a better interface through Shared Services.
Version 184.108.40.206 is different than version 220.127.116.11 and prior.
Backing up to the same environment is simply done through the front end of SS, but crossing environments requires a user folder on the server to be copied (can be done via batch script).
Give it a try. I think you will like it!
Just a quick follow up. Will LCM just only export HFM FR Reports? This sounds to me as a great idea to move reports from one environment to another (hosted in different machines).
I am fairly new to LCM and just want to synch the reports that are being used by the HFM application. The application itself does not need to be LCM.
Yep. You can choose any repository object. It can be all reports, a certain folder, or selected reports.
The high level steps would be:
1) Export whatever you want to file system in LCM (this to the current environment)
2) Find import_export server folder where LCM artifacts were exported: \Oracle\Middleware\user_projects\epmsystem1_FOUNDATION\import_export
3) Copy desired import_export subfolder to new environment
4) Open Shared Svcs in new environment and restore!
Why am i reading this while envisioning you as the ShamWow guy? :-)
One question though, when you restore, is it smart enough to bulk update connections? It seems to me it isn't. If I have multiple test applications or a different database connection on the dev machine, then I'd have to update all of the reports for the new connection? This is annoying since you'd have to go through and make DB connection updates... While you can update multiple reports in one folder, I do not believe there is a way to do a parent folder and it have recursively go through the structure updating?
I feel more like the Dell dude today. I don't know of any way to bulk update the db connection once migrated. You'd sill have to do that process the same as when you do the export/import from the repository.
DUDE, you got a DELL??!?!?!?!? :)
In regards to bulk updating, I do it off-line. :)
If you export the top level of the reports, it sends you a 7z file which contains the folders and DES files. The DES file is basically XML/Plain text; therefore, if you know where the connection information is defined in the file, you can use something like WinGrep to bulk update everything.
A little bit of work to figure out where the connection data is stored in the DES files; however, once you figure it out, it is pretty easy to work with it.