This content has been marked as final. Show 5 replies
Usually map drive access to the log is sufficient as only admins look at it.1 person found this helpful
You can direct it to a folder served by a web server (HFM anyway needs one), it can then be viewed using the URL.
You could use a Task Flow that does the consolidation in the proper order.1 person found this helpful
You could write a call in the Rule file to consolidate those scenarios before the data is copied over.
Either of the above would prevent user error and ensure consistency if there is turnover in the department.
You can define the output folder as a share, which has already been mentioned. I strongly advise against implementing any solution that in production will regularly generate an external file.1 person found this helpful
1) The output file is generated by the DCOM user. If you want this user to generate a file in a specific location, it must be writeable by the DCOM user, and readable by the intended human user. Make sure both NTFS and share level permissions on the target file and its containing folder consider this. Latency for the file write can degrade performance.
2) Most HFM implementations have two or more HFM application servers. You have no control over which server will execute the file, and also no control over what happens if two or more servers try to write to the same file. You can create a process which causes HFM to become single threaded if every time Sub Calculate for any entity on any server needs to open a single file. This is because the file object is single threaded. It could be multi-threaded if you write out to a database, but certainly far more complex.
3) Wait state: while as in #2, this can single thread the process, and you have to consider whether your process will error out during moments when another process has access to the file, or if you will ignore that and simply proceed. This can have a significant impact on performance time if you decide to wait until the file becomes available.
While I have done implementations where the overall solution required combining data across multiple scenarios, I consistently find this a cumbersome, error-prone, and poorly performing approach. For this reason, I always try to keep all of the data required in a single scenario. This is one key reason why the "DataSource" or "DataType" (or whatever you call it) approach is so popular and successful.
Finally, you cannot write to the HFM system messages. You can, on the other hand, use Calc Manager to write out to a system log.
First point is apprepreciated, thanks
Secondly, what is the command to consolidate other scenarios?
Thanks fo the advice, it was helpful.
How to write custom message such as "Scenario XX is not having proper calculation status" to the system messages.