This content has been marked as final. Show 5 replies
The only way, you'll view it through HFM (without customizing something) would be to copy it back to the original table.
Outside of HFM, it would be relatively easy to look at the data in the tables through a SQL data view tool. There will be a couple of things you need to know such as how to convert the date/time field....
cast(endtime-2 as smalldatetime)) will convert the times in the log files to a 'human readable' date/time.
The following query will sum up the number of times each activities by hour for a given date range (in this case 11-7-2011)
select datepart(HOUR, cast(endtime-2 as smalldatetime)), count(activitycode) from <appname>taskaudit where cast(endtime-2 as smalldatetime) between '2011-11-7 00:00:00' and '2011-11-7 23:59:59' group by datepart(HOUR, cast(endtime-2 as smalldatetime))
Use the Audit Extract utility bundled with HFM 220.127.116.11 to export this information periodically to a CSV file that you can view offline through any text editor, or even Excel. There is a command line feature for this as well, so you could incorporate this into a batch routine.
For anyone attending Kaleidoscope in San Antonio, TX next month, I will present the various utilities that ship with HFM including this one. Hope to see you there!
No, the utility reads from the task_audit and data_audit tables, and outputs that to .csv format. Whatever you do with it from there is up to you, but the important thing is that it will convert from task codes to the activity, and decode the other fields. Certainly you could append it to another table outside the HFM schema, and use some other tool to read it.
In my experience few people ever go back and read this information: it's typically regarded like an insurance policy "in case someone might ask". I use the information during performance reviews while I'm investigating why it takes someone X hours to consolidate but only on the 3rd day of each quarter. I can see the task audit details to see the 50 other consolidations plus the extended analytics extracts that were going on at the same time.
The new command line support is very useful too. Other products such as Accelatis can help with archiving too, among many other things.
I like command line utilities since you can script them to automate things.
You can export the data from Web U/I as well, but unless you write web scrapers in your free time, it wouldn't be nearly as simple to automate.
Assuming the OP's requirement is to truncate those tables (performance) while retaining the data for potential review/audit (i.e. Secondary table), would the CLI tool also truncate the tables or would he still need to create a task/job to periodically clean up the tables?