2 Replies Latest reply on Nov 14, 2012 5:15 PM by beyerch2

    Importing multiple amount columns from a single text file

      I'm sure this question has been addressed many times. I have tried to search for an answer here and other areas, but I have not been able to find a clear answer yet. I am relatively new to HFM and FDM and thus do not have a lot of experience to fall back on. I am primarily a Planning/Essbase person. That being said, here is my question:

      I have a data source (text file) containing two amount columns that I need to load to HFM via FDM. One amount column consists of Average Exchange Rates and the other amount column consists of Ending Exchange Rates. I have been asked to develop a process to load both columns of data to HFM using a single process (one Import Format). I've been told this is possible by writing an Import DataPump script. It seems that I would need to create a temporary record set based on the original source file and modify it so that it contained a duplicate set of records where the first set would be used for the Average Rate and the second set would be used for the Ending Rate. This would be a piece of cake using SQL against a relational source, but that's obviously not the case here. I do have some experience with writing FDM scripts but from an IF... Then... Else... standpoint based on metadata values.

      If there is anyone out there that has time to help me with this, it would be most appreciated.

        • 1. Re: Importing multiple amount columns from a single text file
          This is relatively easy to achieve with a single import script associated with the Account source field (assuming AverageRate and EndRate are accounts in your application) in your import format.

          Essentially your first amount say AverageRate would be set as the default field for Amount and these values would be loaded as if it were a single value file. For the second value, EndRate you would have to insert the second value directly into the FDM work table which is the temporary table populated when data is imported from a file during the import process. The example code snippet below suld gve you guidance on how this is done

          'Get name of temp import work table
          strWorkTableName = RES.PstrWorkTable

          'Create temp table trial balance recordset
          Set rsAppend = DW.DataAccess.farsTable(strWorkTableName)
          If IsNumeric(EndRateFieldValue Ref Goes Here) Then
                    If EndRateFieldValue Ref Goes Here <> 0 Then
                         ' Create a new record, and supply it with its field values
                         rsAppend.Fields("DataView") = "YTD"
                         rsAppend.Fields("PartitionKey") = RES.PlngLocKey
                         rsAppend.Fields("CatKey") = RES.PlngCatKey
                         rsAppend.Fields("PeriodKey") = RES.PdtePerKey
                         rsAppend.Fields("CalcAcctType") = 9
                         rsAppend.Fields("Account") = "EndRate"
                         rsAppend.Fields("Amount") = EndRateFieldValue Ref
                         rsAppend.Fields("Entity")=DW.Utilities.fParseString(strRecord, 16, 1, ",")
                         rsAppend.Fields("UD1") = DW.Utilities.fParseString(strRecord, 16, 2, ",")
                         rsAppend.Fields("UD2") = DW.Utilities.fParseString(strRecord, 16, 3, ",")
                         rsAppend.Fields("UD3") = DW.Utilities.fParseString(strRecord, 16, 16, ",")
                         'Append the record to the collection
                    End If
          End If

          'Close recordset

          In addition the return value of this Import Script should be "AverageRate" i.e. name of ht eaccount associated with the first value field. The NZP expression also needs to be put on the Amount field in the import format to ensure that the EndRate Field value is always processed even if the value of AverageRate is zero.
          • 2. Re: Importing multiple amount columns from a single text file

            The only issue I have with your suggested solution is that it would be slow if there are a lot of records.

            My thought would be to 'map' the second Rate to an unused UD/attribute field.

            Then after all the rows are processed execute a T-SQL script to clone all the records while performing an update on the cloned records to pull in the new amount and change the account name.

            Would be much faster for large operations.