Forum Stats

  • 3,836,796 Users
  • 2,262,193 Discussions
  • 7,900,114 Comments

Discussions

PBCS - FDMEE - Fusion Financials has '#' in data columns

elwayisgod
elwayisgod Member Posts: 103
edited Mar 15, 2019 12:15PM in Planning and Budgeting

Hi,

We successfully connected to Fusion Financials but the loads are failing as some of the data fields are coming over as '#'.  I tried all kinds of mappings to fix, but nothing is working.

Any ideas on how to correct?   Not sure it can be fixed in the source so wanted  FDMEE to handle it if possible.

2019-03-14_13-41-45.jpg

Tagged:

Answers

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 14, 2019 5:31PM

    Interesting.  If I process a single period at a time, it loads every row with no rejects.  Not sure how/why that is possible.  Perhaps this should go on the Fusion Financials forum?

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 15, 2019 10:37AM

    I have been trying SQL in the Data Load Mapping, to no avail.  Not sure my syntax is correct but I have tried many itereations of it and nothing changes the # symbols....

    sql.jpg

  • JohnGoodwin
    JohnGoodwin Member Posts: 30,471 Blue Diamond
    edited Mar 15, 2019 10:48AM

    SQL mappings are not going to help, validations happen after data has been imported, your data is being rejected at the import stage.

    Cheers

    John

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 15, 2019 11:12AM

    OK.  So is there any recourse?   I looked at Expressions but didn't see any I could add to Import Format.  Perhaps it's back to the source Cloud Financials to determine why #'s are showing up?

  • JohnGoodwin
    JohnGoodwin Member Posts: 30,471 Blue Diamond
    edited Mar 15, 2019 11:38AM

    Why do you want it to load # (missing) data, if you are running multi period and look at the log you should see multiple imports.

    I can see # records being rejected which you would expect because the file downloaded from Fusion includes all the periods that have been selected when running the rule, some data records will be not have values for all periods.

    If I load for one period then check the workbench source records, then run a multi period and check the workbench for the same one period I still see the same record count.

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 15, 2019 11:38AM

    It's bonking on the entire row not just the records with # in them.  So the entire row doesn't load.  That is my issue.  Sorry I wasn't clear on that earlier.

  • JohnGoodwin
    JohnGoodwin Member Posts: 30,471 Blue Diamond
    edited Mar 15, 2019 11:39AM

    Are you 100% sure, it is all not being loaded at once for all periods, if you select to load jan/feb you will see multiple imports of the same source file, some will be rejected because the values don't exist for both periods.

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 15, 2019 11:45AM

    I'll double check.  But in the pic above of my data file, that 3766.66 doesn't load for Nov-18 unless I run only Nov-18.   I'm stumped.  But let me validate.  I'll take screenshots as proof

  • JohnGoodwin
    JohnGoodwin Member Posts: 30,471 Blue Diamond
    edited Mar 15, 2019 11:54AM

    Say you run an extract for Jan/Feb, the file downloaded will contain both periods, for example

    pastedImage_0.png

    DM will now import the file in multiple loads by period.

    First import for Jan should load 3 records

    Second import for Feb should load 2 records and reject the # because it is missing.

    That is the way I see it working and reconciling, if you don't see that then it is Oracle you need to speak to.

  • elwayisgod
    elwayisgod Member Posts: 103
    edited Mar 15, 2019 12:15PM

    My apologies.  I cleared entire db and re-ran the load.  Those rows do just ignore the # symbols and load the data.   I thought I checked this.  I must have been connected to wrong PBCS instance on prior retrievals.

    Sorry for wasting everyone's time.  I thought I had an issue.  Thanks for guidance.