Forum Stats

  • 3,733,814 Users
  • 2,246,824 Discussions


Error on Excel tests

When trying to run a test w/Excel, I got this error message:

"An error occurred while running the tests. Reason: The Given key was not present in the dictionary. (OPA-E00507)"

Does anyone have any idea what to do about this? I had to scrap my test file and start a new one.


  • Richard Napier
    Richard Napier Member Posts: 208 Silver Badge
    edited January 20


    Not knowing the content of the Excel file it's quite hard, but the situations where I have had that error in the past have sometimes been when testing inferred entity instances have been correctly created:

    List the entity instances you expect to be created for each test case on the entity sheet of the test case document, adding at least one entity attribute as an expected outcome. For example, in the ServiceDelta example policy model, "the service" (inferred entity) is an input and "the action on the service" (goal) is an expected outcome on the service worksheet of the test case document.

    Inferred entities and relationships, including containment relationships, can be only tested using what-if analysis when all possible values of the entity’s identifying attribute can be defined in the test case.

    For example, the ServiceDelta example policy has an entity "the service", contained in the relationship "the services", with inferred relationships "the future services" and "the existing services".

    And when working with test cases that are are referencing associative entities - if you declare an associative entity (i.e. an entity inferred by relationship) you must set a value for the identifying relationship in the Excel Test Case.

    The above is an extract of the online help, check to see if it applies in your case

    Hope it helps

  • EMacAdie-ATX
    EMacAdie-ATX Member Posts: 51 Blue Ribbon

    Thanks for the response. This test does not have many inferred entities, so I don't think that is the issue, but I will look into it further.

    Sometimes this happens on entities that are not inferred. A few times I got this after adding an expected value to a second instance of an entity, and the first instance was fine. Example: after adding an expected value for PersonA's total income the test runs, but when I add PersonB's total income as an outcome, I get the error.

    I have a few questions:

    1. Some of our entities and attributes have very long names. Could this be causing problems? Is there a limit on the length of an attribute name in the tests? This was not a problem in OPA 10.
    2. Some of the docs mention what-if analysis. I never did that in 10. Can I create a what-if from the debugger? I get the impression from the docs that creating a what-if has to be done by hand. We have a lot of entities and attributes. Doing it by hand would be onerous.
    3. I created the test xlsx from the debugger. In the "Test Cases" sheet, under the "Test Case" header there is a value of "1" (there is only one test). But in all the other sheets, each of the entities is listed as being used in "all" tests. Could that cause a problem? Should I change the instances to be flagged for test case "1" instead of "all"?

  • EMacAdie-ATX
    EMacAdie-ATX Member Posts: 51 Blue Ribbon

    I think I may have found one potential issue.

    Some of our inputs are alpha-numeric codes that have 2 or 3 characters and come in as strings. Some of the codes have characters that are all numeric. And if the first is "0", Excel chops off the leading "0". "05" is a valid code, but "5" is not.

    Is there a way to prevent OPA from converting these fields? Or do I have to go in manually and change them or put the single quote in front? Given how many input attributes we have and how many tests we have had, this could be a huge pain.

  • Richard Napier
    Richard Napier Member Posts: 208 Silver Badge

    Hi There

    Some thoughts from your past messages

    1) Referring back to your reply before the latest one, we've got some very long names and have not seen any obvious issues.

    2) Yes you can create an Excel Test Case from the Debugger. Click the Export button dropdown and choose the second option.

    3) In Intelligent Advisor 12, there are no longer two types of test files. There are only Excel Test Cases. The documentation sometimes refers to what-if analysis when it is speaking about simply using the "keep column" functionality to take a static copy of an output column in the Test Case, before running the test again with modified rules. So you can compare the new results with the old "kept" column and do some analysis / charting. An example of what that looks like is here

    4) The Test Case Number, 1, is indeed repeated as "all" on the other sheets. You can change it to 1 or (of course) other number(s) as you add more Test Cases to the first tab, to assign data to various test cases. The "all" should not cause an issue.

    5) The behavior of the text attribute with a value of "05" which is then truncated incorrectly to "5" in Excel Test Cases I have seen before myself.

    The ways that I can think of getting round it are:

    Use Excel macros to reinstate the leading zero and change the Format to "Text" .

    Programmatically add or remove the ' in Intelligent Advisor rules (SubString or concatenation) so that there is an apostrophe ' sent as part of the attribute value to Excel. But that means essentially adding new rules to work round the problem just for Excel Test Cases.

    I would consider that to be worth a Service Request on the support site, because it is an desirable behavior that should not really happen.

    Hopefully that's helpful.

  • EMacAdie-ATX
    EMacAdie-ATX Member Posts: 51 Blue Ribbon

    Thanks for the responses.

    I have one question: Why were tests moved to Excel? This is causing problems and I do not see any benefit over the way tests were handled in OPM 10. If a case can run in the rules, it should run in the tests. Period. We shouldn't have to do workarounds at all.

  • Richard Napier
    Richard Napier Member Posts: 208 Silver Badge
    edited January 29


    Well, not being an Oracle spokesperson or employee I cannot comment on your question really.

    I do personally find the performance data and analysis from the Excel files to be far superior to what we could get out of version 10, and the uniform use of Excel has made it so much easier to get business users to drive the testing process and increase buy-in. In one of my clients we now have 3 full-time business testers able to create and test their own Excels and feedback to us on cases that we would have never been able to communicate with them before.

    But as I say, I'm just speaking for myself. What sort of problems are you encountering, aside from the discussion here (not trying to pry, just trying to be useful!)


Sign In or Register to comment.