Forum Stats

  • 3,874,252 Users
  • 2,266,710 Discussions
  • 7,911,785 Comments

Discussions

PL/SQL: ORA-22905: cannot access rows from a non-nested table item

124»

Answers

  • padders
    padders Member Posts: 1,081 Silver Trophy
    edited Nov 4, 2019 2:43AM

    > Thanks for suggestion. It doesn't work as shown below.

    "ORA-04098: trigger 'RQUSER.RATING' is invalid and failed re-validation" shows us nothing except that the trigger is invalid. Show us the trigger code and the compilation error message.

    Also, does the PL/SQL block I posted work or not and if not, what is the error message.

    DECLARE

       dum INTEGER;

    BEGIN

       EXECUTE IMMEDIATE

          q'{SELECT * FROM TABLE (rqEval (NULL, 'SELECT 1 dum FROM dual', 'IRS'))}'

          INTO dum;

    END;

    /

    > applicant <- dbGetQuery(con,"select * from loanapplication where rownum = (select max(rownum) from loanapplication)") is to select the most recent row into the data frame applicant

    I figured that was what it is for. It is nonsense. The expression max (rownum) does not identify the latest row in a table it is essentially just COUNT(*). The predicate 'where rownum = n' doesn't return a row if n > 1 and even if it did it wouldn't work as ROWNUM is not the number of a row in a table, it is a pseudocolumn generated as rows are returned.

    William Robertson3631507
  • 3631507
    3631507 Member Posts: 45
    edited Nov 4, 2019 2:45AM

    It's the same error as in the trigger

    PL/SQL: ORA-22905: cannot access rows from a non-nested table item.

  • Gaz in Oz
    Gaz in Oz Member Posts: 3,785 Bronze Crown
    edited Nov 4, 2019 3:25AM

    here is the simplest anonymous PL/SQL block I could come up with that you can experiment with, with your... requirement:

    SQL> declare  2    num number;  3  begin  4    execute immediate 'select 1 from dual' into num;  5  end;  6  /PL/SQL procedure successfully completed.SQL>

    Notice how the result of the arbitrary select statement return value is put INTO the variable using EXECUTE IMMEDIATE syntax.

    You need to do something like that. When you have THAT working put it into your trigger to test.

  • 3631507
    3631507 Member Posts: 45
    edited Nov 4, 2019 3:43AM

    Thanks padders, your block

    DECLARE

       dum INTEGER;

    BEGIN

       EXECUTE IMMEDIATE

          q'{SELECT * FROM TABLE (rqEval (NULL, 'SELECT 1 dum FROM dual', 'IRS'))}'

          INTO dum;

    END;

    /

    works now when didn't before. I can't explain why.

    Thanks for all who helped with inputs and suggestions.

  • L. Fernigrini
    L. Fernigrini Data Engineer Sr Consultant Member Posts: 4,095 Silver Crown
    edited Nov 4, 2019 9:00AM

    Well, good to know that you were able to solve the issue!

  • 3631507
    3631507 Member Posts: 45
    edited Nov 11, 2019 7:53PM

    Thought issue was completely solved till I ran the trigger and encountered a quirk. It fires but processes the previous insert each time. For instance, I inserted rows A, B and C one at a time. Trigger processes A after B is inserted and then B after C is inserted. The trigger is clearly for AFTER INSERT. Searched the web and can't find a similar encounter posted.

  • padders
    padders Member Posts: 1,081 Silver Trophy
    edited Nov 12, 2019 3:14AM

    Well, it appears that the R function creates a separate connection to the database. This separate connection will not see uncommitted transactions in other sessions (including the transaction which fired your trigger).

    I don't know if R is able to inherit/be part of the invoking connection, you could investigate that possibility.

    An alternative would be to make the R function call asynchronous, a simple 'traditional' way would be to submit (but not commit) a job via DBMS_JOB to execute the R function call. When the transaction is committed the job will be committed also. Advanced Queuing (DBMS_AQ) could be used similarly. However note that these asynchronous approaches would mean the job (and hence the R function) executes in its own transaction a short time after the table update is committed so there will be a period of time (however short) where the R data is not up to date.