I'm building a kind of "dashboard" apex application that shows some status data from disparate sources.
Current thinking is to get the data into oracle via external tables, that themselves run scripts on remote machines.
My main concern is that if I have a single apex page with (say) six different report regions, and each of these reports access
the same external table, then to build the page will require six separate calls out to the external scripts. The scripts may
take up to twenty seconds to return (tried this already: times six is unacceptable.
How can I "cache" the results of a query and make it available for the whole page?
Anyone done this sort of thing before? Prior experience would be welcomed please.
P.S. through apex sql workshop a simple query on the external table works, but when I dress this query up in an interactive report I get some
weird error (don't have the number to hand, sorry). Is there a gotcha with this that I ought to know about?
Tis probably wrong to reply to my own thread, but I've done a bit more digging and have stumbled upon
the facility of an "apex_collection" to cache my dashboard data, so that sounds promising.
Also, the problem with the nonfunctioning query might have something to do with not being granted privileges, but
I will either dig deeper or perhaps avoid this with the apex_collections.
Anyone care to confirm whether these suggestions might prove fruitful?
Thanks in advance.
external tables are a good device for staging data to bring into the database, but not a good idea for running your application from directly. I would load this data into the database before querying for the application. apex_collection is good for caching session dependent data, but will require to be refreshed for each session.