Are DB tools using a lot of resources when looking at data?
Please bear for a few minutes, I have a dilemma and need to figure a good way to exactly know what is going on here.
We have some big production databases, over 10-13 TB, with some very large tables, 2-4 billion records, and we have user (mainly developers and BA’s) who use different tools to access those tables and look at some data, and most of these tables have a degree of parallelism of 8 or 16 for the most part.
The gamut of tools used includes, SAS, PL/SQL Developer, SQL Developer, DB Artisan and others.
Here is my concern/issue, when I look into some of these databases I see some queries of the form “SELECT * FROM Table”, and according to the Grid Control, it tells me that the given session and all the associated processes (16-32 more) are trying to retrieve all 2-4 billion records, however, the statistic do not show any I/O, just elapse time, which keeps going up since the session stays active.