This content has been marked as final. Show 16 replies
DB Time is a great metric but I would usually refer to it in the top level summary, the load profile and the time model statistics.
It refers to the time spent by foreground sessions actively waiting or actively working.
In a one hour period, if there is one session constantly working, the DB time would be 1 hour.
If there are two sessions constantly working, then DB time would be 2 hours, etc.
Regarding the difference between values in the different sections, the DB time in the instance activity stats looks to be centiseconds and rounding errors and/or different collection intervals might account for the difference.
Sounds like you're confusing DB time with CPU time.
Cores are irrelevant.
If in doubt, see your load profile.
DB time represents Average Active Sessions in a time period.
In a 1 second period....
If for 100% of that 1 second period, you have 1 session either on cpu or waiting for cpu or actively waiting for a non idle event, DB time is 1 second.
If for 100% of that 1 second period, you have 2 sessions either on cpu or waiting for cpu or actively waiting for a non idle event, DB time is 2 seconds.
If for 100% of that 1 second period, you have 5 sessions either on cpu or waiting for cpu or actively waiting for a non idle event, DB time is 5 seconds.
If for 100% of that 1 second period, you have 38 sessions either on cpu or waiting for cpu or actively waiting for a non idle event, DB time is 38 seconds.
To a certain extent, it doesn't matter if I double the number of cores or halve the number of cores, if I have 38 sessions actively working or actively waiting for 100% of that 1 second, then DB time for that 1 second is 38 seconds.
Thank you for the brief explanation.
Ok I understand cores haven't got much to do with db time. But that doesn't really answer my question.
So in the Instance Activity Stats section of the AWR report there is a metric called db time. In what is that value expressed?
In below example is this 3,848.36 seconds per second? or is it 3,848.36 milliseconds or centi-seconds or...
Instance Activity Stats
Statistic Total per Second per Trans
CPU used by this session 78,259 21.76 60.53
CPU used when call started 188,815 52.51 146.03
CR blocks created 1,628 0.45 1.26
Cached Commit SCN referenced 3,492,204 971.18 2,700.85
Commit SCN cached 63,550 17.67 49.15
DB time 13,838,024 3,848.36 10,702.26
Thanks in advance
But if in this example its 13,838,024 centiseconds total db time how do I then get to 3,848.36 per seconds. I'm just trying to figure out how these figures correlate to each other..
The duration is 1hr of the awr report.Probably more like 59.93 minutes...
It should be that
13,838,024 / Exact duration of the AWR report in seconds = Average DB time per second