entropy_avail
There seems to be this concept of entropy, which is something to behold on a given Linux (Unix) server
What I have is a practical example, I have:
- Some servers which are running low on this
- Some (well, one actually) where it is normal
I'm looking for an answer to both questions:
- Why is it low on some ?
- Why isn't it on other ? Why the difference, they should all be low, no ?
These servers all run the same Oracle software. They all run 7.X (Red Hat)
It seems to be a very hard question, lots of Google info, but none that can really say : that is it.