This content has been marked as final. Show 7 replies
evebill8 wrote:Sure, the max you can set is the max you can offer. Seems like something that is easy to test out.
We have the server running 64-bit Linux and Java. What is the highest number of heap size I can set? Let's say the server has 256 G of RAM. Can I allocate 128 GB to Java heap or ever more?
Do note that huge heaps pose their own troubles so I would rather scale the heap to what your application needs, not to what the server has. I would research the topic further if I were you as you will likely need to dive into the wonderful world of fine tuning the garbage collector and such. A google for "java huge heap size" is probably a good place to start.
Thanks! It is true. The full garbage collection can take long time with huge heap size. I was trying to tune the GC parameters to make the full GC more often, instead of reaching the peak, but not successful.
You might want to profile the application as well. Finding out where the memory is used might provide ideas about alternatives.
Let's assume I have 128 GB in the server, and I need as much memory as possible in the Java side. What is the ratio to allocation the memory to Java heap and operation system? I heard it was 70% of the max heap size used. I have the app that needs to put user's data in the cache when the user logs in. The more ram I have, the more users I can handle. I am now allocating 96 GB to Java heap and leaving 32 GB to the OS. Is it good enough?
evebill8 wrote:If only it were that easy. You'll find that other things will also greatly bottleneck you (I/O performance for example), you will only solve your cache problem which you can also solve in a more distributed way. Check out ehcache for example.
The more ram I have, the more users I can handle.
My point is: I rather doubt that you can actually handle enough parallel users to actually have your cache size problems without initiating some sort of load balancing strategies.
What is the ratio to allocation the memory to Java heap and operation system?The question is meaningless. The operating system needs so much; you need a little overhead for running whatever else is running such as at least monitoring tools, maybe a database server, maybe an LDAP server, maybe an HTTP server. There is no such thing as the ratio.
I heard it was 70% of the max heap size used.Heard it where? Are you really going to commision and allocate resources on your server based on what you 'heard'? You are supposed to test and measure. Every situation is different.
You seem to be subscribing to some pretty fallacious thinking here. To quote Fred Brooks, 'No silver bullet'.
I have the app that needs to put user's data in the cache when the user logs in. The more ram I have, the more users I can handle.Up to a point. There are other limits.
I am now allocating 96 GB to Java heap and leaving 32 GB to the OS. Is it good enough?There's nowhere near enough information here to say for sure, but it isn't likely. I doubt your operating system really needs 32GB for example, or anywhere near it. I would have thought 2GB was more than enough for a modern operating system.
evebill8 wrote:For the vast majority of business scenarios there would be something wrong with those statements.
I have the app that needs to put user's data in the cache when the user logs in. The more ram I have, the more users I can handle.
The only one I can think where it might be applicable is a military application where all of the data must be encrypted and even virtual memory disk writes are not allowed.