Posted on 01-16-2013 05:40 AM
Hey all,
I'm running a single OS X based 8.6 JSS with the MySQL database hosted on another OS X server. We will eventually be clustering this JSS with 2 additional OS X JSS boxes. Today the users connect direct, but will eventually be load balanced, but we have to go thru a complicated internal auditing process before we do so., For now we are stuck with the standalone JSS .
we have about 1100 Macs connecting to this single JSS, and while i know thats a lot, things are running fairly smooth with the exception of of the JSS web app becoming unresponsive when we try to get Hardware/Software history in Details on an enrolled Mac. When this happens, all requests to the JSS webapp just hangs. Other Casper apps seem to work, just slowly.
attempts to restart Tomcat via command line (locally or over ARD) do not work, and eventually we have to reboot the JSS. Once this is done, everything is working again for a few days or until someone clicks the HW/SW History link.
Load balancing/Clustering will hopefully help with this issue, but is there anything that can be done to tune performance so that this does not occur? Other than automating a restart if it hangs, i'm kind of at a loss.
Any tips would be greatly appreciated.
Solved! Go to Solution.
Posted on 01-16-2013 06:40 AM
There have been discussions going way back about the amount of memory allocated to Tomcat by default. There is this article about Java freezing the JSS:
https://jamfnation.jamfsoftware.com/discussion.html?id=672#respond
And another discussion about it here:
https://jamfnation.jamfsoftware.com/discussion.html?id=3133
Basically, max the RAM out in the box running the JSS and in the box running mySQL. Then you'll need to adjust some settings for both.
A very quick Google search for upping the max_allow_packet for mySQL gives this article:
http://mprathap.blogspot.com/2008/11/increase-maxallowpacket-in-mysql.html
and for increasing the Tomcat memory sizes, I find this JAMF Nation KB:
https://jamfnation.jamfsoftware.com/article.html?id=139
I have not tested any of these methods, so do so with care. Make sure you have backups of your systems and database before doing so.
Posted on 01-16-2013 06:27 AM
we have the same issue. On ours, our Java process skyrockets to 100% and just consumes all available ram. I've wiped and reloaded OS countless times, replaced the ram, and tried backing up the DB and moving to a completely different machine. We have better luck if we run the historical report from the server rather than a networked machines through the JSS, though it's not 100%.
Posted on 01-16-2013 06:40 AM
There have been discussions going way back about the amount of memory allocated to Tomcat by default. There is this article about Java freezing the JSS:
https://jamfnation.jamfsoftware.com/discussion.html?id=672#respond
And another discussion about it here:
https://jamfnation.jamfsoftware.com/discussion.html?id=3133
Basically, max the RAM out in the box running the JSS and in the box running mySQL. Then you'll need to adjust some settings for both.
A very quick Google search for upping the max_allow_packet for mySQL gives this article:
http://mprathap.blogspot.com/2008/11/increase-maxallowpacket-in-mysql.html
and for increasing the Tomcat memory sizes, I find this JAMF Nation KB:
https://jamfnation.jamfsoftware.com/article.html?id=139
I have not tested any of these methods, so do so with care. Make sure you have backups of your systems and database before doing so.
Posted on 01-16-2013 08:25 AM
Thanks Steve... Increasing Tomcat memory seemed to do the trick.