JamfDaemon - noticeable CPU usage

planetf1
New Contributor

Hi, I'm a developer & know relatively little about jamf.

I've noticed that in just under a day and a far, the JamfDaemon process has consumed around 75 minutes of CPU time, which is quite noticeable.

The jamf log shows just a typical hourly policy check.

How can I get more logging to determine what the daemon is doing? Any typical candidates?

A peek with 'truss' just shows a lot of stat() calls (like 1000 PER SECOND) against various files in my system - was Xcode, then git. Once might think it's a scan but it never appears to progress or calm down even after days

18 REPLIES 18

ThijsX
Valued Contributor
Valued Contributor

@planetf1 Do you have Nessus Agent binary in installed on your system?

The combination of Nessus Agent and the JamfDaemon active caused some CPU trouble at some of our devices

/Library/NessusAgent/run/sbin/nessuscli

KyleEricson
Valued Contributor

I have the same issue with one user. As soon as I re-enroll it into Jamf the CPU goes through the roof. I have Nessus and Sophos Cloud AV. I have tried removing Sophos and same issue. Going to try Nessus removal next.


Hire me as an independent contractor.

ThijsX
Valued Contributor
Valued Contributor

@kericson let me know the result!

NathanH
New Contributor

Hi All,
I have experienced this issue on a handful of machines on both 10.14.X and 10.15X
In every case, the Nessus agent was the cause. Once removed the daemon CPU usage returned to normal.

Thank you,
Nathan

KyleEricson
Valued Contributor

@txhaflaire I removed the Nessus Agent and the issue went away.


Hire me as an independent contractor.

Madmax85
New Contributor III

We saw similar behavior as well in which the JamfDaemon process had extremely high CPU usage. At first we were struggling to figure out why. During this time we also saw the Carbon Black sensor service jump up in CPU usage. Eventually we removed the Nessus agent to resolve the issue and have since reinstalled it without the issue reoccurring for now. This only has happened on a handful of endpoints.

KyleEricson
Valued Contributor

Nessus agent is so dumb just another "tool"


Hire me as an independent contractor.

chase_g
New Contributor III

We just installed Nessus about a week and a half ago and started to experience this same thing. The nessus process never seems to get very high but it's causing JamfDaemon and some of our other security agents to spike while it runs. Have any of you tried changing the nessus agent's process priority to low? I made that change on my machine and seemed to make a little bit of difference, the JamfDaemon was still going up but staying within a range of about 15-30% cpu and when looking at the nessus logs my scan times appeared to be about the same.

This is the command to set the process priority for nessus agent from terminal: sudo /Library/NessusAgent/run/sbin/nessuscli fix --set process_priority=low

To change it back to default just do =normal. If anyone else can try testing this out and see if they notice a difference or not I am curious.
Heres where I found that command in their documentation: NessusCLIAgent

lashomb
Contributor II

Running into this issue. Setting process to low and even unloading the launchdaemon for nessus didn't help.

chase_g
New Contributor III

So my Nessus admin just made two changes that seemed to fix the issue we were having. Before the changes we were seeing JamfDaemon and some of our other AV agents spike in CPU for the length of the vulnerability scan, which from the nessus.messages log were taking about 5900-7000 seconds to complete. After the changes the scans were completing in 16-22 seconds. Here are the changes he said he made:

First he disable thorough tests. Then he noticed that "Enable plugin debugging" was turned on, so he disabled that. Those were the two settings that made the biggest difference. Then he noted that he made sure vulnerability scans did not include malware scans, he made those two separate schedules.

Hope this may help anyone else.

First he disabled thorough tests

When looking at my Mac's nessus.messages log my scans were taking about 5900-6500 seconds to complete. Once he made the below changes the scans they were completing in 16-22 seconds and were not causing any

cbrewer
Valued Contributor II

I'm seeing significant JamfDaemon CPU usage just installing updates to Microsoft Office apps - on multiple 10.15.5 Macs (not running the nessus agent).

cbrewer
Valued Contributor II

In my environment, there seems to be a correlation between using the Restricted Software feature in Jamf and seeing JamfDaemon use a high amount of CPU.

naya
New Contributor II

I've started noticing this as well on 10.15.5 macs CPU between 29%-37%. I think it has to do with check-in. Will open a support case with Jamf.

chase_g
New Contributor III

@cbrewer I did notice my JamfDaemon spike hard yesterday during Office updates as well. I do have one or two Restricted Software policies in Jamf but don’t think they are scoped to my computer I will have to check

cbrewer
Valued Contributor II

It turns out that this issue is presenting itself when Microsoft Office apps are updated from 16.37 to 16.38 using MAU. The 16.38 updates are being applied using a binary delta updater that causes a high amount of disk IO.

I can't speak to the nessus agent issue, but I believe this instance is more of a Microsoft issue than a Jamf one.

diegoFA
New Contributor

@cbrewer how did you find that out? I am wondering if one of our users is experiencing the issue for the same reason but would like to see a log showing the issue or at least hinting it's the restricted software feature.

Thanks

kevin_v
Contributor

We've had a couple reports of this after 10.30.3 upgrade to Jamf Cloud. We have software restrictions in place in addtion to nessus scanner. A reboot seems to solve the issue, but curious if there are any suggestions here for mitigation. Thanks!

frootion
New Contributor III

Maybe you would also like to have a look on this thread:

https://www.jamf.com/jamf-nation/discussions/39206/jamfdaemon-high-memory-usage

I guess it’s a similar issue.