Extension Attribute & Log4J

New Contributor

I'm preparing to deploy a scanner to Mac computers to scan for Log4J vulnerability. The scanner outputs a .CSV to the local computer. I need to find a way to get this data so we can review it. 


My thought process was use an extension attribute that inventories scanner's exit code and put it into Jamf. Then we could use a smart group to find out what systems have identified vulnerabilities and require remediation. 


I'm not familiar with building extension attributes. Is this something that can be done? 


Contributor III

Just been dealing with this last week.

What a colleague and I did was a bit messy but was just to get as much information as possible to see what we were dealing with as the security team wanted any machine displaying failures from the 'Wolf-Tools' script to be powered off immediately!


We packaged the script to unpack in /Library/Management/AuditFolder

Ran the script from that location in the policy so the log & json output were there

Had this EA to read for a pass:



#!/usr/bin/env bash


if [[ -e $output_filepath ]]; then
	result="$(cat < "$output_filepath" | grep -m 1 '"result":"PASS"' | awk -F '"result":' '{print $2}' | SED 's/.PASS.*/PASS/g')"
    [[ $result == "PASS" ]] && echo "<result>$result</result>" || echo "<result>FAIL - Check Scan Report</result>"
	echo "<result>No Scan</result>"

exit $?




We then set a policy to upload the logs for failures to the computer object under 'file attachments' using a modified version of Joshua's script from here




#!/usr/bin/env bash

## Computer variables 1

## User Variables

## Computer variables 2
currentUser=$(stat -f%Su /dev/console)
mySerial=$(system_profiler SPHardwareDataType | grep Serial | awk '{print $NF}')
osMajor=$(/usr/bin/sw_vers -productVersion | awk -F . '{print $1}')
osMinor=$(/usr/bin/sw_vers -productVersion | awk -F . '{print $2}')
timeStamp=$(date '+%Y-%m-%d-%H-%M-%S')

## Log Collection
zip /private/tmp/$fileName $logFiles

## Upload Log File
if [[ "$osMajor" -eq 11 ]]; then
	jamfProID=$(curl -k -u "$jamfProUser":"$jamfProPass" $jamfProURL/JSSResource/computers/serialnumber/$mySerial/subset/general | xpath -e "//computer/general/id/text()")
elif [[ "$osMajor" -eq 10 && "$osMinor" -gt 12 ]]; then
	jamfProID=$(curl -k -u "$jamfProUser":"$jamfProPass" $jamfProURL/JSSResource/computers/serialnumber/$mySerial/subset/general | xpath "//computer/general/id/text()")

curl -k -u "$jamfProUser":"$jamfProPass" $jamfProURL/JSSResource/fileuploads/computers/id/$jamfProID -F name=@/private/tmp/$fileName -X POST

## Cleanup
rm /private/tmp/$fileName
exit 0




We had smart groups reporting if the machine wasn't a pass and then dug through those to read the logs. We set up another EA to manually mark if the machines passed or failed with a dropdown and deleted all the file attachments.

A bit manual and messy but we could easily report back some results that made everyone calm down a bit.

I later set up another EA to read the JSON log and exclude any known apps that were reporting as fails, or ones we had patches/upgrades in hand so we could easily see what was still needed. I had this as a column in a search for all managed clients.

We now have Qualys checking but this was all just a quick, dirty solution to keep security from going mental!

Valued Contributor III

We put the output of a scanning tool into an EA, and in some cases it was very large (some developers like to archive every project so we'd see thousands of hits).  I would take that data out of Jamf to analyze it if you end up with large data sets.  You might also want to use JSON if the tool supports it, since it's encapsulated better.

Contributor III

Is there any way to run a scan purely through EA? Is there a suitable script for this?

Contributor III

@mickl089 most tools scan the entire drive. putting it entirely in the EA would mean your recon will stall at that EA for the time it takes to scan.  Some larger drives can take over 40 minutes to complete.  I find it is better to run the scan from a policy and then fetch the small version of the results with an EA.  For large data results, use the upload script from above.  ymmv

New Contributor III

Take a Look at this EA, Just change the version to 2.17.1
IT will run once every 7 days and stop on the date in the script.