Retrieve System Logs

Jason
Contributor II

What methods are people using to get logs off of users systems? For example, a user will call in saying they have some issue with their system. We'll generally then take a look at system.log, install.log, jamf.log, etc. by remoting into their system, having them email us a copy, or looking at them directly.

I'm thinking it would be much better if I could script a Self Service item that would upload those files to an available share, pull them with Casper Remote (without needing to Screen Share), or something similar. Has anyone done anything like this?

12 REPLIES 12

davidacland
Honored Contributor II
Honored Contributor II

I've seen Bomgar help from a support perspective. It gives you an simple file transfer window to get and put files a bit like an FTP app, but using SSH/SCP.

Personally I would just use scp with a key pair in a script with a self service or Casper remote button so either the user or the support person could get the logs put into a central place.

dan-snelson
Valued Contributor II

With inspiration from @Andrina's System Diagnostic Logs from your Users, here's what we're using to gather the logs. (We're planning on integrating with FileSend or OneDrive to automate the end-user's submission of the logs.)

#!/bin/sh
####################################################################################################
#
# ABOUT
#
#   Gather Log Files
#
####################################################################################################
#
# HISTORY
#
#   Version 1.0, 25-Mar-2015, Dan K. Snelson
#
####################################################################################################
# Import logging functions
source /path/to/logging/goes/here/logging.sh
####################################################################################################

loggedInUser=`/bin/ls -l /dev/console | /usr/bin/awk '{ print $3 }'`
loggedInUserHome=`dscl . -read /Users/$loggedInUser | grep NFSHomeDirectory: | cut -c 19- | head -n 1`
timestamp=`date '+%Y-%m-%d-%H-%M-%S'`

/bin/echo "`now` *** Gather Log Files to $loggedInUser's Desktop ***" >> $logFile
/bin/echo -ne '
' | /usr/bin/sysdiagnose -t -A $loggedInUser-$timestamp -f $loggedInUserHome/Desktop

message="Log Gathering Complete

Your computer logs have been saved
to your Desktop as:
$loggedInUser-$timestamp.tar.gz

Please transfer the file to your support representative.

"
/usr/sbin/jamf displayMessage -message "$message"

/bin/echo "`now` Log Files saved to: $loggedInUserHome/Desktop/$loggedInUser-$timestamp.tar.gz" >> $logFile


exit 0      ## Success
exit 1      ## Failure

Jason
Contributor II

@dan.snelson That's great information Dan. I didn't know about sysdiagnose. I tried running it directly from my system, and while it grabs a ton of information, i didn't see a copy of jamf.log in that tar file. Assuming if i wanted one in there i'd need to gunzip it, append the file, then gzip it again. Also, it seems like it's not super easy to actually get the data OFF the system. @Andrina is setting up an email with the file, and you're placing them on the desktop for the user to manually send them over.

Is there any way to simply copy them to a network share? From the windows world this is super easy, even if a drive isn't mapped i can still do a copy out to a UNC path somewhere, assuming the permissions are set accordingly ahead of time.

I tried out SCP (had never used it before being from a Windows background), but didn't have much luck. SSH is not generally enabled, just SMB.

alexk
New Contributor III

Here's another option that we use to gather logs and upload it to an SMB share. It currently gathers logs from /private/var/log, /Library/Logs, and the logs from the backup agent that we use in our environment called Druva inSync. The "Druva Logs" section can be modified for other applications or completely removed. It also collects some computer info to a text file to include in the gathered logs.

The script will mount the SMB share using a service account that has write access to the share. The username and password are stored as parameters ($4 and $5) in the Casper policy and are called in the script to mount the share. The script will mount the share using "nobrowse" with the command "/sbin/mount_smbfs -o nobrowse..." so that the end-user does not see the mounted share when this is all happening.

May not be the prettiest thing but it has been working for us as a Self Service policy or with Casper Remote. Andrina's and dan.snelson's has some good stuff that I didn't think of like sysdiagnose that would be cool to add if you would like.

#!/bin/sh

# eei-logs-auto-collect.sh
# Alex Kim - UC Berkeley, EEI

# Collects logs from /private/var/log, /Library/Logs, and Druva inSync logs if it exists
# Uploads to EEI SMB share

# Future Additions
## Network check to confirm network access and if on a campus network to access the SMB share
## For now, echo the IP address so that it logs the IP to check if that is the reason the mount fails

# Update 2015.03.04
## Added line to redirect output of the zip commands to null. It was reporting failure in the JSS after a 9.6x update

##########
# Variables
##########

# Current date in Year-Month-Day format
currentDate=`/bin/date +%Y-%m-%d`

# Current time in Hour:Minute:Seconds format
currentTime=`/bin/date +%H%M%S`

# Mac computer name
compName=`/usr/sbin/scutil --get ComputerName`

# User name of currently logged in user
loggedInUser=`/bin/ls -l /dev/console | /usr/bin/awk '{ print $3 }'`

# IP Address of ethernet device 0
ethIP0=`/usr/sbin/ipconfig getifaddr en0`
echo "$ethIP0"

# IP Address of ethernet device 1
ethIP1=`/usr/sbin/ipconfig getifaddr en1`
echo "$ethIP1"

# List of user profiles in the /Users directory
userProfiles=`/bin/ls /Users/`

# File name of compressed zip file of the /private/var/logs directory
zipName="logs-$compName-$currentDate-$currentTime.zip"

# File name of compressed zip file of the /Library/Logs directory
libLogsName="LibraryLogs.zip"

##########
# Druva Logs
##########

if [ -d "/Users/$loggedInUser/Library/Application Support/inSync/logs" ];
then
    echo "Druva inSync logs directory exists. Collecting inSync logs"
    insyncLogs="YES"
    if [ -f "/private/var/log/insync.zip"];
    then
        echo "Druva insync compressed logs already exists. Delete the existing one to create a new one."
        /bin/rm -f "/private/var/log/insync.zip"
    fi
    /usr/bin/zip -r "/private/var/log/insync.zip" "/Users/$loggedInUser/Library/Application Support/inSync/logs" &>/dev/null
else
    insyncLogs="NO"
fi

##########
# Check Requirements
##########

# Size of /private/var/log directory
logdirSize=`/usr/bin/du -sk /private/var/log | awk '{ print $1 }'`

# Free space remaining on the boot volume
freeSpace=`/bin/df / | sed -n 2p | awk '{ print $4 }'`

# Check if there is enough free space on the boot volume to create a copy of the logs directory
if [ "$freeSpace" -gt "$logdirSize" ];
then
    echo "$freeSpace is greater than $logdirSize"
else
    echo "$freeSpace is not greater than $logdirSize. Not enough free HD space to copy the log directory. Exiting."
    exit 1
fi

##########
# Computer Information File Creation
##########

# Check if the computer information file already exists. File should not exist but if yes, then delete it.
if [ -f "/private/var/log/00-$compName.txt" ];
then
    /bin/rm -f "/private/var/log/00-$compName.txt"
    echo "/private/var/log/00-$compName.txt already exists. Deleted."
fi

# Pipe computer information to the file
echo "Current Date: $currentDate" >> "/private/var/log/00-$compName.txt"
echo "Current Time: $currentTime" >> "/private/var/log/00-$compName.txt"
echo "Mac Computer Name: $compName" >> "/private/var/log/00-$compName.txt"
echo "Currently Logged in User: $loggedInUser" >> "/private/var/log/00-$compName.txt"
echo "en0 IP Address: $ethIP0" >> "/private/var/log/00-$compName.txt"
echo "en1 IP Address: $ethIP1" >> "/private/var/log/00-$compName.txt"
echo "List of User Profiles: $userProfiles" >> "/private/var/log/00-$compName.txt"
echo "Size of /private/var/log Directory: $logdirSize" >> "/private/var/log/00-$compName.txt"
echo "Free space remaining on Boot Volume: $freeSpace" >> "/private/var/log/00-$compName.txt"
echo "Druva inSync Logs?: $insyncLogs" >> "/private/var/log/00-$compName.txt"

# Check if the computer information file was successfully created above and now exists.
# This check is for logging purposes. The script will continue even if the file was not successfully created.
if [ ! -f "/private/var/log/00-$compName.txt" ];
then
    echo "/private/var/log/00-$compName.txt was not successfully created. Continue anyway."
else
    echo "/private/var/log/00-$compName.txt was successfully created."
fi

##########
# Compress the /Library/Logs Directory and create the compressed file in /private/var/log
##########

# Check if the /Library/Logs zip file exists already. It should not exist but if yes, then delete it.
if [ -f "/private/var/log/$libLogsName" ];
then
    /bin/rm -f "/private/var/log/$libLogsName"
    echo "/private/var/log/$libLogsName already exists. Deleted."
fi

# Compress the /Library/Logs directory to a zip file in the /private/var/log directory, redirect all output to null so casper policy does not report failure
/usr/bin/zip -r "/private/var/log/$libLogsName" "/Library/Logs" &>/dev/null

# Check if the zip file was successfully created above and now exists. If it does not then exit.
if [ ! -f "/private/var/log/$libLogsName" ];
then
    echo "/private/var/log/$libLogsName was not successfully created. Exiting."
    exit 1
else
    echo "/private/var/log/$libLogsName was successfully created."
fi

##########
# Compress and Upload the /private/var/log Directory
##########

# Check if the zip file exists already. It should not exist but if yes, then delete it.
if [ -f "/private/tmp/$zipName" ];
then
    /bin/rm -f "/private/tmp/$zipName"
    echo "/private/tmp/$zipName already exists. Deleted."
fi

# Compress the /private/var/log directory to a zip file in the /private/tmp directory
/usr/bin/zip -r "/private/tmp/$zipName" "/private/var/log" &>/dev/null

# Check if the zip file was successfully created above and now exists. If it does not then exit.
if [ ! -f "/private/tmp/$zipName" ];
then
    echo "/private/tmp/$zipName was not successfully created. Exiting."
    exit 1
else
    echo "/private/tmp/$zipName was successfully created."
fi

# Make the mount point in /Volumes
# Make the mount point in /Volumes
/bin/mkdir /Volumes/osx_logs

# Mount the EEI SMB network share
# Parameters $4 and $5 are the user name and password that is securely pulled from the Casper JSS
/sbin/mount_smbfs -o nobrowse "//$4:$5@your_server_name/server_path" "/Volumes/osx_logs"

if [ $? -eq 0 ];
then
    echo "osx_logs successfully mounted."
else
    echo "osx_logs not successfully mounted. Exiting."
    exit 1
fi

# Copy the zip file of the logs directory to the EEI_Uploads share
/bin/cp -f -X "/private/tmp/$zipName" "/Volumes/osx_logs"

# Check if the source and destination zip files are the same to know if the copy was successfully completed
if [ "/private/tmp/$zipName" == "/Volumes/osx_logs/$zipName" ];
then
    echo "Upload of zip log file to osx_logs was successful"
else
    echo "Upload of zip log file to osx_logs was not successful"
fi

##########
# Cleanup
##########

# Delete the computer information file
/bin/rm -f "/private/var/log/00-$compName.txt"

# Delete the Druva inSync compressed logs file
if [ -f "/private/var/log/insync.zip" ];
then
    /bin/rm -f "/private/var/log/insync.zip"
fi

# Delete the compressed /Library/Logs file
/bin/rm -f "/private/var/log/$libLogsName"

# Delete the local copy of the zip logs file in /private/tmp/
/bin/rm -f "/private/tmp/$zipName"

# Unmount the osx_logs samba network share
/sbin/umount "/Volumes/osx_logs/"

echo "Logs have been successfully captured and uploaded to osx_logs. Completed."

exit 0

chris_kemp
Contributor III

Thanks for sharing, Dan - I was able to use your script as a quick jumping-off point for a system.log gathering script this morning.

skinford
Contributor III

@alexk Good afternoon, are you still using this script for your log capture?

Have a very great day!

sdagley
Esteemed Contributor II

logCollection is a tool from Jamf's @Rosko which will collect logs and upload them as an attachment to the computer record in Jamf Pro.

skinford
Contributor III

First thank you, @sdagley for giving me the heads up to this great script, and thank you, @Rosko for writing it.
I downloaded it, followed the PDF and Bob's Your Uncle, it worked, what an Awesome tool!!! That will really help with collecting log files when needed.

Can't thank you both enough, have a very great day today!

skinford
Contributor III

@Rosko didn't notice your call sign on GitHub until I went back to look for the extension you were talking about in your pdf, I'm n8wb and a VE Liason amongst other things.

Thank you again for the script, my friend!

melvinp
New Contributor

Screenshot 2023-07-19 at 1.11.36 AM.png

Dear @sdagley,

I am getting the above error while running the script as a Jamf policy.

Can you help me in this please?

sdagley
Esteemed Contributor II

@melvinp At first glance it looks like the script is failing because it hasn't been converted to use Bearer Token instead of Basic authentication. Here's a guide on how to do that: https://community.jamf.com/t5/tech-thoughts/how-to-convert-classic-api-scripts-to-use-bearer-token/b...

melvinp
New Contributor

Hey @sdagley 

How do we do this like should I run a Jamf policy. Can you tell me the steps if possible?