Posted on 10-23-2014 11:23 AM
So, from time to time we get computers stolen, and more often than one might think, they don't get erased but rather just gets a new admin account (using the .applesetupdone trick).
Our users aren't too happy about us collecting geo-location on a daily basis so we crafted a nifty little script that gives us the adress of the stolen computer. The script gets scoped to a static group containing all computers that are stolen (or lost) and set to run ongoing on all triggers (to get as much data as possible). As soon as the computer reports back in to the JSS, these scripts starts collecting data.
Along with the adress-script we have another script that prints the safari surf history of the current logged in user to stdin so it gets into the policy log to be viewed and finally one script that prints out the external IP and DNS name of the computer.
The adress scripts works by listing all the wifi networks seen by the computer and crafts a URL that requests data from google. The result is the approximate adress of the computers present location and the coordinates and finally the accuracy of the positioning.
We have recovered several computers using these scripts as all we have to do is provide law enforcement with location, IP/ISP and some "proof" of who we think is using the computer (you can often tell by the surf log since people tend to log into sites like Facebook and email which nicely prints out the name in the page title and therefore gets logged using the surf history part).
The record shows that the script is amazingly correct, or rather, Google is extremely good at provide location just using wifi networks. Most of the time, Google reports an accuracy of "150" which I guess would refer to 150 meters but from experience, that could easily be 15 meters, it's that accurate!
These scripts work for us, your mileage may vary, computers may catch fire, the scripts may contains bugs and so on...
Nevertheless, here are the scripts:
To get external IP:
#!/bin/sh
IP=$(dig +short myip.opendns.com @resolver1.opendns.com)
DNS=$(host $IP | awk '{print $5}' | sed "s/.$//g")
echo "$IP - $DNS"
To log the surf history to the JSS (nb does not work in 10.10 as Safari now saves the history in a sqlite 3-file):
#!/bin/sh
unset LANG
#echo "START"
PREFIX="/tmp/safari-log"
USER=$(w | grep console | awk '{print $1}')
eval cp ~$USER/Library/Safari/History.plist $PREFIX-history-bin.plist
plutil -convert xml1 -o $PREFIX-history-xml.plist $PREFIX-history-bin.plist
split -p "WebHistoryDomains.v2" $PREFIX-history-xml.plist $PREFIX-
tail -n +5 $PREFIX-aa | egrep -o "(<string>.*</string>)|(<dict>)" | sed -E "s/</?string>//g" | sed "s/<dict>//g" | grep -v "^http://www.google.*,d.Yms$" > $PREFIX-history.txt
OLD_IFS=$IFS
IFS=""
exec 5<$PREFIX-history.txt
while read -u 5 LINE
do
echo $LINE | egrep -s "^[0-9.]+$" > /dev/null
if [ $? -ne 0 ] ; then
echo $LINE
else
TIME=$(expr $(echo $LINE | egrep -o "^[0-9]*") + 978307200)
date -r $TIME
fi
done
IFS=$OLD_IFS
rm $PREFIX-*
#echo "END"
And finally the one that prints out the address (location):
#!/bin/sh
INTERFACE=$(networksetup -listallhardwareports | grep -A1 Wi-Fi | tail -1 | awk '{print $2}')
STATUS=$(networksetup -getairportpower $INTERFACE | awk '{print $4}')
if [ $STATUS = "Off" ] ; then
sleep 5
networksetup -setairportpower $INTERFACE on
fi
/System/Library/PrivateFrameworks/Apple80211.framework/Versions/A/Resources/airport -s | tail -n +2 | awk '{print substr($0, 34, 17)"$"substr($0, 52, 4)"$"substr($0, 1, 32)}' | sort -t $ -k2,2rn | head -12 > /tmp/gl_ssids.txt
if [ $STATUS = "Off" ] ; then
networksetup -setairportpower $INTERFACE off
fi
OLD_IFS=$IFS
IFS="$"
URL="https://maps.googleapis.com/maps/api/browserlocation/json?browser=firefox&sensor=false"
exec 5</tmp/gl_ssids.txt
while read -u 5 MAC SS SSID
do
SSID=`echo $SSID | sed "s/^ *//g" | sed "s/ *$//g" | sed "s/ /%20/g"`
MAC=`echo $MAC | sed "s/^ *//g" | sed "s/ *$//g"`
SS=`echo $SS | sed "s/^ *//g" | sed "s/ *$//g"`
URL+="&wifi=mac:$MAC&ssid:$SSID&ss:$SS"
done
IFS=$OLD_IFS
#echo $URL
curl -s -A "Mozilla" "$URL" > /tmp/gl_coordinates.txt
LAT=`cat /tmp/gl_coordinates.txt | grep "lat" | awk '{print $3}' | tr -d ","`
LONG=`cat /tmp/gl_coordinates.txt | grep "lng" | awk '{print $3}' | tr -d ","`
ACC=`cat /tmp/gl_coordinates.txt | grep "accuracy" | awk '{print $3}' | tr -d ","`
#echo "LAT: $LAT"
#echo "LONG: $LONG"
#echo "ACC: $ACC"
curl -s -A "Mozilla" "http://maps.googleapis.com/maps/api/geocode/json?latlng=$LAT,$LONG&sensor=false" > /tmp/gl_address.txt
ADDRESS=`cat /tmp/gl_address.txt | grep "formatted_address" | head -1 | awk '{$1=$2=""; print $0}' | sed "s/,$//g" | tr -d " | sed "s/^ *//g"`
if [ $EA -ne 0 ] ; then
echo "<result>$ADDRESS (lat=$LAT, long=$LONG, acc=$ACC)</result>"
else
echo "$ADDRESS (lat=$LAT, long=$LONG, acc=$ACC)"
fi
rm /tmp/gl_ssids.txt /tmp/gl_coordinates.txt /tmp/gl_address.txt
Posted on 10-23-2014 11:39 AM
I assume your JSS needs to be open to the Internet for this to work?
Posted on 10-23-2014 12:25 PM
Very much so, but if you don't worry about it (it being privacy for the user) one could easily change the script to run independently on the computer and mail its location once every day.
Posted on 10-28-2014 07:09 AM
I love it, but our works council isn't really happy about it :D Really looking forward to get that approved as a procedure for our stolen machines. Great one Pat! (y)
Cheers,
Fab
Posted on 10-28-2014 07:33 AM
Very nice. I love the grepping of the physical address! I've done something similar for the external IP address, but find it helps to test with a couple of different services. We also check checkip.dyndns.org and checkip.dyndns.org.
Not as elegant as yours, but for anyone interested...
GetIP() {
case ${1} in
DynDNS )
Foo=`curl -s http://checkip.dyndns.org`
Bar=${Foo#*:}
Baz=${Bar%%<*}
echo $Baz
;;
Checkmyip )
Foo=`curl -s http://checkmyip.com | fgrep 'div id="ip'`
Bar=${Foo#*br>}
Baz=${Bar%%<*}
echo $Baz
;;
* )
# No such check defined; return nothing
echo ""
esac
}
Posted on 10-08-2015 06:45 AM
This is great, but has anyone modified the surf history script to be compatible with 10.10? Or to gather history data from Chrome and or Firefox?
Posted on 10-08-2015 09:52 AM
This is fabulous!
Posted on 10-15-2015 08:00 AM
For Safari you can very well just download the sqlite file and analyze it.
It's located in ~/Library/Safari/History.db
Posted on 01-26-2017 05:26 AM
For Safari 9 and 10, this will now print the history file to the policy log:
#!/bin/bash
unset LANG
#echo "START"
PREFIX="/tmp/safari-log"
USER=$(w | grep console | awk '{print $1}')
cp /Users/$USER/Library/Safari/History.db $PREFIX-history-bin.db
sqlite3 /tmp/safari-log-history-bin.db "SELECT id,datetime(visit_time+978307200, 'unixepoch', 'localtime'),title FROM history_visits;"
# rm $PREFIX-*
#echo "END"
Posted on 01-26-2017 08:35 AM
Getting error: no such table: history_visits
Posted on 01-26-2017 08:46 AM
Are you running these as an extension attribute? as that's how I just got it setup, and street address is NOT being populated when a recon is done.
Posted on 01-27-2017 03:20 AM
@ooshnoo , if you make the scripts into extension attributes, the output must be formatted with <result> ... </result>
there is in the 'address script' a mistake, $EA is not defined. I changed the last lines to check for $ACC (I am not sure if that is useful, but it should be a value) so it looks like this:
if [ $ACC -ne 0 ] ; then
echo "<result>$ADDRESS (lat=$LAT, long=$LONG, acc=$ACC)</result>"
else
echo "$ADDRESS (lat=$LAT, long=$LONG, acc=$ACC)"
fi
and my extension attribute works (need to run jamf recon...)
Posted on 01-27-2017 08:51 AM
Thank you sir. Yep...figured that out last night with help from Jamf support. All good here.
Posted on 01-30-2017 04:12 AM
No, I don't run these as EA, just a standard policy and look in the policy log for the information. It's sensitive and this is only run on confirmed stolen computers.
Posted on 01-30-2017 04:17 AM
@JayDuff, are you sure there is a Safari log to print? It's not populated until you've actually done some surfing in Safari ;)
Posted on 01-30-2017 06:54 AM
@bollman I went to a few sites before I ran it. I was trying to run it as a .sh on the computer. When I set it as a policy, it worked.
Posted on 01-30-2017 03:18 PM
@bollman Thank you so much for these scripts. I am definitely going to save them... it is always good to be prepared!
Posted on 02-28-2017 08:15 AM
Have you tried using imagesnap to capture a picture of the thief, then uploading it somewhere?
It looks like there is no way to store images in the JSS (feature request for EA of type Picture, coming right up!). What do you think the best way to get that image would be? It would need to be clandestine, of course. So, if we use SFTP or SCP, the credentials need to be pre-populated.
I know School Messenger uses PHP and Java to upload files to their central server, so it's possible. Any ideas?
I set up a hidden share on my NAS, and opened it up to FTP. Then I made this script:
#!/bin/bash
now=`date '+%Y_%m_%d__%H_%M_%S'`;
imagesnap -w 1 -q /tmp/$2.png
ftp -u ftp://$4:$5@$6/$2-$now.png /tmp/$2.png
rm /tmp/$2.png
exit 0
Option 4 is the FTP User name
Option 5 is the FTP password
Option 6 is the FTP path (my.ftp.server/directory - no trailing /)
Apply it via policy, and it sends pictures to the FTP server every time it runs. It is sending the user name and password in the clear, and it's using FTP, so I am open to suggestions. Is there a way to do this more securely? sftp doesn't have a way to upload to a URL like ftp does, right? scp is not available on my NAS, and also doesn't have the capability. I don't know diddly about certificate-based authentication, but am willing to learn!
Posted on 03-06-2017 11:09 AM
@JayDuff I tried running this script as a policy and only getting "Error: no such table: history_visits" Is there something else I need to change in the script? Anything different I need to do to the policy?
Posted on 03-06-2017 02:23 PM
I tried running this script as a policy and only getting "Error: no such table: history_visits" Is there something else I need to change in the script? Anything different I need to do to the policy?
@bfrench Did you surf with Safari on the target device? I got the same result when I ran it on a device that had no history.
Posted on 03-08-2017 08:17 AM
@JayDuff Yes - I did go to several sites to work up some history before I sent the policy.
Posted on 03-08-2017 09:21 AM
Since I know the logged in user I was able to just run the sqlite command without pulling the username and creating the tmp file - that worked for me.
Posted on 03-08-2017 11:48 AM
Posted on 03-08-2017 11:59 AM
@bfrench That's what I saw as well. Just the titles of the pages, no URLs.
Also, the script relies on the user being logged in when the script runs. If no one was logged on when you ran it the first time, that may be why it failed.
Posted on 04-12-2017 11:50 AM
I've set up your Location Extension Attribute on all of my 1:1 MacBook Airs. Lately, however, most of the MBAs have been returning null results. I converted the EA to a script, and ran it on one of the MBAs in question, and found that I am getting a status of "OVER_QUERY_LIMIT".
Do you know how Google is tracking the queries? Is it by machine address, IP address (in which case all of my computers are going out the same Public IP), or some other criteria?
Also, I did a little research on the error, and Google suggests putting the query into a loop, like this:
url = "MAPS_API_WEBSERVICE_URL"
attempts = 0
success = False
while success != True and attempts < 3:
raw_result = urllib.urlopen(url).read()
attempts += 1
# The GetStatus function parses the answer and returns the status code
# This function is out of the scope of this example (you can use a SDK).
status = GetStatus(raw_result)
if status == "OVER_QUERY_LIMIT":
time.sleep(2)
# retry
continue
success = True
if attempts == 3:
# send an alert as this means that the daily limit has been reached
print "Daily limit has been reached"
I'm not sure how to put that into your EA.
Also, a cautionary tale:
The reason I first implemented this was because we had 3 MacBook Airs go missing. One of them started checking into the JSS from an address outside of our district, and students aren't allowed to bring the devices home. So, I put in the EA, and got a lat/lon (within 150m). I gave the address to the police, who were also requesting the address of the IP from AT&T. Apparently, they started knocking on doors around the address of the lat/lon I gave them. Scuttlebutt is that word got back to the thief, that the cops were looking, so he tossed all 3 of the laptops into the Des Plaines river. Lesson learned: don't tell the cops your information until they already have it.
Posted on 04-24-2017 09:03 PM
These scripts are great. I just tested them on my own machine and the location info and Safari history were spot on. Does anyone know of a script to retrieve the browsing history for Chrome?
Posted on 05-01-2017 09:23 AM
I have setup the stolen static group, assigned the stolen machine to the stolen static group, added the scripts and created the policy to push the scripts to the machine....My question is where will the data appear? If still fairly new to JamfPro.
Many thanks,
Allen
Posted on 05-01-2017 09:31 AM
If you are using the Safari history script - it will show in the log for the Policy.
Posted on 05-01-2017 09:41 AM
Thank you!
Posted on 05-30-2017 08:21 AM
EDIT
never mind. figured it out.
Posted on 07-31-2017 12:46 PM
Just implemented @bollman's location script in my JAMF instance - amazing how accurate it seems to be!
One thing to be aware of is that this script relies on wifi being enabled and actively turns it on (and then back off) if it finds it in the off state. I'm betting most users won't dig seeing their wifi turning on by itself (and then back off, which is even MORE suspicious-looking), so consider fiddling with the if statements so that, if the script finds the wifi to be on then go ahead and do "it's thing" but, if it's off, simply return a message like "WiFi was disabled during last check-in; location not acquired."
Also, reading @JayDuff's comment about hitting an OVER_QUERY_LIMIT message from Google (which makes sense, considering how much this will hammer their servers, depending on the size of your system inventory), I'm thinking you could do something like store the name of the active WiFi network as a text file in /tmp when a lookup is performed, then modify the script to first compare that to the current wifi network the NEXT time the script is accessed and only proceed with a geo-lookup if it's changed. That should minimize the amount of lookups.
Posted on 07-31-2017 01:22 PM
@ChrisJScott-work What you'd look at to see if it changed is the BSSID. You might have a WiFi network available over a large area so the SSID might not change, but the BSSID will as it's unique to each base station.
Posted on 07-31-2017 01:30 PM
Great suggestion, @StoneMagnet! You kind of read my mind... one thing that would be an issue w/ my solution is that, for example, my company has offices all over the world and they all have same wifi name - any user traveling from one office to the next w/o connecting to another network would not get their location updated. Not a matter of life or death, but a hiccup.
Anyhow, your suggestion would resolve that - thanks!
One other question: the accuracy - what is that a measurement of? Feet? Meters? Smoots (https://en.wikipedia.org/wiki/Smoot)?
Posted on 08-29-2017 08:01 AM
This script doesn't seem to work anymore.
I keep getting the following errors when running the script
line 40: [: -ne: unary operator expected
(lat=, long=, acc=)
Posted on 08-30-2017 09:56 AM
I was just about to come and post the same thing as @ooshnoo I noticed the other day, I get the same error also.
Posted on 08-31-2017 04:48 AM
@bollman Any thoughts on why it no worky???
Posted on 08-31-2017 06:59 AM
@JayDuff On the imagesnap/screencapture and logging I have used a this with ARD before:
/tmp/imagesnap - | base64 | pbcopy
The other half is pbpaste | base64 -D -o ~/Desktop/shot.jpg;open ~/Desktop/shot.jpg
One might simply pipe the base64 encode into the logs and decode as needed.
Posted on 08-31-2017 04:27 PM
Not working as 404 error from google. Looks like the API is gone
Posted on 09-01-2017 07:34 AM
deleted
Posted on 09-01-2017 09:40 AM
I think it's still there, they just altered it so you have to insert your API key in the URL request each time.