Browser History - Shell Script copying old version of File

New Contributor III

Issue: I have a shell script that when doing a "ditto" command seems to be taking an old version of a file even when the source file has a newer modified date. Is there anything that would cause 'ditto' or any copy command to get an old file? Does the drive cache copied files as the filename of the source and destination are the same each time (destination folder is different)?


I current have a script to do the following:

  • Create a temp folder to work in - prefixed with current date/time to ensure it different every time the script is run.
  • Copy the History file from a Google Chrome profile to a tmp folder (using "ditto") (script actually does this all present profiles, safari and Opera)
  • Parse the files out to CSV using sql
  • Delete to DB's to save space
  • Zip the CSV files and upload to Jamf Inventory Record of the device
  • Remove the temp folder using "rm -rf".

This process was working great, until the past month or two. Whenever the script is run, it will upload a file that was modified in April and therefor not the latest history from the device.

Devices are MacBook Airs running anything from 10.15 to 12.4, APFS formatted drives.

Attempted steps to resolve:

  • I have tried a different computer - works first time, subsequent runs get old file
  • have tried using 'cp' to copy the file and various other cmd's and options.
  • I have uploaded a log file with an "ls -lh" of the directories in question and I can see the file in the profile location has been modified recently.
  • I have left the db files to upload in the zip and they have a modified date of April (save as last entry in csv after parsing the history data)
  • Killing chrome before copying the file worked on one computer but not on others.

The issue looks to be in the copy process as the parsing of the file works correctly if given the newer file. The process is run weekly on a remote target via jamf policy.


Script being used:




echo "Setting up Variables"
loggedInUser=$(scutil <<< "show State:/Users/ConsoleUser" | awk '/Name :/ && ! /loginwindow/ { print $3 }')
mySerial=$(system_profiler SPHardwareDataType | awk '/Serial Number/{print $4}')
deviceID=$(curl -sku ${jamfCreds} -X GET -H "accept: application/xml" "${jamfProUrl}/computers/serialnumber/${mySerial}" | xmllint --xpath '//general/id/text()' -)
uploadDate=$(date +"%y%m%d%I%M%S")

dbLocationGoogle=/Users/${loggedInUser}/Library/Application\ Support/Google/Chrome
dbLocationOpera=/Users/${loggedInUser}/Library/Application\ Support/com.operasoftware.Opera

mkdir $logTmpLocation


echo "Creating SQL Files for use"
#Create Chromium Database SQL Query File
cat > "$logTmpLocation/Chromium.sql" << EOF
  datetime(last_visit_time/1000000-11644473600, "unixepoch") as last_visited, 
FROM urls
ORDER BY last_visit_time DESC; 

#Create Safari Database SQL Query File
cat > "$logTmpLocation/Safari.sql" << EOF
  datetime(hv.visit_time + 978307200, 'unixepoch', 'localtime') as last_visited, 
  history_visits hv, 
  history_items hi 
  hv.history_item =
  hv.visit_time DESC;

# Process safari profiles found
echo "Processing Safari Profiles
osascript -e 'display dialog "Safari will now close for maintenance." with title "Heights College Device Maintenance" buttons {"OK"} default button 1'
killall "Safari"
for file in ${dbLocationSafari}/History.db(N.);
    if [[ $file =~ $validfile ]] {
        echo "  Copying file to tmp location ${filename}"
        ditto $file ${filename}.db
        sqlite3 -header -csv $filename.db < $logTmpLocation/Safari.sql > $filename.csv; # process DB to CSV
        echo "  Converted Safari DB $safaricount to CSV

# Gather and Process Google profiles found
echo "Processing Google Chrome Profiles
osascript -e 'display dialog "Google Chrome will now close for maintenance." with title "Heights College Device Maintenance" buttons {"OK"} default button 1'
killall "Google Chrome"
for file in ${dbLocationGoogle}/**/*History(N.);
    filefolder=$( echo $(dirname "$file") ) # Get folder path
    profilename=$( echo $(basename "$filefolder") ) # Get last item in the path
    comp_profilename=$(sed "s/ //g" <<< $profilename) # Compress the profile name to remove spaces
    if [[ ${comp_profilename} != 'GuestProfile' && ${comp_profilename} != 'SystemProfile' ]] {
        echo "  Processing ${file}"
        echo "      Copying file to tmp location ${filename}"
        ditto $file ${filename}.db
        sqlite3 -header -csv $filename.db < $logTmpLocation/Chromium.sql > $filename.csv;
        echo "      Converted $comp_profilename DB to CSV

# Gather and Process Opera profiles found
echo "Processing Opera (GX & non GX) Profiles"
for file in ${dbLocationOpera}*/History(N.);
    if [[ $file =~ '[GX]' ]] {
	ditto $file ${filename}.db
    sqlite3 -header -csv $filename.db < $logTmpLocation/Chromium.sql > $filename.csv;

echo "

Browser History Process Complete

    Run Date & Time: $uploadDate
    Logged on User:  $loggedInUser

    Processed the following files:
    Google: $googlecount
    Opera:  $operacount
    Safari: $safaricount

echo "Removing Parsed DB files"
rm -rf $logTmpLocation/*.sql
rm -rf $logTmpLocation/*.db*

echo "Zipping and Uploading CSV's"
cd $logTmpLocation/ && zip -r $logTmpLocation/$ ./* && cd -

# Upload zip of csv's to jamf Inventory Record - Attachments
curl -sku ${jamfCreds} "${jamfProUrl}/fileuploads/computers/id/${deviceID}" -F name=@${logTmpLocation}/$

# Cleanup files and remove tmp directory
echo "Cleaning Up"
rm -rf $logTmpLocation




Valued Contributor


I know it's been some time but I saw this thread and have a similar workflow, thought i'd comment for anyone following along.

Main differences would be;
- I don't close applications before copy, using 'cp' although renaming those files in the process. eg. Chrome's 'History' >> chromeDB & Safari's 'History.db' >> safariDB
- I'm using 'sqlite3' to process the data, I created two functions;

exportsqlSafari() { 
sqlite3 $userTMP/SafariDB <<!
.headers on
.mode csv
.output $userTMP/SafariDB.csv
SELECT datetime(hv.visit_time + 978307200, 'unixepoch', 'localtime') as 'Visit Time', hv.title as 'Webpage Title', hi.visit_count as 'Times Visitied', hi.url as 'URL Link' FROM history_visits hv, history_items hi WHERE hv.history_item = order by hv.visit_time desc;

exportsqlChrome() {
sqlite3 $userTMP/ChromeDB <<!
.headers on
.mode csv
.output $userTMP/ChromeDB.csv
select datetime(last_visit_time/1000000-11644473600,'unixepoch','localtime') as 'Visit Time',title as 'Webpage Title', visit_count as 'Times Visited', hidden as 'Incognito Mode', url as 'URL Link' from urls order by last_visit_time desc;

- I also created another function which searches Chrome for additional, local user profiles, then processes found db's and named according to their profile. eg.

find "$userChrome" -maxdepth 2 -type d -iname "Profile*"

plutil -extract raw -o - - <<< "$jsonPreference"

I then have another policy which picks up logs, so I add devices if needed for upload attachment.
Has been working for quite some time so maybe try renaming on the copy or maybe even processing using sqlite.

Good luck! Otherwise, it's been working for you close to a year now! :D