Skip to main content
Question

Browser History - Shell Script copying old version of File

  • July 7, 2022
  • 1 reply
  • 1 view

Forum|alt.badge.img+4

Issue: I have a shell script that when doing a "ditto" command seems to be taking an old version of a file even when the source file has a newer modified date. Is there anything that would cause 'ditto' or any copy command to get an old file? Does the drive cache copied files as the filename of the source and destination are the same each time (destination folder is different)?

Explanation:

I current have a script to do the following:

  • Create a temp folder to work in - prefixed with current date/time to ensure it different every time the script is run.
  • Copy the History file from a Google Chrome profile to a tmp folder (using "ditto") (script actually does this all present profiles, safari and Opera)
  • Parse the files out to CSV using sql
  • Delete to DB's to save space
  • Zip the CSV files and upload to Jamf Inventory Record of the device
  • Remove the temp folder using "rm -rf".

This process was working great, until the past month or two. Whenever the script is run, it will upload a file that was modified in April and therefor not the latest history from the device.

Devices are MacBook Airs running anything from 10.15 to 12.4, APFS formatted drives.

Attempted steps to resolve:

  • I have tried a different computer - works first time, subsequent runs get old file
  • have tried using 'cp' to copy the file and various other cmd's and options.
  • I have uploaded a log file with an "ls -lh" of the directories in question and I can see the file in the profile location has been modified recently.
  • I have left the db files to upload in the zip and they have a modified date of April (save as last entry in csv after parsing the history data)
  • Killing chrome before copying the file worked on one computer but not on others.

The issue looks to be in the copy process as the parsing of the file works correctly if given the newer file. The process is run weekly on a remote target via jamf policy.

 

Script being used:

 

#!/bin/zsh jamfProUrl=https://your.jamf.server:8443/JSSResource jamfCreds=apiuser:apipassword echo "Setting up Variables" loggedInUser=$(scutil <<< "show State:/Users/ConsoleUser" | awk '/Name :/ && ! /loginwindow/ { print $3 }') mySerial=$(system_profiler SPHardwareDataType | awk '/Serial Number/{print $4}') deviceID=$(curl -sku ${jamfCreds} -X GET -H "accept: application/xml" "${jamfProUrl}/computers/serialnumber/${mySerial}" | xmllint --xpath '//general/id/text()' -) uploadDate=$(date +"%y%m%d%I%M%S") dbLocationGoogle=/Users/${loggedInUser}/Library/Application\\ Support/Google/Chrome dbLocationOpera=/Users/${loggedInUser}/Library/Application\\ Support/com.operasoftware.Opera dbLocationSafari=/Users/${loggedInUser}/Library/Safari logTmpLocation="/var/tmp/${uploadDate}_${loggedInUser}_BrowserHistory" mkdir $logTmpLocation googlecount=0 operacount=0 safaricount=0 echo "Creating SQL Files for use" #Create Chromium Database SQL Query File cat > "$logTmpLocation/Chromium.sql" << EOF SELECT datetime(last_visit_time/1000000-11644473600, "unixepoch") as last_visited, url, title, visit_count FROM urls ORDER BY last_visit_time DESC; EOF #Create Safari Database SQL Query File cat > "$logTmpLocation/Safari.sql" << EOF SELECT datetime(hv.visit_time + 978307200, 'unixepoch', 'localtime') as last_visited, hi.url, hv.title FROM history_visits hv, history_items hi WHERE hv.history_item = hi.id ORDER BY hv.visit_time DESC; EOF # Process safari profiles found echo "Processing Safari Profiles " osascript -e 'display dialog "Safari will now close for maintenance." with title "Heights College Device Maintenance" buttons {"OK"} default button 1' killall "Safari" for file in ${dbLocationSafari}/History.db(N.); { validfile="(.db)$" if [[ $file =~ $validfile ]] { filename=${logTmpLocation}/Safari-History${safaricount} echo " Copying file to tmp location ${filename}" ditto $file ${filename}.db sqlite3 -header -csv $filename.db < $logTmpLocation/Safari.sql > $filename.csv; # process DB to CSV echo " Converted Safari DB $safaricount to CSV -------" ((safaricount++)) } } # Gather and Process Google profiles found echo "Processing Google Chrome Profiles " osascript -e 'display dialog "Google Chrome will now close for maintenance." with title "Heights College Device Maintenance" buttons {"OK"} default button 1' killall "Google Chrome" for file in ${dbLocationGoogle}/**/*History(N.); { filefolder=$( echo $(dirname "$file") ) # Get folder path profilename=$( echo $(basename "$filefolder") ) # Get last item in the path comp_profilename=$(sed "s/ //g" <<< $profilename) # Compress the profile name to remove spaces filename=${logTmpLocation}/Chrome-${comp_profilename}-History if [[ ${comp_profilename} != 'GuestProfile' && ${comp_profilename} != 'SystemProfile' ]] { echo " Processing ${file}" echo " Copying file to tmp location ${filename}" ditto $file ${filename}.db sqlite3 -header -csv $filename.db < $logTmpLocation/Chromium.sql > $filename.csv; ((googlecount++)) echo " Converted $comp_profilename DB to CSV -------" } } # Gather and Process Opera profiles found echo "Processing Opera (GX & non GX) Profiles" for file in ${dbLocationOpera}*/History(N.); { GX="" if [[ $file =~ '[GX]' ]] { GX="GX" } filename=${logTmpLocation}/Opera$GX-${comp_profilename}-History ditto $file ${filename}.db sqlite3 -header -csv $filename.db < $logTmpLocation/Chromium.sql > $filename.csv; ((operacount++)) } echo " ____________________ Browser History Process Complete Run Date & Time: $uploadDate Logged on User: $loggedInUser Processed the following files: Google: $googlecount Opera: $operacount Safari: $safaricount _____________________ " echo "Removing Parsed DB files" rm -rf $logTmpLocation/*.sql rm -rf $logTmpLocation/*.db* echo "Zipping and Uploading CSV's" zipfilename=${uploadDate}_${loggedInUser}_BrowserHistory cd $logTmpLocation/ && zip -r $logTmpLocation/$zipfilename.zip ./* && cd - # Upload zip of csv's to jamf Inventory Record - Attachments curl -sku ${jamfCreds} "${jamfProUrl}/fileuploads/computers/id/${deviceID}" -F name=@${logTmpLocation}/$zipfilename.zip # Cleanup files and remove tmp directory echo "Cleaning Up" rm -rf $logTmpLocation

 

 

1 reply

Bol
Forum|alt.badge.img+11
  • Contributor
  • 276 replies
  • June 4, 2023

@jonathan_mcc 

I know it's been some time but I saw this thread and have a similar workflow, thought i'd comment for anyone following along.

Main differences would be;
- I don't close applications before copy, using 'cp' although renaming those files in the process. eg. Chrome's 'History' >> chromeDB & Safari's 'History.db' >> safariDB
- I'm using 'sqlite3' to process the data, I created two functions;

exportsqlSafari() { 
sqlite3 $userTMP/SafariDB <<!
.headers on
.mode csv
.output $userTMP/SafariDB.csv
SELECT datetime(hv.visit_time + 978307200, 'unixepoch', 'localtime') as 'Visit Time', hv.title as 'Webpage Title', hi.visit_count as 'Times Visitied', hi.url as 'URL Link' FROM history_visits hv, history_items hi WHERE hv.history_item = hi.id order by hv.visit_time desc;
!
}

exportsqlChrome() {
sqlite3 $userTMP/ChromeDB <<!
.headers on
.mode csv
.output $userTMP/ChromeDB.csv
select datetime(last_visit_time/1000000-11644473600,'unixepoch','localtime') as 'Visit Time',title as 'Webpage Title', visit_count as 'Times Visited', hidden as 'Incognito Mode', url as 'URL Link' from urls order by last_visit_time desc;
!
}

- I also created another function which searches Chrome for additional, local user profiles, then processes found db's and named according to their profile. eg.

find "$userChrome" -maxdepth 2 -type d -iname "Profile*"

plutil -extract profile.name raw -o - - <<< "$jsonPreference"

I then have another policy which picks up logs, so I add devices if needed for upload attachment.
Has been working for quite some time so maybe try renaming on the copy or maybe even processing using sqlite.

Good luck! Otherwise, it's been working for you close to a year now! :D


Reply


Cookie policy

We use cookies to enhance and personalize your experience. If you accept you agree to our full cookie policy. Learn more about our cookies.

 
Cookie settings