Script to rsync Documents and Desktop Files help

GabeShack
Valued Contributor III

We are looking to have a script run at logout that rsyncs the data from a Users (AD user) local Documents and Desktop folders to their shared drive that is already mounted on their system. The script that I had gotten working in terminal works well as it rsyncs the data just the one way from the local to the user share. I think though I need to modify it to use the $3 variable so Casper understand to run it on user logout and not system logout as this keeps failing. I may also think about trying to have it rsync the contents of desktop from the share back to the local desktop on login. We are doing this specifically for our elementary school students so its almost like redirection without using redirection lol.

Any help would make me very grateful.

Here is what I got with help from Ben Greisler:
#!/bin/sh
#By Ben Greisler ben@kadimac.com July 23, 2012
klistuser=whoami

#echo $klistuser

pathb=$(echo `dscl localhost -read /Search/Users/$klistuser dsAttrTypeStandard:OriginalHomeDirectory` | awk 'BEGIN { FS = ".internal" }; { print $2 }' | awk 'BEGIN { FS = "<path>" }; { print $1 }' | awk '{ sub("</url>","");print}')

#echo $pathb

localpath=/Users/$klistuser

#echo $localpath

rsync -avz $localpath/Desktop/ /Volumes/$pathb/Desktop/
rsync -avz $localpath/Documents/ /Volumes/$pathb/Documents/

Gabe Shackney
Princeton Public Schools
1 ACCEPTED SOLUTION

GabeShack
Valued Contributor III

So I changed my original script as posted above and replaced where it said $klistuser with $3 and it worked perfectly by making a policy run on logout. I then modified it to do the reverse and rsync the files back from the network drive to the respective desktop or documents folder keeping both folders with the most current files.

So a few things...this will save files amazingly well gets all the most recent changes however if you rename a file, it duplicates it, if you throw a file away then it will recreate it unless you trash it from each machine you logged into with that user. Lol so I'm on the right track but maybe need a few more modifications.

#!/bin/sh
#By Ben Greisler ben@kadimac.com and Gabe Shackney July 25, 2012


#echo $klistuser

pathb=$(echo `dscl localhost -read /Search/Users/$3 dsAttrTypeStandard:OriginalHomeDirectory` | awk 'BEGIN { FS = ".internal" }; { print $2 }' | awk 'BEGIN { FS = "<path>" }; { print $1 }' | awk '{ sub("</url>","");print}')

#echo $pathb

localpath=/Users/$3

#echo $localpath

rsync -avz /Volumes/$pathb/Desktop/ /$localpath/Desktop/
rsync -avz /Volumes/$pathb/Documents/ /$localpath/Documents/

And the logout script

#!/bin/sh
#By Ben Greisler ben@kadimac.com and Gabe Shackney July 25, 2012


#echo $klistuser

pathb=$(echo `dscl localhost -read /Search/Users/$3 dsAttrTypeStandard:OriginalHomeDirectory` | awk 'BEGIN { FS = ".internal" }; { print $2 }' | awk 'BEGIN { FS = "<path>" }; { print $1 }' | awk '{ sub("</url>","");print}')

#echo $pathb

localpath=/Users/$3

#echo $localpath

rsync -avz $localpath/Desktop/ /Volumes/$pathb/Desktop/
rsync -avz $localpath/Documents/ /Volumes/$pathb/Documents/

Thanks again for all your suggestions and help. Let me know if you have any ideas about the deleting issue or renaming issue. I'd love to keep refining this!

I also think the only specific data piece that others would need to change to run this script is where its set to fs = ".internal" which is what our internal network uses. Otherwise this is pretty generic.

Gabe Shackney
Instructional Technology Specialist
Princeton Public Schools

Gabe Shackney
Princeton Public Schools

View solution in original post

16 REPLIES 16

nessts
Valued Contributor II

first thought is why not turn on synchronization in the MCX stuff and use whats built in with portable home directories? Second thought logout hook.

jamie_ivanov
New Contributor

On user login, rsync /remote/path /local/path.
On logout, rsync /local/path /remote/path.

Know your tools before you use them. You are looking for two-way synchronizing; rsync itself synchronizes one way, but it can be scripted to do a two-way sync as I've mentioned above.

You can launch the script with "&" to launch it into the background or within a "screen" session.. On logout "screen -S rsync rsync /local/path /remote/path" will launch the string in a virtual terminal named "rsync" which you can connect to (if necessary) to watch the status (locally or remotely) with "screen -r rsync", etc. Screen will exit upon successful completion.

J.I.

GabeShack
Valued Contributor III

After much debating and research and speaking with Apple as well as other consultants, it was recommended not to use MCX redirection as there are usually many associated problems with this. Since we are only looking to have very small text and picture files as part of this I think this script could work very well.

Gabe Shackney
Princeton Public Schools

Andrina
Contributor

We've got a couple similar things in place - this is what's working for us with rsync - I also have an on-demand Applescript which is a "double-click" for the end user to have their files synced to the server... Our source is from a set folder that's the same on all machines - You could probably change the source up to be something like "/Users/$USER/" with some testing... Obviously, some customization for your server and share will be required...

#!/bin/bash

# Script to sync files to Server User Folder - must be run while user is logged in to allow for authenticaed mount of Server

# Change SOURCE path as needed for individuals machine
SOURCE="/Users/username"
USER=`whoami`
CADIR=`dscl /Active Directory/PRODUCTION/All Domains -read /Users/$USER RealName | sed s'/RealName://'g | cut -c 2-50 | sed s'/ /./'g | tr '
' ' ' | sed s'/ //'g`

# Make sure someone is logged in first!
who | grep console
if [ `echo $?` == 1 ]; then
exit 0
else
echo "continue"
fi

# Check to see if Server is currently mounted - if not, mount it
df | grep /Volumes/ShareName
if [ `echo $?` == 0 ]
then
echo $CADIR
else
mkdir /Volumes/ShareName
mount -t smbfs cifs://server.example.com/ShareName /Volumes/ShareName
fi

# Roll the logs
mv ~/Library/Logs/lastsync.log.7 ~/Library/Logs/lastsync.log.8
mv ~/Library/Logs/lastsync.log.6 ~/Library/Logs/lastsync.log.7
mv ~/Library/Logs/lastsync.log.5 ~/Library/Logs/lastsync.log.6
mv ~/Library/Logs/lastsync.log.4 ~/Library/Logs/lastsync.log.5
mv ~/Library/Logs/lastsync.log.3 ~/Library/Logs/lastsync.log.4
mv ~/Library/Logs/lastsync.log.2 ~/Library/Logs/lastsync.log.3
mv ~/Library/Logs/lastsync.log.1 ~/Library/Logs/lastsync.log.2
mv ~/Library/Logs/lastsync.log ~/Library/Logs/lastsync.log.1

# Sync the data
rsync -ah --progress --compress-level=0 --stats --inplace $SOURCE /Volumes/ShareName/Users/$CADIR/Backup/ >> ~/Library/Logs/lastsync.log

# Get rid of sync logs older than 7 days
find ~/Library/Logs/ -iname lastsync.log* -mtime +7 -exec rm {} ;

GabeShack
Valued Contributor III

Well I'm not looking for two way sync per say, but it may be something I want to play with as I stated above our original script is a one way sync. So I do understand the tools, but need more help with my scripting.

Gabe Shackney
Princeton Public Schools

Andrina
Contributor

And if you want an on-demand Applescript droplet...

on open droppedItems

    set mountedDisks to list disks
    if mountedDisks does not contain "ShareName" then

        try
            mount volume "smb://server.example.com/ShareName"
            delay 5
        on error
            delay 1
            try
                mount volume "smb://server.example.com/ShareName"
                delay 5
            on error
                display dialog "There was an error mounting the Volume." & return & return & ¬
                    "The server may be unavailable at this time." & return & return & ¬
                    "Please inform the Network Administrator if the problem continues." buttons {"Okay"} default button 1
            end try
        end try
    end if

    set backup to do shell script "/usr/local/bin/folder"
    set rsyncpath to quoted form of text 1 through -2 of POSIX path of droppedItems
    do shell script "rsync -ah --progress --compress-level=0 --stats --inplace --delete " & rsyncpath & " /Volumes/ShareName/Users/" & backup & "/Backup/ >> ~/Library/Logs/lastsync.log"

end open

This of course refers to a bash script that I'm storing in /usr/local/bin on users workstations with the following contents:

#!/bin/bash
echo `dscl /Active Directory/PRODUCTION/All Domains -read /Users/$USER RealName | sed s'/RealName://'g | cut -c 2-50 | sed s'/ /./'g | tr '
' ' ' | sed s'/ //'g`

GabeShack
Valued Contributor III

So I assume this is not being run by Casper as a policy but on individual machines based on need? I was really hoping to get this to work on a by user basis with some sort of logout hook.

Gabe Shackney
Princeton Public Schools

Andrina
Contributor

The Applescript is an "on-demand" the first all bash script could be run by policy...

GabeShack
Valued Contributor III

I think our big problem is making this happen on logout and the timing. By the time it runs its too late to grab the data. I have to either force the script to run before the logout command actually happens, or have the script save the data to a file that gets grabbed by the root user to use for the command.

Gabe Shackney
Princeton Public Schools

mm2270
Legendary Contributor III

I was wondering if running a backup at the Logout trigger would even work. Since the script would presumably run after the account has logged out, it may have trouble determining current user. Perhaps using:

last | head -1 | awk '{print $1}'

to get the last user name will work? All it should really need is the path to the home folder, since it runs as root, yes? I don't use scripts run by Logout hooks much with Casper, so I'm unclear about that.

Andrina
Contributor

The next issue will be the share that you're copying data to then - my script relies on the user being logged in and will use the kerberos ticket. On logout the mounting of the volume will fail - you kinda need to catch all this before the user credentials are lost... *thinking*...

jescala
Contributor II

Have you considered Unison?

http://www.cis.upenn.edu/~bcpierce/unison/

GabeShack
Valued Contributor III

So I changed my original script as posted above and replaced where it said $klistuser with $3 and it worked perfectly by making a policy run on logout. I then modified it to do the reverse and rsync the files back from the network drive to the respective desktop or documents folder keeping both folders with the most current files.

So a few things...this will save files amazingly well gets all the most recent changes however if you rename a file, it duplicates it, if you throw a file away then it will recreate it unless you trash it from each machine you logged into with that user. Lol so I'm on the right track but maybe need a few more modifications.

#!/bin/sh
#By Ben Greisler ben@kadimac.com and Gabe Shackney July 25, 2012


#echo $klistuser

pathb=$(echo `dscl localhost -read /Search/Users/$3 dsAttrTypeStandard:OriginalHomeDirectory` | awk 'BEGIN { FS = ".internal" }; { print $2 }' | awk 'BEGIN { FS = "<path>" }; { print $1 }' | awk '{ sub("</url>","");print}')

#echo $pathb

localpath=/Users/$3

#echo $localpath

rsync -avz /Volumes/$pathb/Desktop/ /$localpath/Desktop/
rsync -avz /Volumes/$pathb/Documents/ /$localpath/Documents/

And the logout script

#!/bin/sh
#By Ben Greisler ben@kadimac.com and Gabe Shackney July 25, 2012


#echo $klistuser

pathb=$(echo `dscl localhost -read /Search/Users/$3 dsAttrTypeStandard:OriginalHomeDirectory` | awk 'BEGIN { FS = ".internal" }; { print $2 }' | awk 'BEGIN { FS = "<path>" }; { print $1 }' | awk '{ sub("</url>","");print}')

#echo $pathb

localpath=/Users/$3

#echo $localpath

rsync -avz $localpath/Desktop/ /Volumes/$pathb/Desktop/
rsync -avz $localpath/Documents/ /Volumes/$pathb/Documents/

Thanks again for all your suggestions and help. Let me know if you have any ideas about the deleting issue or renaming issue. I'd love to keep refining this!

I also think the only specific data piece that others would need to change to run this script is where its set to fs = ".internal" which is what our internal network uses. Otherwise this is pretty generic.

Gabe Shackney
Instructional Technology Specialist
Princeton Public Schools

Gabe Shackney
Princeton Public Schools

greatkemo
Contributor

@gshackney I know this is a couple of years too late but I thought I would chime in, you needn't rsync the files back to the user, you just need to create symlinks, here are my login and logout scripts (not 100% yet, ignore the du part in the logout script:

Login:

#!/bin/bash

sleep 3
# Find current logged in user
CRT_USR=`ls -l /dev/console | awk {'print $3'}`

# Notify the user of mounting the notwork share
JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Mounting Network Home"
MNT_DSC="Please Wait While Your Network Home is being Mounted"
MNT_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/GenericFileServerIcon.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${MNT_DSC}" -alignDescription natural -icon ${MNT_ICN} -iconSize 100 &

# Clear any files that may have been left over by accident
rm -rf /Users/$CRT_USR/Desktop/*
rm -rf /Users/$CRT_USR/Documents/*
rm -rf /Users/$CRT_USR/Downloads/*
rm -rf /Users/$CRT_USR/Movies/*
rm -rf /Users/$CRT_USR/Music/*
rm -rf /Users/$CRT_USR/Pictures/*
rm -rf /Users/$CRT_USR/.Trash/*

# Mound the user share
osascript -e "try" -e "mount volume "cifs://path/domain/user/$CRT_USR"" -e "end try"
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

# Notify user of the sync
JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Syncing Files"
SYC_DSC="Please Wait While Your Files Are Being Sync'd"
SYC_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/Sync.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${SYC_DSC}" -alignDescription natural -icon ${SYC_ICN} -iconSize 100 &

# Make symlinks to all files in My Documents and Desktop
VOL_PTH=`df | grep qatar | awk '{print $NF}'`
if [ -e $VOL_PTH/MyDocuments/Downloads/ ]; then
    ln -s $VOL_PTH/MyDocuments/Downloads/* /Users/$CRT_USR/Downloads/
else
    mkdir -p $VOL_PTH/MyDocuments/Downloads/
    chown -R $CRT_USR:"YOURDOMAINDomain Users" $VOL_PTH/MyDocuments/Downloads/
    ln -s $VOL_PTH/MyDocuments/Downloads/* /Users/$CRT_USR/Downloads/
fi
ln -s $VOL_PTH/windowsoverhead/Desktop/* /Users/$CRT_USR/Desktop/
ln -s $VOL_PTH/MyDocuments/* /Users/$CRT_USR/Documents/
ln -s "$VOL_PTH/MyDocuments/My Videos/"* /Users/$CRT_USR/Movies/
ln -s "$VOL_PTH/MyDocuments/My Music/"* /Users/$CRT_USR/Music/ 
ln -s "$VOL_PTH/MyDocuments/My Pictures/"* /Users/$CRT_USR/Pictures/ 

# Remove unwanted files
rm -rf /Users/$CRT_USR/*/*.db
rm -rf /Users/$CRT_USR/*/*.ini
rm -rf /Users/$CRT_USR/*/*.lnk
rm -rf /Users/$CRT_USR/*/*RECYCLE*
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

exit 0

Logout

#!/bin/bash

# Find current logged in user and volume path
CRT_USR=`ls -l /dev/console | awk {'print $3'}`
VOL_PTH=`df | grep YOURDOMAIN | awk '{print $NF}'`

# Delete the symlinks 
find /Users/$CRT_USR/Documents/ -type l -delete
find /Users/$CRT_USR/Downloads/ -type l -delete
find /Users/$CRT_USR/Desktop/ -type l -delete
find /Users/$CRT_USR/Movies/ -type l -delete
find /Users/$CRT_USR/Music/ -type l -delete
find /Users/$CRT_USR/Pictures/ -type l -delete

# Check remaining space on drive
# VOL_SIZ=`du -skh $VOL_PTH | awk {'print $1'}`

# Notify the user of the sync
JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Syncing Files"
SYC_DSC="Please Wait While Your Files Are Being Sync'd"
SYC_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/Sync.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${SYC_DSC}" -alignDescription natural -icon ${SYC_ICN} -iconSize 100 &

# Sync the folders back to the share
if [ -e $VOL_PTH/MyDocuments/Downloads ]; then
    rsync -avzu /Users/$CRT_USR/Downloads/ $VOL_PTH/MyDocuments/Downloads/
else
    mkdir -p $VOL_PTH/MyDocuments/Downloads/
    rsync -avzu /Users/$CRT_USR/Downloads/ $VOL_PTH/MyDocuments/Downloads/
fi
rsync -avzu /Users/$CRT_USR/Desktop/ $VOL_PTH/windowsoverhead/Desktop/
rsync -avzu /Users/$CRT_USR/Documents/ $VOL_PTH/MyDocuments/
rsync -avzu /Users/$CRT_USR/Movies/ "$VOL_PTH/MyDocuments/My Videos/"
rsync -avzu /Users/$CRT_USR/Music/ "$VOL_PTH/MyDocuments/My Music/"
rsync -avzu /Users/$CRT_USR/Pictures/ "$VOL_PTH/MyDocuments/My Pictures/"
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

# Notify the user clean up
JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Clean Up"
CLN_DSC="Cleaning Left Over Files"
CLN_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/FullTrashIcon.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${CLN_DSC}" -alignDescription natural -icon ${CLN_ICN} -iconSize 100 &

# Remove all files in docs desktop and downloads
rm -rf /Users/$CRT_USR/Documents/*
rm -rf /Users/$CRT_USR/Desktop/*
rm -rf /Users/$CRT_USR/Downloads/*
rm -rf /Users/$CRT_USR/Movies/*
rm -rf /Users/$CRT_USR/Music/*
rm -rf /Users/$CRT_USR/Pictures/*
rm -rf "/Users/$CRT_USR/Library/Saved Application State/"*
rm -rf /Users/$CRT_USR/.Trash/*
defaults write /Users/$CRT_USR/Library/Preferences/com.apple.loginwindow TALLogoutSavesState -bool false
chmod -R a-w "/Users/$CRT_USR/Library/Saved Application State/"
chmod a-w /Users/$CRT_USR/Library/Preferences/com.apple.loginwindow.plist
chown root /Users/$CRT_USR/Library/Preferences/com.apple.loginwindow.plist
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Clean Up"
CLN_DSC="All Files Have Been Removed"
CLN_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/TrashIcon.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${CLN_DSC}" -alignDescription natural -icon ${CLN_ICN} -iconSize 100 &
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

# Eject the volume
JMF_HLP="/Library/Application Support/JAMF/bin/jamfHelper.app/Contents/MacOS/jamfHelper"
TTL="Finder"
HED="Eject Disk"
MNT_DSC="Ejecting Network Drive & Logging Out"
MNT_ICN="/System/Library/CoreServices/CoreTypes.bundle/Contents/Resources/EjectMediaIcon.icns"
"${JMF_HLP}" -windowType hud -title ${TTL} -heading "${HED}" -alignHeading center -description "${MNT_DSC}" -alignDescription natural -icon ${MNT_ICN} -iconSize 100 &
hdiutil detach $VOL_PTH -quiet
sleep 3
osascript -e 'tell application "System Events"' -e 'keystroke "q" using command down' -e 'end tell'

exit 0

JPDyson
Valued Contributor

@greatkemo It's worth noting that your approach is not to sync the local data to a network share, but rather to move all data to the network share, and work only from that. What happens when a user takes their laptop home? It seems like they wouldn't be able to take their work home with them, and that changes made offline would be destroyed.

greatkemo
Contributor

@JPDyson this is aimed as lab computers not user laptops, our campus uses box.com for user file share and sync, and also manage CrashPlan for all user file backup. The above is only meant for classroom computers to mimic our windows desktop roaming profile experience. So what is happening is, once a user logs in, their files on their windows home directory have symlinks created to them in all corresponding directories. Then when a user logs out, their files are rsync'd (new files created locally only) back to their windows home.