I use a script to download and install the latest version of Chrome and WebEx.
When running the curl command it seems to fail when creating the local file
Here is the command:
/usr/bin/curl -O https://dl.google.com/chrome/mac/stable/GGRO/googlechrome.dmg > /private/tmp/googlechrome.dmg
Here is the result:
Script result: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0Warning: Failed to create the file googlechrome.dmg: Read-only file system 0 79.5M 0 1388 0 0 30173 0 0:46:04 --:--:-- 0:46:04 30173 curl: (23) Failed writing body (0 != 1388)
Anybody know what's wrong? It works fine for Mojave. I've even changed the directory from /private/tmp to /Users/Shared
Again, it only happens when running from Self Service. If run the policy from Terminal, it works fine.
@dan-snelson thanks for the idea. It still fails but gives a slightly differnent result
% Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0Warning: Failed to create the file googlechrome.dmg: Permission denied
I have seen the same thing. It looks like to be something to do with the encrypted user data partition because if I'm running it from Terminal then it is triggering the policy for the local user versus when called from Self Service it is using the root user level and trying to transverse across to the user level.
This is from Self Service:
Script exit code: 0 Script result: downloading<br/> % Total % Received % Xferd Average Speed Time Time Time Current<br/> Dload Upload Total Spent Left Speed<br/><br/> 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0Warning: Failed to create the file googlechrome.dmg: Read-only file system<br/><br/> 0 79.5M 0 1388 0 0 3540 0 6:32:43 --:--:-- 6:32:43 3540<br/>curl: (23) Failed writing body (0 != 1388)<br/>Thu Oct 24 19:19:55 PDT 2019: Mounting installer disk image.<br/>hdiutil: attach failed - No such file or directory<br/>Thu Oct 24 19:19:55 PDT 2019: Installing...<br/>cp: /Volumes/Google Chrome/Google Chrome.app: No such file or directory<br/>Thu Oct 24 19:19:55 PDT 2019: Unmounting installer disk image.<br/>hdiutil: eject failed - No such file or directory<br/>
And this is when run from Terminal:
Script result: downloading<br/> % Total % Received % Xferd Average Speed Time Time Time Current<br/> Dload Upload Total Spent Left Speed<br/><br/> 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0<br/> 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0<br/> 6 79.5M 6 5231k 0 0 4705k 0 0:00:17 0:00:01 0:00:16 4700k<br/> 41 79.5M 41 32.6M 0 0 15.4M 0 0:00:05 0:00:02 0:00:03 15.4M<br/> 76 79.5M 76 60.5M 0 0 19.4M 0 0:00:04 0:00:03 0:00:01 19.4M<br/>100 79.5M 100 79.5M 0 0 20.9M 0 0:00:03 0:00:03 --:--:-- 20.9M<br/>Thu Oct 24 19:21:15 PDT 2019: Mounting installer disk image.<br/>/dev/disk2 Apple_partition_scheme <br/>/dev/disk2s1 Apple_partition_map <br/>/dev/disk2s2 Apple_HFS /Volumes/Google Chrome<br/>Thu Oct 24 19:21:16 PDT 2019: Installing...<br/>Thu Oct 24 19:21:24 PDT 2019: Unmounting installer disk image.<br/>"disk2" ejected.<br/>
@dan-snelson No bueno on the /Users/Shared/ either... I saw the webex.dmg drop in there but it is still failing...
QuotedText 0:01:09 0:00:02 1875k<br/> 99 127M 99 126M 0 0 1829k 0 0:01:11 0:01:10 0:00:01 1868k<br/>100 127M 100 127M 0 0 1830k 0 0:01:11 0:01:11 --:--:-- 1872k<br/> echo 'date: Mounting installer disk image.'<br/>date: Mounting installer disk image.<br/> echo 'date: Installing...'<br/>date: Installing...<br/> /usr/sbin/installer -pkg '/Users/Shared/Cisco Webex Meetings.pkg' -target /<br/>installer: Error - the package path specified was invalid: '/Users/Shared/Cisco Webex Meetings.pkg'.<br/> /usr/bin/hdiutil eject -force '/Volumes/Cisco Webex Meetings.pkg'<br/>hdiutil: eject failed - No such file or directory<br/> rm -rf /webexapp.dmg<br/>+ rm -rf '/Users/Shared/Cisco Webex Meetings.pkg'<br/>
There are a few issues with directly using curl to install software.
AutoPkg helps alleviate most of these issues. Including VirusTotalAnalyzer as a post-processor isn't much work and gets you fully virus scanned, verified installer packages. Also means you have a chance to think through the software update and vetting process for your org. Something like JSSImporter implemented well means Mac client software will be up to date directly from a trusted source (your distribution point) in an automated fashion.
@cbrewer SSL verification/validation is not really anything that protects from a MITM attack. The only thing that protects/prevents from a MITM attack is cert pinning see this link. Something like 95% of servers out there do not pin their certs, and there are many reasons for this. Anything that interrupts traffic from source to destination will break against a pinned cert. This means that not only is MITM possibly a good attack vector against
curl scripts, the amount of
curl scripts listed on Jamf Nation tells me if I were an attacker, specifically targeting your Org and you used jamf, good chances are I can MITM you. I know that is a far fetched attack, as that would require specific attackers to specifically target you or your Org. However, it is still a risk that SSL verification/validation does zero to protect against.
Then you have all the quality issues, roll back issues, lack of testing issues. With AutoPKG, even if I YOLO it to prod, I still ahve all the packages in my package repo to roll back to any broke version.
So, yes, the risks are all there and the risks are much much lower with an AutoPKG workflow.
Your 2nd and 3rd points are valid but I'm not convinced your 1st one is. Curl will correctly validate the certificate chain.
returns a bad ssl certificate chain
If you don’t want to curl because you don’t trust CDN’s then make that specific point. If you don’t want to curl because version control is an issue, say that. But those issues aren’t the same as a MITM attack. Saying every HTTPS connection is vulnerable to a network level MITM attack is basically saying HTTPS is fully broken and you shouldn’t ever trust it.
I don't want to discount the upsides of AutoPKG but I'm still not seeing where curl is the man-in-the-middle security risk that it's being presented as. All that said, I'm not a security expert and am just making conversation here.
It is the simple fact that it can be done, and that there are already existing tools to help automate stuff in better ways is why it is a security risk. Now add in the fact that you may not know every network your devices are on, connect to, or peer through via ISP Network Peering. You can MITM SSL, and it can be done via proxy with two separate sessions and the cert chain would still return valid, unless the destination cert is pinned.
Also, executing code as
root on a client endpoint, that goes out to the internet and downloads and runs things as root is also a risk. I am not sure how much it can be explained. It is up to you to decide if you want to ignore the risks. AutoPKG some static policies/code can accomplish a much more controlled and secure automation path.
I have yet to see anyone's
curl script do any of the checks AutoPKG does.
Whether the MITM attack is an issue for your environment or not, the other points are still valid and are enough of a reason to be cautious in using scripts that just pull down and install a product over the internet. I say this as someone who used to spend a lot of time building such scripts myself, years ago. Some of them are still on my GitHub page (which I need to get around to removing) I abandoned these scripts a while back, not so much because of the potential security issues, but because of the lack of any version control or validation of the installations effects beforehand. While one can take some measures to minimize those issues, the fact is, you really don't have any control over what a vendor posts as their "latest" version, assuming a script is always pulling down such latest release. I've seen too many cases of Oracle Java or Adobe Flash player updates causing problems under specific circumstances to feel comfortable anymore with doing this. And that's not even getting into larger product installations like Office, et al. It's just too risky in my opinion to blindly push out these installations without first downloading them, testing them and retaining a library of them for possible rollback cases. Even setting aside a possible security issue, like inadvertently installing a compromised product, there is the potential to break some functionality on your Mac fleet, which can make you have a very bad day indeed.
But everyone's tolerance levels are different, so I'm not going to judge anyone who still wants to do this. Just as long as they go into it understanding the risk they're engaging in.
If anyone really wants to use scripts like this, they should at the very least consider adding in some functionality to the script that, in the case of .pkg style installers, validates the vendor signature.
pkgutil --check-signature /path/to/installer.pkg
I think it's somewhat similar to what AutoPKG does, but likely not anywhere near as comprehensive. And AutoPKG does more than just that. But adding that in to the script prior to installation should at least provide a little more security.
I still wouldn't do it anymore myself though.
@mm2270 Yup, exactly what AutoPKG does. AutoPKG can also pass a URL to VirusTotal to add a bit of extra layers of checking. It can also parse
eTag meta data, it can do code sign checks, displays code changes in repos and takes a zero trust model now (meaning if I change code in a repo, it will display the actual delta in terminal and then force me to read it and trust it), so on and so forth.
AutoPKG is just a series of processors that run, so you can add your own on top of that. The automation is there, the method is just different. I think I have beat this horse to death, called a necromancer, raised the horse, and then beat it do death a second time, so I think I will slow my roll there.
Also, like mentioned, if you are willing to take those risks that is fine, but I would encourage your Org and your security team to also understand those risks