creating 10.11 image

AVmcclint
Honored Contributor

I realize El Capitan is a game changer, but how does this affect creating new images? I just got a brand new MBP with 10.11 pre-installed on it. First I tried to capture a virgin image of it by booting up from an external drive running 10.10.5 and Casper tool 9.81 before I did anything else to it. When I launched Composer and chose OS Package, I picked the Macintosh HD and Recovery HD, I picked a location on the external drive to save the image (which I've done many times before with 10.9 and 10.10) and then it gave an error saying I don't have permission. I tried many times and even chose to only do Macintosh HD, but it kept telling me I don't have permission. Permission to do WHAT? I gave up on that so I rebooted it from its internal drive. I created a local admin account and setup ssh and ARD permissions and made sure I updated it to 10.11.1. After that I booted up from the external drive again and Composer STILL wouldn't let me make an image of the drive. What am I missing? Does the external drive need to also be running 10.11?

1 ACCEPTED SOLUTION

mpermann
Valued Contributor II

@AVmcclint I would expect it was complaining about having permission to write to the location that you were trying to save the Composer image to. But I could be wrong. I use AutoDMG to create my base OS images. If I am going to create a Base 10.11.1 image I would use a computer that already has Mac OS 10.11.1 installed and download the AutoDMG and the latest El Capital installer from the App Store. Then I use AutoDMG to create my fresh base OS. The benefit of using AutoDMG is you get the recovery partition as well and it will even download any updates that came out since the OS installer and add them to the image as well. If you haven't given it a try you really should.

View solution in original post

11 REPLIES 11

mpermann
Valued Contributor II

@AVmcclint I would expect it was complaining about having permission to write to the location that you were trying to save the Composer image to. But I could be wrong. I use AutoDMG to create my base OS images. If I am going to create a Base 10.11.1 image I would use a computer that already has Mac OS 10.11.1 installed and download the AutoDMG and the latest El Capital installer from the App Store. Then I use AutoDMG to create my fresh base OS. The benefit of using AutoDMG is you get the recovery partition as well and it will even download any updates that came out since the OS installer and add them to the image as well. If you haven't given it a try you really should.

ant89
Contributor

+1 for autoDMG. It works great and includes the recovery partition.

hkabik
Valued Contributor

+2 for autoDMG, I only go the Composer route if there's a forked OS I have to capture.

AVmcclint
Honored Contributor

I hear many many many many many people jump up and shout autoDMG for any and all image creation problems. I've tried it out and I get it. I understand it. I really do. I can see definite value in what it does. The problem is that it doesn't fit our needs. The list of reasons is very long, but the main reason is that we need to customize a lot of things on the base image before it is put down on other computers. "But you can script all that!", you may say. My reply is I'm not a script god and even though there are a lot of helpful snippets of scripts here on Jamfnation and elsewhere, they still don't cover everything. It takes me a grand total of 5 minutes to do every last bit of customizing we need before committing the image. It takes me many hours to take the scripting route to find, comprehend, modify, test, and deploy each and every script that we would need to use. If Apple makes any changes to the OS, I need to go through the whole process again to adjust until the scripts work again. The way I do it in the GUI, I can easily adjust my mouse clicks to the new way Apple wants us to do it. I understand that the latest mantra is monolithic imaging is a thing of the past, but the secondary reason I would rather do all the customization before imaging is that if I manually set everything ONE TIME, I know for a fact it will work each and every time I use it to image. If I rely on a single all-encompasing script or even 50 smaller scripts to make those same changes, that makes troubleshooting a major time waster. The more actions that need to happen to get from point A to point B, there is a greater likelihood of something going wrong.

Please understand that this is my personal opinion and I definitely would not argue with you if autoDMG works for you in your environments. I just felt the need to respond to the call for autoDMG since I see it in most of the replies I see here.

BTW, It did turn out to be a permissions issue on the target folder I was trying to save the image to. Thanks for the reminder of the obvious answer. :) chmod 777 took care of it.

mpermann
Valued Contributor II

@AVmcclint glad you were able to resolve your issue. I understand AutoDMG/modular workflow isn't for everyone. You have to use what works for you in your environment. I'll know not to mention it to you in the future if you ask a question and I think I have something that might be useful. ;-)

One thing I realized during my time with using monolithic imaging was, when I made the image, it was done on the exact same hardware that it was going to be deployed to. I wouldn't make an image on an iMac and then deploy it to a MacBook for instance. I had an image for all the different model computers we had in our fleet. I ran into some issues with mouse/trackpad settings not being properly set when I would use an image from one model with another. I had a nice checklist of the things I would always do to my monolithic images so reproducing them in an accurate way wasn't difficult. I continued using monolithic images with Casper for over a year before I started investigating modular imaging.

mking529
Contributor

Don't worry @AVmcclint you aren't totally alone. We're a smaller school district with three image configurations. Three. I have tried to go with the package/script/etc. based approach and all it did was give me a headache. Having to make separate packages for browser settings for staff and students, app data, and so on felt like a chore compared to just setting up an image, laying in JSS, and blam done. Putting everything in Self Service for after-the-fact deployment seems to defeat the point of imaging to me. Plus, it all images faster when it's all part of the block copy process. Additionally, the more packages I have in an imaging config, the more likely it is something goes wrong on the DP end during imaging. NetBoot hangs, AFP/SMB hangs, etc. OS X Server is not what it used to be, and it was never awesome to start with. :/ Not to mention the upwards of an hour it takes for an AAMEEE CS6 package to install on first-boot.

Now, if I was having to deal with 20 configurations I'd be singing a different tune.

AVmcclint
Honored Contributor

@mking529 I agree. However, I don't put EVERYTHING on the base image. I do have a few apps install after enrollment or I put them in Self Service to be installed later under controlled conditions because the versions get updated so frequently. For apps or settings that are pretty much etched in stone, they go into the base image and they stay forever. And if upper management says "change this one thing", it is trivial for me to do manually.

Chris_Hafner
Valued Contributor II

@AVmcclint Right on my friend! I also prefer to slightly modify my base OSs before imaging (modularly BTW unless temporarily compiling said configuration for a mass imaging). However, the concept of moving toward a zero touch solution with OSs via something like AutoDMG is becoming easier and easier for folks like us. Personally, I really don't mind diving into scripting, though since I have much left to learn, and very little time to REALLY test the advanced stuff, I prefer fully supported, understood and road mapped processes. I've created some really great scripted processes, yet I continue to find new and easy methods for dealing with most of these challenges without them.

That said, using policies coupled with profiles and a generally simple first boot script I've gotten pretty close to moving over to AutoDMG OSs in production and as advertised. I'd love to be able to simply download the latest OS and dump it into the configuration without having to touch anything at all (except for testing of course).

Care to trade notes on what you're doing to your OSs?

pblake
Contributor III

@AVmcclint

AutoDMG is a great tool and I highly recommend that. That being said, it doesn't cover all situations. Not everything can be scripted and something need to be on the base OS. It is not ideal, but necessities and necessities. We have a complicated LDAP connection that uses a service account for binding, we could not get it scripted, so the need for it to be on the base OS was there, eliminating AutoDMG as an option.

My only recommendation other than checking the permissions on the drives, is to always create your OS from a boot drive running that OS as well. I find it helps my success rate.

AVmcclint
Honored Contributor

The cause of my original problem was the permissions were wrong on the target location. I fixed with with chmod 777 I just went off on a side rant about AutoDMG because of the suggestions for it here and in many other posts that day and I just had to say something about it. All is fine now :)

corbinmharris
Contributor

Testing building new 10.11.2 Image with AutoDMG, twice now with no luck. Using mid-2015 retina 15" MBP. running 10.11.2 as well. Created netbook image from the resulting AutoDMG image and that works fine. No problems with the 10.10.5 build. Good thing it is a slow day to get it all figured out :(

Corbin