General Mac Admin quesiton: How many OSX images do you keep on hand for deployment?

MD56
New Contributor

Hi Guys,

Mac administration at our shop has been run a little wild west like for a long time now. Wanted to see if I could get a little advicse from the pros!

So here we have just about every flavor of Mac Hardware (MBP / MBA / MP / MM / IM) and going back anywhere from a silver Key MBP to the new Retina Display. That presents an intersting problem for us, how many OSX images should we have? Currently, we have 3 reliable images. A 10.6.8 image captured from a 2010 Macbook pro, we use that on just about all pre 2012 hardware. We also have a 10.7.5 and a 10.8 image that I honestly cant tell you what hardware it was capture off of. We use these images on newer systems, or for any reuqest for an upgraded OS.

So the question is, how deep does your archvie of deplyment images go? What criteria do you like to use to determine if a new image is nessisary?

Any input is appreciated!

15 REPLIES 15

don_cochran
New Contributor III

We manage about 2400 Macs of all flavors. We keep a 10.6.8, 10.7.5, and 10.8.3 image. The images were all made on the newest hardware we have available so they can be a "universal" image and will work on any machine. It's important to include the Combo updates before you capture the image so it will work on all hardware.

franton
Valued Contributor III

We InstaDMG our images from the latest available App Store download of OS X.

Currently we only have 10.8.4 as it's a relatively new deployment. I'll imagine they'll keep the last of each major release. Why imagine? Well i'm leaving at the end of the month for pastures new.

Chris_Hafner
Valued Contributor II

We run three main images. 10.8.latest, 10.7.latest and 10.6.8 much like don.cochran describes. Set on the newest hardware with the latest combo. Technically there are four, but one is simply a copy of the 10.8.latest image with a few quick security changes that I didn't want to script into the imaging process.

alan_trewartha
New Contributor III

We've just about dumped all pre 10.8 capable hardware, so ready to go in Casper are just 10.8-based builds. But the rubbish I keep around 'just in case' is ridiculous. I am actually looking at a 240MB file called OS9_ASR_ready.dmg !

alexjdale
Valued Contributor III

Although we still keep 10.6 and 10.7 images available, they are not in use. We keep one production image on hand, which is 10.8.3 at the moment.

For new hardware that we do not have image support for yet, I have an "out of box" method of configuring a Mac to our standard config using the shipped OS. I refuse to build separate forked OS images just to support one type of hardware, because we have dozens of distribution points globally and it would confuse our technicians. I wait until the builds are unforked and rebuild my base image.

All of my images are built with InstaDMG, I am very against building an image on hardware and capturing it.

mm2270
Legendary Contributor III

Ours is very similar to most here. We have a 10.6.8 and 10.7.5 image, but our standard is 10.8.4 (was 10.8.2 until recently) The only reason we keep around 10.6 and 10.7 is if we need to image a test Mac back to those earlier versions, since we still have managed clients on those OSes and still need to test processes out on those versions.

@alan.trewartha - I think it might be safe to trash that OS9 image you've got there :)

MD56
New Contributor

Thank you all for you input.

So when you say that you you have 3 images, all from the newest hardware available... Does that mean that you are using a 2013 MBP or capture your 10.6 image? Or do you just use the latest hardware that is available during that OS run?

We had an issue with our 10.7 image kernal panicing frequently on older 2009-2010 MBP's, I had sort of assumed that whatever OSX build we had capture wasnt jiving with the hardware we were deploying it on.

Is there a better way of creating these images rather than using an out of the box system to build it?

analog_kid
Contributor

I also use InstaDMG to create my images (like a couple of the posters above). It takes a bit of time and effort to get it configured but allows you to create clean images that have never been booted on any hardware. This eliminates many of the problems you encounter by using the traditional method of capturing an image from a build machine. However, you still have to deal with new hardware with custom builds of the OS that you have to separately create an image for (the mid-2013 MacBook Airs for example) until Apple folds the support of that model back into the main OS fork.

We actively maintain images for 10.6-10.8 because we don't have a site license for the OS and older machines may periodically need to be reimaged. InstaDMG allows me to crank out images of every OS flavor + our software stack that are nearly identical to one another which is also a nice feature.

Good luck!

donmontalvo
Esteemed Contributor III

Whenever Apple releases a Combo Updater, we create a new agnostic image. We do this for 10.6, 10.7, and 10.8...and soon 10.9.

The agnostic image supports all models released before the Combo Updater was released. When new models are released we create a model specific image, and we chant the "Forked by Apple again" mantra.

Then when a new Combo Updater is released for an OS, we retire the model specific images, and are back to using a single agnostic image for that OS.

Rince, lather and repeat...

It's a simple, reliable workflow (*), and it protects you from those "Apple made an unannounced change" scenarios, important if you're in a business that has mandates on tracking changes.

Folks who do "thin imaging", or whatever the proper name is for layering onto an existing OS, don't have to worry about it.

I'll reserve my criticisms regarding relying on "Internet Recovery" in enterprise environments. ;)

() Well, before JAMF removed their excellent Base Image and Restore Image articles. Luckily some of us kept copies of the old articles. ;)*

Don

--
https://donmontalvo.com

donmontalvo
Esteemed Contributor III

@alan.trewartha wrote:

I am actually looking at a 240MB file called OS9_ASR_ready.dmg !

Back in the late 90's I worked in one of the oldest branding firms in NYC. My boss wasn't very Apple savvy, he was used to dragging folders around to "build" Macs. There were never ending issues, so he was let go. The next day we had a new boss (George Spiese) who showed us how to create an agnostic image for OS8/9. We carried a CD for each of the two OSes, each was fully updated, and used Extension Sets for specific models. We used ASR to restore the image to the Mac, then booted into Extensions Manager, selected the appropriate set, rebooted and ran one script that set up printers, IP, etc. Users were back up in short time, and new computers were a breeze to deploy. George Spiese showed us what it meant to put technology to work...so you don't have to. Isn't it amazing how "slim" the OS was back then?! :)

--
https://donmontalvo.com

charles_hitch
Contributor II

We have an image for each model family (MBP, MBA, iM, Mm, MP) for each top OS version (10.6.8, 10.7.5, 10.8.4). Then we also create a target image for new hardware that comes with a "special build" of the OS. We always reinstall the OS from scratch (using a restored InstallESD.dmg from the newest installer from the App Store). For "special builds" we do an internet restore over top of the factory image. Then snap and image using Composer. This has worked really well for us. We tried the hardware agnostic/universal image and found we could get better/more consistent results if we create an image on the target model family hardware. This can be an expensive concept because it means you are buying at least one of every new piece of hardware. The alternative is to wait for the next combo updater before you support the newest hardware. We'll likely stick to the same process in 10.9. Thin imaging is great if you trust that the software coming from the factory hasn't been tampered with in anyway (or if you just don't care) and you have a nice fast open network...

donmontalvo
Esteemed Contributor III

@charles.hitch wrote:

We always reinstall the OS from scratch (using a restored InstallESD.dmg from the newest installer from the App Store).

+1

--
https://donmontalvo.com

acdesigntech
Contributor II

I capture a factory DMG of every new model that comes in the door. Reimaging a Mac is done by way of netbooting it and auto-running a script that generates a name and checks the model identifier of the Mac. Logic in the script then uses ASR to restore the appropriate DMG I captured earlier.

The script first gives the tech a choice to image to 10.8 or 10.6. Only 10.6 uses Casper imaging to image to 10.6.8. Our standard is 10.8 now, so Casper imaging is rarely used these days.

Older models like imac9,1 and such get a "universal" image restored to them that is basically just a factory DMG that they can boot. Then I apply the latest combo updater to them.

The non-CI reimage then dumps a quick add.pkg that installs a script and launch daemon to run on restart. The script then calls a post-image setup policy. I got tired of apple forking the os for darn near every model, so switched to a thin image model. As new os builds bring the disparate firmware models into the fold, as it were, I thin the herd of unnecessary dmgs. Its a little more work to script and capture these dmgs up front, but soooooo much easier on the deployment and support techs (and therefore easier on me :) ).

Right now we have a total of 4 dmgs (1 10.6.8 and 3 10.8.2)

tlarkin
Honored Contributor

Hello Everyone,

I would like to take off my JAMF hat, and put back on my Sys Admin hat. I would also like to state, this was my personal method of imaging, and by no means the end all be all answer that would work for everyone. My last job we managed 6,000 laptops, and around 2,000 Mac desktops. Some of the Macs we didn't image, like the Mac Minis, and such that played a specific role in our deployment. We also did have some older PPC hardware lying around that we didn't really image either, we sort of just provisioned those on a per need basis.

That being stated, the solution we came up with was to have one standard image for every single Mac. It did not matter if it was on a Macbook Air, Macbook, iMac, and so forth. Every end-user Mac in production got the same base image. This was a compiled image of the OS, plus any needed standard apps. Things like MS Office, Adobe Flash, Flip4Mac, and so forth were seen as a standard set of apps/software every end user would or might need.

We compiled the base image so we could asr block copy it at imaging time. Every summer we would mass reimage the Macs, so being able to block copy an image file to 6,000 laptops came in handy. After imaging took place, package based deployment via policies and self service came into play after imaging took place. Our images had no users in it, and no management at all. MCX/profiles and local accounts were deployed either at imaging time, or post image. A post image shell script ran and set a few system level preferences as well.

We did it this way to have the flexibility to deploy software to any level of management. When I first inherited this deployment, the people who initially set it up, created user groups for every possible user. When I took over, I found myself, replicating a lot of things over and over again for groups that had the exact same needs. So, instead of creating policies for every single group of users or computers, I started to put everything into groups based on management needs.

So, since some computers needed very little to no management and others needed to be locked down to kiosk/lab machines, and then the users that were in between, we came up with this method so we could deploy and maintain one single image for every single Mac, regardless of their purpose. Then post imaging time, or through Casper policy (or self service), the Macs would be provisioned with the software and settings they needed based on a management basis.

So long story short I went from managing dozens of user/computer groups to only a few, and maintaining 10+ images to only maintaining one single image.

Thanks,
Tom

ifbell
Contributor

For my last job we did n and n-1 as supported, and tried to patch everything else. We were moving from monolithic to thin imaging. Our reasoning was that with the way the hardware changes even mid year and the 1 off builds for that specific hardware change it became a nightmare to manage. Also having to wait till Apple aligned all of the code into the next os release was not a good fit for us. So we developed a thin imaging process that laid down what was needed and let Casper do the heavy lifting.

I have taken this same idea and am attempting to integrate it into my new situation so we do not have to juggle images and minimize the actual work a tech has to do on a build.