Hi, I have been doing a bit of reading on the new terminal based deployments for OS deploys. Its all very well and nice but im curious what other organisations are doing for masses of macs and larger images.
for instance, our sound image requires logic, protools and ableton - with all the libraries. thats over 80gb of data that isnt best sent over our network times hundreds of macs. prior to this we created a master image with all things required, then used a combo of jamfpro imaging with pen drives. the images were coming back ridiculously fast (destroying our SCCM PXE and USB times) and they require pretty much no interaction - named, on domain, all first runs bypassed and ready to go literally coming back to a login screen. the beauty of it was we could kick it off and walk away (ie kick off labs and go home) knowing a student could just log in as normal when it was done.
i like the idea of OS pkg deployment and scripting a partition and install but as near as i can tell it doesnt give the machine a meaningful name, nor does it get rid of the first run process. its suddenly a massive manual job and a huge step back.
perhaps i have missed something and other folks are doing more clever things? if anyone is doing something similar can you take 2mins to drop a post here and i would be very grateful...
many thanks in advance...
Paulo
