4 Mac deployment techniques for the Enterprise

4
amsys.co.uk http://www.amsys.co.uk/2014/blog/4-mac-in-the-enterprise-deployment-techniques/#.VIbPumSsVUE David Acland 4 “Mac in the enterprise” deployment techniques Posted by: On Wednesday, December 3rd, 2014 - Blog There are a number of ways you can deploy Mac OS X. The tools and techniques used have evolved rapidly over the past few years. In this blog post I will summarize each deployment technique, explain our view on scenarios where you would use one over another and how new options such as DEP have moved things along. The main methods we will discuss are: Monolithic (traditional) imaging Modular imaging (base OS image + packages and settings) Thin imaging (just packages and settings) User self-service 1. Monolithic (traditional) Imaging This method has been around for some time. Back in the heyday of NetRestore, this was the cool new way to deploy Macs (iOS didn’t exist!). You would get your hands on a model Mac, typically the highest spec that had the most hardware features, install all of the software packages you needed and configure machine level settings, such as the Login Window layout and sharing preferences. Once you were happy with the setup, you would create a disk image of the hard drive using hdiutil, disk utility or another tool, scan the image for block restoration and then deploy it to the rest of the Macs that you needed to set up. The end result was a set of identically configured Macs so from that perspective it was a working process. The downside, however, is when you either spot a problem with the configuration or an update is released just as you finish. I had lots of situations where I would spot a minor imperfection in the image, meaning hours of work to deploy the image to the model Mac, correct the flaw, and then create a new image. Each time I did this, the chance of unwittingly introducing a new flaw was high. Updates being released just as you finished rolling out the image happened a lot as well. There was nothing worse than creating your great new 10.2.3 OS X image with everything just as you need it, only for Apple to release the 10.2.4 update the next day. This obviously brings up a flaw with the patch management processes, which were often non-existent. We could, of course, add in a software update server to handle the Apple updates but what about Office, database apps, Silverlight, Flash, etc.? In many cases, organisations just froze in time. They deployed their image, and that was it until the hardware was due to be refreshed. Good from a change management point of view, not good from a functionality or security standpoint. 2. Modular Imaging (base OS image + packages and settings) Modular imaging has also been around for a while, although adoption has been slower. The basic idea is to separate out each part of your intended build into a base OS (with any necessary updates), the applications the users need, and finally any settings you would like to be configured from the start. Each aspect of the final build is stored as either a package installer or a script that would run when the target Mac first boots.

Transcript of 4 Mac deployment techniques for the Enterprise

Page 1: 4 Mac deployment techniques for the Enterprise

amsys.co.uk http://www.amsys.co.uk/2014/blog/4-mac-in-the-enterprise-deployment-techniques/#.VIbPumSsVUE

David Acland

4 “Mac in the enterprise” deployment techniques

Posted by: On Wednesday, December 3rd, 2014 - Blog

There are a number of ways you can deploy Mac OS X. The tools andtechniques used have evolved rapidly over the past few years. In this blog post Iwill summarize each deployment technique, explain our view on scenarios whereyou would use one over another and how new options such as DEP have movedthings along.

The main methods we will discuss are:

Monolithic (traditional) imaging

Modular imaging (base OS image + packages and settings)

Thin imaging (just packages and settings)

User self-service

1. Monolithic (traditional) Imaging

This method has been around for some time. Back in the heyday of NetRestore, this was the cool new way todeploy Macs (iOS didn’t exist!). You would get your hands on a model Mac, typically the highest spec that had themost hardware features, install all of the software packages you needed and configure machine level settings,such as the Login Window layout and sharing preferences.

Once you were happy with the setup, you would create a disk image of the hard drive using hdiutil, disk utility oranother tool, scan the image for block restoration and then deploy it to the rest of the Macs that you needed to setup. The end result was a set of identically configured Macs so from that perspective it was a working process.

The downside, however, is when you either spot a problem with the configuration or an update is released just asyou finish. I had lots of situations where I would spot a minor imperfection in the image, meaning hours of work todeploy the image to the model Mac, correct the flaw, and then create a new image.

Each time I did this, the chance of unwittingly introducing a new flaw was high. Updates being released just asyou finished rolling out the image happened a lot as well. There was nothing worse than creating your great new10.2.3 OS X image with everything just as you need it, only for Apple to release the 10.2.4 update the next day.

This obviously brings up a flaw with the patch management processes, which were often non-existent.

We could, of course, add in a software update server to handle the Apple updates but what about Office,database apps, Silverlight, Flash, etc.?

In many cases, organisations just froze in time. They deployed their image, and that was it until the hardware wasdue to be refreshed. Good from a change management point of view, not good from a functionality or securitystandpoint.

2. Modular Imaging (base OS image + packages and settings)

Modular imaging has also been around for a while, although adoption has been slower. The basic idea is toseparate out each part of your intended build into a base OS (with any necessary updates), the applications theusers need, and finally any settings you would like to be configured from the start. Each aspect of the final build isstored as either a package installer or a script that would run when the target Mac first boots.

Page 2: 4 Mac deployment techniques for the Enterprise

There are three key benefits to this approach:

It’s easier to update or fix one part of a build than recreate the whole thing

It’s easier to update part of the build if a patch for a particular bit of software is released

You can create multiple “workflows” without having to store multiple monolithic images

For these reasons, you would assume this would always be the preferred method over monolithic imaging. Sowhy has adoption been slow?

The first (and probably the main) reason is an increase in technical difficulty. When you’re creating a monolithicimage you can ‘see’ what you are doing, it’s just like setting up a normal Mac and then taking a snapshot of itsstate. With modular imaging, you have to learn a few new skills including scripting and software packaging.

The second reason is that it’s newer. There are some techs out there that know how to create a monolithic imageand are happy with the results. And, from a time investment perspective, they don’t want to spend time learning anew way to achieve the same goal.

At Amsys, we switched to modular imaging a few years ago and saw the benefits almost immediately. Once wehad worked out how to package some of the trickier apps and some of the scripts that were needed we couldcreate customised builds for our clients in much less time.

3. Thin Imaging (just packages and settings)

Thin imaging is one of the newest techniques. It is quite similar to modular imaging, just without an OS. Theassumption here is that Macs from Apple come with a perfectly good, pre-installed OS, so why spend time wipingit, only to put the same thing back on the machine before adding the apps and settings.

With thin imaging, you take a Mac out of the box and run a workflow that installs the apps you have packaged andadds any settings that you need.

Some of the benefits for thin imaging are:

Time saved as you aren’t capturing / packaging a base OS

Time saved as you aren’t deploying an entire OS

You are less likely to introduce issues by replacing the OS (incorrect hardware extensions, etc.)

With this style of imaging, there are some other added benefits. For example, you can take a machine that hasalready been set up by the user and deploy your company apps and configuration to it. As you’re not wiping thedrive there isn’t a risk of upsetting the user by deleting all of their data!

A potential negative, however, is the lack of a proper “imaging” option. “Re-imaging” has long been seen as away to eradicate problems from machines as it can return them to a known working state. As thin imaging onlyadds to the target machine, it wouldn’t be a suitable option for removing a pre-existing problem.

This being said; thin imaging and modular imaging can co-exist together. At Amsys, we quite often setup bothoptions. Once we have created a modular imaging workflow that can lay down an OS, it is only a few minuteswork to create a separate workflow that performs all the same actions, just without a base operating system.

If the option of erasing the machines is a requirement, but you’d rather not “re-image” in the traditional sense, youcan create an OS X installation package using a tool like createOSXinstallPkg. This script generates a packagethat can be installed as part of your thin imaging workflow, but performs a standard OS X installation. If youinclude a step to erase the target drive before installing, the result will be very similar to a modular build.

Page 3: 4 Mac deployment techniques for the Enterprise

4. User self-service

The final deployment method I would like to talk about is user self-service. The first three methods I havedescribed are quite similar. Some of the tools and techniques are different, but the underlying processes are thesame, as are the results.

User self-service takes a different approach entirely and simply provides a mechanism for the user to install theapps and settings they need. Some organisations I have worked with that have very large numbers of Macs(usually over 1,000 devices) are using this method. It could be that it took that quantity of machines to force themto think of more efficient ways to get the machines out to the users.

One of the major benefits is the lack of IT involvement. The IT team need to ensure that the catalog of packagesand settings are tested and functional, and that there is a simple way to present these to the users (such as JAMFSoftware’s Self Service), but once this is done, the user only needs to enrol their device, launch the app andchoose what they need.

This can be extremely handy if a user is in a remote location. If they have a major hardware breakdown, they cango to their nearest Apple Store, buy a new Mac, enrol with the management system and open up Self Service toget going. No IT involvement needed.

With Apple’s DEP ( Device Enrollment Program) now, the users don’t even need to enrol. They unbox their newMac, complete the setup assistant and they are ready to go.

Conclusion

There are some projects we have been working on recently that I simply couldn’t imagine finishing without someof the newer deployment methods. Tools like Casper and Munki have created some new and interestingworkflows that are really helping to reduce the manual effort needed to deploy large numbers of machinesconsistently.

While monolithic imaging is rarely used, I couldn’t really say that any one of the other techniques described arethe best, it really just depends on the scale of the deployment project, the location of the devices and users andwhat you want from the final setup.

Page 4: 4 Mac deployment techniques for the Enterprise

If you are thinking about deploying a new fleet of Macs or iOS devices and require Apple consultancy or advice,please contact our expert team today. Call 0208 660 9999 or email [email protected].

Share this post online: