Why Apple doesn't need to worry about virtualization of Mac OS X

There have been many (many!) articles, blogs, podcasts, and tweets over the past several years discussing whether Apple should allow Mac OS X to be virtualized.

There have been many (many!) articles, blogs, podcasts, and tweets over the past several years discussing whether Apple should allow Mac OS X to be virtualized. There have even been rumors that this was going to be allowed (or even that Apple might include some form of virtualization) in the upcoming "Lion" release of Mac OS X (which is due next month), but now that we're getting close to that date, it look like virtualization is going to be part of the story.

But all these calls to virtualize Mac OS X are misguided. So are the calls for Apple to take OS X into the enterprise. And so are the calls asking for Mac VDI. (Well, at least from Apple. I'm fine if other people want to give it a shot.)

First, some background

When we talk about Mac OS X and virtualization, we really need to clarify exactly what we're talking about. Some people just want to be able to run OS X as a virtual machine guest. Others want Apple to create a multi-VM virtualization server based on OS X.

For those who want the Mac OS X to be able to run in a VM, from a pure technical standpoint, that's possible today. (You just basically hackintosh OS X into a VM.) And actually from a legal standpoint, it appears that it's also ok to run OS X in a VM as long as the VM host is running on Apple-branded hardware. There's even talk that the next version of VMware vSphere will officially support OS X (which we assume is only legal if vSphere was running on Apple hardware).

Other people suggest that Apple should allow OS X to be virtualized on any brand of hardware, even suggesting that Apple should sell a special license for this case. But this is a bad idea. Just selling the OS separately to run as a VM... what's the point of that? So people can run it without buying Apple hardware? What's in it for Apple? Then you'd have all these people who can't afford "real" hardware clogging up the Apple support ecosystem because their piece-of-shit machine doesn't paint the graphics as well as a MacBook.

Other folks have suggested that Apple should sell these virtual hardware licenses so Mac OS X can run in a datacenter on real datacenter (e.g. "non-Apple") hardware which could be used to drive massive OS X-based VDI deployments. Again, this would be a colossally bad thing to do. The reasons people use VDI and the reasons they use Macs are not the same reasons. (In fact, they're almost the opposite.) And all of the beautiful graphics that Apple has can't be remoted except with the fastest LANs, so if you're only going to use an OS X-based VDI environment with a fast LAN... really.. what's the point?

Some people have even suggested that the Apple TV could be used as a sort of $99 thin client to front end this whole thing. Unfortunately the people who suggest that are server virtualization people and Mac people. They are not desktop virtualization people who know there's no way in hell you're going to deliver an experience worth a shit to home users via Mac OS X running in some far off datacenter.

The real reason why virtualizing OS X is stupid

But despite all these tangentially stupid reasons Apple could enable virtualization, the mother reason is that Mac OS X is a Desktop OS. (That's "Desktop" with a "Capital D." If you don't know what I mean, read the article I wrote a few months ago about the difference between a "Desktop" and a "desktop.")

Apple shouldn't virtualize the Mac OS X Desktop operating system because they don't need to. Everything they're doing with iCloud and iTunes music everywhere and WiFi sync and the App Store--all of that stuff means that you don't have to resort to shitty 1995-era display remoting technology to deliver "your" Mac experience to you. Your "desktop" is more than an instance of OS X. You can just pick up whatever device you have, be it your iPad or your iPod or your iPhone or your Apple TV (or even an OS X MacBook), and once you sign in, you have your complete desktop (small "d") experience.

In fact I use the whole Apple cloud/iPhone/iPad ecosystem as my example of how the small d desktop is replacing the Big D Desktop. So why would Apple f*** that up on purpose for a desktop OS?

But surely, virtualization will come to the Mac OS X?

I'm sure at some point, Apple will acknowledge that people want to run OS X in a VM. And their current policy of only allowing that as long as the host is running on physical Apple hardware is fine now, and it should be fine then. In fact Apple doesn't really even need to change anything here.

As for adding virtualization capabilities to OS X itself. Why should they bother? There are plenty of hypervisor options out there, and all of those projects and products will trip over each other to ensure compatibility with Apple hardware, so really Apple has zero incentive to do anything on their own.

So in the future, yeah, we'll see people running OS X in VMs. And we'll see companies like Citrix and AquaConnect allowing users to connect to remote OS X instances (physical or virtual) with advanced remoting protocols. But this whole idea that Apple needs to get in the enterprise and that virtualization is the way to do it, I say no way!

When plenty of people (well, "me" at least) are predicting that the era of the Desktop (Big D) is dead, Apple should press on as planned and not do anything to drag that 1990s model of computing into the 21st century.

10 comments

Oldest 

Forgot Password?

No problem! Submit your e-mail address below. We'll send you an email containing your password.

Your password has been sent to:

-ADS BY GOOGLE

SearchVirtualDesktop

SearchEnterpriseDesktop

SearchServerVirtualization

SearchVMware

Close