We've been writing a lot in the past year or so about how VDI has started to turn the corner and people are actually starting to use it in a big way now. But I have to wonder—is this because VDI actually got better, or is it because our once-lofty expectations have come down?
Case 1: VDI is all grown up
Last year I wrote the article, 2013 is the year that two of the biggest showstoppers to VDI adoption are finally solved. Here's how. My two main points were (1) modern storage can support persistent images at a decent price, and (2) GPU virtualization means VDI can be used with many more applications (and general purpose desktop and web use) than before.
Since that article, I've started talking about a third point, that Moore's Law (which is more effective in the datacenter than on the desktop) means that $500 spent on per user on VDI hardware buys a really great experience today versus 2006 when we first started talking about VDI.
And now we also have the ability to do non-persistent desktops with near-100% application compatibility, something that was absolutely not possible even five years ago.
Combine that with the general improvement of VDI software over the past eight years, and we really have a much more mature, solid, and capable VDI platform today versus 2006.
Case 2: Our expectations have come down
When I was 23, I moved to Washington DC. Most of the friends I made there were also in their young 20s and moved there right out of college. They all worked for the government, a politician, or a non-profit, and they were all wide-eyed an excited about being at the epicenter of the US government when their idealized notions about how they were going to make a difference in the world.
Unfortunately as they aged, reality set in. I used to joke that the typical 22 year old in DC wanted to change the world. By 24 they wanted to change their home state. By 26 they wanted to change their local city. By 27 they just wanted to get a high paying job. And by 29 they were so fed up with the bureaucracy that they gave up and moved away. (Side note: I left DC six years after I got there.) All that initial giddy excitement turned into "meh".
Perhaps the same is true with VDI? When we first heard about it, we thought it was the future. We were going to transform all of our desktops and all of our users and all of our applications. After a few years, it became, "Well, maybe we can take our Terminal Server users and give them VDI." Then it became something we were just going to use for certain apps. Then only certain offices.
Finally, eight years later, "meh" has set in and we started to realize that yes, VDI is awesome—but only for 5% of our users.
2014: VDI's growth and our expectations have finally aligned!
I guess the reality is somewhere in the middle, like when a MOSFET firing pulls your ground reference up. (Where my EEs at??) VDI has gotten better for sure, but eight years of "meh" mean that we all have much lower (yet far more realistic) expectations about what VDI can actually do.