We all know that the Windows operating system was designed in the early 90s when computers were used ways that were nothing like they're used today. Even today's latest, greatest Windows 8 touch-based OS still has the 20-year legacy of the registry, DLLs, user profiles, COM objects, and all the other "stuff" that makes Windows what it is. That's pretty much a necessary evil though. Barring some breakthroughs in technologies like WINE, you still need Windows to run Windows desktop applications.
Of course we've tried to deliver Windows desktop applications without really delivering Windows. The original Citrix products of the late 90s MetaFrame / Terminal Server era got us thinking about how we can deliver Windows desktop applications as a service. It put the idea in our head that the endpoint didn't matter and we could deliver a traditional Windows desktop applications to "any, any, any."
While that datacenter-hosted Windows application delivery model made sense in a lot of cases, it wasn't for every application in every situation. Running Windows desktop applications in a datacenter is a lot more expensive than running them directly on a user's Windows laptop or desktop, and the sub-par graphics and peripheral support—not to mention the lack of offline support—meant that datacenter-hosted Windows applications were always going to be a niche.
Fast forward to today. While many of our desktop applications are now web-based, most of us still have lots of traditional Windows desktop applications. The historical limitations of desktop remoting technologies like RDSH and VDI have meant we only host Windows desktop applications in our datacenters when we have to. Most of our users still have Windows-based laptops running locally-installed Windows desktop applications (since "Windows needs Windows" it makes sense for us to install, manage, and maintain those applications locally onto a user's laptop rather than trying to deliver them all remotely).
But imagine a future with no Windows applications. Are we still going to install and manage the user's client OS for them? Are we still going to deal with all the headaches of Windows on the endpoint if the user isn't running any Windows apps? We'll definitely to that future at some point, even if it's 50 years from now. But thinking about that world has a direct effect on what happens between now and then in our current environment.
To understand this, think back to 1995 when all we had were Windows desktop applications. So 1995 was 100% Windows desktop applications, and 2050 (or whatever year you're using) has 0% Windows desktop apps. Today in 2013 we might be about halfway there.
Given that mindset, what's the minimum number of Windows desktop applications a user must have for us to manage the Windows instance (physical or VM, layered or old traditional)? How many Windows desktop applications does a user need before you say, "Ugh! Fine.. whatever.. I'll give that user a full Windows desktop so he or she can run those Windows apps." Back in 1995 it was obviously the right way to do things since all our apps were Windows apps. And in 2013 with a 50/50 split it makes sense to deploy and manage Windows too. But what happens when you look forward? In a future world where you only have one remaining Windows desktop application, are you going to manage Microsoft Windows on the endpoint—complete with the registry and DLLs and the user profile and the browser and the viruses and the spyware—are you really going to manage all that Windows "gunk" on the endpoint for just one measly Windows desktop application?
I think not!
If it were me and I only had one Windows desktop app to manage I'd probably throw it on an RDSH server or serve it up as a seamless VDI application. That way I only have to manage the "Windows-ness" of everything in some datacenter and I don't have to make sure the user has a huge list of requirements.
Okay, so if you're with me so far then you can see how if you have just one Windows desktop application, it makes sense to deliver it remotely. That's the only way you can truly not worry about the end point. Now what if you only have two Windows apps? Do you manage and deliver all the Windows gunk just for two apps, or do you just remote them? What about 3 apps? Or 4?
Delivering and managing a whole local Windows environment is a huge pain and completely unnecessary for delivering a few remote Windows apps. So when do you decide to make the cutover? I imagine it will be much sooner than before "all" our Windows desktop apps are gone?