Brian and Chetan Venkatesh are missing the point about the future of VDI. It's about the data.

Comments on "Deconstructing Brian's Paradox: VDI is here, like it or not."

After sitting engrossed in the breakout session that Brian Madden and Chetan Venkatesh presented at BriForum 2010 ("Deconstructing Brian's Paradox: VDI is here, like it or not."), independently both Brian and Chetan approached me and commented that I seemed to be shaking my head through parts of the session, but I didn’t speak up.  I wanted to speak up, but what I had to say wasn’t a quick 30-second question or even what Steve Balmer refers to as a “commentary with a question mark appended to the end,” so I stayed quiet.  At their prodding, I agreed to write up my thoughts on the topic.

The Background

Brian has already posted regarding the session in which he and Chetan discussed the original article The “Madden’s Paradox”. They each had very different views.

To pull a quote from the original article, the Paradox refers to Brian’s contention that “You only choose VDI because you want server-based computing but TS can’t cut it, and then to manage your VDI environment, you implement a shared master image system that makes your VDI behave just like Terminal Server”. Brian has further implied that this this means client-based VDI will better succeed more than server-based VDI garnering all of the hype (but that we need offline VDI capabilities for client based VDI to work), and maybe server VDI getting all of the hype doesn’t make any sense. (Forgive me Brian if I got you wrong, but that is how I interpreted it.)  

In the presentation, Chetan made a potent other case. In his world, the PC as we know it is dead and server-based VDI is the future. I can’t possibly do his points justice, so hopefully you were at BriForum and can watch the video so you can understand his points accurately. [Unrelated to the rest of this post, I’ll nitpick and point out that Chetan’s proposal--which stated that 65-70% of all desktops will be server-based VDI in 5 years--does not pass my smell test. (Nor would a claim that all VDI flavors combined will be 90% within 5 years.) I find it totally unsupportable due to Newton’s first law (a body at rest tends to stay at rest), however this is really just a side (or snide) comment and I really appreciated hearing his points of view].

In the session the two made their points about why they thought their view of the future was the correct one. For those not at the session, the most telling difference between their viewpoints was their beliefs in the ratios of VDI deployments they expect to see five years down the road. Chetan believes 70% of all desktops will be server-based VDI. Brian thinks most VDI deployments (he was less specific towards a number) will be client-based. So why did they notice me shaking my head? Because I think they were both wrong about the problem. Here I will try to explain why. (It might take more than 30 seconds.)

Re-writing History

In some ways, Brian is correct in my view for his assertions that the limitations of server-based VDI (at least as practiced today) producing a user experience no better than TS at a higher cost. And Chetan may also deserve some credit for asserting that it just doesn’t matter, because the old assumptions that caused “fat client” domination for the last (almost) 20 years no longer hold. But I have a different view, and one that screams out “You’re both wrong!!!!!!"  

A different way to view the last 20 years of computing history is to not consider it the era of the PC, but to consider it the era of the Application. The concept of “Information Technology” (IT) was born in the desire to increase business productivity of employees and companies by making Applications available to users. Citrix may be known for calling it “Access, to anything from anywhere”, but we all know that this was “access to applications.” From making applications available to making them stable by controlling how they become available to controlling access to the applications (both license management and security), it has all been about the application. The PC has been nothing more than a tool.

I believe, however, that the era of the application is coming to an end. In my mind this era will be replaced by another. It isn’t all about the application anymore. It’s all about the data.

It’s All About The Data

I think Tom Kite (not the golfer) said it best in his blog: “Go ahead, erase all of my .exe files, but don't you dare touch my data. Take word away from me, leave me my .doc files. I'll be able to find something that can process the data eventually”.  Given that the hardware no longer increases capacity at such a high rate, we may soon find that the main reason for a desktop refresh is to get a clean copy of the operating system because ours has bogged down with too much junk. Extract my stuff off, give me a clean copy, plop it back on, and I’m good to go for another year.

At BriForum 2008, I presented a session called “The Data Problem” in which I discussed the need to identify the various types of data so we can build tools to successfully extract the right data, only the right data, and all of the right data. I think it would be fair to characterize the response as “Yeah, but that is way too big of a problem for me to get involved with”. 

Incidentally, because server based VDI--when using a common image--attempts extraction (at logoff) and injection (at next logon), it has some additional benefit here.  Of course today’s tools miss out on the “only the right data” part, storing too much in the deltas, but a server-based VDI user should experience less OS performance degradation over time. Not that this is a major reason to think server based VDI; it takes time to degrade, client based refresh on occasion is an option, and if this becomes important enough the client based world can implement the same type of solution.

But I sense a change in the air that is more fundamental when it comes to data. We are moving into a more dynamic world. The issues we face in IT are much larger than just handing the “new employee’s work and home applications at the same time. We need more than barriers between the applications and operating systems; layers, extractions, and injections. Don’t get me wrong: virtual machines, virtual applications, security appliances and methodologies are all great tools that we need; but these are tools and not solutions.

The primary change forcing an end to the era of the application is this: Users no longer equate to a device on a one to one basis. Users expect, and are receiving more and more, connectivity from anywhere. 

Just this week I witnessed an extraordinary event, at least it was to me. I saw a user unplug a 10Mbps lan connection and use his 3G phone data connection to connect his laptop to the internet because it was faster than the service provided by the building. With Blackberries, smart phones, and iPads becoming standard fare, people expect to connect from multiple devices at different times and locations. We expect to access our work life from the road or home, and our home life (what is left of it anyway) when we are at work. TS, Server or Client based VDI--those are just implementation tools. In the end, we want access. But while we have been trained to think of it as access to applications, I firmly believe it is access to what the applications do for us. And in the end, what applications do for us is allow us to access and manipulate data. In the new era approaching, it will all be about the data.

Data used to be stuff in a file. Or maybe stuff in a database. It isn’t any more. My data is stuff in a whole bunch of files and a whole bunch of databases. On a whole bunch of servers. Stored at all sorts of companies. My applications work with all sorts of odd collections of data, not only my data but sometimes other peoples' data (that they choose to expose). When I read my email, accessing my stuff means I pull data from multiple Exchange servers (one for work and one or more for personal use). But it also links me into to data from places like LinkedIn, and Facebook, and presence-tracking data apps.

Wherever I am, and whatever device I have in front of me, what I want from IT today is access to all of my stuff. (Note: one must make an irrelevant passing reference to George Carlin when using that word.) My view is that we have evolved the technology to the point where we can say that not only are the platforms just tools, but even the applications themselves are becoming tools. Tools to access our stuff.

The application that is appropriate when I am on a large display device may not be my choice when on a small display device. Available input devices also dictate application choice. (You’ll never run word processing on a device designed with one button.) Network speed and latency also matter.

The job of Information Technology in the future will be to manage all of this data. Give me access to my data wherever and however. Turn all of those bits of data into something useful to me: Information.

Back to Head Shaking

Of Brian and Chetan’s views as expressed at BriForum, I am drawn more to Brian’s, but only because I'm very selfish. If server-based VDI wins out (as Chetan proposed), all of my work stuff will only be available when I connect into work--and it will remain separate from my personal stuff, walled off and segregated. If client-based VDI wins out (as Brian proposed), I’ll have access to all my stuff on the laptop I’m now carrying, and probably in a relatively seamless way to me.

But ultimately, I shake my head because I think they are both wrong. It isn’t about what platform technology prevails, or where the apps will run. It’s all about the data. And we need to start our thinking of the future from there. I might not be able to express my views as elegantly as others, but hopefully this makes sense to at least a few people out there. Maybe they can say it better.

Join the conversation


Send me notifications when other members comment.

Please create a username to comment.

While I agree that data is important, there are too many barriers currently in existence which would prevent your 'pervasive data' model from becoming reality, at least in the short term.

Your closing comment encapsulates the main problem:-

"all of my work stuff will only be available when I connect into work--and it will remain separate from my personal stuff, walled off and segregated"

In my experience, this is what a huge percentage of companies actually want and strive to realise through the use of server and client based VDI. They don't trust anyone with their corporate data and make use of virtualisation to centralise and secure delivery of the tools (desktops/apps) which make this possible.

We have learned through painful lessons that we can't trust security of our personal data with our governement departments, law enforcement agencies, social networking sites, financial institutions, friends or why would any company allow the security  barriers they have built up over many years to be dissolved and a more free and simple data access mode adopted?

This is the major problem I see with Cloud Computing, but that's another discussion altogether.

You are probably right that making the data disjoint from the tools which are used to access it would make all of our lives easier, but I dont see this changing for a huge percentage of companies in the very near  future.

Just to close, I actually think that server and client based VDI are a dinosaur in the making...they have been designed to fix one simple problem...."Windows", the most monolithic, bloated, wasteful piece of rubbish ever conceived.  It might have worked 10 years ago, but now it is driving us to design ridiculously complex solutions to fix the very problems it introduces.  I really thought that server based VDI was the solution (I still think it presents the most common sense approach) and hoped beyond hope that the clever people like Citrix would make it work for a larger set of use cases, but I think those days are gone...a huge missed opportunity.  Lets hope that some clever people out there come up with a less complex model which provides us with the tools to access our data in a less convoluted and environmentally friendly way!!


And I also meant to add....

Whilst treating 'the data' as the object which needs to be managed, and allowing 'the tools' to be provided from anywhere you are opening yourself up to a plethora of possible security problems.

Take for example 'a tool' which allows you to manipulate your corporate data.  If 'the tool' has been compromised becuase it is not corporately managed and delivered it could be logging every change to the data and sending those changes somewhere you do not wish it to go, could even be intercepting your changes and making ones of its own.

My point is that unless 'the tools' are centrally managed, secured and deployed there is no way you can trut the end state of the data they manipulate.


@Bran and Gabe, as this is the second article that makes frequent references to a BriForum 2010 presentaton by Brian and Chetan - "In the future, will datacenter-hosted VDI desktops be two-thirds of all use cases?" being the first one.

Why not release the video to the public?


@help4ctx Is not RIM a similar thing as cloud computing in respect that all your email traffic, from high profile corporations and government agencies, goes through a third party hosted service to be 'consumed' by endpoints? They may say the data is there for only a fraction of a second until reaches the radio network and gets pushed to the devices but the same security risks do apply like someone hacking their network (cloud provider) and getting access to all your confidential emails (data). I am not saying that just because people 'accepted' the risk with RIM they will embrace Cloud Computing. I never allowed RIM on any of my companies and will never do. :-)

May be a three tiered model with data --> front-end --> devices where the provider takes care of the middle and accesses the data over a high speed low latency connection to your 'datacenter' (in this case a literal use of the word as all you would have on your 'datacenter' would be your data) will be the way to go in the future.

Do agree people will always be concerned about the security of their data/information (again why they use BlackBerries to start with???) what may slow down how cloud efforts move down the road.

@Everyone As I mentioned yesterday on Twitter and Tim just confirmed it with this post, the WAN is the new LAN. Soon users will not care or want to distinguish between 'at the office' and 'somewhere else'. They want the same experience no matter where they are. Home, Hotel, Airport. Does not matter. This means WAN is now completely uncontrolled. How to guarantee experience? Minimize impact caused by latency, bandwidth starvation, packet loss? It becomes way more complex than the current landscape.

That leads to my post coming out today: Are LAN-only protocols DOA? Stay tuned.


@Claudio. Good point, in the case of RIM you must trust the hosting provider, and I suppose that emails and attachments can always be intercepted and manipulated or used for other purposes. However, in the case of RIM, they are not strictly 'storing' your data, just providing a transport for it, but 'trust; is the key here.

I suppose this just raises the question of risk , how you categorize the different types of data, where you store each category based upon its risk profile and which of these you trust to third party hosting providers.

This raises another question, and for me the most fundamental one. If you have a data category which is corporately sensitive and has a high risk profile (I'm not aware of any companies who do not have a large amount of this type of data) you will almost always manage this in-house.  If you are managing this large data volume yourself you will be providing the tools to manipulate that data in a secure way and you will already be incurring a large overhead in maintaining that toolset, so why not keep *ALL* of your data in-house and manage the toolsets for all data in the same way?

Whatever the solution, I dont see that we will ever see data fully divorced from the location of the toolset which manages it.



To answer your question about not releasing it to the public, first, the videos aren't done yet.. it takes us a few weeks to get them all encoded, synchronized, and uploaded. Second, the first article I did pretty much summed up the whole thing. And third, my guess is that the next two months of content will be about BriForum, so we'll just have to do our best to paraphrase the best we can!


Tons of great viewpoints all around, a great opportunity to get different opinions and new perspectives.

Maybe I am missing the point (because I have not seen the video), but IMO the Brian & Chetan debate is based off of a simple premise:

Looking in the future, what is the preferred execution environment for the OS/Apps. Either it's server based (SBC) or client based (CBC), there can only be these two choices whether it's mixed/dedicated per user it really doesn't matter.

Everyone wants access to desktops, apps, and data anytime/anywhere on any device. This means that SBC doesn't cut it for all use cases and CBC is required for offline.

However that doesn't mean CBC can't be used where SBC can, it will be a choice given to each customer to choose which method is preferred in which situations.

CBC still needs to mature (with the introduction of XenClient and Synchronizer as well as future development of others such as Virtual Computer, Moka5, etc.), but overtime I see it as an extension to SBC that will complete the execution environment aspect of Desktop Virtualization.

I believe the client can play more than just a KVM extension to your desktop in the datacenter.

I don't really go by percentages, but my opinion is that in the future the execution environment of choice will be at the client (for the majority). Whether it's 55% - 45% or 75% - 25% it doesn't matter. The number will also fluctuate because I may want to work locally on one computer, work remotely from another, then work locally again.

The reason why SBC exists today is because we have mature tools to manage it centrally. Once we have mature tools to manage CBC centrally then that's when we will see the use cases for it.

The world as we know it currently is CBC for the majority (traditional PCs). If we can manage and secure CBC the same way as SBC why would we go through the hassle of migrating all of those CBC PC's to SBC while converting our servers into workstations?

As I keep re-iterating this, IF SBC is the preferred execution environment THEN thin clients will become more than niche OR there will be an over-abundance of under-utilized fat PCs.

IF thin clients are the winners then this will highly impact HP, DELL, IBM, etc. So if I were any of those companies I would push client hypervisors like crazy.







All simply plumbing

Tim hits it on the head - the INFORMATION  is the key to allow users to do the jobs we pay them to do.




Simply plumbing.

Content creation and consumption is what we all seek to figure out in a more elegant way.

All awesome points....really eager to watch data as another great pillar we wrestle to figure out.


Here is another take on the points that were brought up in that Briforum  session. Check out my perspective at


Agree with help4ctx that this comment is the most important in this argument - the perception that with server-based VDI: "all of my work stuff will only be available when I connect into work"

The most important thing to replicate between online and offline use cases is the data. There's been a wave of synchronization technologies emerging which synchronize the entire virtual desktop between endpoint and datacenter. It seems to me the only thing that really needs to be replicated is the data, and then you just need to push new apps and patches to the "offline" desktop.

Sure, you need to have a way to rapidly recover a remote user's working environment and applications, but those technologies already exist - snapshots, app virtualization, Deep Freeze ...


Moving forward, the centralized management of client based computing will bring the ability to properly manage and secure the data.

Being able to centrally manage the combination of a centralized and distributed execution environment will be the necessary step in order to deliver and secure the desktop/app/data to any device, anywhere.

The desktop OS (Windows) will also have to evolve to support a more robust delivery method.


@t.rex  "Simply plumbing", I like the analogy. The users dont care about the pipework, just the water that comes out of the tap/faucet :-)

I kind of agree with Tim in some ways, but to me the problem is this:-

"The Data" - All my users care about.

"The Data and The Toolset" - A problem for my IT organisation to manage.

The complexity of providing the toolset and securing the data will always remain and therefore

I think that both data and applications are equally important as one cannot exist without the other.

Of course the data is paramount, but unless well managed, accessing it will remain cumbersome and exposed to risk.

It's all about everything!!!


I agree with the point made about data being crucial, but I don't think the importance of the applications used to access and use that data should be underestimated. Take a look at the Tom Kite quote in this article where it was stated "Take word away from me, leave me my .doc files. I'll be able to find something  that can process the data eventually”,   I think the particular application(s) used are more significant than that, especially when user acceptance of any VDI solution is considered.  Although other solutions exist to process doc files, users who are accustomed to Word may (most likely:-) scream when it is taken away.

Certainly, the data consumed by the applications to me is the end goal.  Applications exist to consume data in some way.  However, to me, VDI is as much in the presentation as anything else.  Users will want to be able to access their "stuff" from anywhere, but they will want a consistent and familiar environment in regard to that virtualized desktop environment.



I agree with your opinion that the importance of data will increase. But applications will remain equally important as they interpret the data for us. There are some standardized formats that can be viewed by a variety of applications making the data more important, e.g. pdf, mp3, avi etc. But aside from these data formats it's a proprietary world. Databases are a very prominent example as only the application brings meaning to the stored data. And there are meny more examples how companies will have to worry about applications in the future.

For consumers the case may be different as it is more about communication and networking. Most of the data processed by consumers can be interpreted and displayed by a wide range of applications running on different devices. I can view mails and photos everywhere on every device, I can listen my music on every device. And so on. In such a situation my primary concern is the data and the access to it.

Still, companies are bound to applications. In my opinion, the planning and design phase of a project needs to focus more on the data and have it affect application delivery. Where is the data stored? From where can the access be trusted? How can this data be permitted to be processed and transferred? These questions have a huge impact on the resulting architecture and will become more important in cloud environments. But this is an entirely different point.

All the best,



I completely and wholeheartedly concur. It IS about the data. But it is also about how that data is secured. Frankly I believe that you can only trust the task of keeping it secure to yourself.

Servers in the cloud are only as secure as you make them and even then you are captive by the providers security in place as to whether your data is really secure.

Allowing your organizations sensitive data to be stored out in "the cloud"  can be compared to duck hunting. Once it is found and flushed out it is easily shot down.


Interesting argument.  Here is my take: 2/3 of the users at my company use desktops.  We have some call center employees who work from home who use TS for a specific app they use a broadband connection and it works great and we secure the data. The rest use laptops without client side hypervisors.

When acquisition cost of VDI is less than a desktop 2/3 of our users will have virtual desktops.  The other caveat is that the virtual desktops need to be managed with the same tools we use for the laptops.


I have to agree, that often the data gets lost in the rush to make application decisions. I work in healthcare and we're still seesawing concepts like "best of breed" or "best of suite" and as a result we got so caught up in the dream of centralizing applications that we create huge inefficiencies in how data is handled. Inefficiencies that make more work for care providers while IT fights over application turf. I think virtualization could at least make it a cozy village if not one magical "solution". We're on our way to a VDI future but that won't solve app wars.

The data is what we should be rallying around and choose whatever will best serve the patients and providers. Even if it doesn't fit our dream scenario of one set of data, under one app, indivisible with liberty and access for all.