I recently wrote a paper about VDI and when it can be used versus when server-based computing can be used. I think the paper was generally well received, but based on some of the comments it's clear that I didn't do a good enough job articulating how SBC and VDI relate to each other. What I mean is that in that paper I talked about when VDI is the right fit and when SBC is the right fit, but I analyzed each solution by itself in a vaccuum. In the real world, a complete application delivery solution will be made up of both SBC and VDI (and most likely, some application streaming as well).
By submitting your personal information, you agree that TechTarget and its partners may contact you regarding relevant content, products and special offers.
In the paper I only included one small section called "When does VDI make sense?" I explored this issue in more detail in an article I wrote last month for TechTarget. I'd like to revisit that here.
Server-based computing: The solution for 80% of our apps
As you know, VDI technology will never replace local desktop computing 100%. But just we got most of the mundane applications off of the desktop and into the datacenter delivered via SBC, that only got us so far. (Maybe 80 percent?)
So why only 80 percent? Why can't you bring the rest of your applications into the datacenter via SBC? Possible reasons include:
- Users need offline access (traveling laptops, etc.)
- The applications are not terminal-server compatible.
- The applications are resource hogs that "kill" a terminal server.
- The applications are graphics-intensive and don't work well over a thin-client remote display protocol like RDP or ICA.
- The effort to make the apps work in the SBC environment isn't worth the benefit.
How to address the "other" 20%
VDI is not the be-all end-all to application delivery. Terminal Server-based SBC is a good foundation. From there, look at the above list and think about VDI. Items number 2, 3, and 5 can be solved with VDI solutions. This means that VDI technology is useful in any scenario where you have power users or users who need strange, non-terminal-server-compatible applications, but where the users still need the flexibility associated with traditional SBC environments. (Examples include connecting to applications from anywhere, over slow connections, etc.)
If VDI can solve issues 2, 3, and 5, then what about 1 (offline access) and 4 (grapical apps that don't work via RDP or ICA). Since those application use cases cannot be solved by any SBC-based solution (traditional SBC or VDI), then we have to look elsewhere. And that's where application streaming comes in.
By streaming apps, we can get them to run locally on a device while centrally managing them. This solves problems 1 and 4 from the list above. Of course it also potentially solves issues 2, 3, and 5! So if app streaming can solve issues 1-5 above, then why not use streaming for everything?
The reason is that the list above contains only the downsides of SBC. But SBC has many advantages over streaming. (Again, the term "SBC" here refers to the remote presentation architecture, be it "real" SBC or VDI.) A short list of SBC advantages might include:
- Access an application from any client platform.
- All execution happens in the datacenter. Better performance for 3-tier apps.
- No wait to begin using the application.
- Instantaneous application upgrades.
- Use applications over slow connections.
- Application stay running as you switch devices.
- And so on...
The point here is that a true application delivery solution will be composed of multiple technologies, including SBC, VDI, and streaming. Where one is weak, another is strong. By combining all of these technologies together, you can come up with a solution that will work for all your apps and all your users.