The title of today’s article is a quote from Dane Young, our guest on yesterday’s podcast. We were talking about one of Dane’s BriForum 2015 London presentations where he talked about the state of GPUs in VDI, and this is how he summed it up:
“I’m trying to convince people that GPUs should not be optional for VDI."
I never thought about it quite like this before, but I wholeheartedly agree. I mean look at regular desktops and laptops. How many of them can you buy without a GPU? And if you price out a business desktop and then you have to shave off some costs, do you ever do that be removing the GPU? No! You dial back the CPU, cut down on memory, or maybe don’t do SSD.
So by that logic a GPU should be 100% required for VDI, and if you don’t like the “added” cost of it, then you can offset it by putting 10% more users per server. Sure, that means that each user will get 10% less CPU and RAM, but again, that’s a tradeoff we make every day with physical desktops, so why wouldn’t we do that with VDI too? (And plus having a GPU might actually relieve some load from your CPU meaning you can actually fit more users on a box.)
While we’re on the topic of GPU and costs, I’m still hearing a lot of push back from people that adding GPUs are expensive. Those Nvidia K1 cards are $3k for only 32 users! (Actually here’s one on Amazon for $1850. Good for them!) So $1850 for 32 users is $57 per user. How much are you spending on VDI hardware and storage already? Probably $500 a user? So adding a real GPU is going to add 10% to your total hardware build cost.
Seriously, how is the GPU not just a standard thing that is in every VDI environment? Seriously, these are desktops, with desktop applications! The GPU isn’t just about CAD and 3D rendering anymore. It’s involved in everything in Windows. When it comes to VDI, the GPU is not optional. Get with it man. It’s 2015.