Head-to-head performance analysis of App-V, SVS, ThinApp and XenApp

XPnet created a whitepaper which analyze and describes the performance characteristics of Altiris SVS Pro, Citrix XenApp, Microsoft App-V and VMware ThinApp. Although the tested application set is limited it still gives a nice view of the performance characteristics of different Application Virtualization solutions.

XPnet created a whitepaper which analyze and describes the performance characteristics of Altiris SVS Pro, Citrix XenApp, Microsoft App-V and VMware ThinApp. Although the tested application set is limited it still gives a nice view of the performance characteristics of different Application Virtualization solutions.

VMWare commission the study, but did not aid or influence the results. VMWare chose Xpnet to study AppVirt performance because they developed Clarity Suite which is a fairly well known Office benchmarking application and because it’s freely available anyone can reproduce their results. This is the first real whitepaper which investigates the performance impact of various Application Virtualization solutions. My personal feeling is that there are more real independent tests to come..

The complete test report can be downloaded here

UPDATE; The topic 'Performance analysis' is very interesting. The research organization XPnet isn't that interesting; read it here: http://www.zdnet.com/blog/btl/why-we-dont-trust-devil-mountain-software-and-neither-should-you/31024 and

http://www.osnews.com/story/22902/XPNet_CTO_a_Fraud_Editor_Fired_from_InfoWorld

 

Join the conversation

15 comments

Send me notifications when other members comment.

Please create a username to comment.

The whitepaper does not discuss how they obtained the CPU overhead graph from the conclusions. I also miss a discussion on why there would a CPU overhead to start with: a pure CPU performance test shouldn't give any difference (there is no CPU virtualization in application virtualization), which was the case with SoftGrid 4.1/4.2 (I was asked to write a report for a customer on SoftGrid performance in 2006 and back then there was simply no statistically significant difference -- I cannot imagine it changed).


Also, the network overhead graph makes it seem that streaming with App-V is very inefficient. The whitepaper doesn't state clearly that App-V was used with RTSPS streaming (RTSP over SSL tunneling) which obviously is less performant. It is also more secure than plain SMB streaming like used for ThinApp (the proper test would have been streaming over "secure SMB" or using RTSP instead).


Yet, I do believe their general conclusion that ThinApp is a bit more snappy and performant in the broad sense. That is to be expected of course, since App-V includes an entire management framework & local caching mechanism... both which are missing in ThinApp.


Cancel

"The whitepaper doesn't state clearly that App-V was used with RTSPS streaming (RTSP over SSL tunneling) which obviously is less performant. It is also more secure than plain SMB streaming like used for ThinApp (the proper test would have been streaming over "secure SMB" or using RTSP instead)."


Woops, missed the section that stated they explicitly used RTSP.


That makes an interesting case to evaluate the more performant SMB streaming vs. the local caching mechanism. I know what to do this evening ;).


Cancel

Should have included Xenocode (www.xenocode.com) into the testing as I believe it has a even a lower overhead than ThinApp.


Cancel

I meant Thinstall


Cancel

It would have been nice to see InstallFree included in the list...


Cancel

With all the people sending this to me, I may have to actually take it apart.   For the moment, just one thing I feel compelled to correct straight away.


> ThinApp ... onto 64-bit versions of Windows XP,


Article says that ThinApp supports 64-bit versions of Windows XP and nobody else does.   Citrix App Streaming has supported streaming on 64-bit systems this since its inception and this is something we've been very proud of.


More details another day, I should probably hook up with the authors and get some facts clarified - offline.


Cancel

This whitepaper lost all credibility on page 3.  I just wonder how much VMware paid these hacks to write this.  To the authors - next time you conduct some "thorough analysis", you might want to actually *research* the products you are writing about.  Application Streaming supported 64-bit before Microsoft and VMware even thought about it - just shows how much experience you have with these products.  


To Brian/Ruben - I'm surprised crap like this makes it onto your *independent* website.  You should pull this until they do a revision.  The reason people love your site is because it's (usually) fairly objective.  Let's keep it that way.


Cancel

I think this paper is "interesting".  While it does not pass my smell test overall, it makes a starting point for what should be done in this kind of testing.  


In particular, there is not enough detail provided about how the testing was done so that we can determine if the tests are valid or attempt to reproduce them.  I think you can look to the increased level of detail provided in the recent work by Ruben and Jeroen (www.brianmadden.com/.../VDI-and-TS-performance-on-ESX-Hyper_2D00_V-Xen-and-bare_2D00_metal-head_2D00_to_2D00_head-results-are-here-via-Project-Virtual-Reality-Check.aspx) did in virtual server testing for the kind of detail I'm talking about.


The last time I did tests on virtual application overhead was back in 2000, so this kind of testing is way over-due, especially with the kind of results claimed in this paper.  Needless to say we did not see the kind of numbers in this paper!  I recall one app that actually ran faster with virtualization (although that was because of heavy registry use by the app and Windows NT had a horribly slow registry system -- taking all those calls away from that bottleneck caused the speedup.  Microsoft fixed that in Windows 2000)..  There will be overhead to provide the bennefits, just not to the scale in this paper.  Depending on how the tests are done (beyond the detail provided), I am sure that one could construct tests to show all sorts of results.


Cancel

I truly believe this information can be valuable in different ways. First of all it’s a good topic in the Application Virtualization market space, there isn’t so much information in the field about this topic at all. I made clear that this article isn’t independent, When the results don't hold it will get trashed in public. Each vendor and community member has the ability to respond on this blog entry so at the end of the day we all receive more (in-depth) information, form our opinion and use whatever we like to use.


Cancel

Looks like another one of those VMware's misinformation marketing machine.


In order for them to prop up inferior product, VMware resorted to hiring a company with little credibility on the subject. As mentioned by others, VMware conveniently obmits other similar products in the industry.


The sad thing is, you might see a press release in the next week or two from VMware telling the world how App-V sucks.


Cancel

I don't know about the claimed numbers for ThinApp, but the overhead for App-V and Citrix App Streaming falls somewhere between the 2 (sadly not publishable) benchmarking exercises I have been involved with for customers.


There is an indirect virtualization overhead on CPU use ... every time you look for a file/registry key/COM object you are executing additional instructions to check the user store, the virtual package and finally the base OS to find the desired object.  


By living entirely in user mode ThinApp may or may not be more efficient at this process (avoiding a whole load of context switches to and from kernel mode components).


That said, in non-CPU bound applications I would expect the overhead to be more towards the lower end of my previous benchmarks (maybe 10-20% transaction time extension ... with higher CPU use during the transaction).


They may have hit the issue noted in the VRC project with Office 2007 SP1 that consumes a lot of CPU time with Outlook open.


When you are already CPU bound by the application every extra function call and context switch related to virtualization directly adds onto the transaction time, significantly exaggerating the impact.  In the worst case benchmark (100% CPU bound financial app) we saw transaction time overhead of over 300% for SoftGrid 4.2.


Cancel

Well, I for one welcome the publication of articles like this through the site. I don't see any need for it to be pulled, or that it implies a lack of independence. It's been made clear that it is a funded report, and anyone with a reasonable amount of intelligence can see the implications of that; as is borne out by the comments! If we can all see how biased it is, then we can take what we want from it and ignore the rest. At the very least, its given me some thoughts about how to test and measure app visualization solutions (as have some of the comments) Thanks for posting it.


Cancel

mattevans Wrote: "There is an indirect virtualization overhead on CPU use ... every time you look for a file/registry key/COM object you are executing additional instructions to check the user store, the virtual package and finally the base OS to find the desired object."


Very well put. That is precisely why you need to do a clear distinction between a CPU test, memory test and I/O test (which can indeed be registry and disk I/O based).


In a (non-publishable) comparison I studied for a customer (referred to it already), we discovered for SoftGrid a native CPU performance (and as Tim Mangan pointed out, even one case were the application actually ran faster -- we blame it on a statistical deviation because I honestly wouldn't know why that would be the case for a pure CPU based application). The disk I/O penalty was much higher (up to 50% in some specific situations (writing, large blocks, ... would have to look up the exact details)) and hence also the registry I/O.


The disadvantage with these synthetic tests that measure only CPU, memory or disk performance is of course that they do not reflect a real world situation. That is in my opinion the added value of this report: it uses a tool that automates a series of "real life" tasks and measures the performance.


In such a scenario, I can indeed imagine that ThinApp is more performant given its extremely low overhead. In a highly packed situation such as VDI, this can accumulate to being able to squeeze an extra desktop or two onto a single server. Yet I do wonder how many companies will consider that to be a key decision point for an application virtualization product, when there are much more differences in the management & packaging aspects of all products that are compared.


Cancel

Shame you couldn't include us too Ruben :-)


Hopefully catch up at VMWorld.


Nick.


Endeavors Technologies


Application Virtualization | we invented it


VMWorld Stand 147


Cancel

Thanks to Ruben for posting this report, in the right spirit and in one of the few places that will provide a diverse set of opinions and comments,  I have made several comments in response to this report over at the Citrix Blogs.


community.citrix.com/.../viewpage.action


Please check it out.


Community review of this report and all of the vendors various responses should prove benefitial to all of us.


Cancel

-ADS BY GOOGLE

SearchVirtualDesktop

SearchEnterpriseDesktop

SearchServerVirtualization

SearchVMware

Close