Tim Mangan has been writing about “perceived performance” of Citrix servers for almost two years. (Read his original paper about the perceived performance of Citrix servers, and a newer paper about the perceived performance of virtual machines.) Basically, his theory is that it doesn’t really matter what values the various performance monitor counters show. What matters is how uses perceive the performance of the server. This is typically measured in latency (or “delay” in performing some action), and it needs to incorporate server load, applications, network latency, etc.
Ultimately, a server with 100% CPU load and 100% memory load that has happy users is much better than a server with 20% CPU load and 30% memory load where users think the server is too “slow.”
Tim presented his perceived performance research at BriForum 2005, and one of the visual aids he used was a set of graphical data called the Perceived Performance Profile for a particular server. (example below)
After the conference, Tim received many questions about how he created these charts, so he decided to create the Perceived Performance Toolkit. The perceived performance toolkit (available from Tim’s website www.tmurgent.com, look for the link in the lower left corner) is a ZIP file that contains a few different components.
The main components are a server-side EXE and a client-side EXE that actually test and record the performance. You’ll need your own mechanism for generating load (or “stress”) on the server itself, but that can be something as simple as CPUstres.exe from the Windows Resource Kit, calc.exe running a huge factorial calculation, or your own AutoIT script that simulates user activity. What’s great about the Perceived Performance Toolkit is that neither of the two test executables (client or server-side) require installation. You just copy them in place and run them.
In simple terms, to use the toolkit you publish the server-side EXE as an application and then you use the client-side EXE on a workstation with the ICA client installed. The client throws sessions at the server and everything is recorded. In many ways this is similar to other performance testing tools, except that the way these tools work has to do with delay timers and processor ready queues and other things that focus purely on how fast the server can respond to an application that wants to do some work (as opposed to the other “standard” perfmon counters).
When the test is finished, all of the results are dumped into an Excel spreadsheet where you can study them or generate graphs like the one pictured previously.
Tim’s Perceived Performance Toolkit will undoubtedly become regarded as one of the “must have” tools for performance tuning a Citrix Environment, and I’ve added it to my list of tools in the Brian Madden Toolkit. (www.brianmadden.com/toolkit)