I had a chance to sit down with Brian and ask him a few questions about the details of these two new technologies.
Multimedia Streaming via ICA
Citrix's “Rave” technology allows multimedia content to be streamed from a MetaFrame Presentation Server through an ICA session to an ICA client. Using this technology, the content is as sharp and crisp as ever, and CPU utilization on the server is lower than when content is streamed via the current version of MetaFrame. Citrix developed this technology after recognizing the increased business role that multimedia content is playing in today's corporate workplace.
To understand how this new technology works, it's important to understand how multimedia streaming works in current MetaFrame environments.
Traditionally, if you have multimedia content (a DivX video, for example) that you want to integrate with your MetaFrame servers, you install a media player and the appropriate codecs on your MetaFrame server. Your client devices need nothing more than the standard ICA client. When content is streamed from the server, the server's locally installed media player software uses its codec to decode the stream and pass it to the display. Terminal Server's MultiWin components intercept the display and audio data, and Citrix's server-side ICA components encode the data into an ICA stream. The content is sent to the client device as ICA data, and the client device's ICA software decodes the ICA stream and displays it on the screen.
Unfortunately, there are inefficiencies in this model that cause poor video quality, dropped frames, and choppy sound. First is the fact that ICA was designed to handle graphics associated with Windows applications—not multimedia content. The ICA protocol just can't handle the amount of data needed for watching videos. There are also scalability issues when watching videos via ICA , since the server must spend valuable processor time decoding the original content stream and then immediately re-encoding it for ICA .
Citrix's new “rave” technology completely changes this model. In this model, multimedia content is streamed to the client device via the ICA protocol in its originally encoded state . This means that the server does not have to spend processor time decoding and re-encoding the content. Also, since whatever format the content was originally in is probably more appropriate for streaming content than ICA , the content can be sent to the client device in the most efficient way possible. This means that a client device can watch videos from a MetaFrame Presentation Server at full quality.
The ramifications of this architecture are that the client device must have a media player and the appropriate codes installed locally in order to view a content stream from the server. In today's world, however, this is not usually a problem. Even so called “thin” devices have local processing capabilities, and media players are available for all operating systems, including Windows CE and Linux.
So what's the point of this if a local media player is needed on the client? Citrix is not trying to make streaming content any better than it is without MetaFrame. The “rave” technology simply allows MetaFrame Presentation Servers to provide multimedia content to client devices in addition to traditional Windows applications. As their new marketing message states, they're striving to provide the “access infrastructure” that supports whatever a user needs to access—be it multimedia content or Windows applications.
Another upcoming technology that Citrix showed off at iForum was a JPG-based compression technology that allows certain images to load much faster on client devices. Brian Nason led a very powerful demonstration where he had two client devices side-by-side, one with the new technology and one without. He simultaneously opened a photo-laden Word document from both client sessions, and the images popped into the session with this new compression technology in about 1/10 th the time as they did in the other session.
When I spoke with Nason, my question was simple: “How does it work?”
Nason explained that traditionally, Citrix has focused on “lossless” compression. In current versions of MetaFrame Presentation Server, screen data on the client device is a pixel-for-pixel, 100% exact match to the source data on the server. While the ICA components of MetaFrame do compress the data, the lossless requirement effectively limits the amount of compression that can be applied.
In the new MetaFrame Presentation Server technologies, Citrix will give administrators the option of enabling “lossy” compression. Lossy compression will allow for much greater compression ratios because the software on the client device will interpolate some of the pixel information based on compression algorithms.
In the real world, lossy compression is used everyday without anyone noticing. The different types of compression are the fundamental differences between bitmap files (lossless) and JPEG files (lossy). This is why a 3MB bitmap image can be “saved as” a JPG file that's only 200KB. The sophisticated compression algorithms mean that in most cases, people won't be able to see any difference between the two images.
Of course it's important to note that the lossy compression capabilities of the new version of MetaFrame will be able to be turned on an off, which is a good thing if you were using MetaFrame to serve something like a medical imaging application.
All in all, these two new technologies are only a small subset of what Citrix has on tap for their new version of MetaFrame Presentation Server. I'll keep you posted on the inner workings of all the new capabilities over the next few months.