Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Tech Focus: Optimising for the Cloud

Digital Foundry on understanding the challenges of cloud gaming development

The furore last week over GameStop's decision to remove OnLive redeem codes from US PC versions of Deus Ex: Human Revolution overshadowed what must surely be one of the most gutsy, impressive marketing moves we've seen in the industry for quite some time.

The notion of including another SKU of the game within the PC release not only adds a great deal of value for the end-user but is a superb promotional idea from OnLive. It also challenges sceptics to play the exact same game both locally and via the cloud and invites them to make their own conclusions about the quality of the service on offer.

The inclusion of Deus Ex: Human Revolution is a canny choice for many reasons. For a start, it's the biggest game we've seen released in months, establishing OnLive in users' minds as a service that can attract big-name games day and date with existing platforms - even though EA and Activision releases remain conspicuous by their absence. On a technical level, Human Revolution is a game where the visual make-up and basic gameplay style is a relatively good fit with a system that exchanges fidelity for convenience. In short, OnLive has limitations, but the make-up of this game won't particularly highlight them at anything like their worst.

The question is, what are these limitations and based on what we know about cloud streaming systems in the here and now, what should games developers be aware of? To what extent is it actually possible to optimise for OnLive?

OnLive's Deus Ex pack-in deal challenges sceptics to play the exact same game both locally and via the cloud and invites them to make their own conclusions about the quality of the service on offer.

I spent some time with the service recently, hooking up to the service using Eurogamer Networks' mind-bogglingly fast 100mbps leased line in its Brighton-based offices and was intrigued to see what had changed and improved with the system since its release just after E3 in 2010, where the system was put through its paces using a 25mbps internet connection.

In both these tests we were essentially looking at something approaching a best case scenario for the system. OnLive caps out at around 5mbps for its visuals and audio and this won't change by having a vastly over-spec internet connection. However, typically the more bandwidth available, the lower latencies are likely to be. While there has been discussion of improvements to "the algorithm" and talk of "tuning latency", the overall experience between then and now was remarkably similar.

There has previously been talk of OnLive's proprietary video compression, but in truth, it seems to act very much like the industry standard h.264 and almost certainly uses a variation based on a concept dubbed "periodic intra refresh".

The usual way to compress video is to kick off with a reference frame (or intra frame/I-frame) with subsequent frames describing the differences before a new reference frame comes along. Put in really basic terms, cloud video streaming systems like OnLive and Gaikai work differently, by splitting the image into rectangles and sending reference data more frequently on a "per rectangle" basis rather than using the traditional intra frame. Clever as it is, it doesn't fundamentally impact the basics of video compression: the more difference there is between frames, the more changes need to be crammed into the available bandwidth. When there is not enough bandwidth to detail the changes, ugly compression artifacts can occur.

Here's a comparison between the PC version of Batman: Arkham Asylum and its OnLive counterpart. We've slowed the video down and hand-encoded the file to ensure that picture quality is as pristine as possible. As you can see, the OnLive representation of this game isn't bad at all - the game aesthetic is a good match for the compression technology.

In a fast-moving, colourful game where the majority of the screen is in motion (Disney's Split/Second, available on OnLive is a good example) you will undoubtedly experience a big reduction in video quality compared to the local experience to the point where as a games developer you might wonder why you invested so much time creating intricate, detailed visuals that don't actually get fully resolved on the user's screen. On the flipside, a game along the lines of Rocksteady's Batman Arkham Asylum - dark, muted, with limited motion - can look quite decent indeed.

Away from video quality, the other element to bear in mind is latency. OnLive's performance here surprises many. Play the same Split/Second game I just griped about via OnLive and it is clearly playable. However, move to a shooting game and the latency is more pronounced, particularly when it comes to precision aiming.

OnLive has recently talked about "tuning latency" but the bottom line is that decoding video will have a set latency and transmission from the server will have at best a minimum lag - so this basically leaves encoding on the host server as a tweakable variable. In theory a less efficient encode can be achieved more quickly, but it's difficult to imagine that OnLive would want to compromise picture quality still further. I'd really like to know on what basis latency can be "tuned", but technical discussion is rarely forthcoming from OnLive.

So what can be done to get the best results from a system that inevitably must compromise core components of the gameplay experience? From a picture quality perspective, there's very little that can be done, short of hoping for some kind of video encoding miracle (or for a significant increase in bandwidth in future, which is perhaps more likely). The bottom line is that a slow-paced game will retain much more quality than an ultra-fast, colourful racing title - emphasising why Deus Ex: Human Revolution is perhaps such a good title to use in promoting OnLive.

You can imagine that from a picture quality perspective at least, casual games, puzzle games, RPGs and real-time strategy games would be a much better fit for the system than high-speed racing and shooting games. However, video compression was never really designed for precision computer visuals, so it may well be that there would be some advantages to working with a post-process filter along the lines of NVIDIA's FXAA, which seeks to reduce jaggies and aliasing by applying a heuristic blur to the image. An offshoot of this kind of work could result in some useful improvements to picture quality and compression efficiency via cloud-based delivery.

Another thing to bear in mind with streaming video is that the precision of HDMI/DVI's 24-bit RGB is lost in favour of a pixel format known as YV12, which effectively halves bandwidth even before any conventional video compression algorithms are applied. Chroma resolution is lost, which means that even if bandwidth isn't a problem, picture detail using pure reds or blues will look much blockier than they do locally. Typically, steering clear of these colours is a very smart move indeed, particularly on persistent HUD elements.

When it comes to latency, there is more that the developer can do from a performance perspective. We've taken a look at a number of OnLive titles and sought to compare their settings and performance with the same titles on a range of conventional PCs. Our educated guess is that in the here and now, OnLive servers dedicate a set amount of processing and GPU power to each game instance: a modest Core 2 Duo's worth of CPU might, combined with an entry level enthusiast graphics card along the lines of a 9800GTX or 9800GT.

Richard Leadbetter avatar
Richard Leadbetter: Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.
Related topics