In Theory: Can Cloud Gaming Replace Console?
Exploring the challenges facing the rise of gameplay over IP
Time is being called on video gaming as we know it. No less an expert than id software's John Carmack claims that consoles will fall to Moore's Law and that realistically, we're looking at two more generations of bespoke gaming machines before the curtain is drawn. The future is the cloud: gameplay over IP. The question is, what exact form will it take?
We have two working systems to mull over in the here and now: OnLive and Gaikai, both based on the principle of beaming your control inputs to a remote server that transmits back gameplay in the form of compressed video. The advantages are numerous: no need for game code on the client meaning no piracy, while loading is kept to a minimum (and masked rather well within OnLive's excellent front-end interface) with no lengthy installs. As the server holds all the game code, the player needs a simple dumb terminal and game controller, which - in theory - will never need to be upgraded. Both Cloud services have been seen deployed on a vast array of devices from iOS to Android, Blu-ray players and even TV. We've even seen Gaikai running on Xbox 360.
Both Cloud services have been seen deployed on a vast array of devices from iOS to Android, Blu-ray players and even TV. We've even seen Gaikai running on Xbox 360.
There's also the matter of convenience and access. Your whole game library is instantly available, you can play your games on multiple devices (anything with a controller input and video decoding facilities, basically) and integration options are mouth-watering: how about organising a multiplayer session within Facebook? Better yet, why not recommend games to your friends by letting them play the actual game within the browser? Both OnLive and Gaikai plan to have you playing real games launched directly from your Facebook feed - a killer concept that Xbox LIVE and the PlayStation Network would have real issues matching. The potential to expand the audience here is phenomenal.
Of course there are the downsides too. Concerns over lag - the time your controller inputs are reflected by the action on-screen - will never go away completely, but so long as there is a relative amount of consistency, the maths work out that the experience can be eminently playable. In a recent interview with Eurogamer, OnLive's Steve Perlman believes that his system could even outperform console:
"Video games today, when they're built for Xbox 360, PS3 or even PC, they have pre-render queues. In order to get as much realism as they can with the processing hardware they have, they introduce multi-frame lag in games. There is some period of delay before the result hits the screen," he says.
"We're able to compensate for that because we have state of the art servers with very high performance GPUs. A 2005 class Xbox or PS3 game, when you put it on a 2011 class server, we don't have to have that pre-render queue. Instead, we use that time for the network delay. The algorithm keeps getting better and better."
It's an interesting argument but the fact is that all games have latency built-in to different extents, and short of extensive re-engineering, the sort of lag Perlman is talking about can only be mitigated by throwing more power at the existing code, and it definitely won't be completely eliminated as he appears to be suggesting. For example, we measured latency on the Xbox 360 version of Bulletstorm running at 30Hz to be in the region of 133ms, without factoring in additional lag from the display. Conversely, the game running on an i7 with a GTX580 at 60Hz comes down to 84ms (with v-sync enabled).
In an environment where the server is running several virtualised game instances, it is unlikely that OnLive servers will be able to allocate the same amount of resources to Bulletstorm as our i7 games set-up - indeed, at GDC2010, Perlman talked about twice the amount of GPU resource per instance as an Xbox 360 - nowhere near our GTX580 set-up - but let's assume that this latency can be matched. This means that OnLive has to compress the video, transmit it and decompress it in just 50ms: a tough task. As it stands right now, we've seen a baseline lag of 150ms on certain OnLive titles - just one frame off Bulletstorm on Xbox 360, but on other games where performance is variable, response is correspondingly less solid, and occasionally downright lousy.
"We don't tune the system by some sort of scientific measurement on latency," Perlman continues. "We tune the game system from a human perceptual point of view to try to make it so the game plays as good as possible."
It's a bit of an odd comment (and Perlman doesn't expand upon how latency can be "tuned") but then cloud gaming is to an extent all about smoke and mirrors - making the experience playable and enjoyable even if the actual input lag is a touch on the high side. Actually being able to discern latency without a precision controller like a mouse is actually quite difficult unless you're comparing side-by-side with the equivalent local experience. To a certain extent, it either feels "right" or "wrong".
Can the lag situation be improved? Well, game developers can certainly improve latency through more efficient code: Guerrilla Games reduced Killzone latency from 150ms down to around 116ms between sequels, and Criterion Games managed to get Need for Speed: Hot Pursuit lag down to 83ms on a 30Hz title (which we believe is a first) and reduced it still further to a phenomenal 50ms on the PC version of the game. The lower the core latency in the game code itself, the less noticeable the encoding/decoding/transmission overhead added by Cloud services will be.
It's also reasonable to assume that infrastructure deals such as the one struck by OnLive with BT Internet will make a difference: cutting out as much of the internet as possible and talking as directly with the server as you can is bound to make a difference.
But what of the fluctuating picture quality? OnLive appears to operate on the same general principle as Gaikai - both systems use the same h.264 video codec as featured in streaming Flash video, but the keyframes, or "intra" frames (reference points from which future frames are derived) are instead replaced with what is called periodic intra refresh. In this situation the frame is split into sections which have reference information updated in sequence, cutting down latencies and improving compression efficiency in an environment where every millisecond counts.
A potential route to improving picture quality may well be to offload the issue of video compression from the cloud service and instead make it part of the actual rendering process in-game.
However, the fact remains that there's only so much bandwidth available. The more detailed an image is and the more motion there is on-screen, the more data needs to be sent to the user in order to sustain the same image quality. The result can be really poor picture quality in the heat of the action and a direct impact on the gameplay experience. Just the basic make-up of a game's visuals can make a huge difference to compression efficiency. A dark, slow-moving game like Alan Wake is eminently suitable for some serious compression, while a game with a lot of detail/noise like Enslaved is far more difficult to compress, even with relatively slow-moving motion.
Constricted bandwidth is always going to be a problem to a certain extent, and short of throwing more bandwidth at the issue, or introducing more efficient coding technologies, not much is going to change that. However, a potential route to improving picture quality may well be to offload the issue of video compression from the Cloud service itself and instead make it part of the actual rendering process in-game. In the here and now, engine architects are dealing with a high def display as their final output and calibrate their outputs towards that, but what if this situation changed? What if they also targeted h.264 output as well?
The way things stand, the Gaikai and OnLive compression systems don't have the ability to judge which parts of the image are important and which are not, so bandwidth allocation is based on movement rather than where the player's focus is at any given point. Crytek has already made some fleeting references in its technical presentations to a "points of interest" focus on video compression - where a core part of the engine tech intelligently decides where bandwidth should be utilised in order to maximise image quality where it matters to the player.
Over and above that, there's the basic fact that optimising for a specific target platform - be it console or cloud-based - will almost always produce a better result. Should cloud gaming take off, we may well see the basic make-up of art assets being influenced by how well they will compress, for example. The other element to factor in is the price of bandwidth itself. Right now, OnLive targets 5mbps for its HD "720p60" streaming. Double that data throughput, and many of the issues to do with picture quality would be resolved pretty quickly.
The notion of a cloud-based delivery system doesn't need to be restricted to just video either. John Carmack predicts a day where rendering technology is so powerful, we have a single device that can dock anywhere and provide gameplay in the lounge, on the move, or wherever you like. At this point, the notion of having a far-off server rendering each individual frame and beaming it across the internet could be replaced by a different system, where gameplay assets are streamed instead with rendering and gameplay logic being handled locally. Different cloud-based systems like this wouldn't necessarily compete with the established providers either - there's nothing to stop OnLive repurposing its datacentres for a different kind of traffic, for example.
This way of utilising the cloud wouldn't have the immediacy of the Gaikai/OnLive experience (games would need to be downloaded) but it would resolve the issues of latency and image quality - the device would still be generating the image, and response would be as instant as a local video game. However, the bandwidth requirement in background-loading gameplay assets would be considerable and making it work convincingly could require a fundamental re-think of the way games are designed.
There's plenty of time for that to happen, of course, and for a whole range of alternative solutions to become apparent. If cloud gaming is to gain traction over two console generations, the transformation we could expect to see in that time period would be quite phenomenal. Quite where the PC would sit within the new eco-structure is an interesting question: OnLive and Gaikai are both based on PC architecture, suggesting a rosy future for hardcore gamers if these systems gain traction. Of course, the real unknown surrounds how the existing major platform holders - Microsoft, Sony and Nintendo - would respond to the challenge in seeing their business models effectively replaced...