Tech Focus: Next-Gen Cloud vs. Console
Can "Gaming over IP" ever match the local gaming experience?
The recent announcement of GFACE - a social media network with Cloud support, backed by Crytek - demonstrates that the concept of "gaming over IP" is likely to gain further traction as viable competition to both current and next-gen consoles. OnLive is out now with a full service, Gaikai is set to follow suit this year, and while there's a strong argument that these emerging technologies are not really a match for the local console experience, it's only fair to remember that these are first-gen technological products. They will improve, and even in the here-and-now they do work, even if the experience is quantifiably sub-optimal compared to local gaming.
"Picture quality and input latency are the key elements holding back Cloud gaming performance. To what extent can they be improved over time?"
The question is, to what extent can Cloud gaming services improve? Much has been made of the fact that even with ultra-fast fibre-optic networks, latency will always be an issue. Similarly, lossless video quality requires so much bandwidth (720p60 is well over 100 megabytes per second) that it's not going to be viable on gaming services. Given that video will always be compressed, to what extent can picture quality - which varies dramatically from one instant to the next - get better?
In this article we'll be tackling both of these issues, presenting the argument that improvements to internet infrastructure combined with optimising the current latency pipeline could well be enough to bridge the gap between Cloud services and what you might call a standard console experience.
The picture quality issue is perhaps the easier aspect to address. At the moment the standard appears to be to transmit a 720p video stream at 60 frames per second using around 5mbps of bandwidth. Having worked extensively with h.264 video - the same system used by Gaikai and OnLive - that seems to be a very low level of bandwidth for a 60FPS stream, and it's safe to say that this encoding technique performs extremely poorly with certain games: playing titles such as Space Marine on OnLive is a pretty hideous experience all told. The bottom line is that the more motion and colour in play, the more bandwidth is required to sustain quality. In short, some games will look great, others will look terrible.
Improvements in compression technology can help of course. While both OnLive and Gaikai use h.264 compression, the encoders behind them at the datacentres are very, very different. The recent addition of Assassin's Creed: Brotherhood on Gaikai gives us an excellent opportunity to compare the two Cloud services in terms of picture quality. Gaikai's use of the widely acclaimed x264 encoder, used by YouTube and Facebook amongst a multitude of others demonstrates that at the same approximate levels of bandwidth, image quality gains can be seen.
Similarly, the introduction of Crytek into the Cloud scene is intriguing. Previously the company has revealed some of its thoughts on optimising video compression by integrating it into the game itself, allocating bandwidth according to the most important elements of the scene rather than treating the frame as a simple 2D image to be compressed and beamed over the internet. In short, the game engine knows more about the make-up of the scene than an external compressor, so in theory it could do a better job of encoding it.
However, at the end of the day, I suspect that this is a problem that will simply be overcome by brute-force alone. As infrastructure radically improves, so the available bandwidth will increase and the cost of utilising it will be forced down. If OnLive works at 5mbps - albeit with plenty of issues - imagine what could be done by upping that to 10mbps. With BT recently announcing 330mbps trials in the UK, and arch-competitor Virgin already offering end-users a 120mbps product, why stop there? Why not go up to 20mbps or even beyond?
Both Gaikai and OnLive have demonstrated utilising this extra bandwidth for 1080p and stereoscopic 3D support, but I would suggest that employing it to improve image quality is perhaps a better way forward. To illustrate, here's an excerpt of what I would deem to be pretty challenging video, encoded using x264 (the same technology behind Gaikai) at 5, 10, 15 and 20mbps. While the actual encoding settings won't be identical to what Gaikai is using, it does demonstrate the basic problem and the effectiveness of the brute-force solution. At its heart, video compression is all about using data efficiently to describe the difference between frames. On a basic, fundamental level, the more bandwidth available, the more accurately the changes can be rendered from one image to the next.
So if we're looking at a combination of better encoders and more bandwidth to overcome the image quality issues, what about the response issue? Latency will always be the key challenge with Cloud-based services - the speed of limit is a bit of a pain in the arse in that regard - but even in this regard we could see tangible improvements in the coming years. Let's take a look at how latency is tackled in the here and now and factor in potential elements that could see this improved.
The current methodology we've seen with OnLive in making games playable over the Cloud is pretty straightforward: even without factoring in the internet, all games have their own internal latency - the so-called input lag. This varies from title to title, depending on how they are programmed, but to give a couple of examples, we've measured Modern Warfare 3 (60FPS) on Xbox 360 as having a 50ms latency from button press to resulting on-screen action, whereas Battlefield 3 (30FPS) on the same console seems to be in the region of 116ms.
"In the here and now, we are being asked to take the hit in fidelity and response in favour of a more convenient experience. Going forward, can we have our cake and eat it? In theory it's possible."
The theory is that by running a console 30FPS game at twice the frame-rate - OnLive targets 720p60, remember - input latency is significantly reduced. For the sake of argument, let's say that 50ms is shaved off the response time. That's enough time to encode a frame, decode it and also take a bite into the transmission time over the internet. That's the theory, so how does it stand up in practise? This is very easy to test: measure OnLive latency, and compare it with the PC and Xbox 360 versions. The PC metric gives us the base latency of the game itself (the same code is used with OnLive remember), while the Xbox 360 figure represents the target latency in providing a console-like response.
|Game||PC||OnLive||Xbox 360||Latency Deficit|
|Unreal Tournament 3||66ms||150ms||116ms||33ms|
The tests were carried out on a 50mbps Virgin fibre optic connection with no other traffic running on the line at the time, and we chose areas where the game was operating at its highest frame-rate - in short, as close to best-case conditions as we could manage. On other connections we were quite alarmed at how variable this measurement could be, and we were also quite startled at the variance caused by drops in game performance, server-side. Inconsistency appears to exaggerate the impact of lag.
Regardless, in these tests, the difference between PC and OnLive is in the region of 83 to 116ms, but the gap closes when compared to the console versions. In our experience, it's usually playable but it's noticeably laggy - to varying degrees. We've also noticed on a number of tests that there's something of a latency threshold around the 150ms level - if you're below that, typically control isn't flagged as an issue but if you're above it, people do complain. Bulletstorm on console comes in at around 133ms and there are few complaints about its response level. Compare and contrast with Killzone 2 at 150ms, which turned into something of a major issue (though perhaps because frame-rate drops did have a further impact on response).
At its best, OnLive dances around the threshold - noticeable but not an enormous issue. At its worst, it's well over it and unsatisfactory. Improving this is the key challenge. For a start, lowering latency as much as possible within the game itself will claw back precious milliseconds - something we've discussed already in our Optimising for the Cloud article we ran a while back. Perhaps if Cloud gaming gains traction, this will become more of a focus during development.
Similarly, ensuring a solid 60FPS will help, meaning that Cloud providers will actually have to deliver in that regard: OnLive proves to be rather variable in that regard and the more drops from 60FPS there are, the higher the latency. In essence then, developers need to be more mindful of the target platform, or else the platform itself needs more powerful hardware.
Secondly, the deployment of more datacentres closer to the client is an obvious move. David Perry has been talking a lot recently about ultra-low latencies between his various locations and Gaikai datacentres - specifically 8ms between his home and what we'll assume is the Californian server, and even as low as 5ms between his London hotel and the local Gaikai facility. Right now, OnLive UK is actually serviced by a facility in Luxembourg. Typically the further afield the datacentre, the higher the latency and the more opportunity there is for packet loss, so being more local has obvious advantages. Player-side, as infrastructure is upgraded more towards fibre, latency should also decrease there too.
But can any more be done? Gaikai's association with LG in getting their service available via Smart TVs may present further opportunities. Connect up an Xbox 360 to an HDTV and even in "Game" mode - which disables a lot of the set's lag-inducing post-processing - you can still be looking at latencies of 50ms just from beaming a digital signal from the console and getting it displayed on the screen. Even when measuring latencies on fast gaming PC monitors - with an advertised 8ms lag - the actual end-to-end measurements we're getting can be up to 33ms. Bearing in mind how much the Cloud services can achieve in the same time period, this is a remarkably inefficient and obviously another vector of attack that Gaikai could use in bridging the gap between console and cloud latency levels. If you're already integrated into the TV itself, why not try to make the most of it?
I would suggest that the kind of experience console users enjoy now with Modern Warfare 3 - in all its 60FPS, ultra-low latency glory - simply cannot be achieved with a cloud gaming service. But an additional 50-83ms gives us the same kind of response that we get from console FPS titles like Borderlands, Battlefield 3, Killzone 3 and Bulletstorm: all eminently playable, and none of which have been criticised for being too laggy. At the moment it's theory, but relentless march of technological innovation has a habit of turning this into palpable fact sooner rather than later.
It's almost three years now since I wrote why OnLive can't possibly work, and it's interesting to look back and see the points being made that were eventually born out (latency, picture quality) in subsequent analysis, but how fundamentally, despite the very significant, quantifiable drawbacks, it is actually playable.
In the here and now, we are being asked to take the hit in fidelity and response in favour of a more convenient experience. Going forward, can we both have our cake and eat it with a service that captures the essence of the modern day console experience, and perhaps even exceeds it by using more powerful PC hardware, server-side? With next-gen consoles the best part of two or even three years ahead of us, it'll be interesting to see if the Cloud-based engineering breakthroughs and infrastructure improvements can arrive before new local hardware gains the kind of critical mass Xbox 360 and PS3 enjoy now.