Close
Report Comment to a Moderator Our Moderators review all comments for abusive and offensive language, and ensure comments are from Verified Users only.
Please report a comment only if you feel it requires our urgent attention.
I understand, report it. Cancel

Digital Foundry

Tech Focus: Next-Gen Cloud vs. Console

Wed 08 Feb 2012 7:50am GMT / 2:50am EST / 11:50pm PST
Digital Foundry

Can "Gaming over IP" ever match the local gaming experience?

The recent announcement of GFACE - a social media network with Cloud support, backed by Crytek - demonstrates that the concept of "gaming over IP" is likely to gain further traction as viable competition to both current and next-gen consoles. OnLive is out now with a full service, Gaikai is set to follow suit this year, and while there's a strong argument that these emerging technologies are not really a match for the local console experience, it's only fair to remember that these are first-gen technological products. They will improve, and even in the here-and-now they do work, even if the experience is quantifiably sub-optimal compared to local gaming.

"Picture quality and input latency are the key elements holding back Cloud gaming performance. To what extent can they be improved over time?"

The question is, to what extent can Cloud gaming services improve? Much has been made of the fact that even with ultra-fast fibre-optic networks, latency will always be an issue. Similarly, lossless video quality requires so much bandwidth (720p60 is well over 100 megabytes per second) that it's not going to be viable on gaming services. Given that video will always be compressed, to what extent can picture quality - which varies dramatically from one instant to the next - get better?

In this article we'll be tackling both of these issues, presenting the argument that improvements to internet infrastructure combined with optimising the current latency pipeline could well be enough to bridge the gap between Cloud services and what you might call a standard console experience.

The picture quality issue is perhaps the easier aspect to address. At the moment the standard appears to be to transmit a 720p video stream at 60 frames per second using around 5mbps of bandwidth. Having worked extensively with h.264 video - the same system used by Gaikai and OnLive - that seems to be a very low level of bandwidth for a 60FPS stream, and it's safe to say that this encoding technique performs extremely poorly with certain games: playing titles such as Space Marine on OnLive is a pretty hideous experience all told. The bottom line is that the more motion and colour in play, the more bandwidth is required to sustain quality. In short, some games will look great, others will look terrible.

Improvements in compression technology can help of course. While both OnLive and Gaikai use h.264 compression, the encoders behind them at the datacentres are very, very different. The recent addition of Assassin's Creed: Brotherhood on Gaikai gives us an excellent opportunity to compare the two Cloud services in terms of picture quality. Gaikai's use of the widely acclaimed x264 encoder, used by YouTube and Facebook amongst a multitude of others demonstrates that at the same approximate levels of bandwidth, image quality gains can be seen.

Similarly, the introduction of Crytek into the Cloud scene is intriguing. Previously the company has revealed some of its thoughts on optimising video compression by integrating it into the game itself, allocating bandwidth according to the most important elements of the scene rather than treating the frame as a simple 2D image to be compressed and beamed over the internet. In short, the game engine knows more about the make-up of the scene than an external compressor, so in theory it could do a better job of encoding it.

However, at the end of the day, I suspect that this is a problem that will simply be overcome by brute-force alone. As infrastructure radically improves, so the available bandwidth will increase and the cost of utilising it will be forced down. If OnLive works at 5mbps - albeit with plenty of issues - imagine what could be done by upping that to 10mbps. With BT recently announcing 330mbps trials in the UK, and arch-competitor Virgin already offering end-users a 120mbps product, why stop there? Why not go up to 20mbps or even beyond?

Both Gaikai and OnLive have demonstrated utilising this extra bandwidth for 1080p and stereoscopic 3D support, but I would suggest that employing it to improve image quality is perhaps a better way forward. To illustrate, here's an excerpt of what I would deem to be pretty challenging video, encoded using x264 (the same technology behind Gaikai) at 5, 10, 15 and 20mbps. While the actual encoding settings won't be identical to what Gaikai is using, it does demonstrate the basic problem and the effectiveness of the brute-force solution. At its heart, video compression is all about using data efficiently to describe the difference between frames. On a basic, fundamental level, the more bandwidth available, the more accurately the changes can be rendered from one image to the next.

So if we're looking at a combination of better encoders and more bandwidth to overcome the image quality issues, what about the response issue? Latency will always be the key challenge with Cloud-based services - the speed of limit is a bit of a pain in the arse in that regard - but even in this regard we could see tangible improvements in the coming years. Let's take a look at how latency is tackled in the here and now and factor in potential elements that could see this improved.

The current methodology we've seen with OnLive in making games playable over the Cloud is pretty straightforward: even without factoring in the internet, all games have their own internal latency - the so-called input lag. This varies from title to title, depending on how they are programmed, but to give a couple of examples, we've measured Modern Warfare 3 (60FPS) on Xbox 360 as having a 50ms latency from button press to resulting on-screen action, whereas Battlefield 3 (30FPS) on the same console seems to be in the region of 116ms.

"In the here and now, we are being asked to take the hit in fidelity and response in favour of a more convenient experience. Going forward, can we have our cake and eat it? In theory it's possible."

The theory is that by running a console 30FPS game at twice the frame-rate - OnLive targets 720p60, remember - input latency is significantly reduced. For the sake of argument, let's say that 50ms is shaved off the response time. That's enough time to encode a frame, decode it and also take a bite into the transmission time over the internet. That's the theory, so how does it stand up in practise? This is very easy to test: measure OnLive latency, and compare it with the PC and Xbox 360 versions. The PC metric gives us the base latency of the game itself (the same code is used with OnLive remember), while the Xbox 360 figure represents the target latency in providing a console-like response.

Game PC OnLive Xbox 360 Latency Deficit
Unreal Tournament 3 66ms 150ms 116ms 33ms
Borderlands 50ms 166ms 133ms 33ms
DiRT 3 100ms 216ms 116ms 100ms

The tests were carried out on a 50mbps Virgin fibre optic connection with no other traffic running on the line at the time, and we chose areas where the game was operating at its highest frame-rate - in short, as close to best-case conditions as we could manage. On other connections we were quite alarmed at how variable this measurement could be, and we were also quite startled at the variance caused by drops in game performance, server-side. Inconsistency appears to exaggerate the impact of lag.

Regardless, in these tests, the difference between PC and OnLive is in the region of 83 to 116ms, but the gap closes when compared to the console versions. In our experience, it's usually playable but it's noticeably laggy - to varying degrees. We've also noticed on a number of tests that there's something of a latency threshold around the 150ms level - if you're below that, typically control isn't flagged as an issue but if you're above it, people do complain. Bulletstorm on console comes in at around 133ms and there are few complaints about its response level. Compare and contrast with Killzone 2 at 150ms, which turned into something of a major issue (though perhaps because frame-rate drops did have a further impact on response).

At its best, OnLive dances around the threshold - noticeable but not an enormous issue. At its worst, it's well over it and unsatisfactory. Improving this is the key challenge. For a start, lowering latency as much as possible within the game itself will claw back precious milliseconds - something we've discussed already in our Optimising for the Cloud article we ran a while back. Perhaps if Cloud gaming gains traction, this will become more of a focus during development.

Similarly, ensuring a solid 60FPS will help, meaning that Cloud providers will actually have to deliver in that regard: OnLive proves to be rather variable in that regard and the more drops from 60FPS there are, the higher the latency. In essence then, developers need to be more mindful of the target platform, or else the platform itself needs more powerful hardware.

Secondly, the deployment of more datacentres closer to the client is an obvious move. David Perry has been talking a lot recently about ultra-low latencies between his various locations and Gaikai datacentres - specifically 8ms between his home and what we'll assume is the Californian server, and even as low as 5ms between his London hotel and the local Gaikai facility. Right now, OnLive UK is actually serviced by a facility in Luxembourg. Typically the further afield the datacentre, the higher the latency and the more opportunity there is for packet loss, so being more local has obvious advantages. Player-side, as infrastructure is upgraded more towards fibre, latency should also decrease there too.

But can any more be done? Gaikai's association with LG in getting their service available via Smart TVs may present further opportunities. Connect up an Xbox 360 to an HDTV and even in "Game" mode - which disables a lot of the set's lag-inducing post-processing - you can still be looking at latencies of 50ms just from beaming a digital signal from the console and getting it displayed on the screen. Even when measuring latencies on fast gaming PC monitors - with an advertised 8ms lag - the actual end-to-end measurements we're getting can be up to 33ms. Bearing in mind how much the Cloud services can achieve in the same time period, this is a remarkably inefficient and obviously another vector of attack that Gaikai could use in bridging the gap between console and cloud latency levels. If you're already integrated into the TV itself, why not try to make the most of it?

I would suggest that the kind of experience console users enjoy now with Modern Warfare 3 - in all its 60FPS, ultra-low latency glory - simply cannot be achieved with a cloud gaming service. But an additional 50-83ms gives us the same kind of response that we get from console FPS titles like Borderlands, Battlefield 3, Killzone 3 and Bulletstorm: all eminently playable, and none of which have been criticised for being too laggy. At the moment it's theory, but relentless march of technological innovation has a habit of turning this into palpable fact sooner rather than later.

It's almost three years now since I wrote why OnLive can't possibly work, and it's interesting to look back and see the points being made that were eventually born out (latency, picture quality) in subsequent analysis, but how fundamentally, despite the very significant, quantifiable drawbacks, it is actually playable.

In the here and now, we are being asked to take the hit in fidelity and response in favour of a more convenient experience. Going forward, can we both have our cake and eat it with a service that captures the essence of the modern day console experience, and perhaps even exceeds it by using more powerful PC hardware, server-side? With next-gen consoles the best part of two or even three years ahead of us, it'll be interesting to see if the Cloud-based engineering breakthroughs and infrastructure improvements can arrive before new local hardware gains the kind of critical mass Xbox 360 and PS3 enjoy now.

19 Comments

Bruce Everiss Marketing Consultant

1,692 594 0.4
Jagex have been doing this sort of stuff for over a decade with Runescape.
Optimising what is done by the client so as to minimise the amount of data transmission. And using very clever real time compression to further minimise the volume of bits fired down the wire.
More clever stuff with the camera and draw distance also played its part.

All this wasn't done to bring a console level experience. Rather it was done to enable it to work at all on the tediously slow interwebs we had way back in 2001 and also to enable owners of even below averagely powered PCs to enjoy the game.

One of the biggest latency problems is just how slow a domestic TV can be between receiving a signal and displaying it. They aren't designed to be fast. So the latency of the most commonly used display can often be several times the latency of the interwebs when playing an online game.

Edited 1 times. Last edit by Bruce Everiss on 8th February 2012 9:49am

Posted:2 years ago

#1
I suppose its a case of a gamer wanting to drive a reasonably fast road car, vs being a F1 pilot.
Inevitably, there can be advances in the future, but probably there will be sufficient capacity to cater to both.

This optimization , increase in tech and bandwidth issues all have to be significantly improved for the ability to tap into the market (UK especially so. we are nowhere near ready yet)

Posted:2 years ago

#2

Kingman Cheng Illustrator and Animator

955 184 0.2
I feel that in the UK we're not ready for something like this.

Posted:2 years ago

#3

Brian Smith Artist

197 88 0.4
My concern for this tech is that the companies looking to prosper from it will need the market to fund it before it's truly viable. I've given Onlive a go sofar on a 50mbs connection and impressive though it is to see it stream to your desktop, nothing was playable imo. Playable for free, yeah.... but would I pay for these, not a chance. My gut feeling is that most gamers will avoid until it's as good as that local experience. Can these companies prop up this tech until it's good enough is the question for me. Can they afford to just wait on the bandwidth catching up ?

Posted:2 years ago

#4
Well, why do you think its in their interest to bang on the drum?

Posted:2 years ago

#5

Brian Smith Artist

197 88 0.4
@Dr Chee Ming Wong - They'll need a bigger drum then. haha.

Posted:2 years ago

#6

John Donnelly Quality Assurance

313 38 0.1
There are many ways to reduce the data that has to go down thew wire.
Smart encoders to send only the changes from frame to frame would help with the display rates and quality.
There are ways to lower latency on the wire as well, between setting up the best interconnectors in the data center to the location.

You dont always need more bandwidth if you are not making the best use of your wire.

I just know that on my internet connection at 10mbs (max) that most of these services would be unplayable during the times I want to play which is more of an issue for me.
ISPs will traffic shape data anyway unless you are willing to pay for a 'gamer' package costing more

Posted:2 years ago

#7

Klaus Preisinger Freelance Writing

1,157 1,219 1.1
I am very surprised to see current cloud streaming operators encode the entire picture. There would be leaps of quality improvements, if the UI was rendered locally, while the cloud only streamed in the moving picture behind the UI.

h.264 wastes massive amounts of bandwidth, to express all those tiny details of the UI, the small text etc. To some extend the codec tries to separate UI from the moving parts. But the sheer amount of precise borders between unmoving UI and moving picture utterly ruins any gains.

Try it yourself. Encode a video once, then encode it again with one line of black pixels anywhere on the screen. At the bitrates cloud services have to operate, you will notice a drop in quality. Now imagine a one pixel line around every UI element. That is a massive amount of noise introduced.

If a game is optimized for cloud operators, it will not make such rookie mistakes. But it also won't be as cheap as slapping the PC version on some server.

Posted:2 years ago

#8

Neil Millstone Director, White Bat Games

32 12 0.4
My feeling is that OnLive and the rest are being very smart. They're jumping on this market now while they're the only ones doing it, so that once the general network performance improves to the point where it's more practical to more people, they will have all the experience and tech necessary to do well.

At the moment, I'm sure that they're not charging enough to cover their costs, but the cost always comes down. When they do, OnLive will be in a great position to capitalise on it.

Posted:2 years ago

#9

Vitalii Moskalets Game Designer, GameLoft

27 0 0.0
At least for now Cloud gaming must be at least SAME price as if I buy the game. Because when I buy it, I don't need to pay subscription to play single-player. If I need to buy game and then pay subscription to play through Cloud - then it MUST be EQUALLY IDENTICAL to what I have now on consoles and PC.

Another issue is multiplayer, It would be interesting to read some analysis on how multiplayer games goes at Cloud. Will latency become bigger? Or it will be really the same as single-player?

In general, the problem in fact is the money :) If I bought PS3, then I can play my game for several years (especially if it is multiplayer) without any subscription, with good quality. Will Cloud gaming be able to provide me such opportunity? To play games without subscription payment? I guess no, and that is why games will become more expensive through such service.
In terms of "but you will not need to buy expensive PC/Console":
- Console = 200-300$ For 6 years easily :) And then only pay for game
- PC = that can depend by region, and if not include monitor, then I think 1500-2000$ per 3 years easily ONLY if you pro gamer and want to play ALL latest games on 2500x1600 with 16x aliasing and so on. In general, if you buy top available tech now, it will easily work for 5 years, until you feel that new games start to work slowly with medium GFX.

Also, in the perfect situation for Cloud companies, you want to get all console players to your service, that means maybe 20 millions players, can start to play games. Load on servers? Ok, maybe locally it will not exceed 5-10 millions per datacenter, but anyway, are they calculate all tech costs? And we return to that it could be expensive and will increase the cost of game.

Interesting to see how they will develop all this, especially when there will be rivals, OnLive and Gaikai and all others will have separate datacenters I suppose, but player won't subscribe to all services, probably he will choose only 1

Posted:2 years ago

#10

Manoel Balbino Programmer, Playlore

15 3 0.2
@Klaus: the problem is that rendering the UI in the client would require some of the game code to execute in the client as well. This would greatly increase the system complexity and reduce portability, as well as requiring much more work to port the games (the cloud services are still several years away from getting exclusive games).

A more general solution would be to use two overlaid video streams: the in-game stream (business as usual) and the UI stream (using completely different codec settings). Developers would then only need to render the UI into a separate render target/display device instead of dealing with all the complications of sending executable code to the clients.

Posted:2 years ago

#11

Dominic Jakube Student

92 13 0.1
Well the 8th Gen consoles are allready on their way, so they have about 5-8 years to get their stuff togetheir.It's hard to imagine what the net infrastucture will be like then, I wouls say 90% of the western world would have 50Mbit fibre or faster accessable.But that 10% who don't have it available and the others who have no need for internet but might want to play an occasional game will still be a market.

In a way the streaming game services might always be chasing the dragon,more customers,more general net useage,more isp shaping/throttling,more detailed games.When they can handle 7 million call of duty players on release day the harware companies might play hardball and restict licsensing on their hardware.This could put the start ups out of bussiness if say Sony demand a timed streaming exclusive on PSN for a physical media release.

Cloud based gaming might be the future but it won't be On-live and Gaikai at the helm,it wiil be Micro-soft/Sony/Nintendo and maybe Apple or Google either through hardball tatics or buy out's.

Edited 1 times. Last edit by Dominic Jakube on 8th February 2012 6:00pm

Posted:2 years ago

#12

Curt Sampson Sofware Developer

596 360 0.6
I found the first sentence of the summary very interesting.

"In the here and now, we are being asked to take the hit in fidelity and response in favour of a more convenient experience."

Doesn't that exactly describe why so many of us play consoles on televisions rather than PCs?

That, more than anything else, makes me think that cloud gaming has a chance. (Though not with me, of course; I'm sticking with my console....)

Posted:2 years ago

#13

Felix Leyendecker Senior 3D Artist, Crytek

184 204 1.1
As a developer, the thought of cloud gaming becoming the dominant model sounds intriguing at first glance, because it could mean unlimited hardware power at your disposal. No more catering to console or low-spec PC hardware limitations.
You can't help but wonder what this would mean for the business side of things, though. With AAA devs one-upping each other constantly, development costs would quickly spiral out of control, and with it, development times. So it might be hard to justify a subscription-based model with release date stretched thin for every publisher except the very largest ones.

Posted:2 years ago

#14

Kieren Bloomfield Software Engineer, EA Sports

96 87 0.9
@Curt Agreed, all you have to do is hit a level of quality that is 'good enough' for the mainstream consumer. It happened with audio, the transition from vinyl to cd and then to mp3 was a drop in quality for the audiophiles but acceptable or unnoticeable to the average consumer. Another parallel can be made with the final output device; it doesn't matter how good the content is if the screen or speakers are only average at best.

The only challenge for me in this is the internet providers themselves. I have a fast enough connection for this already, but how quickly will I rip through my monthly usage allowance using services like this and what will happen to my connection speeds when everyone starts playing?

Posted:2 years ago

#15

Klaus Preisinger Freelance Writing

1,157 1,219 1.1
@Manoel
I agree with you, rendering the UI costs CPU time on the target device and requires an engine to receive and process the data. But the jump in quality is just too good to ignore this approach in the long run.

Layering two stream is problematic at best. Devices might struggle to decode two h264 streams at the same time, because they are not made for it. Codecs with alpha channel support aren't exactly what they were supposed to be a few years back. Worst case is the codec implementation on the target device screwing up and drawing giant black boxes.

Special cloud versions of games with support for an UI drawing engine which is part of the cloud gaming app running on the target device seems more reasonable to me.

Posted:2 years ago

#16

Kevin Patterson musician

187 103 0.6
As long as ISP's want to cap bandwith usage then I cannot see cloud gaming as the future but more as an option.
I prefer to have a console where I do not have to be online to play a game.

Posted:2 years ago

#17

Andrew Benton

11 0 0.0
Good read, loved everyone's comments too!

Posted:2 years ago

#18

Jeffrey Kesselman CTO, Nphos

112 0 0.0
Despite Onlive's claims, Einstein proved that instantaneous communication over distance is impossible.

Latency will always be an issue, as will bandwidth to a lesser degree.

With respect, Runescape and other MMOs are *not* "cloud" games. They are local games with a communication and back-end component. All render is done locally. Are user response is done locally in order to hide the inherent latencies.

There is a world of difference between the very practical art of online game architecture, social communication networks like Gface, and the pig-in-a-poke Onlive is selling of an effectively dumb client with all render and game logic at the server.

Edited 2 times. Last edit by Jeffrey Kesselman on 14th February 2012 5:46pm

Posted:2 years ago

#19

Login or register to post

Take part in the GamesIndustry community

Register now