Close
Report Comment to a Moderator Our Moderators review all comments for abusive and offensive language, and ensure comments are from Verified Users only.
Please report a comment only if you feel it requires our urgent attention.
I understand, report it. Cancel

Digital Foundry

Tech Focus: The battle against latency

Wed 06 Jul 2011 6:45am GMT / 2:45am EDT / 11:45pm PDT
Digital Foundry

Measuring input lag, and how devs have reduced its impact on the gameplay experience

An element we often discuss in Digital Foundry articles is controller lag - the delay between a player's pad input and the resultant action kicking off on-screen. It's fair to say that over the last few years, as displays and rendering technology have increased in sophistication, so the responsiveness of our games has diminished. This is not a good thing. Typically, the more responsive our games are, the more we feel connected to the experience, and the more immersive and satisfying the game feels to play.

Compare an old sprite-based Mega Drive shooting game played on a CRT TV of the era with one of today's FPS games on a flat panel display and the difference is quite remarkable. We have been conditioned to accept a growing level of latency - and it's exactly this kind of perception that makes Cloud gaming systems like OnLive and Gaikai work. The lag is definitely there, it's just that gamers have become conditioned to accept it. Even the design of video game controllers has added to the problem of less responsive games: PlayStation Move is a beautifully designed device, but it's still slower to track movements than a Dual Shock 3's analogue sticks, and the less said about Kinect's latency issues, the better.

However, there's undoubtedly an increasing realisation in the development community that understanding and lowering input lag leads to a better gameplay experience, and several studios have been in touch with us to discuss the methodology involved in measuring latency - as good a reason as any for this article.

Typically, the more responsive our games are, the more we feel connected to the experience, and the more immersive and satisfying the game feels to play.

While latency measurement can be carried out from within code, a growing number of developers are using outside measurement as a tool in identifying areas where controller response can be improved. The techniques for latency measurements were initially published by Neversoft's Mick West, who has written a couple of excellent articles for Gamasutra on the reasons why we have latency in games and indeed how to measure it - required reading.

West's technique is very simple: film the screen using a high-speed camera (he used a Canon Powershot point and shoot camera with a 60FPS video mode) but make sure that the controller and the display are in the same shot. You can then simply count the amount of frames between the moment a button is pressed and the resulting action occurring on-screen. At 60FPS, each frame occurs for 16.67ms of delay, so a delay of, say, six frames equates to 100ms of latency.

Complicating matters somewhat is the presence of latency in another area: the display itself. An old-skool CRT display produces an image free of lag. However, modern flatpanel screens all suffer from latency to some degree or another: indeed, the popular "game mode" found on many HDTVs is designed to turn off as much latency-inducing post-processing as possible, to give a faster, crisper response.

West's solution was ingenious. He hooked up a CRT to his source, and measured responsiveness of the PS3 XMB in order to produce a baseline measurement of 50ms - or three frames. He then repeated the same experiment on his LCD, with anything above three frames being a direct response of additional latency within the display itself. This extra lag can then be factored out of any additional testing you may undertake on your LCD or plasma display.

What is worth adding at this point is that further measurements may be required according to the resolution of the source. Feed a 720p image to a 1080p/1200p display and you can expect more lag compared to presenting it with a native "full HD" signal: the screen needs to add an additional processing element: scaling the 720p image to the native resolution of the display. On our "oldie but goodie" office Dell 2405FPW display, we found that the screen added an additional three frames or 50ms (!) of lag when given a 720p input, but "only" two when fed 1080p.

When putting together the original Digital Foundry article on console lag, we did come across a couple of problems in using Mick West's methodology. Firstly, filming both controller and screen simultaneously wasn't particularly comfortable, and secondly, there was an element of doubt about the exact point at which the button was pressed. As this is an exercise in measurement, the introduction of any uncertainty on the results wasn't particularly welcome, and the nature of filming both controller and screen at the same time could make extended filming sessions similarly imprecise: what if your hand moves out of shot, for example, or what if the camera auto-focuses out what you really need to see?

The issue was resolved by Call of Duty makers, Infinity Ward, who commissioned modsmith extraordinaire Benjamin Heckendorn to produce a custom Xbox 360 controller, connected to a board that lit LEDs that corresponded with playing inputs.

"We commissioned Ben to make us the light board after a programmer saw me spending a lot of time filming myself pressing buttons in front of a CRT to test input latency," Drew McCoy of Infinity Ward (now Respawn) told us during our initial experiments.

"He, being a programmer, was obviously frustrated that such an imprecise method was used to test something that he and the rest of the engineers here at Infinity Ward spend a great deal of time and energy on - reducing input latency."

In short, Mick West's methodology was improved by taking out the uncertainty of when the button was being pressed, giving a straight digital indication of the "zero frame" event, the point at which you start measuring. Now it's possible to simply film the screen and board in one easy shot and Heckendorn's custom controllers are available to anyone willing to buy them, with both Xbox 360 and the more difficult to make PlayStation 3 version both available.

The only way to improve the testing still further is to improve the sample rate of the camera itself. In line with West's suggestions we used a 60FPS video mode in a cheap Kodak Zi6 camera. However, choosing a faster camera would help ensure a better measurement, and there are a number of appropriate, cost-effective models out there: the Nikon Coolpix P100, Casio Exilim EX-FH100 and Fujifilm HS10 will all film 480p at 120FPS, for example, while the Casio ZR100 and ZR10 yield a 432x320 image at 240FPS. Switching to a faster camera would help issues whereby the LED on the controller monitor board activates later on in the exposure of a particular frame while filming.

So what do the results typically tell us about latency in console games? Well, there is a clear link between frame-rate and controller response. For 60FPS titles, in optimum conditions we typically see response at four frames, or 66ms, with titles such as Call of Duty 4, Guitar Hero and Ridge Racer 7 offering this exceptional level of response. However, games such as WipEout saw an increase to 83ms, while our experiments with LEGO Batman saw the game respond fairly sluggishly at well over 100ms.

For games that operate in the 30FPS range, latencies increase significantly. Up until recently, it was impossible to find any game that offered a sub-100ms response. Several titles, such as Unreal Tournament III hit 100ms, but several other Unreal Engine 3 games such as Mirror's Edge and Bulletstorm offered a 133ms latency between button press and on-screen response.

It's interesting to note that Bulletstorm received no criticism that we were aware of for its controller response, and yet Guerrilla Games' Killzone 2 did. We measured this with a baseline latency of 150ms - just one frame more than Bulletstorm. The question here is intriguing: is there a threshold at which latency becomes noticeable and detrimental to the gameplay experience?

Going forward, hammering down latency is going to become increasingly important if developers are looking for the best performance when readying a game for a service like OnLive.

It's worth bearing in mind that these measurements do not include the additional lag of the display. In the case of Killzone 2, married up to our Dell 2405FPW with its own three-frame lag at 720p, we would be seeing a minimum of 200ms - a fifth of a second. The issue with Killzone 2 is that the 150ms is just the baseline. Frame-rate drops can add two or three more frames to the delay, and it is perhaps this inconsistency in response that is one of the major problems: players cannot adjust to the controls because the level of response is not so consistent. Guerrilla Games patched the title, but the only real adjustments seemed to be in the "dead zones" of the analogue sticks rather than addressing the level of response itself.

Things changed radically with the release of Killzone 3, with our measurements indicating that the baseline 150ms latency had dropped down to 116ms - a substantial achievement.

"There were a lot of changes to all systems related to input, movement, gunplay and player response," Guerrilla's technical director Michiel van der Leeuw told us during an in-depth GamesIndustry.biz inquisition on the development of Killzone 3.

"We fixed a few bugs where we inadvertently added latency, but we also tightened up deadzones and the responsiveness of particle effects to show your bullet hit something. Anything that wouldn't change the weight, but improve the feel and control of the player we polished up."

The high-speed camera technique was also utilised by the tech team in helping to reduce unwanted lag.

"We did a lot of testing with high-speed cameras to verify our changes and see if we were still on track. We also added a system to the engine that would track an event through the engine from the initial button press as it would travel through code, all the way down to when the RSX would send it to the TV," van der Leeuw added.

"So you can see when the button was pressed, when the game code responded to it, when the bullet was fired, when the particle was spawned and so on. If the high-speed camera then indicated we had long latency or jitter, we could pin-point the exact problem."

An example of latency measurement in action. Here we're testing Criterion Games' Need for Speed: Hot Pursuit - a title that has been optimised for ultra-fast response times at 30FPS (console) and 60FPS (PC).

Criterion's superb, multi-award winning Need for Speed: Hot Pursuit managed to cut latencies to what we believe is an all-time low for a state-of-the-art current generation console game. Having worked in the 60Hz realm with all its previous Burnout titles, the developers were keen to make their first 30FPS game as responsive as possible, and this required an extensive reworking of the engine, binning the tradition approach of running game code and rendering on different threads and instead utilising a single thread instead.

The result is quite extraordinary. Criterion's last major game, Burnout Paradise, ran at 60FPS and offered a 66ms input lag. Need for Speed: Hot Pursuit, capped at 30FPS offers an 83ms response: just one additional frame of latency for a game running at half the frame-rate of its predecessor. The PC version was faster still, measured at an incredible 50ms of end-to-end lag.

"The way the architecture works is to run the game simulation internally at 60FPS, and it's polling the controller once for every simulation step so you get as up-to-date inputs as possible," Criterion tech mastermind, Alex Fry told us.

"The render code (building display lists for the GPU to consume the next frame) immediately follows the two controller poll/simulate loops, and then it waits for v-sync. Thus on CPU we get two 60FPS updates and one 30Hz render in a total of 33ms. When the CPU is done and v-sync is hit, the GPU kicks off and renders the scene while the next simulate/render frame happens in parallel on CPU. Once done the GPU also waits for v-sync (which also syncs it to the CPU), thus adding another 33ms. Then TV scan-out happens to get the final image to screen, which adds a final 16.6ms. On the PC version, it's obviously able to go to 60FPS so we don't clamp it to two simulate loops per render."

Going forward, hammering down latency is going to become increasingly important if developers are looking for the best performance when readying a game for a service like OnLive. The lower the baseline latency, the lower the impact of additional, completely unavoidable lag caused by video encoding, transmission of data and video decoding client-side. But just on a general level, the tighter controller response, typically the better the game experience becomes...

12 Comments

Neil Alphonso Lead Designer, Splash Damage Ltd

48 17 0.4
Excellent, excellent article!

Posted:3 years ago

#1

Lewis Brown Snr Sourcer/Recruiter, Electronic Arts

203 57 0.3
Great work, Ive played most of the Games discussed except Killzone 1 & 2, and did just notice a couple of times on Bulletstorm but not enough for me to find it annoying otherwise its not something I have ever noticed...

Posted:3 years ago

#2

Klaus Preisinger Freelance Writing

1,161 1,228 1.1
Try Super Meatboy with a laggy controller and you won't have a snowball's chance in hell. Same level with a different controller having less lag and the level suddenly is not a problem at all.

There really is something to those ultra-polling peripherals sold for the PC.

Posted:3 years ago

#3

John Donnelly Quality Assurance

313 38 0.1
This is another area where there is a level of cross over between games and what I do not.
In my case now its all network based latency but if cloud based gaming is to happen both network and input/display latency needs to be tackled.

Posted:3 years ago

#4

Justin King Creative Digital Producer, mattel

8 0 0.0
wait until services like onlive go mainstream, between datacaps and net latency issues, i don't think people will be focusing too much on the controller issues :o)

Posted:3 years ago

#5

Glen Cooley Community Representitive

1 0 0.0
It would be good to see this sort of analysis on sports games like FIFA or Madden where high fidelity of control and latency really do shape the experience.

Posted:3 years ago

#6

Jeffrey Kesselman CTO, Nphos

112 0 0.0
It is important to understand that lag and latency are not the same thing. Lag is perceptual, it is a description of a perception had by the user. Lag is complex and has many component causes. The only proper measure of lag is user experience.

Latency on the other hand is a physical measure of the time it takes between two events and can be measured. Communication latency is a reality on the internet and is not going away as long as Einstein's physics holds true. This is not to say that it could not be reduced with internet improvements but it can never be eliminated because instantaneous communication of information over a distance is impossible and limited by the speed of light.

MMORPGs and other well written internet games lessen lag within the confines of network latency by using perceptual tricks such as dead reckoning and correction smoothing. These are based deeply in the design of the individual game and thus unavailable to services like OnLive that attempt to take games not designed for such latencies and play them over the net.

Edited 3 times. Last edit by Jeffrey Kesselman on 6th July 2011 6:02pm

Posted:3 years ago

#7

Jeffrey Kesselman CTO, Nphos

112 0 0.0
@justin
"wait until services like onlive go mainstream"

Unfortunately we will have no light by which to measure these things, the sun having gone nova many years before...

Onlive is at best a fad. It is a bad idea that technically could never work well. The right answers are digital distribution and local execution. It is both the highest quality and most economically viable method for distributing content.

Edited 1 times. Last edit by Jeffrey Kesselman on 6th July 2011 5:59pm

Posted:3 years ago

#8

Andrew Ihegbu Studying Bsc Commercial Music, University of Westminster

469 178 0.4
I can see a small obsession with OnLive here and I'm not sure if it's within the industry or just this site but I am adamant that gaming through the clouds only real advantage is a higher level of DRM and whilst this was a good article that spoiled it for me.

Posted:3 years ago

#9

Tony Johns

520 12 0.0
When all the controllers went wireless, that is when the controller lag started to come in from what I saw.

I will be happy to go back to chords anyday but it all depends on what sort of game you are playing and how fast is the action and what is needed in order to overcome that lag.

Posted:3 years ago

#10

Mitchell Ribar Co-Founder & CEO, HeroPing

1 0 0.0
Latency is a topic very close to my heart. Critically, this article discusses only half the equation. Already raised in previous comments, but it is key to manage Internet latency too. Every game has online multiplay at some point, or all the time like MMOs, so managing both Internet latency and controller latency are the two-sides to make sure game experience is optimized.

Controller latency is a correctable known, but Internet latency is an unknown - a HUGE number of factors are in play. I'm biased, but I wish more developers would embed my company's platform into their games because at the end of the day, people will blame the developers for the laggy experience.

Key example: Do a search on the World of Warcraft forums for Internet latency based problems and feel the anger these gamers post with... it doesn't go to the ISPs... the burden lies with the developer.

Edited 1 times. Last edit by Mitchell Ribar on 8th July 2011 8:43am

Posted:3 years ago

#11
Great article. I was uninstered in the first page, but the second is above all my expectations.
Congrats.

Posted:3 years ago

#12

Login or register to post

Take part in the GamesIndustry community

Register now