Tech Focus: The battle against latency
Measuring input lag, and how devs have reduced its impact on the gameplay experience
An element we often discuss in Digital Foundry articles is controller lag - the delay between a player's pad input and the resultant action kicking off on-screen. It's fair to say that over the last few years, as displays and rendering technology have increased in sophistication, so the responsiveness of our games has diminished. This is not a good thing. Typically, the more responsive our games are, the more we feel connected to the experience, and the more immersive and satisfying the game feels to play.
Compare an old sprite-based Mega Drive shooting game played on a CRT TV of the era with one of today's FPS games on a flat panel display and the difference is quite remarkable. We have been conditioned to accept a growing level of latency - and it's exactly this kind of perception that makes Cloud gaming systems like OnLive and Gaikai work. The lag is definitely there, it's just that gamers have become conditioned to accept it. Even the design of video game controllers has added to the problem of less responsive games: PlayStation Move is a beautifully designed device, but it's still slower to track movements than a Dual Shock 3's analogue sticks, and the less said about Kinect's latency issues, the better.
However, there's undoubtedly an increasing realisation in the development community that understanding and lowering input lag leads to a better gameplay experience, and several studios have been in touch with us to discuss the methodology involved in measuring latency - as good a reason as any for this article.
Typically, the more responsive our games are, the more we feel connected to the experience, and the more immersive and satisfying the game feels to play.
While latency measurement can be carried out from within code, a growing number of developers are using outside measurement as a tool in identifying areas where controller response can be improved. The techniques for latency measurements were initially published by Neversoft's Mick West, who has written a couple of excellent articles for Gamasutra on the reasons why we have latency in games and indeed how to measure it - required reading.
West's technique is very simple: film the screen using a high-speed camera (he used a Canon Powershot point and shoot camera with a 60FPS video mode) but make sure that the controller and the display are in the same shot. You can then simply count the amount of frames between the moment a button is pressed and the resulting action occurring on-screen. At 60FPS, each frame occurs for 16.67ms of delay, so a delay of, say, six frames equates to 100ms of latency.
Complicating matters somewhat is the presence of latency in another area: the display itself. An old-skool CRT display produces an image free of lag. However, modern flatpanel screens all suffer from latency to some degree or another: indeed, the popular "game mode" found on many HDTVs is designed to turn off as much latency-inducing post-processing as possible, to give a faster, crisper response.
West's solution was ingenious. He hooked up a CRT to his source, and measured responsiveness of the PS3 XMB in order to produce a baseline measurement of 50ms - or three frames. He then repeated the same experiment on his LCD, with anything above three frames being a direct response of additional latency within the display itself. This extra lag can then be factored out of any additional testing you may undertake on your LCD or plasma display.
What is worth adding at this point is that further measurements may be required according to the resolution of the source. Feed a 720p image to a 1080p/1200p display and you can expect more lag compared to presenting it with a native "full HD" signal: the screen needs to add an additional processing element: scaling the 720p image to the native resolution of the display. On our "oldie but goodie" office Dell 2405FPW display, we found that the screen added an additional three frames or 50ms (!) of lag when given a 720p input, but "only" two when fed 1080p.
When putting together the original Digital Foundry article on console lag, we did come across a couple of problems in using Mick West's methodology. Firstly, filming both controller and screen simultaneously wasn't particularly comfortable, and secondly, there was an element of doubt about the exact point at which the button was pressed. As this is an exercise in measurement, the introduction of any uncertainty on the results wasn't particularly welcome, and the nature of filming both controller and screen at the same time could make extended filming sessions similarly imprecise: what if your hand moves out of shot, for example, or what if the camera auto-focuses out what you really need to see?
The issue was resolved by Call of Duty makers, Infinity Ward, who commissioned modsmith extraordinaire Benjamin Heckendorn to produce a custom Xbox 360 controller, connected to a board that lit LEDs that corresponded with playing inputs.
"We commissioned Ben to make us the light board after a programmer saw me spending a lot of time filming myself pressing buttons in front of a CRT to test input latency," Drew McCoy of Infinity Ward (now Respawn) told us during our initial experiments.
"He, being a programmer, was obviously frustrated that such an imprecise method was used to test something that he and the rest of the engineers here at Infinity Ward spend a great deal of time and energy on - reducing input latency."
In short, Mick West's methodology was improved by taking out the uncertainty of when the button was being pressed, giving a straight digital indication of the "zero frame" event, the point at which you start measuring. Now it's possible to simply film the screen and board in one easy shot and Heckendorn's custom controllers are available to anyone willing to buy them, with both Xbox 360 and the more difficult to make PlayStation 3 version both available.
The only way to improve the testing still further is to improve the sample rate of the camera itself. In line with West's suggestions we used a 60FPS video mode in a cheap Kodak Zi6 camera. However, choosing a faster camera would help ensure a better measurement, and there are a number of appropriate, cost-effective models out there: the Nikon Coolpix P100, Casio Exilim EX-FH100 and Fujifilm HS10 will all film 480p at 120FPS, for example, while the Casio ZR100 and ZR10 yield a 432x320 image at 240FPS. Switching to a faster camera would help issues whereby the LED on the controller monitor board activates later on in the exposure of a particular frame while filming.
So what do the results typically tell us about latency in console games? Well, there is a clear link between frame-rate and controller response. For 60FPS titles, in optimum conditions we typically see response at four frames, or 66ms, with titles such as Call of Duty 4, Guitar Hero and Ridge Racer 7 offering this exceptional level of response. However, games such as WipEout saw an increase to 83ms, while our experiments with LEGO Batman saw the game respond fairly sluggishly at well over 100ms.
For games that operate in the 30FPS range, latencies increase significantly. Up until recently, it was impossible to find any game that offered a sub-100ms response. Several titles, such as Unreal Tournament III hit 100ms, but several other Unreal Engine 3 games such as Mirror's Edge and Bulletstorm offered a 133ms latency between button press and on-screen response.
It's interesting to note that Bulletstorm received no criticism that we were aware of for its controller response, and yet Guerrilla Games' Killzone 2 did. We measured this with a baseline latency of 150ms - just one frame more than Bulletstorm. The question here is intriguing: is there a threshold at which latency becomes noticeable and detrimental to the gameplay experience?
Going forward, hammering down latency is going to become increasingly important if developers are looking for the best performance when readying a game for a service like OnLive.
It's worth bearing in mind that these measurements do not include the additional lag of the display. In the case of Killzone 2, married up to our Dell 2405FPW with its own three-frame lag at 720p, we would be seeing a minimum of 200ms - a fifth of a second. The issue with Killzone 2 is that the 150ms is just the baseline. Frame-rate drops can add two or three more frames to the delay, and it is perhaps this inconsistency in response that is one of the major problems: players cannot adjust to the controls because the level of response is not so consistent. Guerrilla Games patched the title, but the only real adjustments seemed to be in the "dead zones" of the analogue sticks rather than addressing the level of response itself.
Things changed radically with the release of Killzone 3, with our measurements indicating that the baseline 150ms latency had dropped down to 116ms - a substantial achievement.
"There were a lot of changes to all systems related to input, movement, gunplay and player response," Guerrilla's technical director Michiel van der Leeuw told us during an in-depth GamesIndustry.biz inquisition on the development of Killzone 3.
"We fixed a few bugs where we inadvertently added latency, but we also tightened up deadzones and the responsiveness of particle effects to show your bullet hit something. Anything that wouldn't change the weight, but improve the feel and control of the player we polished up."
The high-speed camera technique was also utilised by the tech team in helping to reduce unwanted lag.
"We did a lot of testing with high-speed cameras to verify our changes and see if we were still on track. We also added a system to the engine that would track an event through the engine from the initial button press as it would travel through code, all the way down to when the RSX would send it to the TV," van der Leeuw added.
"So you can see when the button was pressed, when the game code responded to it, when the bullet was fired, when the particle was spawned and so on. If the high-speed camera then indicated we had long latency or jitter, we could pin-point the exact problem."
Criterion's superb, multi-award winning Need for Speed: Hot Pursuit managed to cut latencies to what we believe is an all-time low for a state-of-the-art current generation console game. Having worked in the 60Hz realm with all its previous Burnout titles, the developers were keen to make their first 30FPS game as responsive as possible, and this required an extensive reworking of the engine, binning the tradition approach of running game code and rendering on different threads and instead utilising a single thread instead.
The result is quite extraordinary. Criterion's last major game, Burnout Paradise, ran at 60FPS and offered a 66ms input lag. Need for Speed: Hot Pursuit, capped at 30FPS offers an 83ms response: just one additional frame of latency for a game running at half the frame-rate of its predecessor. The PC version was faster still, measured at an incredible 50ms of end-to-end lag.
"The way the architecture works is to run the game simulation internally at 60FPS, and it's polling the controller once for every simulation step so you get as up-to-date inputs as possible," Criterion tech mastermind, Alex Fry told us.
"The render code (building display lists for the GPU to consume the next frame) immediately follows the two controller poll/simulate loops, and then it waits for v-sync. Thus on CPU we get two 60FPS updates and one 30Hz render in a total of 33ms. When the CPU is done and v-sync is hit, the GPU kicks off and renders the scene while the next simulate/render frame happens in parallel on CPU. Once done the GPU also waits for v-sync (which also syncs it to the CPU), thus adding another 33ms. Then TV scan-out happens to get the final image to screen, which adds a final 16.6ms. On the PC version, it's obviously able to go to 60FPS so we don't clamp it to two simulate loops per render."
Going forward, hammering down latency is going to become increasingly important if developers are looking for the best performance when readying a game for a service like OnLive. The lower the baseline latency, the lower the impact of additional, completely unavoidable lag caused by video encoding, transmission of data and video decoding client-side. But just on a general level, the tighter controller response, typically the better the game experience becomes...