Close
Report Comment to a Moderator Our Moderators review all comments for abusive and offensive language, and ensure comments are from Verified Users only.
Please report a comment only if you feel it requires our urgent attention.
I understand, report it. Cancel

Digital Foundry

Tech Focus: Game Graphics vs. Movies

Wed 11 Jan 2012 8:00am GMT / 3:00am EST / 12:00am PST
Digital Foundry

Does resolution really matter? Developers discuss bringing a filmic look to next-gen titles

When thoughts turn to next-gen console technology, we seek to quantify the leap forward with absolute metrics - and resolution inevitably gets prominence. Looking at the current generation of consoles, technical requirements for Xbox 360 games at launch (which were quickly overlooked) suggested that games should run at a minimum of native 720p with 2x multi-sampling anti-aliasing. A reasonable expectation for next-gen is full-on 1080p with the equivalent of 4x MSAA, but to what extent does resolution actually matter?

An interesting discussion kicked off on the blog of NVIDIA's Timothy Lottes recently, where the creator of FXAA (an anti-aliasing technique that intends to give games a more filmic look) compared in-game rendering at 1080p with the style of visuals we see from Blu-ray movies.

"The industry status quo is to push ultra-high display resolution, ultra-high texture resolution, and ultra sharpness," Lottes concluded.

Do 1080p games super-sample compared to Blu-ray movies? Is the current focus on high contrast, high detail artwork the right approach for a more filmic next-gen?

"In my opinion, a more interesting next-generation metric is can an engine on an ultra high-end PC rendering at 720p look as real as a DVD quality movie? Note, high-end PC at 720p can have upwards of a few 1000s of texture fetches and upwards of 100,000 flops per pixel per frame at 720p at 30Hz."

Comparing screengrabs of a game (Skyrim running with a super-sampled anti-aliasing hack) with the Robert Downey Jr Iron Man movie, the NVIDIA man reckons that even at native 1080p with no MSAA, game rendering is still effectively super-sampling compared to the quality we see in theatrical presentations, and maybe game developers could pursue a more filmic look using fewer pixels in concert with other processing techniques. Lottes noted that there is little or no single pixel-width detail in 1080p Blu-ray movies, as we can see in spades in ultra-precision PC presentation, suggesting that the same level of detail can be resolved in gaming without recourse to a 1080p framebuffer - or else utilising 1080p with a lot of filtering that gives the illusion of a lower resolution.

The notion was endorsed by many games developers, with DICE's rendering architect Johan Andersson saying that "It's not about the amount of pixels, it is about the quality of the pixels and how the overall (moving!) picture looks like. Less aliasing = less noise for your brain to interpret = more pleasing and easier to see visuals."

"Something else that bothers me in most games these days is how much contrast there is in the textures," added Prey 2 lead graphics programmer Brian Karis.

"Having physically based material guidelines help but the artists seem to try everything they can to create higher contrast. The result in my opinion is crunchy, noisy and often nasty looking images. I'd call the status quo ultra sharpness, ultra contrast."

A good example of the approach being described here can be seen in a great many Unreal Engine 3 titles - with the likes of the Gears of War games and Enslaved cramming a phenomenal amount of high frequency detail into a 720p framebuffer.

"We do what is essentially MSAA. Then we do a lens distortion that makes the image incredibly soft (amongst other blooms/blurs/etc). Softness/noise/grain is part of film and something we often embrace. Jaggies we avoid like the plague and thus we anti-alias the crap out of our images," added Pixar's Chris Horne, adding an interesting CG movie perspective to the discussion - computer generated animation is probably the closest equivalent gaming has in Hollywood.

"In the end it's still the same conclusion: games oversample vs film. I've always thought that film res was more than enough res. I don't know how you will get gamers to embrace a film aesthetic, but it shouldn't be impossible."

To achieve a resolution equivalent to today's full HD Blu-ray movies, it therefore follows that the processing requirements could fall quite dramatically from what was previously thought. You don't need MSAA, you don't need super-sampling. While the image may fall foul of pixel-counting in that it doesn't hit native 1080p, precious rendering resources can still be deployed elsewhere.

"A game could render at native 1080p resolution (with extra optional hardware AA or SSAA) then use a fast post effect filter (optionally combined with post AA to "estimate" shaded samples which don't physically exist) to apply a film-like depth of field and get a result similar to a 1080p BR film," Timothy Lottes added. "And that this process would have a filter kernel which would be at least two to three (of 1080p) pixels wide (and likely larger in practice to get a good window)."

The ambition of what is being suggested here shouldn't be understated. It would involve a lot of buy-in not just from the game-makers, but also from the audience themselves who have become conditioned to a particular aesthetic. The perception of video games graphics in the HD era is that there's a purity to the look, a certain pristine edge to the visuals that has only been challenged by a very small number of releases (Limbo and Killzone 2 are two examples that spring to mind).

Even in some existing console titles, there's already evidence that the focus on detail is not so important and that rendering resources can be better deployed elsewhere

To truly work it would also need to be accompanied by technologies that make the most of the "freed up" processing budget. Lighting technologies in particular would need to make a step-up, but other elements such as animation and direction would also need to match. To get the filmic aesthetic, it's as much about how a game moves as well as how it looks.

However, in terms of the quality vs quantity argument, there is already a range of evidence within the existing console generation that the focus on detail is not so important and that rendering resources can be better deployed elsewhere. Call of Duty is perhaps the most spectacular example, if not quite along the lines being suggested by Timothy Lottes. It's a fact that the biggest example of the triple-A philosophy does not render at a recognised high definition resolution: a typical 720p game offers an additional 50 per cent of resolution over the 1024x600 framebuffer utilised in Modern Warfare 3 with the developers using the resources available to operate at a target 60 frames per second.

Other examples do tie in more closely with the ethos being suggested by Lottes: that it's not the pixel count that's important to image quality rather than the way those pixels are used. Namco Bandai's Japanese teams have experimented a great deal with resolution on current generation systems, most notably in Tekken 6. Without motion blur active, the Xbox 360 version renders natively at 1365x768, downscaling to 720p - using super-sampling to give some anti-aliasing. However - perhaps surprisingly - with motion blur active, resolution drops to 1024x576, but the game measurably looks better. Not only does the blur add more realism to the movement, but more detail appears to be resolved in the textures even though base resolution is effectively halved.

Discussing Tekken 6 in particular, RedLynx's Sebastian Aaltonen - the main technical mind behind the brilliant Trials HD - discusses how the lower resolution mode offers up system resources for other forms of graphical processing:

"Both configurations fit well inside the 10MB eDRAM. The 1024x576 is kind of a strange choice, as it's only around half the pixels of the 1365x768 and the cost of the blur filter comes nowhere close to the performance gained from the resolution decrease, and they are not eDRAM limited either," Aaltonen observes.

"The resolution reduction itself is not something I consider strange, but a reduction this large means they have something else going on than just the motion blur. The better texture detail you are seeing could mean they have enabled anisotropic filtering for the lower resolution."

What Timothy Lottes is suggesting is something a whole lot more ambitious - a new approach to rendering from the ground upwards, encompassing both the engine and the creation of the core art assets.

Alan Wake is an excellent current gen example of how pixel count has been reduced with processing resources directed elsewhere. Despite a low 960x544 resolution, the game looks beautiful with very few of the visual artifacts associated with 'sub-HD' gaming.

A more dramatic current generation example would be Remedy's Alan Wake. The game actually operates with an even bigger resolution penalty than Tekken 6, with a native 960x544 framebuffer - but the combination of 4x MSAA, phenomenal lighting effects and post-processing work ensures that the game does not suffer for it: aliasing and the dreaded "jaggies" are not especially an issue (though screen-tear definitely is). Remedy itself says that Timothy Lottes' FXAA is utilised in the American Nightmare Xbox Live Arcade sequel, most likely in place of the 4x MSAA of the original - so it will be interesting to see if resolution is increased, or if the processing resources are instead used on cleaning up the intrusive lack of v-sync.

As game developers transition across to a new breed of architecture, exciting possibilities open up - the ability to present gameplay in ways we've never seen before. There's been a lot of discussion about how to make the next-gen matter - this is just one discussion, one possibility. Developers like Crytek have already indicated in their SIGGRAPH and GDC presentations that the ability to innovate with new rendering paradigms is limited on current generation platforms. New hardware means new opportunities, and the notion of key developers, artists and decision makers already discussing the possibilities freely and publicly is another credit to the openness and the collaborative spirit that flows through the video game industry.

16 Comments

Graham McIntyre Software Engineer, Games Warehouse

3 0 0.0
Great article

Posted:2 years ago

#1

Drew Northcott Senior Artist

8 2 0.3
So it's not just me that thinks that Skyrim looks horrible?

Posted:2 years ago

#2

Mihai Cozma Indie Games Developer

124 34 0.3
Great article, with one cons here. Games need to have more contrast and be more sharper than movies, because of what you need to see in a game. I recently played Red Dead Redemption, and I had to move closer to the TV display to see enemies in the background so I can hit them properly, while at the same distance I could watch Blu-Ray movies just fine. Watching a game is not the same as watching a movie, often very little parts on the screen need your attention so they need to pop out, while on movies usually the thing that needs your attention is covering a big part of the screen.

Posted:2 years ago

#3
I agree with Mihai's statement -Joel Payne http://www.DigitalBacklot.com

Posted:2 years ago

#4

Brian Smith Artist

196 85 0.4
Good article. I'd like to see more industry speculation on the phrase 'making the next gen matter'. It seems to me that we're going to see less of a contrast between this and the next gen than we've ever seen before. Yeah, under the bonnet we'll have specs that suggest a similar hike in processing power as usual but it's going to translate to much more subtle things than it has in the past. Is it going to wow the public and not just the tech-heads.... I'm not so sure. Could be a bumpy ride.

Posted:2 years ago

#5

Dustin Sparks Interactive Developer / Gaming Blogger

13 0 0.0
@Mihai Cozma I agree that in order to make it feel like a game smaller things need to pop better. However, I don't think this entirely eliminates the film aesthetic goal. There may be a bit of rethinking the way the game mechanics work but they would have decades of film and camera tricks to take advantage of in order to reach the same goal.

Posted:2 years ago

#6

Mihai Cozma Indie Games Developer

124 34 0.3
@Dustin - Indeed aesthetic is always welcome, as long as the gameplay is not hindered by it, and it all comes down to the developer in the end as you said.

Posted:2 years ago

#7

Rui Martins Senior Software Developer

11 1 0.1
Don't forget the dreaded reviews, that will take screenshots and compare them by pixels.
The audience, must first be educated, to understand the difference, or they will revert to the old saying "more is better".

What I think is lacking in current games is good gameplay. There is to much concern with visuals, physics and similar, and gameplay is somewhat left behind.
This industry needs more artists, in the true sense of the word, in order to balance the huge tendency that it already has to the technological aspects of a game, and I say this, while I'm a tech guy myself.

A good game brings balance to all the parts involved, like a good film does (plot, visuals, photography, music, ambience, etc...)

Posted:2 years ago

#8

Jonathan Cooper Animator, Naughty Dog

6 9 1.5
I'd have a locked 30 (or even 24) fps framerate with consistent motion-blur as a necessity. The pursuit of 50+ frames per second is counter-productive to the aesthetic compared to film, and is beneficial only in high-frequency games like driving or multiplayer FPS.

Posted:2 years ago

#9

Greg Wilcox Creator, Destroy All Fanboys!

2,193 1,170 0.5
^+1 Jonathan, although TOO much motion blur can be a detriment in some cases where it's overdone and combines with depth of field blurring. At least to my poor old eyeballs...

Posted:2 years ago

#10

Mihai Cozma Indie Games Developer

124 34 0.3
Depth of field blurring is one of the most unrealistic effects added in games. That is, because our eyes already does it once. So if I look at a big 24" monitor (or bigger) from up close, while playing a game, my eyes already blur the image found on the corners and edges of the screen. Why there must be an additional blur, I don't understand. Switching focus from one object to another in real world is very very fast, while doing it with hands (as this is how it is calculated in games) it is not that fast. In movies it is a great affect as it is a tool for the director to emphasize certain parts of the screen, but in games its just plain bad.

Motion blur is great as long as it is not overdone. However 24fps can be tricky, because in a movie they have complete control over the camera, and fast action sequences are filmed with at a higher fps rate (you can feel transitions between the two) while in games the player controls the camera, and fast rotating all around at 24fps doesn't working neither in movies nor in games.

Posted:2 years ago

#11

Tim Carter Designer - Writer - Producer

578 322 0.6
If you want to bring a filmic "look" to games, you have to put the keys of the car in the hands of creative.

The "filmic look" was created by talent - not suits.

Posted:2 years ago

#12

Christopher McCraken CEO/Production Director, Double Cluepon Software

111 257 2.3
If only this much effort was put into actual game play... and game play innovation.

Posted:2 years ago

#13

Greg Wilcox Creator, Destroy All Fanboys!

2,193 1,170 0.5
I think more devs just need to study better movies and maybe go for cinematography over trying to "be like Avatar" (ugh). I'd rather play a game that has the visual/emotional impact of Powell/Pressbuger's A Matter of Life and Death or Metropolis than something that's eye candy packed with overuse of lighting and other effects that just makes me nauseous after two minutes of explosions and screaming...

Posted:2 years ago

#14

Minjie Wu Directer, Ubisoft Montreal

1 0 0.0
agree

Posted:2 years ago

#15

Heinz Schuller Art Director / Artist

15 22 1.5
I've always found it somewhat strange that in games, image presentation qualities like contrast and sharpness are often thought of as constants. Some publishers really beat art directors on the head with "image quality" as referenced by how 'clear' or 'sharp' the game looks in any screenshot, that high-contrast is the sought-after goal.

But in film, contrast & affinity are part of a set of tools to better support the emotional narrative of a given scene (along with the chaos v. order of lines, color choices, etc.). I look forward to the era where we're not so much judged on how well we can "see all the money on-screen", but rather whether how effectively we presented the visuals in support of the story/action.

Posted:2 years ago

#16

Login or register to post

Take part in the GamesIndustry community

Register now