If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Tech Focus: MLAA heads for 360 and PC

How morphological anti-aliasing is transitioning across from PS3 to Xbox 360 and PC

One of the most important elements to image quality and stability in the modern age is the concept of anti-aliasing, which smoothes away jaggy edges and helps minimise distortion when rendering a high res image at a lower resolution.

The traditional solution is to employ multi-sample anti-aliasing, a hardware feature in all modern graphics accelerators, and of course an element of the Xenos and RSX GPU cores in today's HD consoles. However, despite excellent results (particularly in terms of sub-pixel detailing), it is a relatively expensive effect, heavy on RAM and bandwidth - aspects that are at a premium on the PlayStation 3 in particular. In terms of cross-platform development, the amount of console titles we've seen with anti-aliasing in effect on Xbox 360 but absent on PS3 is considerable.

The response from Sony's Advanced Technology Group (ATG) was remarkable. It took on research from Intel into morphological anti-aliasing, and created its own version of the tech which debuted in God of War III and has regularly featured in both first and third party games - an excellent example of how Sony rolls out technological innovations from its central HQ to all PlayStation developers.

MLAA is a post-process anti-aliasing technique that scans the framebuffer, attempting to pattern-match edges and applying a blur/filter, providing edge-smoothing that goes well beyond the traditional 2x MSAA and 4x MSAA we see in console titles. It's also very expensive from a computational perspective: it's believed that ATG's MLAA requires 3-4ms of rendering time spread across five SPUs.

It's not without its issues either – specifically sub-pixel detailing erroneously picked up as an edge but can be amplified and can actually exaggerate pixel-popping issues rather than minimising them.

MLAA exists now because of the limitations of current generation consoles and because of the performance speed-up with respect to MSAA.

Of course, ATG's tech is also exclusive to PlayStation 3, but in an age where console developers are looking to extract every last ounce of performance from the architecture, MLAA-style implementations have value on Xbox 360, and the case for post-process anti-aliasing for PC titles is growing too. While Sony got there first with its MLAA work, the basic concepts are hardly proprietary. Jorge Jimenez, Jose I. Echevarria and Diego Gutierrez are working on a GPU-based implementation that works very well indeed on Xbox 360 and PC.

The team not only spent time talking with Digital Foundry about their tech, but actually handed over demo code, giving us a chance to check out its results. Because MLAA is a post-process technology, analysing the image as a 2D object, this meant we could use the filter on our own lossless HDMI video captures from existing console titles, and test out the quality of the technique independently…

Bad Company 2, Borderlands, Castlevania and Enslaved are good examples of HD titles with no anti-aliasing. Here we have a direct A to B comparison of the original footage alongside the same video processed with Jimenez MLAA. Use the full-screen button for 720p resolution, or use the link below for a larger window.
Digital FoundryWhy MLAA? Why now? Is the trend in gaming development making traditional multi-sampling anti-aliasing too costly for the current console platforms? Why isn't MSAA suited for deferred rendering techniques?
Jorge Jimenez

Filter-based anti-aliasing in general appeared as a consequence of many factors. Speaking of previous graphics technology, the first that comes to our mind is the fact that it allows overcoming the limitations found on some platforms. For example, it's not possible to use MSAA in conjunction with multiple render targets (MRT) on DirectX 9, which is a feature required to output the input data used for post-processing effects like ambient occlusion or object motion blur. But the more important reason may be that deferred shading rules out MSAA on some platforms, as this technique requires the usage of MRT and the ability to read individual samples of an MSAAed buffer. Also, on the Xbox 360, anti-aliasing usually forces the usage of tiling, a technique used to render a frame in tiles, in order to fit the memory requirements of the eDRAM. Tiling forces re-transforming meshes that cross multiple tiles and also introduce some restrictions that may increase the complexity of the engine architecture.

In current PC (DX10+) and probably future console technology you can easily mix MRT with MSAA, so you no longer have problems generating the buffers that post-processing effects usually require. And you can also access individual samples. So, you may be thinking, what's the problem then? The first problem is the huge memory consumption: for 8x MSAA and a regular G-buffer you require 506MB just for generating the final frame (not counting models, textures, nor temporal framebuffers). The second one is that in every place where there is an edge, the pixel shader needs to be supersampled. In other words, by using 16x MSAA in a deferred engine, you have to calculate the color 16 times for each pixel on an edge, instead of once, which can lead to a huge performance penalty.

Battlefield: Bad Company 2 remains a state-of-the-art video game, and the lack of anti-aliasing is one of the few issues we had with the excellent image quality. It's interesting to see how Jimenez MLAA works on the image, especially in light of DICE's decision to use MLAA on the PS3 version of Battlefield 3. Click on the thumbnails for full 720p PNG, lossless quality.

From a quality point of view, the maximum MSAA sample count usually reachable in current generation of consoles is 2x, while filter-based approaches can exceed 16x in term of gradient steps. So, it’s a big quality upgrade on consoles. Furthermore, filter-based anti-aliasing allows developers to easily overcome the usual pre-tonemapping resolve problems found in HDR engines.

And last, but not least, filter-based anti-aliasing usually yields an important speed-up with respect to MSAA. For example, our GPU implementation (Jimenez's MLAA) runs 1180% faster than MSAA. So, these techniques are very appealing even to forward rendering approaches.

So, to summarise, MLAA (and post-processing filters in general) exist now because of the limitations of current generation of consoles and previous PC technology, and because of the performance speed-up with respect to MSAA. The widespread usage of deferred shading warrants its usage on the future, even on the case MSAA performance improves on the GPUs that are to come in the next years.

Author
Richard Leadbetter avatar

Richard Leadbetter

Technology Editor, Digital Foundry

Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.

Comments