Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Tech Focus: Motorstorm Apocalypse

From first impressions of the PS3 to stereoscopic 3D and 1080p

Digital FoundryDynamic destruction and deformation are key elements in Apocalypse - what were the key challenges here and how did you overcome them?
Matt Southern

From a production standpoint the key challenge was to define and create an entirely new team - the Events Team - who could deliver the quality animation and VFX solutions we needed in high volumes. Nick Sadler joined us from Studio Liverpool (who were invaluable throughout the project), to help run things. I'll let Nick elaborate...

Nick Sadler

Continual refinement of gameplay and visuals was invaluable in achieving final results, so it was key to have a process which allowed frequent reviews and frequent iteration.

The biggest issues faced in preparing the events were pretty fundamental design challenges: how would the event affect gameplay before, during and after it had triggered? The events were designed with fairness in mind from the beginning: they should be triggered far enough away so that the fastest vehicle with a clear run at full boost would experience as much of the event as possible and not be disadvantaged.

Fidelity and interaction will increase. Interaction with dynamic events will be a key evolution of the tech moving forwards.

Andy Seymour, lead technical artist Evolution Studios

No dynamic object could have sufficient mass to radically affect the gameplay negatively, or take out the player unfairly. We knew that the player could learn the motion of events, and how intrusive they would be onto the racing line, but wouldn't know if events would trigger: there's definitely a need for twitchy reaction and skill in adapting to the order of events.

We also had to make significant alterations to modelling/animation pipeline for interactive events. There was a good amount of 'chicken versus egg', when moving between interactive pre-vis-quality events and interactive production-quality artwork and collisions: approving gameplay if the interactive blockout might be different to the production version versus having artists commit to investing in a production-quality asset, without a blockout to prove its worth.

Another issue came about because of the sheer number of contributions necessary in preparing the artwork; from modelling, to animation, to VFX/particles, back to modelling, then to audio, etc. Getting the right person working on the right aspect of the event at the right time was initially a pain, but we were well-managed, and the team was amazing in producing 280+ events on a tight schedule.

Andy Seymour

Key Tech Art challenges focused on timing and triggering these events effectively. We developed a Timeline tool in order to direct these events outside of the art pipeline, and give control to the Designers and Events Team. In addition with a very linear pipeline it becomes very awkward to make large changes to geometry once animated. We developed scripted tools to enable the animators to overcome these inherent DCC limitations.

Digital FoundryRacing games generally take place in relatively static areas. How did the design team and the production process adapt to handle levels that can change so radically?
Nick Sadler

The Events were designed in fairly small brainstorming sessions, centred on playable 'blockout' versions of the tracks. We soon developed an intuition for where events would look their best: locations were chosen based on how visible a collapse, plane-crash or road-shear might be, but also on a base-level of how disruptive an event stood to be: We couldn't really make a natural bottleneck even narrower.

The timing and scheduling of events was handled exclusively by designers and had a randomness and flexibility built-in. For example, the larger-scale events will only trigger (in the single-player festival experience), if the player is in a qualifying position.

The technical worst-case scenario during development was a single-lap race where every event triggered and was visible while the entire pack of cars was bottlenecking around it. If we could make this work, in terms of gameplay and in terms of framerate, distributing that track's events over two, three, four, or five laps, would (technically) be more easily achieved and tuned.

Design-wise, significant challenges came from having to maintain the balance of all vehicle classes, as events could elongate or truncate the length of a lap, mid-race. One particular issue was opening and closing (or narrowing) routes at the right time. Open a route too early, and AI cars would plough into structures that hadn't yet moved out of the way even with a dynamic avoidance system implemented. Conversely, if a route was closed early, AI vehicles would avoid apparently safe routes, alerting players to imminent closure. It was of utmost importance to the player-immersion that the AI could make the decision at the same time the player does.

Digital FoundryWhat do you feel is the natural evolution of this kind of tech? Dynamically generated changes to the levels perhaps?
Andy Seymour

We've only touched the surface of what's possible. Fidelity and interaction will increase. Interaction with dynamic events will be a key evolution of the tech moving forwards.

Nick Sadler

For at least the short-term it will simply feature more in new titles if it's popular. Ideas like these are developments of older, similar ideas, carried to conclusions that might've not been seen yet so I think this kind of feature will be pushed until the feature either becomes too much or passé.

Longer-term I'd hope to see a more mature and symbiotic relationship between level/track-design and dynamic destruction, creating something that's even better than the sum of the parts.

Crashing helicopters, collapsing scenery... MotorStorm Apocalypse has some really impressive set-pieces, and Evolution employed a new Events Team to handle them.
Digital FoundryMore generally, how did the VFX pipeline evolve in other areas?
Andy Seymour

The VFX pipeline had to deal with much larger effects than we had ever attempted before. Therefore screen fill and overdraw were to become major issues for us. We had tech artists and VFX artists research and develop techniques to deal with dust clouds and explosions. We combined typical 'billboard' style particle effects with polygonal mesh based effects, and bespoke shaders with animated parameters.

Oli Wright

The rendering pipeline had a major overhaul in order to facilitate dynamic lighting. We also switched to a new shader authoring system developed by ATG that enabled our artists to easily author shaders using a tool that is affectionately known as the noodle editor. In previous MotorStorms, artists would request a shader - but the actual task of writing of the shader would fall to a coder.

We also massively beefed up our particle system this time round, improving the sorting and adding ribbons for missile trails. We've always used lower resolution buffers for our particle rendering, but this time round we added geometry aware up-sampling to virtually eliminate the haloing of particles around solid objects that you would sometimes see.

Then there's SSAO too. It runs at quite a low resolution, again with geometry aware up-sampling so hard edges are preserved.

Nick Sadler

We had to factor in the idea that every animated element required an effects-pass. Regardless of the visual effect that was necessary, it could only go in after production geometry and animation had been completed and approved.

The more significant difference between MotorStorm Apocalypse and older VFX pipelines was in how larger volumetric effects were achieved. When larger structures collapsed, we needed to be able to drive into the resulting dust clouds without losing framerate. We achieved this by having larger, animated geometry spheres with a dust/smoke shader with Z-fading at geometry intersections and a transparency fall-off based on viewing-angle. Smaller dust particles augmented the leading edge of the cloud, and faded fairly early.

The dust-sphere gave way to a fullscreen fog-pocket effect, with more easily rendered wisps which visually alluded to travelling through dense, localised fog, smoke, or dust. All transparencies were rendered in the half-res buffer, but the combination of the dust-spheres and small dust particles meant that the player could drive right up to, and into the cloud. Using large animated shapes, we could better-control the amount of over-draw and fill-rate.

Implementation of the MotorStorm Apocalypse VFX pipeline was initially tested in anger for the game's 'first playable' in November and December 2009. The practise was honed during the build-up to E3 and Cologne in 2010, and the momentum was maintained through alpha, beta and on to our GM submission. In most cases, the VFX task for each event would have a very short time to be implemented, reviewed, adjusted, and colour-corrected for the environment's lighting.

Richard Leadbetter avatar
Richard Leadbetter: Rich has been a games journalist since the days of 16-bit and specialises in technical analysis. He's commonly known around Eurogamer as the Blacksmith of the Future.