In this deep-dive tech interview, Digital Foundry quizzes the developers at Evolution Studios on a wide range of topics, from its first impressions of the PS3 hardware back in the MotorStorm 1 period, through to tech enhancements made to the sequels, plus the introduction of set-piece Events in Apocalypse - and of course the studios' firm commitment to stereoscopic 3D. We also get some answers about the game's support for 1080p, a feat that hasn't been especially prominent in the game's marketing to date.
The depth and level of insight provided by the studio's responses is only made possible by the range of development talent that was drafted in to answer our questions, so many thanks to each of the following Evolution staff members who took part.
- Matt Southern, Game Director
- Andy Seymour: Lead Technical Artist
- Simon O'Brien: Art Director
- Nick Sadler: Lead Artist: Events
- Oli Wright: Lead Graphics Programmer
- Neil Massam: Vehicle Lead
- Dave Kirk: Physics Programmer
- Dave Hewitt: Online Programmer
Q:Evolution has been at the forefront of PlayStation 3 technology since before the Sony acquisition and MotorStorm was one of the most technologically outstanding games of the launch period. You seemed to get more out of the console at the time - how did you prepare for development on PS3, and what were your first thoughts on the hardware?
Oli Wright:We were genuinely excited by the hardware. The SPUs in particular were almost like an entirely new class of processor, like the PlayStation 2's vector units, but on steroids - and you could program them in C++. The GPU was also a big shift for us in that it was also fully programmable. I remember we had an early PC build of MotorStorm (before we had devkits), and it ran at something like five frames a second. Just fast enough to give a sniff of gameplay, and we had a crowd of people round the PC, all wanting a go.
Andy Seymour:From an art perspective there were two major challenges. Firstly we knew that we needed to move from our in-house world editor to a DCC in order to craft the worlds more artistically. Prior to MotorStorm, Evolution Studios developed the World Rally Championship series of PlayStation 2 games with hundreds of unique stages, and we knew we'd need trade quantity for fidelity in order to show off PlayStation 3 visuals for launch.
Secondly we were aware that the advent of next-gen techniques such as shader code and normal mapping were going to be a training challenge for the art team. So our render coders taught our key artists the basics of cgfx vertex and fragment code. This was made easier by a very early visual node-based shader authoring package called RTZen. Tech Artists could use it to prototype shaders and see the results instantly, and then analyse the shader code that was generated to understand the techniques in play.
We saw the MotorStorm concept as an ideal opportunity to set us apart from the competition by overcoming one of the genre's most difficult technical hurdles.
Neil Massam, vehicle lead, Evolution Studios
Neil Massam:The high risks associated with balancing multiple vehicle types racing head-to-head within the same game, especially on a launch title, were considerable. But we saw the MotorStorm concept as an ideal opportunity to set us apart from the competition by overcoming one of the genre's most difficult technical hurdles. The successful implementation turned out to be one of the games most defining features and achievements.
Our vehicle rigging and suspension system had to cater for this broad range, many with exposed and highly visible suspension. We settled for a technically complex real-time physics based suspension system to increase the level of realism. Havok physics and driver ragdoll tech were also key ingredients for the vehicles in MotorStorm and added an additional level of believability on the new PlayStation 3 hardware. Components that fell off the vehicles or hinged now looked convincing and the drivers being flung from their vehicles added further spectacle to crashes.
Q:Looking back on the first MotorStorm, what was your overall assessment on what you did right and what could have been improved from a technological standpoint, and how did this filter into your work with Pacific Rift?
Oli Wright:At the time I think we thought we'd done a good job. We were pretty happy with the mud effects for example. At the time I think we thought we'd done a good job and we were pushing the RSX hard, although we knew we could do a lot more with the SPUs. Looking back now, there isn't much that hasn't moved on. In MotorStorm we didn't have a gamma correct rendering pipeline, for example. We fixed that in Pacific Rift very early on in development. Other significant changes for Pacific Rift from a graphics point of view included a complete re-architecting of the shaders and lighting model - we moved from an ad-hoc Phong lighting model to something that was 'nearly Cook-Torrance'.
Andy Seymour:Visually we were very pleased with the results of MotorStorm. Dynamic deformation of mud was a key technology during MotorStorm's development, but one that in hindsight had possibly too much focus. It was technically very impressive, something that we couldn't have done on the PlayStation 2, but in as far as adding to the gameplay experience, it was limited. We tried to push the PlayStation 3 as far as we could during our early exposure to it, and in doing so we learnt that visually often the old techniques were still the best.
We attempted to minimise baked lighting in our diffuse textures; relying heavily on our lighting engine to create a realistic-looking environment. In doing so we slowly learnt that there was still a place for photos on polygons. It simply looked better and was cheaper to render. This filtered through to Pacific Rift, prioritising our GPU time where appropriate, on high risk areas such as water.
Q:From a conceptual angle, MotorStorm Apocalypse is some way removed from both MotorStorm and Pacific Rift. What was the overall thinking behind this new approach?
Matt Southern:As kids I think a lot of us were happy for video game sequels to offer minor iteration and polish, but as the medium has grown more significant differences are rightly expected. We felt that interest in MotorStorm would wane if we didn't treat the franchise with a healthy disrespect, whilst still respecting the original audience and staying faithful to our DNA.
We felt that interest in MotorStorm would wane if we didn't treat the franchise with a healthy disrespect.
Matt Southern, game director, Evolution Studios
We also spotted a decline in interest in the entire genre of racing after Pacific Rift, which sold really well. We looked to the genres that were really gaining momentum for our inspiration: shooters, action adventures.
Andy Seymour:It's all too easy to find yourselves treading water. We needed to push ourselves beyond our comfort levels in order to keep the franchise invigorated.
Q:How much in common does Apocalypse have from a technological standpoint with your previous games? Did this whole new approach require a significant re-think of your existing engine?
Andy Seymour:Apocalypse has lots in common technologically with our previous games. Unlike our approach to the franchise, we try and make incremental improvements to our technology rather than reinventing the wheel. For example on the first MotorStorm our VFX pipeline was very hard-coded. On Pacific Rift we developed an in-house editor for the artists to define VFX, and this was taken through to near breaking-point on Apocalypse. Our animation pipeline, however, had to be completely overhauled to deal with the massive events in Apocalypse and the engine was reworked to deal with new challenges such as multiple dynamic lights.
Oli Wright:Evolution by name, evolution by nature. We were partly ready for large-scale dynamic environments already. Our object culling approach, for example, has always used occlusion objects rather than pre-computed visibility. This system just needed iterating to handle moving occlusion objects, and we also added a dynamic portal system to make our indoor sections more efficient.
Q:Dynamic destruction and deformation are key elements in Apocalypse - what were the key challenges here and how did you overcome them?
Matt Southern:From a production standpoint the key challenge was to define and create an entirely new team - the Events Team - who could deliver the quality animation and VFX solutions we needed in high volumes. Nick Sadler joined us from Studio Liverpool (who were invaluable throughout the project), to help run things. I'll let Nick elaborate...
Nick Sadler:Continual refinement of gameplay and visuals was invaluable in achieving final results, so it was key to have a process which allowed frequent reviews and frequent iteration.
The biggest issues faced in preparing the events were pretty fundamental design challenges: how would the event affect gameplay before, during and after it had triggered? The events were designed with fairness in mind from the beginning: they should be triggered far enough away so that the fastest vehicle with a clear run at full boost would experience as much of the event as possible and not be disadvantaged.
Fidelity and interaction will increase. Interaction with dynamic events will be a key evolution of the tech moving forwards.
Andy Seymour, lead technical artist Evolution Studios
No dynamic object could have sufficient mass to radically affect the gameplay negatively, or take out the player unfairly. We knew that the player could learn the motion of events, and how intrusive they would be onto the racing line, but wouldn't know if events would trigger: there's definitely a need for twitchy reaction and skill in adapting to the order of events.
We also had to make significant alterations to modelling/animation pipeline for interactive events. There was a good amount of 'chicken versus egg', when moving between interactive pre-vis-quality events and interactive production-quality artwork and collisions: approving gameplay if the interactive blockout might be different to the production version versus having artists commit to investing in a production-quality asset, without a blockout to prove its worth.
Another issue came about because of the sheer number of contributions necessary in preparing the artwork; from modelling, to animation, to VFX/particles, back to modelling, then to audio, etc. Getting the right person working on the right aspect of the event at the right time was initially a pain, but we were well-managed, and the team was amazing in producing 280+ events on a tight schedule.
Andy Seymour:Key Tech Art challenges focused on timing and triggering these events effectively. We developed a Timeline tool in order to direct these events outside of the art pipeline, and give control to the Designers and Events Team. In addition with a very linear pipeline it becomes very awkward to make large changes to geometry once animated. We developed scripted tools to enable the animators to overcome these inherent DCC limitations.
Q:Racing games generally take place in relatively static areas. How did the design team and the production process adapt to handle levels that can change so radically?
Nick Sadler:The Events were designed in fairly small brainstorming sessions, centred on playable 'blockout' versions of the tracks. We soon developed an intuition for where events would look their best: locations were chosen based on how visible a collapse, plane-crash or road-shear might be, but also on a base-level of how disruptive an event stood to be: We couldn't really make a natural bottleneck even narrower.
The timing and scheduling of events was handled exclusively by designers and had a randomness and flexibility built-in. For example, the larger-scale events will only trigger (in the single-player festival experience), if the player is in a qualifying position.
The technical worst-case scenario during development was a single-lap race where every event triggered and was visible while the entire pack of cars was bottlenecking around it. If we could make this work, in terms of gameplay and in terms of framerate, distributing that track's events over two, three, four, or five laps, would (technically) be more easily achieved and tuned.
Design-wise, significant challenges came from having to maintain the balance of all vehicle classes, as events could elongate or truncate the length of a lap, mid-race. One particular issue was opening and closing (or narrowing) routes at the right time. Open a route too early, and AI cars would plough into structures that hadn't yet moved out of the way even with a dynamic avoidance system implemented. Conversely, if a route was closed early, AI vehicles would avoid apparently safe routes, alerting players to imminent closure. It was of utmost importance to the player-immersion that the AI could make the decision at the same time the player does.
Q:What do you feel is the natural evolution of this kind of tech? Dynamically generated changes to the levels perhaps?
Andy Seymour:We've only touched the surface of what's possible. Fidelity and interaction will increase. Interaction with dynamic events will be a key evolution of the tech moving forwards.
Nick Sadler:For at least the short-term it will simply feature more in new titles if it's popular. Ideas like these are developments of older, similar ideas, carried to conclusions that might've not been seen yet so I think this kind of feature will be pushed until the feature either becomes too much or passť.
Longer-term I'd hope to see a more mature and symbiotic relationship between level/track-design and dynamic destruction, creating something that's even better than the sum of the parts.
Q:More generally, how did the VFX pipeline evolve in other areas?
Andy Seymour:The VFX pipeline had to deal with much larger effects than we had ever attempted before. Therefore screen fill and overdraw were to become major issues for us. We had tech artists and VFX artists research and develop techniques to deal with dust clouds and explosions. We combined typical 'billboard' style particle effects with polygonal mesh based effects, and bespoke shaders with animated parameters.
Oli Wright:The rendering pipeline had a major overhaul in order to facilitate dynamic lighting. We also switched to a new shader authoring system developed by ATG that enabled our artists to easily author shaders using a tool that is affectionately known as the noodle editor. In previous MotorStorms, artists would request a shader - but the actual task of writing of the shader would fall to a coder.
We also massively beefed up our particle system this time round, improving the sorting and adding ribbons for missile trails. We've always used lower resolution buffers for our particle rendering, but this time round we added geometry aware up-sampling to virtually eliminate the haloing of particles around solid objects that you would sometimes see.
Then there's SSAO too. It runs at quite a low resolution, again with geometry aware up-sampling so hard edges are preserved.
Nick Sadler:We had to factor in the idea that every animated element required an effects-pass. Regardless of the visual effect that was necessary, it could only go in after production geometry and animation had been completed and approved.
The more significant difference between MotorStorm Apocalypse and older VFX pipelines was in how larger volumetric effects were achieved. When larger structures collapsed, we needed to be able to drive into the resulting dust clouds without losing framerate. We achieved this by having larger, animated geometry spheres with a dust/smoke shader with Z-fading at geometry intersections and a transparency fall-off based on viewing-angle. Smaller dust particles augmented the leading edge of the cloud, and faded fairly early.
The dust-sphere gave way to a fullscreen fog-pocket effect, with more easily rendered wisps which visually alluded to travelling through dense, localised fog, smoke, or dust. All transparencies were rendered in the half-res buffer, but the combination of the dust-spheres and small dust particles meant that the player could drive right up to, and into the cloud. Using large animated shapes, we could better-control the amount of over-draw and fill-rate.
Implementation of the MotorStorm Apocalypse VFX pipeline was initially tested in anger for the game's 'first playable' in November and December 2009. The practise was honed during the build-up to E3 and Cologne in 2010, and the momentum was maintained through alpha, beta and on to our GM submission. In most cases, the VFX task for each event would have a very short time to be implemented, reviewed, adjusted, and colour-corrected for the environment's lighting.
Q:Motorstorm has traditionally been quite remarkable in terms of the amount of vehicles you run, along with their diversity. How do you render so many vehicles at any one time? Are dynamic LODs (levels of detail) in play?
Andy Seymour:Yes. We use LODs based on average polygon size to screen area. We give the artists bias controls to fine tune the results. We also use shader LODs to help further, rejecting shader code that isn't required at distance.
Oli Wright:Drawing lots of tiny polygons is really inefficient. Our LOD algorithm looks at the average polygon size in a mesh, and then looks at how big that average polygon would be when projected onto the screen. It then chooses an appropriate LOD based on that. It does a pretty good job overall and largely automates the process of determining when LOD switching should take place. It doesn't always work though, so the artists always have the option to override it.
We had to dig much deeper to find solutions to help improve frame rate especially on some of the animated event and VFX hungry tracks.
Neil Massam, vehicle lead, Evolution Studios
Neil Massam:Our vehicles have to adhere to very strict polygon, texture and memory budgets which are set at different levels for each vehicle class. The vehicles must also pass stringent validation checks prior to export to ensure that they are free from any technical defects and are compliant with the game and its enforced budgets.
LOD meshes are manually created for all vehicles and used in conjunction with material LODs which we are able to dynamically adjust and use to balance the frame rate in game depending on distance or vehicle usage.
LOD bias values are also assigned to various part groups of the vehicles to force them to switch LODs earlier or later depending on their relative size, ie. technical components, bodywork etc. The vehicles also use a shared pool of textures, and library of preset materials which help us to manage and reduce the overall memory footprint of a typical vehicle.
We had to dig much deeper on MotorStorm Apocalypse due to the performance demands of the game and were forced to find solutions to help improve frame rate especially on some of the animated event and VFX hungry tracks.
We decided to introduce occlusion shapes on the vehicles for the first time which were used to occlude components within the vehicle that would normally be hidden from view but still rendered i.e. engine, suspension components etc. Vehicle shadow proxy models (lower poly meshes) were also created to reduce the cost of rendering real-time shadows.
Finally we had in-game GUI performance statistics to help us quickly assess the GPU and CPU cost of the vehicles or worlds that were proving expensive to render, which helped us to quickly identify and fix any problems.
Q:Weather effects are a new addition to MotorStorm. What did you set out to achieve with this, how is gameplay affected by changing conditions and what were the technical challenges involved?
Matt Southern:We had a lot of requests to add weather effects, and decided to focus them on a couple of tracks to emphasise the effects. Gameplay-wise they add a force to the vehicles, albeit very subtle (we found it could be really frustrating), and they also mean the engine continually cools - which in gameplay terms means you almost constantly have the ability to boost. This means the tracks are driven at notable speed and increase the sense of mania.
Andy Seymour:Our weather effects are highly mood affecting and visual. We set out to achieve ferocity in our weather effects to truly immerse the gamer. Technically this was challenging because we were already maxing out the GPU, so we had to work on shader tricks to give the impression of storms.
Q:Let's talk for a moment about physics - we understand you worked with Havok pretty closely in moving physics across to the SPUs with Pacific Rift. Did this tech remain constant or was it improved still further for Apocalypse?
Dave Kirk:Yes, we've worked closely with Havok since the first MotorStorm. On Pacific Rift there were significant performance gains over previous versions thanks mainly to increased SPU usage, so by the time we were working on Apocalypse the big performance gains had already been made, however the lessons we learned from the early products enabled us to really streamline our code. This meant we could squeeze even more tech in, such as our groups of NPCs, as well as significantly ramping up the amount of dynamic objects and destruction.
Q:There have been several different approaches to HDR lighting ranging from logluv to RGBM - do you operate with HDR in Apocalypse, and if so, what format did you go with?
Oli Wright:The exposure value for the frame is pushed forwards from the results of the previous frame into our material rendering pass, so the 0 to 1 range of values that we get out of that pass is 'post exposure'. Our material pass outputs two LDR render targets though. The first is the regular 0 to 1 range. The second is the same output divided by 32. So the second render target covers a much wider range of light intensities. This buffer isn't seen directly, but it's used as the key for our bloom, and it's used to calculate the exposure value for the next frame.
We know how common it is to accuse 3D of being a gimmick, and when you make games you have the golden opportunity to disprove this by affecting gameplay.
Matt Southern, game director, Evolution Studios
This approach has the same bandwidth requirements as using a single FP16 render target when we're laying down the buffer. This isn't generally a bottleneck for us, so that's not a problem. The benefits from not being FP16 come from when we use the buffers as textures. The post processing doesn't care about things outside the 0 to 1 luminosity range, so it reads from the first buffer. The bloom and exposure metering don't care about high detail dynamic range, but they want a wide dynamic range - so they read from the second buffer. So both users of the rendered image get the information they need, but at 32-bit LDR bandwidth cost.
Q:Not much is known about the basic set-up of your renderer - are you using a traditional forward renderer or have you adopted a deferred approach as seen in the likes of Uncharted 2 and the Killzone games?
Oli Wright:MotorStorm and Pacific Rift were traditional forward renderers. Apocalypse is a semi-deferred light pre-pass renderer. We first render the normals, then we accumulate lighting into an FP16 buffer, then we do a final 'material' pass to produce the image that then goes off for post processing.
Q:Evolution Studios has been at the forefront of supporting stereoscopic 3D and the team has often talked about the best results being attained by building the engine up around 3D capabilities - at the most basic technical level, how did you go about integrating 3D into Motorstorm Apocalypse?
Oli Wright:Stereoscopic 3D renderers can be broadly classified as either 'reprojection' or 'draw everything twice'. In Apocalypse we 'draw everything twice'. Nearly everything anyway. We try to share as much processing as we can between eyes. For CPU that's largely the scene processing and object culling phases. For the RSX it's essentially our shadow map rendering.
One area that we try to be very careful with is dynamically adjusting the interaxial to prevent window violations. Also you won't find any frame tearing when Apocalypse is running in 3D. Those two things are incredibly important for having a 3D experience that is comfortable and easy to view.
Q:Feedback to the 3D experience in Apocalypse has been universally positive - what did you want the player to get out of the game?
Matt Southern:We know how common it is to accuse 3D of being a gimmick, and when you make games you have the golden opportunity to disprove this by affecting gameplay, not just visuals which of course are all a movie or broadcast can offer. Aside from the obvious fact that our concept proposition lends itself beautifully to forward-rushing spectacle, we tried to make the 3D contribute to the sense of vertigo, and the subconscious ability to judge the player vehicle in relation to the AI, tracks and obstacles. To essentially make judging a racing line more instinctive and have 3D contribute to the vital sensation of 'flow'.
Q:There's a lot of discussion about the technological issues in incorporating 3D, but art direction has just as much of a part to play - how did you approach this aspect of MotorStorm Apocalypse?
Simon O'Brien:As we implemented the 3D, we began to maximise the elements that we discovered added most impact to the stereoscopic experience. For instance, small particles such as fire embers and airborne debris were suddenly the heroes of the scene, especially when seen in some density, allowing them to be perceived with a new volume and depth beyond the 2D version.
We also needed to consider what methods we would need to employ to avoid the pitfalls of 3D such as negative parallax and frame violations, whilst taking the 3D effect to its limit. Mostly this came down to tuning of camera parameters and how best to frame the action effectively to sidestep these issues. As a result of taking time over every vehicle camera, scene camera and even the frontend GUI, we feel that it really is one of the strongest examples of stereoscopic presentation available today.
Q:It's not been mentioned much in the marketing, but MotorStorm Apocalypse supports 1080p to the same level that Gran Turismo 5 does - native 1280x1080, upscaled to full 1080p. Bearing in mind the increased fill-rate requirements of 3D, did 1080p support come about almost like a by-product of supporting stereoscopy?
Oli Wright:Yes. If we stayed at 1280x720 for 2D, it would inevitably mean a compromised 3D experience. We saw it as an opportunity to improve our 2D experience though. This is a gross oversimplification, but 1280x1080 is exactly the same number of pixels as 960x720x2 - so that's what we targeted, although in the end it became 1024x720x2 due to some boring alignment reasons. Of course we also wanted to add dynamic lighting, improve our shadows, fix our particle halos, add MLAA and add SSAO. It goes without saying that we're not sane.
We'd rather have a four-player experience with some compromises than not have it at all.
Oli Wright, lead graphics programmer, Evolution Studios
Q:You've worked extensively in retrofitting 3D to Motorstorm: Pacific Rift, using techniques such as lower detail models and dynamic resolution - how valuable was this work in the new game and do you have use any of these techniques in Apocalypse?
Oli Wright:With a game like Apocalypse, it's incredibly difficult to guarantee the RSX rendering time for any given frame up-front. The frames are just so variable, primarily because of the vehicles. We like to have lots of vehicles on screen, and often they're close to the camera, and quite often engulfed in a fireball. So we have two choices. We either make everything so cheap that we can draw our most expensive frame still within budget, or we scale back the rendering for frames that we think are going to be over budget.
We hugely prefer this latter approach. It means the RSX is close to being maxed out far more often, without losing frames. If we didn't do the dynamic detail scaling, then it would mean that the RSX would be idle for a lot of the time to give us the headroom for our variable frame cost, and frames that were more expensive than we considered for would tear and frame out.
Q:Do you use dynamic resolution in the 2D 1080p mode? On a more general level, resolution adjustment on the fly suggests that you would need to anticipate bottlenecks and spikes before they actually happen - what is the real-life approach?
Oli Wright:We do use dynamic resolution changing in 2D - but we've tried to budget things so it only kicks in in exceptional circumstances. When you play the game, it's really hard to notice it happening unless you have a little flashing light that pops up to say 'dynamic resolution active!' Our rule for anticipating the RSX rendering time is simply to assume that the current frame will cost the same as the previous one, unless there's a camera cut. It's a bit basic, but it works very well.
Q:Split-screen support has been upped to four players for Apocalypse, including online integration. What were the major rendering challenges here and how did you overcome them?
Oli Wright:Split-screen is, obviously, a nightmare. We have lots of things that we can scale back to make four-player split screen possible. LOD biasing, shadow resolution, shadow range, SSAO and motion blur. So the visuals are compromised in four-player, but we'd rather have a four-player experience with some compromises than not have it at all.
Q:Online integration of split-screen gameplay suggests that the amount of raw data you need to transmit over the internet increases exponentially with each new player you add - are there any implications here for latency?
Dave Hewitt:If you have a peer-to-peer architecture then yes, there will be an exponential increase, and you do run the risk of saturating the upstream bandwidth on a consumer connection, and introducing latency. However, since we have a client/server architecture there is no exponential increase in network traffic - only a linear increase in upload.
Q:The basic principle of off-loading GPU work onto Cell seems to be a common approach in all the major Sony first party titles - does the same principle hold true for Apocalypse? What are the major tasks in the rendering pipeline you assign to the CPU and RSX?
Oli Wright:The only major pixel-pushing task that we run on SPU is the MLAA. The SPUs are mostly occupied with physics, vehicle simulation (which runs at 600Hz), animation processing, particles, scene processing and rendering preparation. They're certainly not sat idle. If the SPUs were tied up doing lots of work off-loaded from the GPU, then we'd need to have much simpler worlds with far fewer dynamic objects. That wouldn't be MotorStorm. As it stands, we have worlds with over 2000 dynamic objects in them.
Q:From what we've heard of ATG's MLAA, the technique is designed for a 1280x720 framebuffer, but you have it running while processing 50 per cent more pixels in 1080p mode. Did the MLAA tech require an upgrade? What was involved here?
Oli Wright:The MLAA processing itself wasn't upgraded, it just took 50 per cent longer to run. So we had to find the SPU time and the XDR memory to make it work.
Q:A smooth, consistent frame-rate and controller response is especially important in a racing game - what are the systems you have in place for maintaining performance? Do you use anything along similar lines to Guerrilla's Autobot?
Oli Wright:We have a couple of tools to try to make sure that the performance of world rendering in particular is as consistent as possible. The first is a world benchmarking tool that runs a camera through the world along all possible routes, timing various GPU phases and reporting on hot and cold spots. The second is a tool called The Auditor that gives a more spreadsheet type breakdown of a resource usage for a world.
Q:On a more general level, you've now shipped three major PlayStation 3 titles, each significantly different to the last. Do you feel that your tech has matured, or are there any specific areas you still feel could be better exploited?
Oli Wright:I don't think we'd ever be complacent enough to say that any particular area can't be improved upon. There are always things that can be done better. A major area for us that we know we need to improve on is that of world production - our existing process is simply too labour intensive.
In terms of exploiting the hardware better - yes - there are always things you have in your mind that could be done better. But there's always a difficult balance to strike between making something that's production ready versus blue sky experimentation.