Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Lab Rats

Massive Entertainment discusses the science and testing behind game design at Ubisoft

Ubisoft's Massive Entertainment studio, the developer behind World in Conflict, houses a state-of-the-art user research laboratory. It's here that the company monitors gameplay, collects and analyses data, and uses it to influence and feedback to the game design team.

In this exclusive interview with GamesIndustry.biz, Martin Hultberg, user research manager at Massive, discusses the processes behind the data analysis, why and how it uses metrics to inform design, and the dangers of relying purely on scientific analysis.

GamesIndustry.biz Can you expand on the lab that you have at Massive, what it's used for and how it feedback into the game creation process?
Martin Hultberg

We are developing methods to test games, and teaching people how to use rapidly iterative play tests and methods to improve a game. It's very scientific. We invested a lot of money into a good user research lab. The lab consists of two big rooms with modular set-ups. We have one for PCs and one for consoles. Between those two rooms is an observation room with one-way glass looking in on both rooms, with microphones and cameras hooked-up so we can listen in and watch as people play, and record the whole play sessions. Then we have a separate entrance where we can bring people in without taking them through the two rooms so they don't see anything they aren't supposed to see. That's the basic set-up.

GamesIndustry.biz Does that put the play tester at ease, in a dedicated observation room?
Martin Hultberg

Yes, it does. It's still an artificial environment so there is a certain amount of pollution in the test. And they know they are at a developer's studio. A lot of the test subjects that we have have signed up and have an interest in testing and they like what we do at Massive, so some are obviously fanboys. The data we get from them, we have to filter it very carefully. They're not as critical as they should be or as we'd like them to be. But then we also get test subjects from the local university and those guys are more critical, they're not as interested in us as a studio.

GamesIndustry.biz So that data is fed back to the team on a regular basis and is used to improve and tweak the game. What are the main issues you're looking for there?
Martin Hultberg

For level designers it mainly the flow of the map, and how do people move and integrate with that environment on the map. There's at least two ways to play a game – the way the designers intended it to be played, and the way everyone else plays it. That's the minimum. It's good for level designers to see how people treat their map. For game design, they want to know how people use a specific feature, what weapons do they use, do they feel the game is balanced correctly, and things like that. Programmers want to know about performance – we did a frame rate test for World in Conflict where we had builds with very specific choked frame rates to see how users started to react to a bad frame rate, which told us our minimum frame rate spec, so we knew how heavy we could make the maps and how much art we could put in there. At different times we look at different things, but overall we look at usability and balancing. That's basically what's going to make a product good that we might have problems spotting ourselves. We're all gamers, we know what's fun to play and what weapons are cool and we know how to make an online game but the accessibility part is something we can become blind to after a while, and the same for balancing.

GamesIndustry.biz Is part of the challenge managing the amount of data that you get, using the right data, and not becoming overwhelmed by that?
Martin Hultberg

We have a very brilliant online programmer at our company who designed Mass Gate, which is the online platform for World in Conflict. When he designed this he built in it a huge amount of stat tracking systems and we used that during part of the development to assess things during the alpha, beta and other testing. Not all of that is in the retail product, some of that was done for internal testing only. We actually got so much data that most of it became useless to us because we didn't know how to handle it, we didn't have the manpower, we didn't have the analysts to dig into it. Some statistics are just so complicated because it ties into other stuff and if you start looking at it you go off on tangents, following statistics but not coming to any conclusions. It was more than we knew what to do with. It's easy to collect the data but it's something else to actually analyse it.

GamesIndustry.biz I guess you have to be aware of vanity data as well, where you have stats that tell you how well something is achieving what it set out to do, and basically using it to justify elements of the game...
Martin Hultberg

Yes, that's the dangerous part. Selective viewing of data is not uncommon and a lot of times some testing will show people doing a test to prove their point. They don't look for problems in their design. That's the wrong outlook of a test. That's also the hard part because you're criticising people's work and design, these people you work with internally. I'm almost like the internal police and nobody likes the internal police.

GamesIndustry.biz How do you communicate to your team that the data isn't there to criticise and pick apart their game in a negative way?
Martin Hultberg

That's a relationship you have to build and it takes years. It's about trust, it's about realising that it's a team effort. That's something Massive has always been good at promoting. We're doing this together and it's a team effort. You're all in the same boat and you're all helping each other. It's also about the way you present things. Like the way you write your reports and how you tell them about your data. Fairly early we put a demand on the development teams that we want at least one of their guys sitting in on each test. I want to be able to have one of them back me on everything I do, and see how it's done. Then it very rarely turns into an argument. It's then a case of “do we fix this now or let it be and fix it later on?” It's about being very diplomatic and understanding to the fact that the developers love what they do. A designer would never create something knowingly bad. They don't do it on purpose, they think it's good design most of the time. Sometimes they genuinely believe decisions they have made are a good thing, so then you have to tread carefully, as it can become a minefield.

GamesIndustry.biz Is there a danger of taking that too far and not using emotional attachment and more natural feelings to influence the creation of a game?
Martin Hultberg

If you have a lead designer or producer without integrity that can become a problem. They need to have a backbone and I need to have the intelligence to understand that the data I see is not necessarily reality. Again, we're in an artificial environment, we're maybe using 6-12 testers, we're playing on a specific build under specific circumstances. The conclusions I make might not be relevant and I have to realise this. That's the same for the lead designer and producer. Together we have to make educated guesses sometimes – is this actually what it's going to be like in reality? That's the tricky part of the entire process – analysing the conclusions you make.

I could see that if the wrong people are involved then it's a definite risk that people will rely too much on designing by tests. I can see it happening and I've seen products where I can guess that's happened. Because someone at some point has said “look at this report, we need to incorporate this or make changes because of it.” It's a definite risk for teams. We were at that risk at some point in the past but I don't think we're there any more.

GamesIndustry.biz Is the testing and research lab being used by other teams at Ubisoft?
Martin Hultberg

We have a bunch of gameplay labs within Ubisoft – in Montreal, Quebec, Shanghai – and sometimes when projects at respective studios are in a place where they can't really be tested we can offer out our services to their projects to make sure we have full booking on our assets. I've done a couple of tests for other teams and I know France does a lot of work for other teams. We don't want all tests on products to take place in one location, we need to challenge each other and to try to see if there is a cultural difference between North American and European gamers, for example. We try to interact with each other and do at least a couple of tests on products we don't work directly on.

Martin Hultberg is user research manager at Ubisoft's Massive Entertainment. Interview by Matt Martin.

Related topics
Author
Matt Martin avatar

Matt Martin

Contributor

Matt Martin joined GamesIndustry in 2006 and was made editor of the site in 2008. With over ten years experience in journalism, he has written for multiple trade, consumer, contract and business-to-business publications in the games, retail and technology sectors.
Comments