Sections

Design Research: Ignore it at Your Peril

Graham McAllister on why Molyneux's latest advice seems dangerously misinformed

Everything Peter Molyneux says is wrong. Maybe not everything ever, but certainly all the key points that were reported from his keynote speech at the recent G3 Futures Expo - they are simply not true. To be clear, I'm not saying I have a different opinion to Peter, but rather, the best knowledge that exists today on design and creating successful products is completely contradictory to the advice that was given. Let's take a look at two key statements that were made and see why this is important for designers, and in fact everyone in the games industry, to consider.

Statement 1 - "My games would be a quantum times better if I had experienced eyes looking at them." Here Peter states that the best people to test new titles may be fellow developers, and he hopes that those already in the industry will be better at 'spotting problems' with in-development titles. There are two problems with this idea, firstly the value of experienced eyes, and secondly, who those eyes belong to.

The first problem, that of being experienced I've discussed before, the amount of years on the job is not an accurate indicator of identifying people who are good at their job, it's not years of practice that matters, but whether or not they have engaged in what is called deliberate practice. What this comes down to is that it's not just about how much you practice, but how you practice. What this means is that choosing 'experienced' eyes will be very difficult indeed, simply going for people who've been around a while may be of little value, whereas a relatively new developer who has followed the principles of deliberate practice may be much more valuable. Can you easily identify who is who?

"If Peter wants to get ideas from fellow designers then that is completely fine, however if he wants to validate design ideas with designers then that is not fine"

The second problem with this first statement, is that regardless of the quality of their experience, they are precisely the wrong people to get feedback from. If Peter wants to get ideas from fellow designers then that is completely fine, however if he wants to validate design ideas with designers then that is not fine. I've talked before about knowing the intended audience for your game, and one of the reasons why this is so important is so you playtest mainly with those player profiles (others are possible), it's those players you based your design decisions on.

Statement 2 - "We're not competitors, really. There's seven billion people in the world - surely we don't have to worry about competition."

Data released by Google show that the majority of users when searching for an app compare at least one other competing title, so you most certainly should be worried about competition. And yes, there may indeed be seven billion people in the world, but your game is designed for only some of them, and so is your competitors'.

There are two things I can conclude from all this, One, Peter mustn't read my column, and two, he doesn't seem particularly well informed about design. Design is not just about the idea creation stage, or the mechanics of how a game functions, but also how you manage and inform the design process. This is known as Design Research, and is taught as a basic concept on many design-related undergraduate programmes.

"Peter is not alone in making such reckless statements. I was recently at a conference where a speaker was saying that all that was needed to make a great game was to hire smart people and have good ideas"

However, Peter is not alone in making such reckless statements. I was recently at a conference where a speaker was saying that all that was needed to make a great game was to hire smart people and have good ideas. This produced plenty of nods in agreement and I was thinking that everyone who is nodding along right now is probably making fundamental mistakes in their game. I'm not suggesting to not hire smart people or have good ideas, but without a way to manage the creative process, results are likely to be unpredictable, i.e. you may get a good game, but then again, you may not.

These sorts of misunderstandings around established practices in the design process seem to be widespread and I have to wonder - is it because many in the games industry don't know about Design Research, or because they choose to ignore it?

What is Design Research?

Design Research is a collection of methods which can be used throughout the development process to either inform design decisions, or validate design decisions. No matter which stage you are at in development, there's a design research method that can be applied. Broadly speaking these methods can be split into two groups, investigative methods, and assessment methods.

Investigative methods are those which are used to understand your target audience and would typically come at the start of a project. The aim here is to understand why your target players play the games they do, and this player insight helps inform some of your game's early decisions. It's important to note that this is not about asking players what they want, it's about understanding their behaviour.

The second group, assessment methods, typically happen at a stage in the development lifecycle once a playable build is ready. These methods, such as playtesting, are used to validate the assumptions that were made during the design phase. One type of assessment method that can happen early on however is a competitor analysis, which is used to inform design decisions and best-practices from already existing games. The insights captured by these groups of methods help identify who your game is for, design for these players, then assess how successful those design decisions were. It's highly iterative and allows developers to receive feedback during development from the right target audience. It's this approach that Peter should be taking in order to make his games significantly better.

I Know Better

Perhaps more worrying than not knowing about design research is knowing about it and choosing to ignore it. This is more worrying because it means you're familiar with a process that is known to produce better games, yet have instead decided to not use it. I've heard many excuses over the years as to why a design research process isn't being used, here's a small sample.

"It's too early in the process to do anything". No, it's never too early, Understanding your users is meant to happen before you begin designing as understanding informs design, this is typically the first phase in most design processes. If you've got to the design stage and haven't done any user research then you're already introduced mistakes by not being correctly informed, not used best practices, or designed for the wrong players.

"It ruins the creative process". Sadly not. Receiving objective feedback and working within constraints have been shown time and time again to produce better results across a variety of creative disciplines. Design research is intended to enhance creativity by encouraging risk to take place, the process is there to catch any mistakes and put them back on track.

"We know what's wrong and we're going to fix it". Not quite. You have identified known issues that you're going to fix, but it's very likely that are also plenty of unknown issues which you are not yet aware of. If the issue isn't known, it'll not be fixed, and it's better to know sooner in development than later.

"Part of me wonders if some developers choose to ignore design research because of the traditional software development process"

"You can't evaluate fun". Indeed, this is difficult, but you can assess if players understand how your game works and if they can achieve the tasks that are required of them by the designer. Both of these issues can have a significant impact on a game's enjoyment, so if players report that your game isn't fun, it's possible to use this indirect measurement approach to get at the root cause of the issue. By identifying and addressing the underlying cause, a subsequent assessment of the amended game should reveal more positive responses from players.

Part of me wonders if some developers choose to ignore design research because of the traditional software development process. The processes used to create software are typically focussed on delivering features, whereas a design research process is focussed on delivering great user experiences. Of course, both of these approaches can live together quite happily, and it's worth reflecting on your own game development process and how much effort is put into each. Are you focussing on features, or player experience?

I agree with Peter on the core concept though, getting people to play your game during development will indeed dramatically improve the player experience, however they need to be the right people. He doesn't need more designers, he needs a smarter approach to making games.

Related stories

22Cans names Simon Phillips as new CEO, Molyneux remains creative director

Company "begins a reboot" after troubled period

By Dan Pearson

"At that time I started making this terrible mistake"

Molyneux keynote at Reboot entertains, but avoids the elephant in the room

By Dan Pearson

Latest comments (14)

This all sounds helpful - as any research can be - but the certainty in Graham's text is a bit alarming. There's two kinds of articles that instantly cause my eye to twitch: (a) the "I have the code to success" articles or (b) "Gaming is binary: it's this for success or that for failure". Graham seems to have skirted both canards here without breaking a sweat. No offence to his good self but "You either do X data-driven process, which I sell, or your fate is entirely down to luck" ought to be viewed with healthy suspicion coming from anybody, especially as so many amazing games quite obviously managed to get made without these processes.
I know many teams use this kind of research very well and successfully, I just thought to counterweight Graham's overly narrow interpretation of things in the text above. I don't think anyone disputes the accuracy of data but the value of it is mutable from team to team and game to game. And in many successful games and companies it can be of minimal use, doesn't work or is dispensed with entirely.
23Sign inorRegisterto rate and reply
James Coote Independent Game Developer A year ago
All feedback on a game should be taken in context: who is giving it, where and when. Just because it's coming from another developer (down the pub after a couple of beers) doesn't make it automatically invalid - rather the testers / designers need to interpret the feedback with that context in mind.

Obviously you shouldn't ONLY get feedback from developers, but I don't think that's what Molyneux was suggesting. If you're going to spend money on testing then yes, I agree it makes sense to go mostly for the target audience. However, other developers can still play a useful role.

For example, say you've identified an issue where players are struggling with learning your game. It may be you can identify from your data where there is a big drop off in users making it to the next step in the tutorial. But suppose you conclude the issue is more fundamental and the tutorial in its current form simply isn't a good way to teach players the game.

At that point, you can show half a dozen developers your game, explain to them the problem and probably get 3 or 4 different solutions / ideas for how to teach players the game. AB test them with end-users and go from there.

---------

On the point about competition, if you're making a match-3 on mobile or AAA fps then yes, you're in direct competition with other similar products, as well as more widely for visibility in the marketplace. Outside of that, my RTS coming out on Steam is going to have zero impact on your 2D platformer releasing on Xbox on the same day. Most indies, small and medium sized studios are going for some sort of niche on huge digital storefronts. The corresponding marketing strategies don't include the sort of mass market paid user-acquisition that's going to take eyeballs away from your game and on to mine.

The issue of resources for smaller developers is a factor too. I can afford to show the game at the local anime convention and informally get feedback from watching and talking to the two-dozen people who play my game over the course of the day. But doing something more formal with say a user-research firm has a higher overhead and dollar cost. It's not that I'm unaware or don't want to do such things, I just can't afford them.

It's also worth making the point that specifically within indie circles, most indies aren't big enough to allow much in the way of specialisation. Being part of a wider community where people share their expertise is highly mutually beneficial, with very little downside.
1Sign inorRegisterto rate and reply
Jason Avent VP, Studio Head, NaturalMotionA year ago
I'm not sure that Graham is talking about data-driven game development Barry. It's a scientific approach that he's championing but this isn't about player data and analytics. It's about market research and user testing. You probably did it as a matter of course. For example, you might have thought : "Oh, no one's made a puzzle game about a box in a room before but touch interfaces and puzzle games seem to work on phones and tablets". That's sort of market research - albeit some might be more rigorous than you were. Then I bet you got friends and colleagues to play the game before it was released to see if they got it? Well that's the user testing part. Again - what G would attest to is that with more scientific rigour, you might get even better results.
3Sign inorRegisterto rate and reply
Show all comments (14)
Chris Payne Managing Director & Founder, Quantum Soup StudiosA year ago
I think that Molyneux espouses the traditional approach of designers doing the research and playtesting phases themselves. Any good designer will be aware of the pros and cons of competing games, how players play, and will make decisions based on the desired UX. They'll playtest it themselves, looking for issues that break that UX. Obviously developers look at games differently to players, but we'll all human and it's not a huge stretch to imagine you didn't know how to use feature X and check that it gets called out at a suitable point.

Formalised design research and user testing is a more recent trend which in my view is geared towards taking those important designer skills and extracting them into a process that is comprehensible and measurable to managers and investors. That's not to say it's a bad thing - I'm sure it provides a useful crutch for inexperienced designers, and even turns up surprises for experienced devs - all good. But it's not the only way by any means.

[EDIT: Yeah, what Jason said while I was typing :)]

Edited 1 times. Last edit by Chris Payne on 20th July 2015 3:28pm

1Sign inorRegisterto rate and reply
@Jason I do see the parallels and I think we all agree on what is important i.e. player experience over features (although in the spirit of a good quibble, Graham may not be talking about data-driven development but I do think he's talking about data-driven design).
What I can't sign up to is not the letter but the spirit of the piece: the idea that rationalising your design formally by hiring Graham is 'responsible' and informally by relying on experience or knowledge to answer the same questions is 'irresponsible' and some sort of road to disaster. Just like his previous article, Graham speaks about sensible stuff but wraps it up in such a way as to say "it's my way or the highway" and just like that other article, it's that part I want to quibble with.
There are many teams who, like ours, had little resources to spend on their first game and it bothers me they might read Graham's rather scorched-earth picture of development precision and wonder how the hell all their favourite games ever got made.
I'm happy to remind them: not like this.
8Sign inorRegisterto rate and reply
Jason Avent VP, Studio Head, NaturalMotionA year ago
Yeah that's fair enough. I don't think you have to hire Graham to follow these methods though. When we made the game 'Pure', we went out to Churchill Square shopping centre and recruited play testers. It didn't cost much at all. I think we paid them a tenner each for an hour. It worked pretty well. We did a lot of DIY usability testing on CSR Racing too. It doesn't take a lot of money if you know what you're doing. It's the discipline that's the main thing. There's a load more free data out there now to call upon too so market research doesn't have to cost very much either.

I think a lot of people want to make something for themselves when they begin in game development so that's totally at odds with making it for someone else. Sometimes you get lucky by trusting your gut. It's more fun doing it that way - that's for sure. :)
1Sign inorRegisterto rate and reply
@jason No doubt, I hear what you're saying. I do think you're reducing Graham's more strident view to the very basic though, there's a difference between what he was spelling out and "it's good to have testers" which nobody disagrees with even Molyneux :P
I dunno. Sure there are huge games that use data like this very well, especially giant games-as-a-service titles where total collapse can seemingly be induced though micro-changes in air density. But many games informed by rigorous research are just a bunch of arse to play. And others that didn't use it at all are great and successful. This doesn't make me think research like this is bad, just unimportant, in the scheme of things. For instance, I'll go out on a limb and say Pure played so well first and foremost because it was your excellent team who made it and not because you "knew" it was would be good thanks to research.
This is the lure and dangerous promise of data: predictability. Numbers evoke definition and precision; these concepts are like catnip to the investor class who often feel adrift in a sea of chaotic market taste and scruffy devs with a slim grip on the financial. In my view the data-driven service sector that swirls around modern game studios evangelises about itself to the point of quackery. A goodly portion of this scientific approach to gaming only impacts quality in the hands of a great team who already know how to finesse games and who are bent on making something play really well. In this context, those people are 99% there already and are much more likely to make something out of the ordinary with or without the data.
I don't doubt the accuracy or informative quality of good research - I just doubt whether its up there with the many other strengths and insights teams need to have to make something really well, like dedication, experience, talent, knowledge & love of the medium etc, never mind a bloody good game idea and world class execution skills. Whereas Grahams tone can be easily read as "THIS IS IT, NEVER DOUBT AGAIN" even thorough research is tangential at best to making a million seller. In a small amount of cases it will matter hugely but in the huge majority of cases nothing will sink or swim because of it. But that's just, like, my opinion, man.
1Sign inorRegisterto rate and reply
Jason Avent VP, Studio Head, NaturalMotionA year ago
Without doubt, the most important thing is a great team. Ideally one that's worked together before. So yes. However we definitely did a lot of things right and one of those things was usability testing to refine the handling and the tracks on Pure. That was one of the pillars that made the whole product stand up so it wasn't just adding an extra 5-10% in my opinion. It made an enormous difference. I think generally that's what analytics tends to do - add 5-10%. It refines rather than redefines. Data can win or lose arguments for you but you need to be having the right argument to begin with and that's where talent and qualified experience comes in.

So I too believe that data-driven design is only part of the mix. A small part at that. The methods that Graham is pushing are nothing to do with data-driven design though. They're more about quantifying things that are qualitative by doing a bit of research and talking to people so that you can be more objective.

Semantics aside though, there are people who like to fly by the seat of their pants and there are those who take a scientific approach. These worlds are not mutually exclusive. One is not bad while the other is good.

Maybe Graham should temper his rhetoric a little. Maybe no one would talk about him if he did that! :)

Edited 1 times. Last edit by Jason Avent on 20th July 2015 6:31pm

3Sign inorRegisterto rate and reply
James Berg Games User Researcher, EA CanadaA year ago
I've been doing UX research at EA for over 3 years, and have worked on a wide array of AAA games. I can say with certainty that many developers substantially overestimate their players - what they'll understand, how they'll react, and most importantly, how they'll learn to play the game. Experienced devs and members of your own team just can't replicate the new-user experience. In my own work, I've caught critical issues that would have severely impacted the player experience that the developers thought were 'working as intended'. The dev team fixed the issues, and player experience improved substantially.

UX research isn't about telling you what to build, it's about helping you make sure players are playing the entire game you intend them to be playing. It's not about focus-groups telling you what they want, it's about testing the game you're creating and making sure it's providing the experience you want. If you want to try new things, which is awesome, design research is a way to make sure what you're trying is *actually* working, before your game comes out ;)
11Sign inorRegisterto rate and reply
Paul Johnson Managing Director / Lead code monkey, Rubicon DevelopmentA year ago
Developers make for the worst designers. Everything they do is a new feature that they're very proud of, so you're going to see it. Lots of it. Even if everyone else thinks it's shit.

If you are both the designer and the lead programmer, getting player feedback is vital. Players simply don't see the game the way you do, so you can either please yourself or please your customers.
2Sign inorRegisterto rate and reply
Paul Johnson Managing Director / Lead code monkey, Rubicon DevelopmentA year ago
I meant programmers, not "developers". Although artists are even worse. :)
3Sign inorRegisterto rate and reply
Hi Barry, thanks for yours comments and viewpoint. I think you’ve misinterpreted my ‘certainty’ here however, I’m not saying that if you do this you will have a successful game and if you don’t you’ll have a failure, but rather, if you follow a design research approach to game development it will certainly improve your game. In other words my ‘certainty’ is in the ability of the process to improve the player experience of a game.
I’m certainly not saying either of the two things that make your eyes twitch, (a) I have the keys to success or (b) the success of a game is binary, but I definitely am saying that this process will make your game better. I have no doubt about that. I can’t say how much better of course, in some case it can be dramatic, and in others it’s a degree of refinement, but better nonetheless.

When I write each column my main aim is to educate, to bring to the attention of the reader something they have not have known about before, or seen from a new perspective. However, I always come at this from a research angle, so each month whether I’m discussing something that’s happened in the industry, or dispelling a myth, I’ll always research what the current best practices are, and if anyone else knows of a better approach, then I’d genuinely love to hear that. Each article basically takes an issue worth discussing, and then shows what the best current methods are for addressing that issue.

I’m not sure what you’re thinking of when you say data, but design research is mostly qualitative (but can be quant), and it might be helpful to think of the findings as ‘evidence’ rather than ‘data’. Our main focus is in explaining ‘why’, e.g. why do players not understand aspects of your game, why can they not do what you want them to, and ultimately, why do they not experience the game the way you’d like them to. Alongside this if course is what you do about that.

Sorry if I caused any eye twitching ;)

Edited 1 times. Last edit by Graham McAllister on 21st July 2015 11:30am

5Sign inorRegisterto rate and reply
sometimes..its also just down to Luck. how do you research and quantify liquid luck eh :)
0Sign inorRegisterto rate and reply
James Berg Games User Researcher, EA CanadaA year ago
Chee, research doesn't replace or have anything to do with luck. Research doesn't replace creativity or good game design chops (unless someone is making a soulless monetization me-too clone). Fairly basic research validates that the game you're creating is actually harnessing that 'luck', and is the difference between "My team and my friends think this is really cool" and "My end players agree that what we've created is cool". Both of those can work, and will keep working (sometimes), but which do you want to rely on over time?

In-depth research tests and validates individual features so you can figure out *why* things aren't working, or why they ARE working so you can replicate that success (or at least don't spoil it ;P). Often it's small but very important things that the dev team overlooks because of their own experience and competence - you can't unlearn how to play your game, so you can't judge whether someone else will find it hard to learn. Other devs know more about how to learn and understand games than most players, making them a poor substitute there as well. Better than nothing, certainly, but still poor. I used to sometimes do research using internal candidates, and seen this for myself, on AAA titles with a variety of internal testers (from rookie QA to veteran producers/designers).
2Sign inorRegisterto rate and reply

Sign in to contribute

Need an account? Register now.