Sections

Measuring Everything and Understanding Nothing

Why data without experiential context can be misleading

What do players actually do when playing a game? This question is of utmost importance to game developers, after all, the success of their game will depend on players playing the game as they intended. Play it the way the designers expect and it's more likely to result in an enjoyable experience, but if for some reason there are obstacles to the intended experience, then players more likely to leave and not return. So how do you go about understanding what players really do in your game?

Broadly speaking there are two approaches to understanding player behaviour - quantitative and qualitative, and they differ in every respect, from how the information is captured to the questions they answer. Neither is 'better' than the other, they're used for different reasons, but if you're not using both in the development process then it's likely that you're not offering your players the best possible version of your game.

Measuring Behaviour

Quantifying aspects of player behaviour is an essential part of almost all games made today, but especially true in F2P games. Understanding how many players are returning to your game each day (DAU), and how much they are spending (ARPDAU), is an essential way of assessing how your game is performing. However, it's worth stating the obvious, that using quantitative approaches to measuring behaviour is only a useful method if the particular behaviour in question can be represented by a number.

"This is why analytics are very good at measuring aspects relating to churn and average number of sessions per day, but they'll tell you little about the player experience"

This is why analytics are very good at measuring aspects relating to churn and average number of sessions per day, but they'll tell you little about the player experience. For example, say your data shows that players are retrying a certain level many times, is this because it's too difficult, or maybe they're not using an item correctly, or maybe they missed an on-screen prompt, or one of many other potential reasons, chances are, you won't know. And if you don't know what the underlying cause is, then how are you going to make a fix? Without any grounded understanding you're going to rely on guesswork, and this is what we're keen to avoid.

Another issue with using analytics approaches is the resolution of behaviour that it can track. It's often not possible to simply track everything, this would cause a large number of messages to be sent to the server, making it both potentially expensive and very difficult to analyse. This leaves the question of which behaviours to track, are you sure the behaviours that you are tracking are really the ones that matter most to the player experience, or is it just that they're easy to capture?

And even if you could capture every behaviour, some aspects relating to the player simply can't be quantified. Behaviour itself is indeed useful to measure, but it's also useful to understand two other related issues - why did the behaviour occur, and how did the player feel as a result of the behaviour. To answer these sorts of questions, we need to look at other methods.

Understanding Behaviour

One of the key strengths of analytics is in providing evidence where an issue exists, but this is of little value if you also don't have evidence on what to do in order to fix the underlying cause. So while analytics are useful, of equal importance is knowing what to then do to address the issue. Providing understanding and answering why certain behaviours occur is what qualitative research is designed to do.

There are several key methods to understanding behaviour from a qualitative perspective. One of the most commonly used is direct observation of player behaviour during a playtest. This involves trained researchers (usually two so findings can be compared) observing the player interacting with your game and making detailed notes on what they do. This has the advantage over traditional analytics approaches in that it's not constrained to only capturing pre-defined behaviours, and it also factors in other contributing factors such as awkwardness in using controls and any social interaction, which is especially useful if the game is co-op.

"Yes, behaviour is important, but we also want to understand what motivated that behaviour, any misunderstandings that occurred, and also how the player felt as a result"

Another key advantage of qualitative approaches to understanding players is that they can give insights into more than just behaviour. Yes, behaviour is important, but we also want to understand what motivated that behaviour, any misunderstandings that occurred, and also how the player felt as a result. After all, games are about the experience, and designers are very keen to understand which factors contribute to an experience. As an example, suppose that analytics show that players keep using a certain feature in a game, is this because that's their favourite feature, or is it because they don't know others are available (possible tutorial issue), or is it because they don't know how to switch to another feature (usability issue)?

Measuring behaviour alone here is not enough, it doesn't provide any understanding as to why it's being used more often, nor why others are not being used. It's only from interacting directly with players that answers to these questions can be obtained (interviews, questionnaires etc). We've certainly seen cases where behavioural data suggests one thing, but from interviewing players afterwards, an alternative explanation has been given. The space between a behaviour occurring and understanding why it occurred is full of various potential explanations, so getting evidence on which is the correct one is vital if the gameplay experience is to be improved.

Of course qualitative methods are not without their limitations, they typically involve much smaller sample sizes compared to quantitative methods, so it can be difficult to answer some subjective questions reliably (for example, which art style would players prefer). It seems then that the failings of one method are addressable by the advantages of the other, so how can you use both during game development?

Understanding Before Measuring

Let's start with one of questions we posed earlier - how do you know which metrics to track? Rather than guess, you can actually produce evidence for which behaviours matter most to the player by looking at the results of a playtest. These results would pinpoint exactly where players experienced issues, and although these issues are likely to be resolved as they've now been identified, it may be useful to keep on eye on these parts of the game to make sure they're not experienced with a wider audience. Essentially what we're saying is that knowing which metrics to track should come from an understanding of which behaviours matter most, and the only way you'll know that is from observing real player behaviour.

"The space between a behaviour occurring and understanding why it occurred is full of various potential explanations, so getting evidence on which is the correct one is vital if the gameplay experience is to be improved"

Furthermore, if you actually want analytics to be meaningful then you need to reduce the uncertainty in the numbers. For example, say you don't do any qualitative behaviour assessment during development (playtests), and you go straight to analytics as a way of understanding what players are doing in your game. Now say that your DAU and ARPDAU figures that result are awful, way lower than expected, is this because players are confused by what to do in your game (understanding issues), frustrated because they can't do what they want in the game (usability issues), or simply that they don't like your game (game design issues)? Well, you won't know, so how are you going to make changes to improve this?

This can lead to a reactionary approach of changing all the variables in the hope that the numbers improve. With so many variables, this is not likely to be a smart, or successful approach. By the time you get to soft launch and you want to measure player behaviour on a large scale, you really want to be very sure that there's no understanding or usability issues, otherwise, you've no idea what you're measuring as variables are made up of compound parts.

Measuring player behaviour and seeing that everything is on track must feel very rewarding, but you're less likely to achieve this if there's no understanding of player behaviour to begin with.

Latest comments (3)

Tim Carter Designer - Writer - Producer 3 years ago
Data does not create value. It just measures it.

If you've got a great analystics system, but you've made shit, you'll very accurately measure shit.

D'uh... Why is this most basic of things not apparent?
2Sign inorRegisterto rate and reply
James Berg Games User Researcher, EA Canada3 years ago
User research (analytics + playtesting) is a good way to find out whether you've made shit, or made gold, before you launch ;)
2Sign inorRegisterto rate and reply
Tim Carter Designer - Writer - Producer 3 years ago
But the process begins with what you create. If you abandon the creative elements which we have discovered over centuries of work with artists... in favour of metrics... you're just randomly throwing spaghetti at the wall. So really you're making mediocrity to begin with and hoping that metrics will rescue you.
1Sign inorRegisterto rate and reply

Sign in to contribute

Need an account? Register now.