Although not a new feature, Metacritic this week disabled career scores, admitting that details used to calculate the stats weren't thorough enough, amongst complaints from developers upset that their work was either being scrutinised unfairly or ignored completely.
In this exclusive interview with GamesIndustry.biz co-founder and games editor Marc Doyle discusses the thinking behind individual scores and the process it adopted, Metacritic's influence on developers, publishers and journalists, and whether companies are purposefully withholding information about game credits.
Q: Were you surprised by the negative backlash from the development community, when it came to the developer scores feature?
Marc Doyle: I guess I was surprised at the attention so late in the game, because we added that feature in August when we re-launched the site. And I've gotten a few e-mails here and there, "Hey, I've done more than just these five games how can I add more information?" or that type of thing. But those e-mails were certainly not negative.
Then I think two or three high profile people in the game development industry took notice and they put it out there on Twitter and then that got going like wildfire. I just don't think too many people even noticed it was there, and so that kind of called attention to it and then through that process it became clear that if we're not able to do a comprehensive job in adding credits to individual's names then it's not right.
Like if they've worked on 30 games and we can only show four and then we take on this score, which is really just an average of those games in our database for them, then that's not fair. And we discussed it as a team and it made sense to just drop that overall number whilst still trying to build this database which will be difficult, but we're going to give it a shot. It's needless to put that number on it though.
If we're not able to do a comprehensive job in adding credits to individual's names then it's not right.
Q: What were the specific complaints that were coming through? Was it purely the lack of comprehensive cover or were some developers against the entire concept?
Marc Doyle: The first sort of complaint was simply, "How could you do this in such an incomplete way?" And I think a lot of people grant that it's tough, it's not as easy as IMDB getting the credits or the cast list from a studio. The industry's view of game credits... they're not as forthright about it as movie studios are, and so it's a little bit more of a challenge. Obviously other sites have taken a stab at it, with varying degrees of success.
But the other thing is the industry sees Metacritic as this gauge of quality and it's used in various forms. I know that in dealings between publishers and developers Metacritic scores come up and I'm sure that some people are more comfortable about that than others, and some are downright uncomfortable about that - and I can understand that.
I think then when it came down to the very personal nature of, "Wow, I was just a level designer on this" or "I only had a very small role in this particular game and yet I'm tagged with it". I can see where that can be troublesome to certain people. Some people even said, "Wow, I wonder if one day on my résumé companies are going to demand that I have my Metacritic career score." That is certainly not what we contemplated with those game credits.
Q: What was the original thinking behind the feature? Presumably you always consider these things primarily with the ordinary consumer in mind, not the industry?
Marc Doyle: Let's say you love a particular movie and you want to see who's behind it. You click on the director and you see what else he worked on, or you love this actor - you've never heard of her before - let's see whatever movies she's done. By the same token, you go over to games and "this was incredible", or "I love this particular voice actor", let me go to the credits page and see who these characters were. And then if you click on this person, "Oh, this person also worked on these other 10 games." It's just a way to find other products that they might like to play.
Because that's ultimately the goal of Metacritic: "What should I watch, what should I play?" Not necessarily to fuel some larger discussion over what person is more worthy than some other person. Throughout this process we've never ranked individuals, there's a certain context where we say: "Here are the top 10 games of the year by metascore", but even with that old career score - which we've now gotten rid of - we were never going to say: "Here's the top 10 level designers of all time on Metacritic." It was merely a way to collect this data and just take an average of the individual credits.
Q: Does the difficultly in sourcing credits point to a larger problem within the industry? That it's often not clear which studio made a game or what many of the job titles actually mean?
Marc Doyle: Obviously it's a problem. There's a couple of ways you can compile the information. You can do it by hiring an employee to go out and track all this stuff down from official channels or you can use user submissions, people who've actually worked on these games and they go through GameFAQs and register on the site and fill out an account and submit their stuff. I think that some people are going to be forthcoming and certain people aren't.
Q: Is it perhaps that publishers don't want people to know who are making their games? That they want to horde all the customer loyalty for themselves?
Marc Doyle: That could certainly be part of the problem. The issue here is individual people. You can click on Capcom, but you can also click on the development company but it's very tough. We've found it very tough in some cases to say who has been responsible for a game. We're constantly having to clean our data to know exactly which development house actually created something.
I think that is an issue, of the industry not needing or not wanting to put that information out there.
I think that is an issue, of the industry not needing or not wanting to put that information out there. I don't know exactly what's behind that, I haven't discussed that with too many publishers.
Q: Looking at the concept of developer scores from the other perspective, is it entirely right to rate individuals when the creation of a game is a much more collaborative process than in other industries?
Marc Doyle: I think if you were going to try and do some sort of actual ranking, like "here are the best directors" you'd have to make sure that all the credits are on an even playing field. If you started doing it with actors and a particular actor had a bit part versus another actor who had a leading role that will always be an issue.
But again our collection of that data was not for ranking purposes, it was just more for, "what has this person worked on so I can discover more products". But there is an issue, whether you can evaluate these people, one to the other, based on varying degrees of input on a product.
Q: What are your plans now for the developer score feature, will it ever come back?
Marc Doyle: We have no plans to bring it back. Right now we just want to see if we can build the database, take a shot at it, see what we can do. And we're still encouraging our users when they see those pages to please, help us. And then we'll see one day if we can get to a point where that section is beefed up.
Q: How long do you estimate it would take to get the credits section for games to a high state of reliability? Is that months of work or years?
Marc Doyle: I have no idea. I couldn't even tell you if we could get to a point where it would be necessarily fair to bring back scores. It would take looking at it again down the road. Months, years... we'll see where it goes.
If by this issue coming about in the last week all of a sudden the publishers and developers are like: "Yes, let's standardise this whole system", and everyone wanted to go out there and in every game attach their credits, then we could hire someone to simply input that into our system. But... I don't necessarily see that happening and I don't think you see that happening.
Q: Generally, how do you see your relationship with publishers and the business side of the industry? Was their high level of reliance on the site something you originally envisaged happening?
Marc Doyle: From the start our mission hasn't changed. It's a consumer-focused site. How can we make the job of picking a game easier and a more educated process for a gamer? So let's cut through all the PR stuff and the marketing and the advertisements. What do the top critics think about this game, relative to another game? So if you see one game scores 85 and another scores 62, at least based on what the critics think you should go for the 85 game. It's as simple as that. It's a tool among many tools.
If you happen to like that type of game, go for it! I've certainly bought games that have had red scores, I've loved movies that have had horrible scores. It's just a tool for me to know what the critics think.
So we launched around 2001, it was probably around 2003 - 2004 when certain publishers really started taking notice of the metascores. And you know, my thought was: "This is interesting, because when we jumped into the field, and when Gamerankings.com jumped into the field, before that it was always just sales and they could put out sequel after sequel in certain big franchises not worrying about the quality and they were going to do that forever. And I thought: "OK, we're going to jump in here and try and help people."
Did I envisage the day that the industry would take these numbers so seriously? Not at all. In a way I find it a good thing, in that at least they're interested in quality. It's not necessarily for the sales of this particular game - it seems like there's concern with franchises and their licenses. Like if they put out a really poor Lara Croft game, to use an example, it might still sell like crazy but people are going to learn about it, know about it, and then the sequel's not going to sell as well. The value of the franchise won't be as high. I think that's partly the way they view it.
Reviewers will say, "here's what the top end of our scale means and once you go below this it's all kind of the same," but to me that's not responsible.
People certainly contact us, if I'm missing a score or they want to understand how we convert scores from a five star system. But I just try to be as fair and consistent as possible. It's been me from the start, so the philosophy behind how we pick sites and take scores hasn't changed.
Q: What do you think Metacritic's impact has been on games journalism? Generally how do you view the quality of criticism versus, say, the movie industry?
Marc Doyle: I think it comes down to the consumer. I think a movie goer will still gladly go out and see movies that are critically panned, so they'll go see Transformers 2 en masse knowing it had a big fat red score on Metacritic. I'm the same way, it's a two hour commitment, it's $10. They'll go for it. But that's versus a game which is expensive and is a big time commitment and if people see a game is not a good game according to the critics they're not as likely to jump into that.
Q: What about the peculiar way in which video games are scored, where 5 or 50 is never average and somewhere around 75 per cent is considered to be the cut off point for a good review score?
Marc Doyle: There's certainly a bit more inflation on the game side of things. If you see something in the 40s on the film side of our site, which is still a yellow/mixed score, then it's still fair game. But if it's sub-50 on the game side most people would describe it on their site, according to their review policies, as a very poor game.
But a lot of people will say to me, "Well, nobody ever scores really above 9, really below 7" but it isn't true. I could point to so many games where every review on the page is red. I think part of the issue is that movie critics, the big ones - the ones that are paid and do a really good job on it - they're reviewing everything that comes out that week. They'll review the crap.
Whereas the trend now with video games is that in contrast to 10 years ago they aren't reviewing the crap as much. They haven't developed the bottom end of their scale as much and I think that's a problem.
I tell as many people as possible about this: "Start reviewing the shovelware, tell me what a 2/10 means for you, tell me what a 3/10 means." So a lot of people will say, "here's what the top end of our scale means and once you go below this it's all kind of the same," but to me that's not responsible. I want you to be as precise with the bad as you are precise with the good. I think that's where these veteran film critics are doing better than the veteran game critics. They have a tighter control and they have a greater understanding of the precision of the low end of the scale. Many games journalists also do this well, but I think it's an area where there can be general improvement.
Q: Will you also start providing metascores for journalists themselves, as well as their publications?
Marc Doyle: For games we don't cite the individual journalist's name - we just have the publication listed. With movies they have a more consistent staff, they don't use as many freelancers. We do have pages dedicated to those people and those entities but it's simply: "Here is their average score, their highest score and their lowest score", but we're not scoring them personally. We'll do a basic analysis, but we're not evaluating them per se. Just how they score relative to other critics.
Marc Doyle is co-founder and games editor of Metacritic.com. Interview by David Jenkins.