Close
Report Comment to a Moderator Our Moderators review all comments for abusive and offensive language, and ensure comments are from Verified Users only.
Please report a comment only if you feel it requires our urgent attention.
I understand, report it. Cancel

Metacritic debunks score weighting report

Metacritic debunks score weighting report

Thu 28 Mar 2013 9:17am GMT / 5:17am EDT / 2:17am PDT
MediaGDC 2013

"Their guesses are wildly, wholly inaccurate" says score aggregator site

Metacritic has been quick to dismiss a report that attempted to calculate the weights that the score aggregator site attributed to different publications, calling the article "wildly, wholly inaccurate."

"Today, the website Gamasutra 'revealed' the weights that we assign to each gaming publication (for the purpose of calculating our Metascores), based on a presentation given at the Game Developers Conference this morning," it said in a statement yesterday.

"There's just one major problem with that: neither that site, nor the person giving the presentation, got those weights from us; rather, they are simply their best guesses based on research (the Gamasutra headline is misleading in this respect)."

The article that the statement refers to was originally titled Metacritic's Weighting System Revealed, and was based on a GDC session by Adams Greenwood-Ericksen of Full Sail University called A Scientific Assessment of the Validity and Value of Metacritic. It listed publications in six tiers, from highest to lowest, according to the importance their scoring had to a game's metascore.

The Highest category included The New York Times and The Official PlayStation Magazine UK, while the Lowest was home to titles like Jolt Online Gaming and Play UK. The piece caused controversy on Twitter as media outlets checked their ratings, and holds a special interest for developers whose bonuses can be linked to a specific Metascore and publishers whose stock can rise and fall on the site's output.

Metacritic argues that actually it has far fewer weighting tiers than suggested, and that the the study has omitted, under valued and over valued a large number of publications.

"The disparity between tiers listed in the article is far more extreme than what we actually use on Metacritic," it added.

"For example, they suggest that the highest-weighted publications have their scores counted six times as much as the lowest-weighted publications in our Metascore formula. That isn't anywhere close to reality; our publication weights are much closer together and have much less of an impact on the score calculation."

Greenwood-Ericksen has yet to comment on Metacritic's comments.

9 Comments

Andrew Watson Programmer

113 290 2.6
Why is there even score weighing in the first place?

Posted:A year ago

#1

Jim Webb Executive Editor/Community Director, E-mpire Ltd. Co.

2,287 2,507 1.1
Why is there even score weighing in the first place?
Because Metacritic subscribes to a phrase Orwell wrote in Animal Farm. "All animals are equal, but some animals are more equal than others"

Posted:A year ago

#2

Dave Herod Senior Programmer, Codemasters

528 788 1.5
Because it gives an instant "general impression" without having to even read anything? I haven't even bothered to read any reviews of Aliens Colonial Marines because I just looked on Metacritic and thought it seemed to be unanimously declared a stinker. If a game gets a more average metacritic score that's when it becomes worth investigating further.

Posted:A year ago

#3

David Radd Senior Editor, IndustryGamers

359 78 0.2
I think GameRankings and Metacritic both have their place - GameRankings doesn't bias their average based on how big the outlet is, so it gives a flat baseline for what every reviewer thinks. Certainly, score aggregates have changed reviews (and also changed development contracts).

Posted:A year ago

#4

Barrie Tingle Live Producer, Maxis

400 217 0.5
Was interesting to see Eurogamer Italy and Eurogamer Spain had more weight than Eurogamer itself in the Gamasutra listing.

Posted:A year ago

#5

Rupert Loman Founder & CEO, Gamer Network

139 45 0.3
Yep - and knowing the traffic for all those sites and reviews (plus many others on the list) means the list is either completely wrong (as Metacritic said) or completely flawed (which isn't good).

I think this whole thing is a can of worms - how is the quality or influence of the sites/reviews actually decided?

Posted:A year ago

#6

Private Industry

1,176 182 0.2
I was an editor many many years ago and I dont think scores should be weighted depending on website/magazine no matter if the numbers are right or not. Just because people dont work for IGN and so on doesnt mean their opinion should count less.

Posted:A year ago

#7

Identity Stolen Theft

1 3 3.0
As a statistician, I can tell you that weighting the scores is the appropriate way to go.

Weighting is used in many more important statistical subjects with a daily basis, since it is the best way to obtain accurate results. For instance, when calculating inflation in any country, not every increase in price is weighted the same (if the price of a Rolls Royce rises is not as significant for the economy as the rise of milk). When doing political surveys, each respondent's answers are weighted based on how many persons from the voting population he represents. As a professional analyst I can tell you that using weights has absolutely nothing inaccurate, nor evil, involved.

The real complaints about metacritic should be in the form of how they obtain the weights. I don't think Gamasutra's estimates are even close to the reality since the weights they published are just too simplistic. If Metacritic is using those, they should get help from a statistician right away! For calculating weights subjectively, as they do, you usually use a number of numerical attributes to obtain some form of "importance" index. I, for instance, would use information regarding the longevity of the publication, the number of reviews performed by it, the number of awards received, etc. to create a "prestige index". This index would be my basis for the weights, and almost certainly no pair of publications would get the same weight. Which attribute to use? How many? That's the tricky part and that's why I hope they get help from a professional, for the sake of the game's industry.

So please, don't blame the statistics, just blame those who can't use them right.

Edited 1 times. Last edit by Identity Stolen on 30th March 2013 7:47am

Posted:A year ago

#8

Roland Austinat roland austinat media productions|consulting, IDG, Computec, Spiegel Online

144 94 0.7
@Tom, this is the core problem right there:
whilst in a scoring system out of 10 - an 6/10 is a terrible game...8/10 is considered an ok game... whilst only a few games get 10/10
But in an out of 100 scoring system - an 60-75ish score would be a failure.. 80+ would be considered good, 90+ amazing, and no one gets 100/100 as it sugests perfection.
In a 0-10 system, 5 is an average game, everything above 5 is above average.
In a 0-100 system, 50 is an average game, everything above 50 is above average.

So 6/10 is in no way a terrible game and 75/100 is not a failure.

The PR departments of this world and the magazines that want to see their quotes on the packaging have played the arms race to higher ratings and shot themselves in the foot. Because if good games are now just games between 85 and 100 or between 8 and 10, they have effectively reduced the rating bandwidth to 16 or three steps.

Personally, I don't rate games anymore these days. I would use a 10 step system though, with 5 being average. EDGE still does this. Incidentally, EDGE is the only gaming magazine I still read cover to cover.

Edited 1 times. Last edit by Roland Austinat on 2nd April 2013 6:55am

Posted:A year ago

#9

Login or register to post

Take part in the GamesIndustry community

Register now