The adoption of Metacritic as one of the most important tools for game publishers has been a gradual and largely unremarked upon process. Starting out as a site simply designed to aggregate reviews in order to help consumers make buying decisions, Metacritic has slowly gained in stature among publishers - and has, to a large extent, become the de facto standard for measuring the quality of a company's output.
Today, Metacritic's importance is enormous. Publishers set the improvement of their average Metacritic rating as a key strategic objective, and boast about their successes on-stage at E3 or other conferences. The value of game franchises is underpinned not just by their commercial success but by their Metacritic rating, and concerns are raised over publisher and even platform line-ups based on low average ratings. We've even had instances of publishers trying to adjust developer payments based on the rating earned by their game.
Yet for all of the importance attached to Metacritic, the system is deeply flawed. It's not the fault of the site itself, as such - it was never intended to be used in this way, and its creators would probably be horrified at the idea that their rating algorithm was determining royalty payment rates for developers. Metacritic is, quite simply, only as good as the data it harvests - and while it does its best to sort through that data, by providing weightings to different publications, for example, the deep flaws in the data itself are insurmountable.
Basic, fundamental problems exist with the Metacritic approach. How, for example, can you balance out the basic differences in how scoring is perceived on different sides of the Atlantic - from US critics who are often half-jokingly said to believe that the scale for a score out of ten runs from seven to nine, to UK publications like EDGE or Eurogamer which use the whole scale? Dedicated readers, by and large, understand that a seven from EDGE is a good score, while on a site like IGN it's a bad score. Metacritic's software, however, doesn't.
That's before you get into the question of platform exclusives - routinely boosted on Metacritic by reviews in single-platform publications, with few counterbalancing reviews to even the statistics out. Things get even murkier when the relationship between PR, Marketing and Metacritic is taken into account. With review scores meaning more than ever before to publishers (ironic, given the widespread, but quite possibly flawed, perception that the specialist games media means less and less in terms of sales), PR is under increasing pressure to "manage" scores - and many experienced PR professionals are keenly aware of how Metacritic can be manipulated, at least sufficiently to massage a rating a few percentage points upwards.
EA Sports boss Peter Moore has another problem with Metacritic - he reckons that the site simply isn't relevant to Wii games. Speaking to Gamasutra this week, he pointed out that the disparity between Metacritic ratings and actual sales for Wii games is even more severe than it is on other platforms - and suggested strongly this is down to a disinterest in most Wii titles from specialist reviewers.
Moore is bang on the money, but if anything, he understates the scale of the problem. It's not just the Wii which has broken the back of the already strained Metacritic model. The DS suffers from broadly the same problem, as do casual or non-traditional titles on other platforms. Wii Sports and Brain Training, two of the most successful software products of the past five years - both of which can be attributed with driving millions of console sales - have Metacritic ratings in the seventies. Wii Fit, which has sold so many balance board peripherals that it almost counts as a platform in its own right, barely scrapes 80 per cent. Many other successful titles languish even lower in the ratings.
Now, sales are no guarantee of quality, as any music or literature fan will tell you. These, however, are not just successful products - they're widely loved products. It's just that the people who love them aren't gamers, in the traditional sense. They're newcomers to the pastime, or people who played games in their younger years and have now returned to the medium looking for something different. They're a big market, perhaps the fastest-growing market in the industry - and they are largely ignored by Metacritic and the specialist sites which contribute to its ratings.
As Moore suggests, Wii games - and casual games in general - just aren't high on the priority lists of most specialist journalists. Games journalism is, for the most part, populated with people who are traditional hardcore gamers. This is largely because it's those people who are most likely to want to devote their careers to writing about games - but it doesn't help that the low pay and poor advancement opportunities endemic in specialist games media mean that most don't stay in the industry for long enough to cultivate a wider view of the market. The Wii and other casual products are frequently regarded with as much hostility and suspicion by supposedly professional games writers as they are by hardcore gamers on Internet forums.
In part, this is to be expected. As the gaming hobby becomes more and more widespread, some fragmentation in the media is a healthy thing. Specialist sites catering to fans of specific styles of gaming will become more important than ever, allowing consumers to focus on the opinions of journalists who share their basic likes, dislikes and motivations - rather than having to second-guess whether they're reading the rantings of someone who hates an entire style of game, or an honest writer who's simply warning about a bad product.
The bigger sites will have to either choose specialisation, with many of them likely to focus in on the hardcore market, or significantly lift the quality and objectivity of their journalism. Good film publications manage to expertly review action blockbusters, chick flicks and arthouse explorations on the same pages without conflict - games publications are going to have to learn the same kind of objectivity in their approach, without losing the subjective nature that makes well-written reviews worth reading in the first place.
Where does Metacritic fit with this model? If the media fragments into specialist focused sites, with a handful of larger sites struggling for objectivity across all genres and audiences, can Metacritic survive? The answer is yes - but only in its originally planned form. Metacritic will continue to do what Metacritic has always done, which is to provide consumers with a snapshot of critical opinion. Publishers, however, will slowly be forced to abandon it as their Holy Grail of critical success. As the market grows, the cracks in the Metacritic model widen; rather than remaining slavishly devoted to this broken systems, publishers would do well to start rethinking how they evaluate critical success.