Roundtable: A Question of Quality
Metacritic says 2012 was a bad year for game quality, but is that accurate? Our staff weighs in...
A report this week from reviews aggregator Metacritic suggests that the last year in gaming was one of the weakest critically in recent years, with just 18 titles receiving a Metacritic score of 90 or higher (14 of those being on consoles). That's almost half as many as the 32 titles that garnered a 90+ score back in 2011.
So what the heck happened? Is the industry really producing fewer top quality games? Are developers relying too heavily on the same tropes and game mechanics? Has the long console cycle made jaded critics even harder to please, or perhaps there's just something fundamentally wrong with the Metacritic system?
It's hard to pinpoint what's really behind this perceived drop-off in top tier titles or whether there's really anything to actually worry about - after all, the GamesIndustry International staff certainly had no shortage of fun with numerous games throughout 2012. With the last 12 months behind us, and another exciting year ahead, we decided to gather our thoughts on the state of game quality.
I suspect that a few of my learned colleagues will use this opportunity to question the validity of reviews in an age of instant and ubiquitous communication, but I won't be joining them. I wouldn't be so na´ve to argue that professional critics matter a jot to the average smartphone or Facebook gamer, but for the sort of products that comprise the top 20 on Metacritic reviews have a very real importance, to both the producer and consumer.
Indeed, I take issue with Metacritic's misleading and selective claim that 2012's games were, "A far cry from 2011's heights." This idea is entirely based on the relative paucity of 90+ rated games, as if the top few per cent of the review scale houses the only products truly worth playing. The underlying logic is shaky to say the least, but I, for one, regard the qualitative difference between the games comprising the top 10s of 2012 and 2011 as so slight it barely merits consideration. Did The Walking Dead lose its punch because Skyrim scored a point higher? Is Dishonored relegated to the shit-list because Arkham City was liked slightly more by slightly more people? It's particularly baffling because, while 2011 had memorable high points, I believe that 2012 boasted a more consistent standard of excellence - fewer 95s, yes, but far more 89s.
Closer examination of Metacritic's figures bears this out. The average score across all 2012 releases was higher on Xbox 360, PS3, Wii and PC, and there was far more high-scoring original content on each platform: on Xbox 360 the number of franchise games and sequels in the top 20 fell from 18 to 14, on PS3 it fell from 16 to 14. In a year that many have dismissed as being defined by sequels, there were significantly more original games receiving critical praise and strong sales.
We claim to be sick and tired of sequels, and that readily shouldered malaise has no doubt coloured a few reviews of the bigger releases, but when faced with wonderful entertainment like Halo 4, Borderlands 2, Far Cry 3, Guild Wars 2 and Persona 4 I find it difficult to accentuate the negative. I can see demerits only on principle, and that is a bogus premise for any review.
There's a reason Metacritic saw fewer 90+ review averages in 2012, but "Games are getting worse" is just a glancing blow at the real problem. It's more accurate to say games are getting predictable. For the core console gaming crowd for whom Metacritic aggregates reviews, gaming has lost some excitement in recent years. All the mainstream buzz is about social and mobile. New handhelds from Sony and Nintendo have debuted largely to consumer apathy. Online passes, DRM, downloadable content, and other irritating publisher initiatives have made the business behind the hobby impossible to ignore, a problem only likely to worsen in the next generation. And perhaps worst of all, every year we're seeing the same assortment of titles on shelves, just with a new number tagged on the end.
"We've seen them iterate and polish their ideas to a shine, and now we're seeing them beat those same ideas into the ground"
Here's a list of the NPD's top 10 best-selling games of November, and the number of games each franchise has had on consoles in this generation:
- Call of Duty: Black Ops II - Eight games this generation
- Halo 4 - Six games
- Assassin's Creed III - Five games
- Just Dance 4 - Four games
- Madden NFL 13 - Nine games
- Skylanders Giants - Two games
- Need for Speed: Most Wanted - 10 games
- NBA 2K13 - Eight games
- WWE 13 - Eight games
- FIFA Soccer 13 - 10 games
It's an impressive monument to monetization, with six of November's top 10 best-sellers in franchises with eight (or more!) games this generation. That's at least eight games over the span of eight holiday seasons. Now here's the same list, but with last month's best-sellers replaced by the game with which that series reached its Metacritic peak this generation.
- Call of Duty 4: Modern Warfare, 2007
- Halo 3, 2007
- Assassin's Creed II, 2009
- Just Dance 4, 2012
- Madden NFL 08, 2007
- Skylanders: Spyro's Adventure, 2011
- Need for Speed: Hot Pursuit, 2010
- NBA 2K12, 2011
- WWE SmackDown vs. Raw 2007, 2006
- FIFA Soccer 10, 2009
Note that the franchises appealing to core gaming audiences tended to peak earlier in the generation. For that crowd, the group of gamers who buy games year-round, the customers who are most-engaged, who create online communities over months and years of dedication to a game or a series, there has been little to get excited about of late. We've seen what developers and publishers can do on the current generation of hardware. We've seen them iterate and polish their ideas to a shine, and now we're seeing them beat those same ideas into the ground. It's no wonder that reviewers aren't rubber stamping 90+ scores on them any longer.
2012 was one of the best years for video games in a long time. Industry revenue is shrinking, mobile and social is alternatively taking over or tanking, and new platforms are having a rough time, but developers - mainstream and independent - have stepped up to the plate big time. The Walking Dead, Far Cry 3, Hotline Miami, Dishonored, Journey, FTL, Thirty Flights of Loving, Persona 4 Golden, Mass Effect 3, Papo & Yo, Super Hexagon...
The Metascores for 2012 are slightly lower than previous years, with the lowest on Top 10 (Far Cry 3, PS3) sitting at a Metascore of 91. In contrast, 2011's number ten was Minecraft for PC, with a Metascore of 93. The surprising thing is - outside of the grand outlier that is Mass Effect 3 - user scores are higher for 2012. Mass Effect 3 was "Metabombed" by players for the ending, which brought its user score way, way down.
- Batman: Arkham City (PS3): 8.1
- The Elder Scrolls V: Skyrim (360): 8.5
- Portal 2 (PC): 8.4
- Portal 2 (360): 8.1
- Portal 2 (PS3): 8
- Batman: Arkham City (360): 8.4
- The Legend of Zelda: Ocarina of Time 3D (3DS): 8.5
- The Elder Scrolls V: Skyrim (PC): 8.3
- Mass Effect 2 (PS3): 8.2
- Minecraft (PC): 7.1
2011 Average Top 10 User Score: 8.16
- The Walking Dead: The Game (360): 8.5
- Persona 4 Golden PS Vita: 9.2
- Mass Effect 3 (PS3): 4.5
- Mass Effect 3 (360): 5.3
- Journey (PS3): 8.6
- Xenoblade Chronicles (Wii): 8.4
- Dishonored (PC): 8.1
- Mark of the Ninja (PC): 7.8
- Borderlands 2 (PS3): 8.2
- Far Cry 3 (PS3): 8.6
- Far Cry 3 (360): 8.6
- Trials Evolution (360): 8
2012 Average Top 10 User Score: 7.72
2012 Average Top 10 User Score (minus ME3): 8.4
Minus Mass Effect 3's vicious beating, users actually liked 2012 titles more than 2011's. That leaves us to wonder why the same wasn't true of critics. Fatigue, perhaps? As my colleague Brendan points out, 2012 played host to a huge number of sequels, many with annual iterations at this point. We're at the end of a console generation and critics may just be looking for the next new thing.
Add that to the ever-sliding baseline, leading technically-superior sequels to receive similar or lower scores to previous iterations. Former Epic Games designer Cliff Bleszinski even called some critics "haters" in 2011 because he felt that Gears of War 3 deserved a better score than its predecessor. Prior success just makes the mountain developers have to scale much higher.
Together, it means that while the industry may be putting together some of its best work, the scores are just a little lower than before. No problem with that... unless your bonus is based on a Metascore.
Metacritic, like the review sites it aggregates, is becoming less relevant because they have fewer core games to cover. Apart from a few breakout titles, game reviews sites pass over a huge chunk of mobile and free-to-play games because they don't know how to cover them. I'm not saying a traditional review is the right way to cover World of Tanks, Clash of Clans or Rage of Bahamut either; it's just that the whole reviews process is out of date. It just doesn't work for a persistent game, or a game as a service, or however we're labelling online titles now. It didn't really work for massively multiplayer online RPGs either. And on top of that, consumers don't need to read a review when a game is less than the price of a coffee.
"Thinking a bunch of scores kettled by an aggregator like Metacritic can tell you anything about the state of the industry? That's a 2/10 idea right there"
So there were plenty of excellent games released last year - it's just that a big chunk of them weren't reviewed by the media, or were only reviewed by a few sites that are struggling to fit a rigid template to a product that changes after initial release.
The problem isn't the quality of the games but the whole reviews system. As Matt says, games are a different beast now, and free-to-play and games with a heavy online component just don't fit with embargoes or deadlines. When do you put a score on Diablo? On day one, when the servers failed and the games were unplayable? Or two weeks in when the game is working but everyone is like, so over reading reviews of the game anyway? As such, scores are becoming not only difficult to give, but warped by the what they're supposed to cover and why. Metacritic might help you compare opinions on a particular game, but it's no sort of barometer for the quality of the industry.
Pity the poor games journalist. Younger, less secure writers often live in fear of giving the "wrong" score, and aggregators like Metacritic don't help. You gave it a 10 when everyone else gave it a 7? Get the lynching rope and pick a pretty tree. Other more cunning writers might throw out a stunt score now and again, a savage beating of the game that everyone else is losing their Skittles over, just to attract a few extra hits.
Scores aren't measured by science, they're not checked out by the games police before they're posted; they just come from opinions, tweaked by the publication's scoring policy, and massively affected by whatever else is around that month. The clever reader will find writers they know, and trust, and use that as their touchstone when buying new games. But thinking a bunch of scores kettled by an aggregator like Metacritic can tell you anything about the state of the industry? That's a 2/10 idea right there.