The games industry moves pretty fast, and there's a tendency for all involved to look constantly to what's next without so much worrying about what came before. That said, even an industry so entrenched in the now can learn from its past. So to refresh our collective memory and perhaps offer some perspective on our field's history, GamesIndustry.biz runs this monthly feature highlighting happenings in gaming from exactly a decade ago.
Smedley calls the future of free-to-play
There's a danger with a column like this of not giving people enough credit for being right. Looking back on things with the benefit of hindsight can make history feel as predetermined as an on-rails shooter.
We know how things played out, and that was the way they had to play out because that's what has led us to the present. There's a sense of inevitability to it that I think can make us give correct predictions less credit than they deserve.
For example, take this guest editorial we ran in September of 2011 from Sony Online Entertainment president John Smedley about how free-to-play would not only largely replace the subscription model for MMOs but become "the predominant model in online gaming" of all sorts.
It all sounds kind of "common sense" now, or at least it did to me when I first read it. Free-to-play was well-established in many markets by then, and major publishers were exploring the idea with their big brands, including Call of Duty and Ghost Recon Online. And it had been a couple years since Turbine's Dungeons & Dragons Online successfully converted from subscription to free-to-play, showing not just that the model was viable, but that it could actually make games viable even when they had flopped under other models. And of course, Smedley wasn't even the first person to make such a prediction. At this point, it had been five years since Acclaim Games was resurrected on the premise that free-to-play games were the inevitable future.
So no, Smedley wasn't the first to say these things. But downplaying the impressiveness of Smedley's foresight ignores how accurate his predictions were.
We can look up the exact definition of "predominant" and pick nits about whether the situation we have now is exactly what Smedley was predicting, but given how heavily skewed the mobile industry is to free-to-play and how titles like Fortnite and Call of Duty: Warzone have fared in the console and PC space, it's tough to say he was in any way wrong about that, particularly since he didn't give a timeframe for his prediction.
He even offered some tips for developing free-to-play games that remain conventional wisdom in the space today: Let players get just about everything for free if they're willing to grind, let people pay for shortcuts or convenience, and lean into player customization items.
It may also be tempting to look at Smedley's editorial and diminish it for pointing out the logical extension of various industry trends that people inside the industry would have been well-aware of by that point: the success of free-to-play in other markets, the benefits for console makers of supporting the model on their platforms, the growing quality and production values of free-to-play games, and the trends around what people will happily pay money for and what they wouldn't.
"Free-to-play is here to stay. I really think we're going to see a lot more AAA games going with this fantastic business model"
John Smedley in 2011
But clearly these things weren't obvious to the rest of the industry, which had plenty of counter-examples to bring up if so desired. Acclaim never gained much traction and was acquired by Facebook free-to-play publisher Playdom in 2010, then shut down by its new parent less than three months later. And while the social gaming scene Playdom sprang from had been commanding plenty of buzz not too long prior -- Playdom was acquired by Disney in a $763 million deal in between the acquisition and execution of Acclaim -- Facebook's attempts to curb overly aggressive viral pitches in games had slammed the breaks on social gaming's growth, and 2011 saw the sector cool off considerably.
Smedley's predictions of the future also weren't obvious to Electronic Arts since it was gearing up to launch Star Wars: The Old Republic with a traditional subscription-based model later in 2011. Neither were they obvious to Funcom when it released The Secret World in 2012, Square Enix when it released Final Fantasy 14 for the second time in 2013, or Bethesda when it launched The Elder Scrolls Online in 2014.
With the exception of Final Fantasy 14, all of those either went free-to-play up to a level cap or abandoned their traditional subscription models within a year of release. Final Fantasy 14 waited until 2017 to follow with a free-to-play trial for new players, but impressively still retains its subscription plan.
In fact, this is about the only thing Smedley was wrong about, as he predicted The Old Republic would be the last major MMO to launch with a traditional subscription model. He was actually giving the industry too much credit for seeing where things were headed.
Perhaps the best thing about Smedley's editorial? He prefaced it with a warning that his thinking on the subject was evolving alongside the industry itself, so he apologized to anyone who might be reading his words after they were no longer relevant, "so please keep this in mind if you're reading this a year from now."
No apologies necessary, John.
Harrison's domains for disruption
More commonly, the predictions I come across in digging up old stories for this column are unsatisfyingly mixed, neither wrong enough to merit a playful jab nor accurate enough to deserve applause.
Like when London Venture Partners general partner Phil Harrison gave attendees at the Games Invest event in London a list of five disruptive trends in gaming to look for: Flash 11/Molehill, HTML5, location-based mobile games, tablets, and Smart TVs hooked up to the cloud.
Harrison cautioned that the Flash 11/Molehill was a short-term opportunity, but that may have been understating it
Harrison cautioned that Flash 11/Molehill was a short-term opportunity, but that may have been understating it. Two months later Adobe said it would no longer work on Flash for mobile devices, opting instead to focus on HTML5 for mobile applications. Adobe's handling of Flash was burning bridges with partners like Unity by 2013, and by the time Adobe officially ushered in its end-of-life phase in 2017, the only disruption to speak of was the sizable chunk of gaming history lost by the death of the platform.
As for HTML5, it has perhaps been more of a longer-term opportunity in games than Harrison anticipated. Flash game portal Kongregate started hosting HTML5 games in 2013, but it was only in 2017 when the company thought the tech was finally good enough to compete with browser plug-ins like Flash. Even now, it's not difficult to find game developers still talking about HTML5 as "the future."
Tablets? They've done great. But this was already a year and a half after the original iPad launched and quickly silenced the skeptics who thought a bigger iPhone that couldn't make calls would flop. I also don't know if tablets have been any more disruptive than smartphones in general, but they found their portion of the market and continue to be a relevant and viable place for game developers to do business.
Location-based mobile games would appear to be the strongest of Harrison's disruptive predictions, although that's largely been borne out by the phenomenal success of Pokémon Go. Attempts to reproduce that success with hugely popular franchises like Minecraft and Harry Potter have fallen far short of the bar Pokémon Go set, raising questions as to whether Niantic's first hit was simply Thunder Shock in a bottle.
There's still some hope for augmented reality devices to unlock greater potential of location-based mobile games beyond Pokémon, but I don't think it's much of an argument in Harrison's favor if one of his disruptions remains dependent on a separate disruption happening more than a decade later.
Finally, there was Harrison's prediction of Smart TVs and cloud gaming.
"Smart TV and the cloud, hundreds of millions of televisions being sold every year with internet connections in them with, increasingly, processors which are akin to an Atom notebook, notepad type functionality," Harrison reasoned. "This is going to disintermediate a lot of consoles from the space, which is not necessarily a good thing but it creates new channels for consumers to enjoy certain types of games directly on their television."
He was right in that Smart TVs took off. According to a 2020 survey by Hub Entertainment Research, 70% of US households now own a Smart TV, half of them with streaming services like Roku or Fire TV built in, both of which can play a bunch of games.
But have they disrupted gaming in the way Harrison expected? Probably not. (I don't know about you, but my consoles are still firmly intermediated in my TV spaces.) Of course, cloud gaming hasn't really taken off the way he hoped either, and that was a significant aspect of his prediction for disruption.
Much like location-based gaming still has apparently untapped potential, maybe smart TVs will yet play a larger role in the gaming landscape if some of the companies with current cloud streaming ambitions can make those pay off.
From respected icons to known clowns, it's awfully common for anyone guessing at the future of this industry to be right about some aspects and wrong about a bunch of others
At the very least Harrison put his money where his mouth was on this one. At the time he shared these predictions, he was on the advisory board of cloud gaming service Gaikai -- soon to be acquired by Sony as the basis of the PlayStation Now service -- and has spent the past four years at Google heading up the Stadia effort.
So how are we to grade Harrison's predictions? Tablets, location-based gaming, and Smart TVs are certainly viable businesses, and lots of investors in those spaces have probably done quite well. I'm not entirely sure any of them has upended the table individually, although each has certainly advanced trends like mobile gaming and constant connectivity that have collectively been disruptive and helped shape the current industry.
As for HTML5 and Flash, that's a bit further out of my wheelhouse but I think it generally doesn't help that ten years later, one is still spoken of as a "future" tech for games and the other can only be spoken of in the past tense.
I find that more often than not, this is typical of the industry predictions captured in the gaming media. From respected icons to known clowns, it's awfully common for anyone guessing at the future of this industry to be right about some aspects and wrong about a bunch of others, with more uniform instances like Smedley's far more the exception than the rule.
Fortunately, we have a recurring section that lets us focus on just one or two predictions in the middle of much-longer pieces, completely ignoring the rest of the stuff which isn't so notably right (or wrong).
Good Call, Bad Call
GOOD CALL: Spicy Horse founder American McGee tells an interviewer, "In the (not so?) distant future there wouldn't be much need for a bricks-and-mortar retailer."
For consumers at least, that's perfectly true. You can buy your hardware online now, directly from the hardware makers themselves if you choose (when they have stock, anyway). And virtually all of the software is downloadable these days, barring some glitches in The Matrix like the dedicated retro handheld Evercade.
The industry is still somewhat dependent on brick-and-mortar retailers, but that's been fading over time as well. Activision Blizzard reported 6% of its net revenues coming from retail channels last quarter. For Take-Two, retail was 9% of revenues, and just 4% of bookings. Nintendo is likely as dependent on brick-and-mortar outfits as anyone, and it reported barely half (53%) of its software sales last quarter came from physical sales. For the same quarter the year before, physical games were just 44% of Nintendo's software sales.
BAD CALL: McGee, backing up his point by asking, "Why waste resources on a physical location and unreliable employees when the entire experience can be made sharper, cleaner and more entertaining in the virtual representation?"
I'm as guilty as anyone of complaining about the GameStop in-store experience with hard-selling employees forced to recite pitch after pitch, but "sharp," "clean," and "entertaining" are not the words I would use to describe modern digital storefronts, either. They do have some significant advantages over brick-and-mortar when it comes to selection, price, and the fact that you can easily see gameplay videos embedded in store pages, but it turns out endless shelf space does not necessarily make for great user experiences and exacerbates discoverability issues. And that's to say nothing of stores that incorporate forums and user reviews but don't moderate them effectively.
BAD CALL: Bethesda VP of PR and marketing Pete Hines addresses the company's reputation for launching buggy games, saying, "I think we have and continue to get better at it. When you look at Fallout: New Vegas, it was not a Bethesda Game Studios title, it was different experience for those guys even though we worked with them on it, but I think Todd [Howard] and his team have continued, over the 12 years I've been here working with them, to make improvements, and I think they're in a good place with this."
It's bad enough that reads like Hines throwing previous partner and future sister studio Obsidian Entertainment under the bus after it delivered one of the most cherished games in the Fallout IP. It's a little worse when you consider Bethesda Softworks went on to have decidedly buggy launches of Elder Scrolls Online and Fallout 4, followed by a Fallout 76 launch so disastrous the authorities got involved.
GOOD CALL: In a keynote speech at the Cloud Gaming USA conference, THQ CEO Brian Farrell predicts that consoles in the future wouldn't have disc drives and development was shifting toward a games-as-a-service model.
Yes, the latter bit was basically conventional wisdom by that point, and the disc drive thing wasn't exactly a huge stretch, but the Xbox One S All Digital, the Xbox One Series S, and the PS5 Digital Edition weren't foregone conclusions at the time. The disc-less PSP Go had flopped two years prior, and even two years in the future, anxiety over the connected and digitally distributed future would still be pronounced enough to help kneecap Microsoft's original Xbox One plans and help deliver a generational victory to Sony. (There were other factors, of course, but this was still an issue.)
BAD CALL: In a separate conference, THQ chief financial officer Paul Pucino talks about the company's decision to shutter a number of its studios, saying, "We think the best position we can be in with respect to studio structure is fewer is better."
SHADY CALL: Ten MMO publishers using gacha/loot box mechanisms in their games refused to give the South Korean government's game rating agency information about the payout percentages on items in their games. Just as Western publishers would crib many Korean developers' design and monetization practices, they would also steal their strategy of being uncooperative with the government, as we saw Epic Games being evasive with UK legislators in 2019 in a Parliamentary committee that ended with a recommendation to ban loot box sales to children.
Gotta say it's deeply concerning that these companies have decided snubbing governments and risking their wrath hurts less than whatever would happen if people knew the actual details of their business.
GOOD CALL: Netflix announces it is getting into games as part of its ill-fated Qwikster strategy!
BAD CALL: Did you see the part about Qwikster being ill-fated? (Fortunately, Netflix would get into games again -- but not really -- in 2018. And once more earlier this year. Maybe this one will stick.)
GOOD CALL: Jason Citron leaves OpenFeint after selling the company to Gree "to pursue new opportunities." Gree shut down OpenFeint a year later, while Citron went on to found Discord and turn it into the kind of business that can walk away from a $10 billion acquisition offer.
BAD CALL: Atari founder Nolan Bushnell talks about the educational potential of cloud computing, saying, "We are currently teaching subjects 10 times faster. We believe that when we roll this up to full curriculum, we'll be able to teach a full career of high school in less than a year. And we think we'll be able to do that by the end of next year. That's a lot of time to chase girls and have fun."
That part about chasing girls wasn't great to start with and certainly doesn't look any better in hindsight, but this one would probably net a Bad Call regardless.
Last I checked, high school in the US is generally still four years. But as unbelievable as his "end of next year" goal for condensing high school to one year was, Bushnell's suggestion as to who would reap the benefits of such increased efficiency was somehow even more far-fetched.
"If we can have a kid learn twice as fast," he explained, "we can pay teachers twice as much."
Because if there's one thing we know for sure, anytime Silicon Valley disrupts an established field, the resulting windfall always goes straight to the people at the bottom of the org chart, right? (Oh, and lest it go unsaid, the premise of being able to pay teachers twice as much in that scenario is predicated on needing half as many teachers, i.e. laying off half of all high school educators.)
WEIRD CALL: Ten years ago, Quantic Dream co-CEO Guillaume de Fondaumiere was upset with the games industry's rating system, and I'm still not clear as to why.
"My problem is not with how unified the system should be, my problem is that we're not on par with cinema and TV. Most people play 18+ games. Have you ever watched an NC-17 movie? You don't even know what an NC-17 movie is! But you know what an R rated movie is? Alright, so that's 17 and under must be accompanied by their parents to watch a movie. An NC-17 movie is basically a porn movie, and that's basically our 18 rating. And so all our games are porn movies to a certain degree."
Quantic Dream's penchant for gratuitous shower scenes aside, I suspect he's complaining about the stigma M- and PEGI 18-rated games might carry. The comparison to NC-17 films doesn't really fit for me for a few reasons, among them the key fact that parental consent won't get kids into the theater for an NC-17 film, but it has put countless copies of Grand Theft Auto 5 in kids' Xboxes. Restrictive game ratings leave the ultimate choice up to parents and guardians in a way theatrical film ratings don't.
There's also already a much clearer analog for NC-17 in games (in North America, at least) with the rarely assigned commercial death knell that is the AO for Adults Only rating. But I don't get the suggestion the M rating or the PEGI-18 designation is anything comparable when they still enjoy effectively unfettered distribution and marketing. Unlike NC-17 movies, I've never heard of major chains refusing to carry Call of Duty, and Activision can buy ads for it essentially anywhere that would accept ads for R-rated movies.
An M-rating clearly hasn't done much to harm video games' commercial possibilities, as the title of best-selling game in the US has been claimed by an M-rated game every year since Wii Play (with Wii Remote) took the crown in 2008. (The other years are almost all led by Call of Duty games with Grand Theft Auto 5 and Red Dead Redemption 2 as the outliers.)
Also, some of these widely marketed, multimillion-selling M-rated games are already far beyond the limits of what filmmakers can get away with in R-rated movies, so maybe this is one case where the double standard is actually working to game developers' advantage.