If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

R.I.P. single-player games and dedicated handhelds | 10 Years Ago This Month

Sony's big names offer flawed visions of the future while everyone else declares portables a victim of tablets

The games industry moves pretty fast, and there's a tendency for all involved to look constantly to what's next without so much worrying about what came before. That said, even an industry so entrenched in the now can learn from its past. So to refresh our collective memory and perhaps offer some perspective on our field's history, GamesIndustry.biz runs this monthly feature highlighting happenings in gaming from exactly a decade ago.

Single-player gets a terminal prognosis

One thing writing this column has made clear to me is just how often people use 10 years as a sort of default time frame for talking about the future in this industry.

It's really just a perfect time frame because it accommodates such a large spectrum of possibilities, from the banal to the outlandish.

If you asked for predictions five years out, the horizon is so close you won't get real-world holodecks, neural implants, or any of the other goofy sci-fi stuff talked about by people who can call themselves "futurists" with a straight face. But if you asked for predictions 20 or more years out, you're far enough away that saying, "it'll probably be a lot like today, but contemporary trends X, Y, and Z will progress a bit" feels silly and unimaginative. After all, this is video games and there's an expectation for the business to reinvent itself on the regular.

A decade down the road is close enough that we can expect the industry won't be entirely unrecognizable, but it's also enough time for an empire to be built, destroyed, or sometimes both.

10 years gets Blizzard from making unremarkable licensed titles like 1994's The Death and Return of Superman to a super studio with massive franchises like Warcraft, Diablo, Starcraft, and World of Warcraft. 11 years gets Apple from the Pippin to the iPhone. A dozen years gets Sega from the US launch of the Genesis to the Sega CD to the 32X to the Saturn to the Dreamcast to quitting the hardware game entirely.

Those examples are all from the '90s to the mid-'00s, but does it really seem to anyone like the industry has slowed down since then? Mobile and free-to-play games, Minecraft, Fortnite, Steam... A whole lot can happen in about a decade, which is one of the things that makes predictions on that time scale so much fun.

Anyway, the reason I bring this up is because in August of 2011, 10 years was the time frame of speculation for a private session at GDC Europe featuring Sony hardware designer Mark Cerny, Sony Worldwide Studios president Shuhei Yoshida, and Sony Liverpool studio director Mick Hocking.

"I believe the traditional single-player game experience will be gone in three years"

Mark Cerny, in 2011

Correctly calling anything in games 10 years out is tricky enough, but Cerny decided to up the degree of difficulty by not only picking an almost unthinkable turn of events, but cutting the timeframe in which it would happen by more than two-thirds.

"I believe the traditional single-player game experience will be gone in three years," Cerny said. "Right now, you sit in your living room and you're playing a game by yourself -- we call it the 'SP mission' or the 'single-player campaign.' In a world with Facebook, I just don't think that's going to last."

Instead of the usual single-player experiences the industry had been releasing, Cerny suggested they would adapt, naming the Sony-published Demon's Souls as one possible example of what would replace them. If developers didn't have such mechanics in their games within three years, Cerny said they should expect to be dinged by reviewers.

"The funny thing here is, we don't even know what to call this," Cerny said. "Is it single-player or is it multiplayer? We don't even have the words. It's kind of Orwellian. If you don't have any word for freedom you can't have a revolution. How can you be talking about design when we don't have the words to describe it? Yet, that will be the standard, I believe, in 2014."

"It's kind of Orwellian. If you don't have any word for freedom you can't have a revolution"

Mark Cerny, talking about... single-player games?

Cerny was by no means alone in his dim view of single-player games' prospects. The same month, Epic European territory manager Mike Gamble was also on the multiplayer bandwagon, saying "Clearly a console product or a PC product now almost has to have multiplayer otherwise it's just not considered to be a complete product. In fact, it's almost gone the other way where the single player element of it is actually the throwaway and it's the multiplayer which is the critical element."

Obviously multiplayer is still a huge part of the market, but purely single-player games haven't exactly gone extinct. Sony knows that better than anyone, considering the leadership position it currently enjoys in the console space has been greatly boosted by a lineup of single-player system-sellers like God of War, Marvel's Spider-Man, Horizon: Zero Dawn, The Last of Us Part 2 and Ratchet and Clank: Rift Apart.

Yoshida and Hocking didn't do too much better with their predictions, as both bought into the idea of artificial intelligence advancing to the point of passing a Turing test.

"I think what people want in games in 10 years is the perfect human being in digital form, where you can't tell the difference if it's real or digital," Yoshida said, saying digital characters would be believable in behavior as well as appearance. "In your reality it's a human."

Hocking added that AI and interfaces like the Kinect depth-sensing camera would make it so people wouldn't play games so much as they would fill the role of a digital actor in them.

"If they're feeling sad, we can make them feel happy again"

Mick Hocking, on the potential for AI to assess and manipulate players' emotional state

"Perhaps you're playing a detective game and you're playing a witness," Hocking suggested. "The game has got to decide whether you're lying, rather than you deciding whether the character's lying in the game, because we can look at your expression on your face."

Let's put aside for a minute that the player would always be lying in a sense because they'd be acting in a role, so you'd need the player to perform with just the right level of deception. And put aside the non-trivial challenge of making that "fun," or even providing proper feedback to the user to let them know why they failed to hit that level and how they can improve.

The bigger concern for me is how excited Hocking was to boil down the complexities of human emotion to numbers, making "a map" of tastes and moods.

"The more accurate that map can become the more accurate we can be about delivering an experience to change that emotional state," Hocking said. "If they're feeling sad, we can make them feel happy again."

This is the alarming part for me. From adopting content ratings to loot boxes to harassment all the way to hypothetical examples of futuristic detective games, the games industry does a routinely terrible job of thinking things through before starting on them. The priority in gaming has always been to create a successful thing first and then deal with the entirely predictable negative effects when compelled to by outside forces.

Even if using computers to reliably and accurately assess and manipulate people's emotions is possible -- and I'm deeply skeptical -- it's an ethical can of worms I have no faith in this industry to even consider before gleefully opening.

My personal gaming diet is almost exclusively offline single-player games, but I would still take Cerny's prediction being true over Hocking's every day of the week.

Dedicated handhelds are doomed

With Nintendo having just instituted a massive price cut for the 3DS in a desperate bid to turn the fortunes of the recently launched system around, August of 2011 was home to a wave of pessimism about portables.

Early in the month, our own Rob Fahey declared it the end of the handheld, saying, "One thing is certain: dedicated handheld gaming devices are now in rapid decline, and barring an extraordinary technological advance, they're not going to come back. Birthed with the Game & Watch, this sector is going to end with the 3DS and Vita. All that remains to be seen is whether it ends with a whimper, or a bang."

He was by no means alone. Crytek managing director Avni Yerli was telling us the studio was passing on the Vita, explaining, "I can see that the multifunctions that devices like tablets offer is a big advantage of them. And I think people want one mobile device that can do multiple things."

Gaikai founder David Perry thought handhelds would go the way of wristwatches because younger generations just didn't want to bother with a device that only did one thing.

"With all due respects to Sony and Vita, it's a car wreck."

Matthew Seymour

"I know they're adding Netflix and stuff, but they really need to be that sort of multifunction device to survive," Perry said. "And if you think about it, that ultimately turns them into cellphones."

And then there was Heavy Iron Studios' Matthew Seymour, who had the same multifunction reservations when he said, "With all due respects to Sony and Vita, it's a car wreck."

I know I give Sony a lot of flack for giving up on the Vita so soon after launch, but in its defense, a significant chunk of the industry had given up on the Vita -- and handhelds in general -- even earlier. When these interviews were published, we were still half a year away from the Vita's international launch in February of 2012.

On the one hand, the Switch and interest in the Steam Deck make it tempting to chalk these all up as Bad Calls, but the Switch obviously has an asterisk by virtue of being a hybrid console-handheld.

On the other hand, the Switch Lite has no such asterisk, and is as single-purpose as a piece of electronics comes these days. (It doesn't even have a Netflix app, although Hulu, YouTube, and a number of other services have made it on there.)

But on the other other hand, Switch Lite also only accounts for 15.8 million of the 89 million Switches in the wild as of Nintendo's last earnings report. That probably isn't too much higher than the Vita's lifetime sales -- Sony doesn't like to talk about those numbers for some reason -- and it doesn't strike me as evidence that the Lite could have succeeded as its own product rather than as an extension of a proven hit.

Super Mario Land was amazing for its time

I think the conventional wisdom on dedicated handhelds a decade ago was more or less right, but I also believe the Switch has shown a path forward for portable gaming that works as an alternative option for enjoying the same experiences people get on a TV or PC monitor.

Granted, that was always the selling point behind gaming portables; it's just that once upon a time, Super Mario Land was an acceptable and even impressive approximation of Super Mario Bros. But in a post-Breath of the Wild world, people take the idea of "the same experience" much more literally.

Good Call, Bad Call

BAD CALL: Nintendo scheduled a pre-Tokyo Game Show event, so investors got hyped about a bombshell announcement and drove up the company's share price.

Just in case people were expecting some big news about the upcoming Wii U, Nintendo specified the next day that the event would focus on the 3DS.

The big news of the conference ended up being Monster Hunter 4 for 3DS and a new "misty pink" model of the handheld. Investors were disappointed that Nintendo didn't reveal something it never even hinted at, and dinged the share price by selling their stock.

GOOD CALL: Nintendo, presumably thinking it would be fun to relive those events again and again forever, launched the very first Nintendo Direct a month later, and we've been trapped in the cycle ever since.

BAD CALL: Diablo 3 lead designer Jay Wilson attempting to justify the upcoming game's real-money auction house as more than a money grab by saying, "Trading is not very good in Diablo, and yet it's a game about trading."

BETTER CALL: Blizzard, having a revelation about what the franchise is about as it burned the real-money auction house to the ground two years later, saying, "It ultimately undermines Diablo's core game play: kill monsters to get cool loot."

BAD CALL: A year into Sony's big push for stereoscopic 3D gaming, Sony Computer Entertainment Europe 3D evangelist Simon Benson spoke with us about how "we're so early in the 3D life cycle" even though we were already quite comfortable running an editorial under the headline "3D: Failure to launch."

But the actual bad call here was Benson balking when asked about Sony 3D TV sales, saying, "I mean obviously we're Sony Computer Entertainment, so obviously we work closely with the electronics division but it would be rude of us to ask, and even ruder for us to say if we did have those numbers."

Because when you're deciding what major multi-year marketing push you're using to give your console a differentiating factor against a rival whose hot new thing Kinect is still in the honeymoon period, the absolute last thing you're willing to do is to get key sales data from another division of your company so you can make an informed decision. Because that would be "rude."

GOOD CALL: Our writer Matthew Handrahan, who had only been hired two months prior, deciding it was perfectly appropriate to abbreviate Assassin's Creed as "Ass Creed" in a headline.

Naturally, he would go on to be the editor-in-chief of the site.

BAD CALL: Handrahan, who left shortly before his 10th anniversary with the site earlier this year under the sadly mistaken impression that it would save him from being mentioned in this column.

Related topics
Author
Brendan Sinclair avatar

Brendan Sinclair

Managing Editor

Brendan joined GamesIndustry.biz in 2012. Based in Toronto, Ontario, he was previously senior news editor at GameSpot in the US.

Comments