If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Is it time to retire virtual currency? | Opinion

The industry doesn't want to curb loot boxes, but it could address some concerns by giving up another consumer-unfriendly tactic

There's been a lot of focus on loot boxes of late, with the US Federal Trade Commission scrutinizing the issue in a day-long workshop last month and the UK Parliament's Digital, Culture, Media and Sport (DCMS) Committee recommending that loot boxes be regulated under gambling law and banned from being sold to children.

And when the press talks about this focus, we often just refer to it as "the loot box debate," but it's actually a much broader issue than that for the industry. A less catchy but more accurate description might be "the exploitative monetization debate." Because while loot boxes draw plenty of criticism from players, press, and politicians, the pushback against them has spread to a larger collection of tactics broadly viewed as unsavory.

As bad a reputation as loot boxes have, virtual currency helps enable many of their most objectionable excesses

For example, Senator Josh Hawley introduced "loot box" legislation in May that would also ban pay-to-win mechanics in any game played by children, including charging people to bypass cooldown timers, selling items that could ease progression, or selling competitive advantages in a multiplayer game. (As an update, that bill was referred to committee and has not progressed since.) And at the FTC workshop, there was a fair amount of discussion about virtual currency, because it is in many ways intertwined with what gets people riled up about loot boxes.

So perhaps we can curtail this legislative interest in loot boxes not by cracking down on the boxes themselves (a difficult task and one the industry has repeatedly signalled it has no interest in), but by changing how we handle some of those ancillary tactics that combine with loot boxes to form the Disreputable Industry Avengers some politicians are so keen on disassembling.

I suggest publishers look at virtual currency -- perhaps the Hawkeye to loot boxes' Hulk -- and seriously consider just how essential it is to their business. Because as bad a reputation as loot boxes have, virtual currency helps enable many of their most objectionable excesses. Loot boxes draw people's ire for obfuscating what players are buying, but virtual currency similarly hides what they're paying.

For one, there's no set exchange rate. Even if a game sells its virtual currency at an easily graspable conversion -- let's say $1 equals 1 FunBuck -- it's common for games to give players bonus amounts or discounts on larger purchases that muddy the conversion. It's also common for them to let players accrue some amount of virtual currency through gameplay. Both of those are presented as nice perks or bonuses for the player that on the surface allow them to purchase things for less real-world money, but they make the actual cost of any given item something that's difficult to pin down, and highly variable by user depending on how much they value their time.

As National Consumers League VP John Breyault told the FTC during the loot box workshop, figuring out how much anything in virtual currency games actually costs is "a lot of cognitive load on the user."

The math on virtual currency can get messy in a hurry

In the same session, ESA general counsel Michael Warnecke explained the industry's use of virtual currency, saying publishers have a few reasons for it. First, it helps avoid credit card transaction fees on a plethora of low-value purchases. Second, it would be annoying for players to have to repeatedly confirm individual in-game purchases. Finally, it helps preserve a game's narrative integrity. These points merit a bit of examination, as they represent the industry's clearest on-the-record reasons to embrace virtual currency as it has. And as arguments to justify undermining a consumer's ability to make an informed purchase decision, none of them is compelling.

On credit card transaction fees, I don't see any reason publishers and storefronts couldn't simply bundle purchases made within a certain time frame, much the way Apple does with iTunes and App Store purchases. And if they have to swallow the fees on stand-alone $0.99 purchases every now and then, oh well. Businesses have expenses, and it's up to them to price their products to account for that.

As for Warnecke's suggestion that it would be annoying to repeatedly go through the purchase process, that's true. But avoiding annoyance is not an acceptable justification for what politicians and parents are viewing as unethical business practices. However, that point does speak to how much less friction there is when paying with virtual currency. You don't need to enter your credit card information. You don't need to go to a standardized storefront UI. Depending on the game, you don't even need a confirmation dialogue to affirm that you intend to spend this ersatz currency that cost you real-world money.

This is one of the more concerning aspects of loot boxes and microtransactions to me, because the usual friction points in a transaction are opportunities for a customer to consider whether they really want to spend this money, and whether the amount they are spending is a fair exchange for what they're receiving. Purchasing decisions like that can be hard enough when you're waiting in the checkout line of an actual store, holding a tangible thing in your hand and looking at the clearly labelled price tag. When impulse buys are completely non-refundable and constantly just two taps away -- when buying things is so trivial it enters into "honest mistake" territory -- it may be time to consider that perhaps a bit more friction is called for.

One thing we constantly hear from publishers defending their monetization schemes -- right after "Some games go too far, but we do our monetization the right way" and "We want to be respectful of our players" -- is "We want our users to be happy they spent money on our game." And there's some truth to that. No publisher wants their players to feel burned by the transaction. But it doesn't take an in-depth examination of the gaming populace to find evidence of buyer's remorse and people who resent the monetization schemes embedded in their games. If you put some friction back into transactions -- make them feel more like a discrete purchase and less like a continuation of gameplay -- you might cut down on that negative sentiment surrounding your game. And considering the disproportionately loud noise a vocal minority can kick up online these days, even a modest decrease in the actual number of upset consumers could have a significant muffling effect.

Warnecke's final explanation for the industry's use of virtual currency -- to preserve narrative integrity -- is as absurd as it is damning.

"Say you had a game set in Ancient Egypt and you wanted to buy a chariot for a big combat that was going to come up, and you went to the marketplace in Thebes," Warnecke said. "You would not want to be buying a chariot for $2.50 US. It would be a little bit jolting and a little bit odd, so instead a publisher will make it with a historically appropriate monetary currency, such as a deben of copper, which would fit in more with the game."

In short, game makers don't want to break a game's immersion in order to have players make a purchasing decision because it would be "a little bit odd." And that's an absolutely valid creative position. But if you honestly want to hold your narrative integrity as sacrosanct, the simple way to do that is to not ask the player to make a real-world purchasing decision within your story in the first place.

Do you want to spend the day converting actual money to Helix Credits to Drachma to figure out what this costs, or do you want to save the world?

Using virtual currency to keep from breaking immersion is an indefensible compromise. It says the industry wants players to keep making purchasing decisions, but not as paying customers soberly assessing the value of what they're buying against their discretionary income for the month. The industry would prefer they make the exact same real-world purchasing decision while engrossed in the fantasy of a glorious hero a mere deben of copper away from saving the day. (And while I'm certain people aren't so simple as to entirely confuse the game for reality, I would also suggest people aren't so simple as to see something selling for $99.99 and think it's significantly different from something selling for $100, or to buy a brand of soda because the ad had an attractive person in it. And yet…)

"Many of the games employing these tactics are popular with children, who are already ill-equipped to understand the basics of money"

To recap, it's hard for players to know how much they're spending thanks to virtual currency, they often don't know what they're spending it on thanks to loot boxes, and companies are designing their games to have players make real-world purchasing decisions without thinking about their real-world situation. Throw in the way these games are designed as treadmills people can run on for years and throw unlimited amounts of money at, and it's very easy to lose track of not just how much any individual purchase is costing you, but how much you're spending on a game in total. (Recall the FIFA player who was "gobsmacked" to discover that he spent $10,000 on Ultimate Team over two years.)

Keep in mind that many of the games employing these tactics are popular with children, who are already ill-equipped to understand the basics of money. A survey of parents published last year showed that on average, their children didn't understand the fundamentals of money until the age of 10, by which they mean the fact that money was not infinite, and that it had to be earned and saved and budgeted. Now consider that with loot boxes and virtual currency, games are also requiring children to have a functional knowledge of algebra and statistics in order to comprehend the value proposition, and it becomes easy to see why there might be some legitimate grounding to parents' and politicians' concerns.

"But how many kids that young are really playing these games," you might ask. And may I say, astute reader, that is a very good and fair question. But as the DCMS hearings showed, it's a question publishers refuse to answer.

Like so much of what "the loot box debate" is about, virtual currency is not inherently unacceptable in and of itself. But the ways this seemingly innocuous practice interacts with a host of other industry trends right now -- and the way trade groups, platforms, and publishers have had to be dragged kicking and screaming to taking the slightest action on any of them -- makes the industry look, in a word, scummy.

Games are increasingly leaning into monetization schemes that are a black box to players, intentionally depriving them of all ability to make an informed purchase decision. At the same time, analytics and tracking of player behavior mean the exact ways people play games and what makes them spend money have never been more transparent (or profitable) to developers. There's an information asymmetry here that needs to be adjusted, if not eliminated entirely.

The industry has made it clear that it has no intention of giving up loot boxes, so any push for self-regulation to stave off government interference will need to come from another part of the equation. Taking away the industry's insights into what their players are doing would make loot box comparisons to baseball cards and Kinder Eggs considerably more valid, but player tracking has too many legitimate uses and benefits for players and developers alike to consider broadly curbing it. But the mystery around what players are paying? That absolutely can and should change.

Related topics
Author
Brendan Sinclair avatar

Brendan Sinclair

Managing Editor

Brendan joined GamesIndustry.biz in 2012. Based in Toronto, Ontario, he was previously senior news editor at GameSpot in the US.

Comments