Skip to main content
If you click on a link and make a purchase we may receive a small commission. Read our editorial policy.

Bears vs. Art vs. Analytics

Former Halfbrick CCO explains how dev team overcame metric-induced "analysis paralysis" for soft-launched puzzle game

It's not often that a mobile game spends almost a year in soft launch, but such is the case with Halfbrick Studios' Bears vs. Art. The developer behind Fruit Ninja and Jetpack Joyride has had its "first true puzzle game" available in Canada, the United Kingdom, and a handful of other markets since last March, but has yet to follow through on a planned worldwide release. Speaking at the Game Developers Conference today, the studio's now-former chief creative officer Luke Muscat (he parted ways with Halfbrick just prior to GDC) described the game as a victim of "analysis paralysis," frozen in the face of advanced metrics.

To start, Muscat emphasized that analytics do not replace creativity. In fact, they can supplement it very effectively. The problem However, with the amount of metrics that developers have access to right now, the flood of data can be overwhelming. As Bears vs Art was venturing into a new genre for Halfbrick, Muscat said the developers wired it to track a slew of analytics, from what time of day people played to how many moves they took, how much art they destroyed, and so on.

"Analytics give you an incredible focus, but it's also giving you blinders."

At soft launch, the team focused on retention. This was their first (and possibly biggest) mistake, Muscat said. The initial goal was to have 7 percent player retention after day seven. However, when the first build was released, the actual retention level after a week of play was less than half the goal.

"And we were all, 'Thank god we have analytics,'" Muscat said. "The data will save us!"

Unfortunately, it didn't. After numerous rounds of trying to tweak the game's retention using the variables they had anticipated, Muscat said the results consistently came back below expectations.

Analytics are great at comparing things and exposing some problems, Muscat said. But it's really only good at exposing problems you've anticipated, because if you're not expecting it, you're not looking for it with the analytics.

"Analytics give you an incredible focus, but it's also giving you blinders," Muscat said.

So the team stepped back and challenged all their assumptions, framing the new approach as "flipping tables." They'd tweaked things before without moving the needle; they needed to make big changes to the game to see big changes in the retention. They targeted two big table-flipping changes a week, focusing on risky changes instead of safe bets. Every Monday, they'd plan the changes for the week, and upload the new build on Friday afternoon. It necessitated a "hacky," get-it-out-there approach, with polish restricted to just the details that would contribute to the actual test they were making.

Muscat said the new approach was liberating in a way.

"Even bad results are lessons," Muscat explained. "If we make a change and it goes way down, that's awesome because it affirms our previous assumptions."

"Even bad results are lessons. If we make a change and it goes way down, that's awesome because it affirms our previous assumptions."

One of the first things the developers tweaked was to give players a free daily skip-a-level option, letting them progress simply by re-engaging in the game each day. This change had a huge impact on retention, leading to nearly 30 percent improvement in day seven retention figures.

"Instead of getting completely burned out and frustrated on these levels, people were able to skip and move along," Muscat said. "That was what we thought would happen, but the impact was way bigger than expected."

Another change they made was ditching the level-based progression system. Instead of forcing players to complete each level before getting access to the next, the developers restructured the game to unlock clusters of levels at a time. By beating a handful of the levels in each cluster, players could gain access to a new group of levels.

Muscat said the entire team was certain this change would prove to be a winner, which in hindsight probably should have been the clearest sign that it would tank. Even though the change had some minor improvements to retention in the first few days, the numbers plummeted for day seven and everything after. As near as Muscat could tell, the restructuring effectively let players skip hard levels, but eventually they would wind up with nothing available to them but frustrating levels they hadn't built up the necessary skill set to complete.

The team also tried to push things further, with one change inspired by games like Diablo. The Bears vs. Art levels get much harder as players go, but unlike role-playing games, the main character doesn't get any stronger. So the team made the game into an RPG of sorts, letting them upgrade the bear's "stats" so that they could improve the number of moves they had, amount of time, or other factors. People loved it, Muscat said, and retention went up across the board.

Halfbrick also tried eliminating the game's energy system, removing the limitation on the number of times each player could play each day. It's a tired free-to-play mechanic, but it might be a well-worn one for a reason. When Halfbrick took out the energy system, they saw retention boosts for the first few days, but it absolutely cratered started at day four.

Finally, the developers tried to "actionify" the game, cutting the unskippable animations that would play after each player move and allowing players to pinball around art galleries nearly non-stop. It resulted in big retention spikes, and Muscat brought it up as a perfect example of the solutions that analytics will never tell you. None of the numbers they looked at suggested that the game was playing too slowly; it still required the creativity and intuition of a developer to think, "What if we made this a little more action-oriented like Halfbrick's other successful games?"

"Numbers are hard, and humans are stupid."

After six weeks of these sort of tests, day seven retention surged to 20 percent, Muscat said. In general, there were two big findings:

"Numbers are hard, and humans are stupid," Muscat said.

Humans aren't well-equipped to sift through the numbers, Muscat said, and there are too many cognitive biases at play to account for instinctively. To help the team better understand and sift through those issues, Muscat had each of the team members read the book Freakonomics.

One problematic assumption was that players not engaging with a feature meant they didn't like it. Muscat said that's not necessarily the issue; it could be that the button leading to that feature is hidden away, or they don't understand what the feature is about.

There is also the desire for developers to stick with any feature or change that gets a positive response. Muscat said the entire point of a soft launch is to make those mistakes. Furthermore, the idea that success or failure is determined purely by analytics results is a surefire way to get the development team to hate their jobs.

"No data is ever actually conclusive," Muscat said. "We're just dealing with various levels of uncertainty. We're making something creative. In the end, there's always some element of creative risk."