Affordances of Abuse: What Overwatch Creators Could Learn from Death Stranding
Two games. Both feature online collaboration tools. But only Overwatch was featured prominently in a TED Talk titled “The Internet is a Trash Fire.” Maybe Kaplan should study what Kojima did and avoid that whole trash fire situation.
Author's Note: This post was originally published Jan 31, 2021 on (the currently defunct) Byline.
Dr. Lisa Nakamura gave a TED talk based on a course she teaches at the University of Michigan, titled “The Internet is a Trash Fire”. She explains right in her opening that such a phrase needs no explanation. Everyone intuitively understands that phrase, even though people from marginalized communities face disproportionate abuse.
Dr. Nakamura dedicates a big chunk of time to the proliferation of esports as a commercial and social entity on the Web, using Overwatch as a specific example. During the pandemic, I’ve been gaming a lot so I’ve got to thinking about Dr. Nakamura’s observations and how they relate to our work at DashKite.
What are the salient social dynamics of distributed collaboration in gaming? Can we identify properties that shape them? Can they be generalized and applied in broader contexts? Can we see the first principles that would underlie the sustainable and psychologically beneficial Web Communities DashKite seeks to foster?
With that goal in mind, let's compare Overwatch and Death Stranding.
The Toxicity Trash Fire
While Overwatch and Death Stranding both support Web-enabled human collaboration, their means and affect are starkly different. I'm following Dr. Nakamura's lead to focus on Overwatch, but FPS titles have such a strong association with abusive behaviors that they appear intrinsic to the genre. That's a pretty clear problem, one that Death Stranding does not share. In fact, some Death Stranding reviews described the collaboration mechanics with an almost spiritual association.
The word we use for this abuse is “toxic.” It's in Dr. Nakamura's TED talk, in online discussions of this issue, even within Blizzard's own public communications on the problem. While “toxic” captures the way abuse has corrosive, cascading effects that make a space inhospitable, I don't like it. It assumes intrinsicality. It lacks precision. It diffuses blame. Toxicity once described an individual's actions, but people now use it to refer to this cultural conceit, an ambient glow of poorly concealed rage. Something we must come to expect—and our vulnerable must endure—if we wish to exist in certain Web spaces.
The “toxicity” framing posits that without rules or society to restrain us, we are all—at our core selves—monsters, just screaming agents of abuse and chaos. Only with the right mix of rewards (carrots) and punishments (sticks) can we hope to contain our monsters and live peacefully. In addition to being pessimistic, it's flatly wrong. Humans indeed have an immense capacity for harm and a susceptibility to the influence of malicious persuasion. Those are huge liabilities. But developmental psychology tells us that our baseline is constructive.
So, what we need is a framework, some organizing principle to get us closer to the absolute moral truth on the matter (pretzels). That brings us to the work of Kim Crayton and her leadership of the #causeascene movement. She teaches that we must always engage in systemic thinking to test that the most vulnerable are prioritized.
Focus on that test while we look at these games.
Carrots Plus Sticks Does Not Equal Pretzels
The collaboration system constructed by Overwatch assumes good faith. It assumes a context of professional-level, competitive play. Everyone on a team knows each other. There is support from coaches. There's a lot of money on the line. This is all under the auspices of the Overwatch League. We're all in this together. etc. Even under these conditions, you can see racism and homophobia oozing out at the edges. Still, the plan was to take these tools and hand them to the general public.
Overwatch is a hard game. Under ideal matchmaking, you will lose 50% of the time, even as your skill improves. It can be tempting to blame other people. A team is divided into three roles. If any member makes a serious mistake, it greatly costs the team's combat capacity. And if the team fails, you lose points. Teams have coaches for a reason. Without training, it is easy for failure to escalate into emotional dysregulation. Avoiding that is a skill you can build, but in the meantime, there's a person on the other end of that live mic.
Overwatch already fails the systemic thinking test by implementing a collaboration system that relies on good-faith to prevent abuse. Overwatch then attached that broken tool to a game that is very effective at stressing emotional regulation. So, of course, people used that collaboration tool to broadcast abuse.
What did Blizzard do about it? Well, early on, Jeff Kaplan, Lead Designer of Overwatch and Vice President at Blizzard Entertainment, had the gall to complain that it was time-consuming to develop moderation tools. No. I'm serious. He recorded and published a video of himself complaining that it's a distraction.
People's safety is not an ancillary feature. It is a core feature, by definition, because it allows people to play the game. Think of it this way: Overwatch is popular, but how popular would it be if huge swaths of the population weren't actively discouraged from playing? We don't know the answer because Kaplan is bad at his job. He told us. That video is breathtaking.
He even signs off by saying:
Remember to pat your teammates on the back, and if you have that negative comment, maybe just hold it back. Thanks, guys.
🤯 🤯 🤯
Fortunately, Blizzard didn't rely entirely on the power of positive thinking. They introduced an endorsement system to let players build up a reputation. They applied some machine learning to automate abuse detection. A few weeks ago, Overwatch released Priority Passes to address a years-long issue with role balancing. But, if you think about Crayton's model, these are nudges. A carrot to bribe players to be good, but, wow, they need so many sticks to hold the line.
The Priority Passes are an even better example. There was a relative deficit of people willing to play the Tank and Support roles compared to the Damage player population. That slows matchmaking, degrading the quality and value of the game. Sounds serious, right? But instead of locating and addressing the imbalance's fundamental cause, they just bribed players into playing the disfavored roles.
In his video, Kaplan uses the word “community,” and people use that word a lot on the Web. But, a real community requires a systemic approach. All the sticks and carrots in the world aren't going to fix a hope-based design strategy.
No Need to Moderate What You Can Eliminate
Death Stranding also has Web-powered collaboration. But unlike Overwatch, there is a layer of indirection. Death Stranding's consensus algorithm modulates and attenuates player interaction, so players cannot directly communicate.
That slows the transfer of information, which limits what kinds of collaboration players can do. But, it also limits their ability to be abusive. In fact, players cannot express abuse in Death Stranding. I'm going to say that again because it bears repeating.
Players cannot express abuse in Death Stranding.
That's an amazing constraint! It solves so many problems—and passes the systems thinking test—by simply reducing the bandwidth of information transfer. There's nothing to moderate because there's no abuse. From here, a designer is free to work on building value from the collaboration features. I'm sure Kaplan would be so jealous to find out how much time that left Kojima's team to work on “core” features.
While the bandwidth is limited, the collaboration is not trivial. Because the allocations are emergent, they can sometimes be surprising. For example, a bridge to nowhere with hundreds of thousands of votes. In this case, while the bridge is not linked to an existential objective, the collective has nevertheless deemed it valuable and signaled for its propagation. These kinds of unconventional solutions are a hallmark of constraint-based agents. And in this case, we can be safely delighted by surprises instead of horrified by the harms they unleash.
Death Stranding leans into the collaboration system to enrich it with narrative. From this Polygon review:
Forming a Strand Contract is like favoriting another player. A Strand Contract makes it more likely that you'll see that player's structures and abandoned cargo in your game....
This is a good way to make a connection to your friends, but there's also a benefit to forming strand contracts with people with a ton of Likes, which is an indirect indication that their structures are useful and in good positions. Creating a Strand Contract is to both parties' advantages: They get more Likes, and their stuff — ladders, bridges, and so on — will appear more often in your game.
As your porter grade increases... you'll be able to form more Strand Contracts.
In the narrative, there is a social contract about humans pulling together to face down cataclysmic forces. While the game offers these reciprocal perks (carrot), it does appeal to your self-motivation with a helpfulness score and runs a rough emulation of something like a community.
Even reviewers who found the game odd noted the potency of the collaboration mechanic. Based on interviews from Hideo Kojima, Director of Death Stranding, this appears to be intentional. His themes hammer on connectedness, interdependency, and avoiding harm. While I admit it's an odd game, it's an amazing feat of distributed collaboration.
Affordances of Abuse
When people discuss toxicity in the general Web, the common wisdom is to do something about anonymity. That's wisdom that is both widespread and persistent. The impulse makes sense. We see a symptom and seek recourse within our power. But it's an individualistic approach that cannot solve a systemic problem.
We can draw on the key principles Crayton identifies when engaging in systemic thinking. > Tech is not neutral, nor is it apolitical. Intention without strategy is chaos. Lack of inclusion is a risk/crisis management issue. Prioritize the most vulnerable.
Technologists have failed us so spectacularly because they do not want to create safe platform contexts. They do not prioritize the most vulnerable. It's a choice. It's also an anti-pattern; instead of an inclusionary affordance, it becomes an affordance that explicitly favors abuse.
Under these conditions, what recourse do the abused have? They reach for ending anonymity because they cannot imagine platforms providing trust contexts. But in so doing, they pit privacy against safety. Those properties are not intrinsically at odds. In fact, they are part of an interlocking whole that facilitates human rights in distributed systems. But when a platform is not aligned with that goal, it creates these affordances of abuse and leaves individuals to pick up the pieces.
If I built a line of toasters that regularly burned people's houses down, I'd share culpability. Kaplan built a platform where people are regularly abused. That's a similar design failure, and he bears culpability. We can restate that:
We bear responsibility for the systems we build.
This case study is a demonstration that we should consider carefully who we prioritize. By default, systems should not allow abuse. Riskier configurations require a trust threshold and, therefore, a context to establish that trust.
DashKite uses this design principle in our product development process. We're starting with our RSS reader because, exactly like Death Stranding, you can't express abuse to others on that platform. And while we're excited about Civic, its more powerful feature set requires a trust context worthy of the task. What does that context look like? That's the right question, and we're working on it!