Here’s what it’ll take to clean up esports’ toxic culture

Here's what it'll take to clean up esports' toxic culture

College videogame team members practice League of Legends. AP Photo/M. Spencer Green

In day-to-day life, you probably haven’t had someone yell at you, “Get back in the kitchen and make me a sandwich!” If you’re a woman who plays online video games, though, statements like this, and worse, are all too common.

As COVID-19 has driven much of life online and fueled a boom in online gaming, harassment in these and other internet spaces has increased. Forty-one percent of computer and videogame players are female, down from 46% in 2019.

Despite its digital nature, online harassment can have real-world consequences for victims, including emotional and physical distress. This has left online gaming companies and players scrambling for better community management techniques to prevent harassment. As a researcher who studies gaming, I’ve found that the right cultural norms can result in healthy online communities, even in the highly competitive world of esports.

The stakes are high. Competitive video gaming, or esports, now exceeds US$1 billion in yearly revenue. Professional, collegiate and high school leagues are expanding, especially as COVID-19 has decreased opportunities for traditional sports.

History of harassment

Recent stories from The New York Times, Wired, Insider and others have highlighted how pervasive sexism, racism, homophobia and other forms of discrimination are in online spaces. However, these issues are hardly new. Similar problems arose in 2014’s GamerGate Twitter-based campaign of harassment of female gamers, designers and journalists.

Sexism was also common before GamerGate. For example, professional gamer Miranda Pakozdi quit her team following sexual harassment from her coach in 2012; the coach, Aris Bakhtanians, famously stated that “sexual harassment is part of [the fighting game] culture” and that it could not be removed.

Others have suggested that the anonymity of online game spaces, combined with gamers’ competitive natures, increases the likelihood of toxic behavior. Survey data from the American Defamation League suggests that at least 37% of female gamers have faced gender-based harassment.

See also  How do I create a word scramble in Powerpoint?

However, positive online communities exist, and a study by lawyer and former Microsoft user experience designer Rebecca Chui found that anonymous online communities are not inherently toxic. Rather, a culture of harassment requires community norms that allow for it. This suggests that online bad behavior can be addressed effectively. The question is how.

An arena full of people watching an international videogame tournament

Online video gaming, or esports, has grown to have professional, collegiate and scholastic leagues, and international tournaments like this one in Paris in 2019.
AP Photo/Thibault Camus

Players’ coping strategies

In my interview-based research with female gamers, I’ve found that players have many strategies for avoiding or managing online harassment. For instance, some play only with friends or avoid using voice chat to hide their gender. Other gamers get really good at their favorite games, to shut down harassment through skill. Research by other media scholars, such as Kishonna Gray and Stephanie Ortiz, has found similar results across race and sexuality.

These strategies have significant downsides, however. For example, ignoring toxicity or brushing it off allows it to persist. Pushing back against harassers often results in further harassment.

They can also put the burden of challenging harassment on the victim, rather than on the perpetrator or community. This can drive victims out of online spaces. As my interviewees gained responsibilities in their jobs or families, for instance, they no longer had the time or energy to manage harassment and stopped gaming. My research suggests that game companies need to intervene in their communities to keep players from having to go it alone.

How companies can intervene

Game companies are becoming increasingly invested in community management strategies. Large publisher Electronic Arts held a community management summit in 2019, and companies like Microsoft and Intel are developing new tools for managing online spaces. A group of game development companies even recently formed the Fair Play Alliance, a coalition working to address harassment and discrimination in gaming.

See also  How do I get texture packs for Minecraft?

It’s important that interventions be rooted in the experiences of players, however. Right now, many companies intervene though practices like banning or blocking harassers. For instance, the live-streaming platform Twitch recently banned several prominent streamers following allegations that they had committed sexual harassment.

This is a start, but harassers who are blocked or banned often create new accounts and return to their previous behaviors. Blocking also manages harassment after it occurs, rather than stopping it at the source. Thus blocking should be combined with other potential approaches.

First, companies should expand the tools they provide players to manage their online identities. Many participants avoided voice chat to limit gender harassment. This at times made it difficult to compete. Games like Fortnite, League of Legends and Apex Legends, however, have instituted “ping” systems, where players can communicate essential game information rapidly, without requiring voice. Similar tools could be built into many other online games.

Another option my interviewees suggested is to make it easy for players to group with friends, so they have someone on their side to guard against harassment. Grouping mechanisms work particularly well when matched to the needs of their specific game. For instance, in games like Overwatch and League of Legends, players need to take on different roles to balance their team. Abuse can occur when randomly assigned teammates all want to play the same character.

Overwatch recently introduced a new grouping system that allows players to choose their characters, then be matched with players who have chosen different roles. This appears to reduce abusive in-game chat.

Screenshot of videogame League of Legends showing clasped hands

An example of in-game commendations for positive behavior in League of Legends.
Daniel Garrido/Flickr, CC BY

Finally, companies should work to change their basic cultural norms. For example, League of Legends publisher Riot Games once instituted a “Tribunal” system where players could view incident reports and vote on whether the behavior was acceptable in the League community.

See also  Which teams are playing today in EPL?

[Deep knowledge, daily. Sign up for The Conversation’s newsletter.]

Although Riot Games unfortunately closed the Tribunal shortly after its release, including community members in any solution is a good idea. Companies should also develop clear community guidelines, encourage positive behavior through tools like in-game accolades, and respond to ongoing issues rapidly and decisively.

If esports continue to expand without game companies addressing the toxic environments in their games, abusive and exclusionary behaviors are likely to become entrenched. To avoid this, players, coaches, teams, leagues, game companies and live-streaming services should invest in better community management efforts.

The Conversation

Amanda Cote ne travaille pas, ne conseille pas, ne possède pas de parts, ne reçoit pas de fonds d'une organisation qui pourrait tirer profit de cet article, et n'a déclaré aucune autre affiliation que son organisme de recherche.