Shooter video game Call of Duty has embraced AI technology to tackle hate speech during online matches.
Publisher Activision has unveiled a new moderation tool, utilising machine learning technology, which can swiftly detect discriminatory language and harassment in real time. Machine learning empowers AI to adapt and learn without direct human input, relying instead on algorithms and the data it has been trained with to identify patterns.
The tool, named ToxMod, is being introduced into Call of Duty by a company called Modulate.
Michael Vance, Chief Technology Officer at Activision, expressed the tool’s potential to create “a fun, fair, and welcoming experience for all players.”
Toxic voice chat has long been a pressing issue in online video games, disproportionately affecting women and minorities. A September survey by the Pew Research Center found that 41% of Americans have personally encountered some form of online harassment.
While the overall prevalence remains consistent with 2017, evidence suggests that online harassment has intensified. A study on abuse in online games revealed that most female gamers experience harassment during play, particularly those aged 18 to 24. Around 75% of younger respondents reported online abuse, with roughly 50% of women of all age groups facing harassment.
The severe impact of abuse
The study also delves into the psychological impact of this abuse, with 25% of victims reporting feelings of depression, and 27% expressing concerns that the abuse may spill into real life. Shockingly, one in ten even reported feeling suicidal due to gaming-related abuse, underscoring the significant impact of bigotry and toxicity in the gaming community on players’ mental health. This issue is exacerbated in popular multiplayer games due to the sheer number of players, with approximately 90 million people playing Call of Duty each month.
Activision noted that its existing tools, such as the ability for gamers to report others and automatic monitoring of text chat and offensive usernames, have already resulted in one million accounts facing communication restrictions.
Call of Duty’s code of conduct strictly prohibits bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.
Vance explained that ToxMod amplifies the company’s moderation efforts by categorising toxic behaviour based on its severity, allowing a human moderator to determine the appropriate course of action.
Players will not have the option to opt out of AI monitoring unless they entirely disable in-game voice chat.
ToxMod has been integrated into Call of Duty’s Modern Warfare II and Warzone games, initially in the United States only. A broader rollout is scheduled to commence with the launch of the next instalment, Modern Warfare III, on November 10th.