In preparation for Modern Warfare 3, Activision has disclosed that it will include new real-time speech moderation tools to the Call of Duty online experience.
Activision plans to provide “global real-time voice chat moderation, at scale,” according to a blog post on the Call of Duty website. This will be done in order to “enforce against toxic speech, including hate speech.”
In North America, where the new capabilities for Call of Duty: Modern Warfare 2 and Call of Duty: Warzone were added on August 30, a “initial beta rollout” of this new chat moderating system has already started. Following that, the new voice moderation methods will be nearly universally implemented on November 10 to coincide with the release of Modern Warfare 3. The system will be implemented everywhere in the world outside Asia.
BREAKING: Activision will add real time in-game voice chat moderation in Call of Duty. AI powered detection will work in-game to listen and report rule breaking issues automatically.
It’s live today in MWII and Warzone in NA and expands worldwide with MW3. Here’s the details: pic.twitter.com/rp6buXWkfK
— CharlieIntel (@charlieINTEL) August 30, 2023
The new approach makes use of Modulate’s ToxMod, an AI-driven conversation moderation tool. Activision’s blog post states that the system is made to recognise toxic behaviour, such as “hate speech, discriminatory language, harassment, and more.” Alongside the current text-based moderation systems, which support 14 different languages, are these new functionalities.
According to a blog post by Activision, “Call of Duty’s existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts.” Furthermore, Activision said that, among the accounts to whom they had sent warnings, “20% of players did not re-offend after receiving a first warning.”
The introduction of ToxMod comes after Microsoft made a comparable decision and recently started rolling out a voice chat recording feature that lets users transmit voice chat clips to Microsoft’s moderation staff for review. The Microsoft tool makes use of a buffer to let users record 60 seconds after pressing a button. Microsoft’s new voice reporting system, like the one used by Call of Duty, makes extensive use of “AI advances” to “optimise the flow of content to moderators.”
If properly implemented, these new AI-driven solutions introduced by Activision and Microsoft might significantly contribute to making online gaming a safer environment.