Call of Duty will have "real-time voice chat moderation" to address toxic speech

Call of Duty will have “real-time voice chat moderation” to address toxic speech

The new approach makes use of Modulate’s ToxMod, an AI-driven conversation moderation tool. Activision’s blog post states that the system is made to recognise toxic behaviour, such as “hate speech, discriminatory language, harassment, and more.” Alongside the current text-based moderation systems, which support 14 different languages, are these new functionalities.

According to a blog post by Activision, “Call of Duty’s existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts.” Furthermore, Activision said that, among the accounts to whom they had sent warnings, “20% of players did not re-offend after receiving a first warning.”

The introduction of ToxMod comes after Microsoft made a comparable decision and recently started rolling out a voice chat recording feature that lets users transmit voice chat clips to Microsoft’s moderation staff for review. The Microsoft tool makes use of a buffer to let users record 60 seconds after pressing a button. Microsoft’s new voice reporting system, like the one used by Call of Duty, makes extensive use of “AI advances” to “optimise the flow of content to moderators.”