Games

In-game Voice Chat Is Now Fully Monitored In Call Of Duty

Activision will now put voice chats in its Call of Duty series under full monitoring. It is hoped that the toxic behavior of some members of the community will finally be better under control.

Eavesdropping with ToxMod

On the one hand, the shooter series has been a great success for years. At the same time, surveys among users repeatedly show that the gamer community behind the game is viewed as one of the most unpleasant in the entire industry. Activision has therefore tried a variety of measures to get the problem under control – with moderate success.

Now a new approach is relying on the capabilities of modern AI algorithms. According to a report by US magazine The Verge, the gaming group has teamed up with a company called Modulate to introduce “in-game voice chat moderation” for Call of Duty and perhaps other titles.

The new moderation system, which uses an AI technology called ToxMod, is designed to detect unwanted behavior such as hate speech, discrimination, and harassment in real-time. In this way, countermeasures can be taken directly with manageable personnel costs. Because it is simply not economically possible to have moderators constantly listening to all voice chats.

Beta testing starts now

ToxMod’s first beta phase begins today in North America. She is active in “Call of Duty: Modern Warfare II” and “Call of Duty: Warzone”. With the release of “Call of Duty: Modern Warfare III” on November 10th, it will be expanded to many other regions.

According to its developers, ToxMod should go beyond just transcribing and analyzing the content of voice chats. The AI ​​can also detect finer nuances of the conversation that arise from the emotions and volume of the speakers. However, the AI ​​does not react with possible sanctions itself but rather makes its findings available in real time to a human moderator, who can then intervene quickly.