Call of Duty Using AI to Listen Out for Hate Speech During Online Matches
upstart writes:
Shooter video game Call Of Duty has started using AI to listen out for hate speech during online matches.
Publisher Activision said the moderation tool, which uses machine learning technology, would be able to identify discriminatory language and harassment in real time.
Machine learning is what allows AI to learn and adapt on the fly without explicit human instruction, instead using algorithms and the data it's taught with to recognise patterns.
The tool being rolled out in Call Of Duty, called ToxMod, is made by a company called Modulate.
Activision's chief technology officer Michael Vance said it would help make the game "a fun, fair and welcoming experience for all players".
[...] Activision said its existing tools, including the ability for gamers to report others and the automatic monitoring of text chat and offensive usernames, had already seen one million accounts given communications restrictions.
Call Of Duty's code of conduct bans bullying and harassment, including insults based on race, sexual orientation, gender identity, age, culture, faith, and country of origin.
Mr Vance said ToxMod allows the company's moderation efforts to be scaled up significantly by categorising toxic behaviour based on its severity, before a human decides whether action should be taken.
Players will not be able to opt out of having the AI listen in, unless they completely disable in-game voice chat.
Read more of this story at SoylentNews.