Call of Duty to Begin Using AI to Assess Voice Chat for Toxicity::undefined
This is just going to lead to creative language and new slang developing rapidly. Like people trying to avoid YouTube demonization say all kinds of weird shit. Like the Republicans back when they weren’t screaming the quiet part out loud. They said it with a lot of indirect words. “We can’t have the wrong people receiving government assistance”. The Xbox live version will be something like “Did you hear that? Sounds like there’s a huge football player getting some hard reps in your mom’s room.”
“Did you hear that? Sounds like there’s a huge football player getting some hard reps in your mom’s room.”
Actually encouraging creative use of language and a broad vocabulary to own someone seems like a great outcome?
ToxMod assesses the “tone, timbre, emotion, and context” of a phrase or conversation to determine whether a player is behaving in a harmful way.
Given how neural networks are considered black boxes, is there any specific counter-measures against false-positives? And what about the running costs of the model? Is it sustainable for the game company?
They all get reviewed by the same process as reports.