Intel's AI Cop Will Tackle Toxic Gamers

Most of us have probably experienced toxicity in one form or another while gaming online, and it's a problem that doesn't seem to be getting any better. In-game toxicity is bad enough in games where you have to pay to play, but in free to play games, the number of unsavory folks one encounters can seem to quickly grow out of control. Most developers, of course, implement in-game reporting systems and teams of people to dole out punishments for toxicity, but now Intel is looking at ways that AI can help.

Advertisement

During Intel's GDC 2019 press conference, the company announced that it has partnered up with Spirit AI to explore how machine learning can be used to combat in-game toxicity. Spirit AI already has quite a bit of experience in this realm, as it offers a tool called Ally that uses machine learning and AI to identify and curb abuse in text chat.

Ally potentially makes it easier for small teams of moderators to find and snuff out toxicity among large user bases communicating over text chat, but Intel apparently sees the potential for more. It wants to take the work Spirit AI has done with text chat and expand it to cover voice chat as well.

That's a pretty big task, but with machine learning, Intel seems to think that it's at least plausible. PCWorld points out that Intel is well aware of the difficulties both companies would face in creating AI that can scour voice chat for instances of toxicity, and notes that such a tool is probably years away even assuming this is something Intel and Spirit can actually get off the ground.

Advertisement

While we probably won't see functionality like this in our games anytime soon, there might be some point in the future where AI is picking out questionable voice chat discussions and flagging them for human moderators to review. The question isn't just one of feasibility, but also one of accuracy – after all, there are a lot more variables to consider when analyzing voice chat than there are in monitoring simple text chat. We'll see where this leads, but given the difficulty Intel and Spirit face in creating a machine learning voice chat moderator, we aren't expecting this idea to materialize into an actual product quickly.

Recommended

Advertisement