Toxicity is not a new thing in the world of video games, and Riot’s tactical shooter Valorant is no stranger to toxic behavior in-game.
Communication abuse is one of the most common forms of toxicity noticed in multiplayer games. However, it seems like Riot is planning to tackle this issue with a major update to their privacy policy.
Riot Games has confirmed that they’re going to record all in-game voice communications from the players for moderation purposes.They won’t be actively monitoring the comms unless a player has been reported for communication abuse, Riot confirmed in a blog post.
The voice comm data won’t be permanently stored in Riot servers, and will be deleted once they’re no longer needed.
“In order for us to take action against players who use voice comms to harass others, use hate speech, or otherwise disrupt your experience, we need to know what those players are saying,” the devs said. “Which is why, moving forward we’ll need the ability to analyze voice data.”
While the details on how the new voice moderation will work are still hazy, it’s certainly a huge step forward in curbing toxicity in Valorant to an extent.
This change is also likely to hit other Riot titles like League of Legends in the near future.