Modulate, the company making voice chat safe for everyone, today announced that its cutting-edge ToxMod anti-toxicity voice moderation tools are now available for the Nintendo Switch system. Through the Nintendo Developer Program, studios and developers are now empowered to better protect their communities from toxicity by implementing ToxMod, the only voice-native proactive moderation solution available today.
Toxicity is one of the biggest problems facing the games industry today. According to a recent report by the Anti-Defamation League, more than five out of six adult gamers experience toxic behavior in online games. Players between the ages of 13-17 experienced a 6% increase in online harassment from 2021 to 2022, causing 72% of these younger players to hide their identity online either some of the time or every time they play online. Existing, legacy reactive moderation tools are not able to adequately protect players nor scale to accommodate large communities, relying on ineffective text transcription and onerous player reports.
ToxMod is gaming’s only proactive, voice-native moderation solution. Built on advanced machine learning technology and designed with player safety and privacy in mind, ToxMod triages voice chat to flag bad behavior, analyzes the nuances of each conversation to determine toxicity, and enables moderators to quickly respond to each incident by supplying relevant and accurate context. In contrast to reactive reporting systems, which rely on players to take the effort to report bad behavior, ToxMod is the only voice moderation solution in games today that enables studios to respond proactively to toxic behavior and prevent harm from escalating.
Developers and game studios of all sizes use ToxMod to reduce churn, delight players, and build healthier communities.