Still perpetually under the spotlight for its child safety measures and supposed lack thereof, game megaplatform Roblox has added new safety tools and parental controls, including a “voice safety classifier,” as well as joined ROOST, an online safety coalition.
This time, per the announcement, parental tools focus on how kids are spending time in their experiences. Parents can block their kids’ accounts from accessing certain experiences, and they can see how long the kids are spending in certain experiences. Plus, parents can block certain friends now. However, we know from past interviews and investigations that this doesn’t stop kids from opening their own accounts to counter these measures.
For voice chat, Roblox has updated its “open-source voice safety classifier,” as they call it, which continues to “moderate millions of minutes of voice chat per day, across eight languages, more accurately than human moderators.”
Roblox’s Safety Center page has been updated “to serve as a hub for safety resources, tools, and policies for users and parents alike.
In addition, Roblox has joined ROOST, an online safety initiative that describes itself as “a community effort to build scalable and resilient safety infrastructure for the AI era.” Launched a month ago, it touts Roblox itself, OpenAI, Discord, and Google as founding partners.

