Following recent reports and criticisms of Roblox ‘s child-safety protections, the younger-skewing online gaming platform is implementing new measures, including the automatic blocking of direct messaging for users under the age of 13 and parents’ ability to regulate the level of “mature” content available to their kids, regardless of age. These changes to Roblox — which partners with brands including Disney, Netflix, Nickelodeon, Warner Bros. Discovery, Mattel and more on kid-focused games — will allow parents to see their child’s friends, set time limits for their child on the platform and manage how much their child spends on Roblox every month, all under a parents’ separate account from their child’s.
Additionally, new communication settings for users under 13 that will be automatically implemented by Roblox unless a parent chooses to override include users no longer being able to use “platform chat,” which is the ability to directly message others on Roblox outside of games or experiences, and users limited to public broadcast messages only within a game or experience. Outside of messaging, the “maturity” level of content — including what would be considered violent or sexually suggestive content — will now be rated as “minimal,” “mild” and “moderate,” rather than have a specific age assigned for a game or experience. Per Roblox, “Labeling experiences based purely on age does not respect the diverse expectations different families have so we’re launching simplified descriptions of the types of content available.
Experience Guidelines will be renamed Content Labels, and we will no longer label experiences by age. Instead, we’ll label experiences based on the type of content users can expect in an experience. These updates should provide parents greater clarity to make informed decisions about what is appropriate for their child.
” Users under nine years old will only be able to access “minimal” or “mild” content by default and “moderate” only with parental consent, though parents can further adjust those settings. Roblox says it worked with the Family Online Safety Institute (FOSI) and the National Association for Media Literacy Education (NAMLE) when developing these new restrictions. “As parents look for ways to stay informed and engaged in their children’s digital lives, Roblox’s enhanced parental controls are a considerable leap forward,” Family Online Safety Institute CEO Stephen Balkam said.
“By offering robust tools for non-intrusive monitoring and privacy, Roblox is providing families with the confidence they need to foster a secure and enriching online environment.” “With the development of more parental controls, Roblox is addressing what all companies should be addressing: the need to improve safety features for kids and empower parents with the tools they need to help their kids navigate the complex media ecosystem,” National Association for Media Literacy Education executive director Michelle Ciulla Lipkin said..
Technology
Roblox Boosts Child-Safety Settings by Auto-Blocking DMs to Users Under 13, Increases Parental Controls on Violent, Sexually Themed Content
Following recent reports and criticisms of Roblox’s child-safety protections, the younger-skewing online gaming platform is implementing new measures, including the automatic blocking of direct messaging for users under the age of 13 and parents’ ability to regulate the level of “mature” content available to their kids, regardless of age. These changes to Roblox — which [...]