Roblox Limits Messaging for Under-13 Users Amid Safety Concerns

Users under the age of 13 will be restricted from socialising with users outside their friend lists and will be blocked from features involving free-form writing and drawing.The post Roblox Limits Messaging for Under-13 Users Amid Safety Concerns appeared first on MEDIANAMA.

featured-image

Explainer Briefly Slides Gaming platform Roblox announced its plans to block users under the age of 13 from directly messaging other players outside the game or “experience.” It has also mandated parental permission for said users to broadcast messages within the game. This development followed criticism and concerns about the platform’s failure to adequately protect children from exploitation.

Parental control and changes to content labels The gaming application is ramping up its parental control features by enabling parents and caregivers to remotely monitor their child’s activity, including spending limits. Within this feature, parents can also adjust controls on their child’s activity even if they aren’t physically together. To enable this, parents must first verify their Roblox account via either an ID or credit card, after which they can link their account to their child’s account.



Parents can also set daily screen time limits, after which access will be blocked until the next day. These changes will be implemented by the first quarter of 2025. The platform is also redefining the parameters for its ‘Content Maturity Limits’ which detail information about the type of content the experience contains, enabling users to make informed decisions.

The game displays four content labels : ‘Minimal’, ‘Mild’, ‘Moderate’, and ‘Restricted’, each having varying levels of experience like violence, unrealistic blood, and crude humour, among others. With the new update, users under nine years can only access the ‘Minimal’ and ‘Mild’ levels by default and require parental consent to access the ‘Moderate’ level. Besides this, users under the age of 13 will be restricted from socialising with users outside their friend lists and will be blocked from experiences with free-form writing and drawing.

Recent safety updates on Roblox Earlier this month, Roblox announced a feature restricting users under 13 years from finding games that may be classified as inappropriate for their age. Highlighting the flaws within this update, MediaNama reported that creators could easily mislead users under 13 about the content of their game and the same would be challenging without human moderation. Additionally, the features that did not apply to users over 13 years left them at risk of exploitation.

Accusations against Roblox The increase in child safety features on the game follows a Bloomberg report that detailed the presence of pedophiles on Roblox who lured children to share photographs or develop online and later offline relationships with predators. In July, a Hindenburg Research report also termed the platform a “pedophile hellscape.” The report mentioned how the gaps within the screening process of Roblox’s social media features allow pedophiles to target children and, in turn, expose them to grooming, pornography, violent content, and extremely abusive speech.

Recent efforts to protect underage gamers In April 2023, the Ministry of Electronics and Information Technology notified new rules to regulate the online gaming industry, which also included features to protect users who were minors. Within this, self-regulatory bodies responsible for such games should include “measures to safeguard children” including provisions for parental control and classification of online games through age-rating mechanisms under the framework for the verification of games. Additionally, platforms were prohibited from hosting games relating to money laundering or gambling that may be detrimental to users, primarily children.

In August this year, the All India Game Developers’ Forum advocated for the development of an age-rating framework for the gaming industry. In a report, the forum stated that despite India hosting a large of gamers under the age of 18, the government lacks any framework to curb the risks of inappropriate gaming to such users. Later, in September, the Tamil Nadu Online Gaming Authority suggested the implementation of Know Your Customer (KYC) verification to reduce underage gaming.

The move followed talks by the state government in June on passing legislation to establish time and usage limits on online and real-money gaming. Also Read:.