Meta will now put under-18 Insta users into new ‘teen’ accounts, give parents better control

Under the new settings, parents will have several ways to manage their teens’ Instagram usage. They will be able to set daily time limits, block access to the app at night, and view the categories of content their children are engaging with

featured-image

Meta has announced that it will introduce new “teen accounts” for Instagram users under 18. The goal is to give parents more control over their children’s activities on the platform, including the ability to restrict app usage during certain hours and monitor what their children view. This change will apply to new users but will also gradually be extended to existing teen accounts in the coming months.

New features for parental supervision Under the new settings, parents will have several ways to manage their teens’ Instagram usage. They will be able to set daily time limits, block access to the app at night, and view the categories of content their children are engaging with. Parents will also have the option to see the accounts with which their teens are exchanging messages.



Teenagers who are signing up for Instagram are already placed into strict privacy settings by default, such as preventing adults who don’t follow them from sending messages and muting notifications at night. However, these changes will further empower parents, especially for users under the age of 16, who will now require parental permission to alter any privacy settings. For users aged 16 and 17, these features will also be applied by default, though they will have the freedom to adjust the settings themselves if desired.

The teen account changes will be implemented in the US, UK, Canada, and Australia. Meta responds to concerns over online safety The announcement of these new teen account settings is seen as a response to growing concerns about online safety, especially for young people. There have been frequent complaints from parents that they are not equipped to adequately monitor their children’s use of social media.

Meta’s leadership has acknowledged that while parental controls have been available, they were often underused. The aim of this update is to simplify those controls and give parents a stronger role in managing their children’s online experiences. Despite these efforts, some online safety advocates have expressed caution.

Previous updates to enhance child safety on platforms like Instagram have not always produced the intended results. Concerns remain over the availability of harmful content, such as material related to self-harm and mental health issues, which have been linked to tragedies in the past. The father of Molly Russell, a British teenager who died after exposure to harmful content on Instagram, hopes that this latest change will have more meaningful outcomes than previous attempts by Meta to improve safety.

Global implications and legislation This move comes amid broader discussions on social media safety, with countries like Australia planning to raise the minimum age for social media access. There is interest in how these changes might influence legislation in other countries, including the UK. Meta has stated that these new teen accounts are being introduced independently of any government regulations, driven primarily by parental concerns.

However, the possibility of future changes to Facebook or other Meta-owned platforms is still being explored. As discussions about online safety continue to evolve, it remains to be seen how these new features will affect young users and their families..