Meta takes a step to protect kids on Instagram — but needs to go a lot further

Instagram this week put in place new measures meant to protect minors, introducing "Teen Accounts" that give parents more control and limit what content kids see and who can contact them.

featured-image

On Tuesday, Meta announced new measures — shamefully years overdue — meant to protect minors on Instagram. It introduced “Teen Accounts” that give parents more control and limit what content kids see and who can contact them. It’s a big step forward for kids’ safety online, sure — but not close to a total fix.

CEO Mark Zuckerburg isn’t taking this step out of the goodness of his heart. Meta has been forced to bend after years of obfuscation in the face of clear evidence (that the firm had itself buried) of the harm ruthlessly harvesting children’s eyeballs has had on young minds. Never forget the company set out deliberately to turn as many children as possible into addicted, doomscrolling zombies, putting what it calls “engagement metrics” (levels of consumption) as its only north star.



Ask any parent of a teenager: Screen time is a depressingly constant top flashpoint of familial discord. Three in four Gen Zers say social media has had a negative effect on their mental health — and Instagram and TikTok got the lion’s share of the blame. Now Zuckerberg is clearly trying to mitigate a crackdown in the form of the Kids Online Safety Act .

That bill, gaining momentum in Congress, would impose a “duty of care” on social-media platforms — requiring them to avoid features that contribute to harassment, bullying, sexual exploitation or mental-health disorders like anxiety and depression for minors. Meta’s new safeguards are practically copy-and-pasted from some measures demanded in the bill, including allowing parents to control privacy and account settings and to set time restrictions on their child’s account. The Instagram policies would also block notifications between 10 p.

m. and 7 a.m.

(it’d still be up to parents to outright block limit minors’ overnight access to the app), limit who can view the content of minors’ profiles to their followers (effectively making them “private” by default), limit who can contact minors and send reminders to nudge minors off the app if they’ve been scrolling for more than 60 minutes per day. All good: But this won’t solve the problem that the algorithms of social media apps are intentionally addictive or that social media causes depression and anxiety in kids and teens at chilling rates. And it’s not clear how Instagram will prevent teens from avoiding the restrictions altogether by simply lying about their age.

But there’s a reason Surgeon General Vivek Murthy is advocating for a warning label on social media sites : Kids who spend more than three hours a day on social media double their risk of depression and anxiety. Social-media companies don’t deserve the public’s trust, given their past behavior. If they had given one iota about the societal consequences of their actions, they would’ve taken these baby steps years ago.

Congress should pass KOSA anyway..