Duty of care for social media giants to protect users

Tech giants will have to proactively make their platforms safer for users under a proposed digital duty of care flagged by the federal government.

featured-image

Australians are being promised a safer online experience with tech giants to be subject to a digital duty of care. or signup to continue reading Online platforms would be required to take reasonable steps to prevent foreseeable harm to users under the proposed reform by Communications Minister Michelle Rowland. This will be complemented by legislating harm categories, which could include young people, mental wellbeing, illegal content and the instruction or promotion of harmful practices.

Sites such as Facebook, X and Instagram would have proactive obligations to keep users safe. Stronger penalties against non-compliant tech giants would apply, with current fines of under $1 million out of step with other consumer protection, Ms Rowland said. "We will have a penalty regime that is effective," she told Sydney radio 2GB on Thursday.



The proposal for a digital duty of care would bring Australia into line with approaches to online platforms in the UK and Europe. International Justice Mission, an NGO focused on human rights, said a digital duty of care on tech companies was vital to protecting Australian children from online sexual abuse and other forms of exploitation. It welcomed shifting the onus from the eSafety commissioner to issue takedown notices to tech platforms legally needing to take proactive steps.

"This is a necessary step to protecting Australian children from online sexual abuse, and children around the world from exploitation by Australian offenders," CEO David Braga said. DIGI, an industry not-for-profit that includes companies such as eBay, Meta, TikTok and X, said its members represented some of the safest sections of the internet. Members would "'continue to deliver safety-by-design on their services and work constructively with the government to keep Australians safe online" while awaiting more details, managing director Sunita Bose said.

It's the second major social media reform the government has announced in recent weeks and will work in tandem with a legislated age limit also aimed at protecting kids. Children under the age of 16 will be banned from social media under legislation set to be introduced to parliament in November that has bipartisan support. A definition of age-restricted services would be in the legislation and would likely capture YouTube, while Snapchat could fall within the criteria, Ms Rowland said.

"Some of these platforms do present themselves in different ways. They will argue, for example, that they are messaging services and not social media services, but we need to assess that objectively," she said. Platforms would be encouraged to develop low-risk services with exemptions for education and health purposes, Ms Rowland said.

"For example, if you take YouTube kids, that could be one that could be a candidate for being within those exemptions," she said. The technology used to ensure age verification will be up to the tech platforms. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) Advertisement Sign up for our newsletter to stay up to date.

We care about the protection of your data. Read our . Advertisement.