Australia To Implement Digital Duty Of Care On Platforms to Prevent Online Harm

The Australian government will be imposing a ‘Digital Duty of Care’ on digital platforms to keep their users safe from...The post Australia To Implement Digital Duty Of Care On Platforms to Prevent Online Harm appeared first on MEDIANAMA.

featured-image

Explainer Briefly Slides The Australian government will be imposing a ‘Digital Duty of Care’ on digital platforms to keep their users safe from online harm, revealed a press release on November 14. This legislation would place legal responsibility on platforms to keep their users safe. This news comes soon after the government announced its intentions to prohibit social media for teens under the age of 16 in order to prevent excessive usage.

Minister for Communications Michelle Rowland revealed in a speech that the Digital Duty of Care will complement the recent age restrictions and fill in the gaps in the existing Online Safety Act. She stated that the new initiative will help regulate against new forms of harm brought about by changes in the tech industry, heralding a new phase of consumer protection regulation. Unlike the past, where actors were onshore and harms to consumers mostly economic, the digital economy has offshore actors with the harms being mental or psychological.



“To my mind, what’s required is a shift away from reacting to harms by relying on content regulation alone, and moving towards systems-based prevention, accompanied by a broadening of our regulatory and social policy perspective of what online harms are experienced by children,” she said. While the Online Safety Act helped ensure that platforms removed illegal content, it did not “incentivise the design of a safer, healthier, digital platforms ecosystem,” as Rowland put it. The government intends the digital duty of care to fill that gap.

How Does It Work? The concept was first discussed in a statutory review of the Online Safety Act in July this year, with many stakeholders supporting the idea. As Rowland described, the Digital Duty of Care will require platforms to carry out regular risk assessments on harms to Australians, which would include: She also stated that the new age restrictions will include a broader definition of social media that would capture more services. However, this would include only “common social media services” such as Facebook, Instagram, TikTok, and X.

Messaging and gaming services will not be in scope of this definition. The legislation will also contain “positive incentives” that encourage safe innovation and also provide for social media-like services that help young people with health or education. “Social media platforms that can demonstrate they meet set criteria and do not employ harmful features, or provide positive benefits for children, may apply to the regulator for approval,” she added.

Also Read:.