The Federal Government wants to legislate a duty of care for social media platforms.
Under the policy, tech companies would have a legal responsibility to stop preventable harms to users, and risk heavy penalties if they failed to act.
Communications Minister Michelle Rowland said a digital duty of care would help make “online services safer and healthier”.
Duty of care
A duty of care is a legal responsibility to protect someone from “foreseeable harm” — negative impacts that could be reasonably expected.
For example, a school has a duty of care to keep its students safe during an excursion on a sunny day, so it would need to provide sunscreen and ensure students wear hats.
Developing a digital duty of care was a key recommendation of a review of the Online Safety Act handed down earlier this year. It was suggested as a way to put the “onus” on social media companies to protect user safety.
Rowland
In a speech on Wednesday night, Communications Minister Michelle Rowland said the Government would commit to a duty of care model.
“What’s required is a shift away from reacting to harms by relying on content regulation alone, and moving towards systems-based prevention,” Rowland said.
Social media companies would need to regularly identify and prevent harms on their platform. Failure to do so would allow the regulator, the Australian Communications and Media Authority, to impose large fines on the company.
What now?
The Government is in the process of developing a legislated model of a duty of care as part of a broader suite of reforms to the Online Safety Act.
The Act currently covers all social media platforms, meaning a duty of care would do likewise.
The Government is now going through further consultation on how the duty of care model would work.
Legislation isn’t expected until at least 2025. It will need the support of the Opposition, or the Greens and independents, to pass the Senate.
Overseas
A similar duty of care has recently been legislated in the UK.
In the UK, there are separate duties of care to distinguish between content that is harmful to children and to adults.
That means social media companies are liable to protect children from content that is considered harmful to them, including pornography.
They are also liable for criminal content and fraudulent activity online, such as fake ads.