Children should be shielded from online harms such as pro-terror content, child sexual exploitation material, and unethical marketing. Yet new draft Online Safety Codes, if formally registered by the eSafety Commissioner, threaten to water down protections and put Australian children at risk.
Online Safety Codes provide people in Australia with industry-wide protections against things like child sexual exploitation material, pro-terror content, and extreme crime and violence. These Codes are particularly vital for protecting Australia’s children and their rights.
Today, Australia has an enviable online safety pedigree. We had the world’s first independent online safety commissioners, and passed a ground-breaking online safety act in 2015. This is a feat with which the UK still struggles; it remains a dream in countries like the US and Canada.
But this reputation is at risk. As part of the 2020 update to Australia’s online safety laws, the binding Online Safety Codes for industry are set to be introduced early this year. The Online Safety Codes were intended to improve standards by setting the rules for industry — social media companies, online games, and internet service providers — to ensure children’s safety.
Concerningly, the Codes don’t align with existing international best practice. They don’t ensure that monitoring and regulation will support and protect children’s rights in the digital world. There is minimal evidence of a consistent focus on identifying risk, addressing harms, enabling prevention of harm, and creating child-safe environments online.
The draft ‘Online Safety Codes’ lower safety standards
The Online Safety Codes are supposed to specify how children’s safety will be ensured online; from how companies can handle their privacy settings, to how they should manage their live location data, and when to report sexual abuse material. But the draft Codes are insufficient.
Many of Australia’s leading child rights and child safety organisations (including the Australian Child Rights Taskforce) have clearly stated the revised draft Codes are not in children’s best interests.
In a letter to the eSafety Commissioner, a coalition of children’s safety organisations outlined the failures. Far from ensuring children’s basic online safety standards, these codes are poised to reduce safety standards. The letter lists two criteria for evaluating the Codes’ effects:
“Firstly, do these [Codes] improve safety standards for Australian children from the current position, and secondly, do they match the standards enjoyed by children elsewhere in the world…?”
According to these organisations, neither of these basic criteria has been met. They cite examples where the draft Online Safety Codes entrench weaker standards of online safety than the current status quo in practice. They note, for example, that many online services do a better job detecting child sexual abuse and exploitation material than the Codes expect.
The letter’s authors also describe how the draft codes entrench weaker standards than emerging global norms. For example, 16-and 17-year-old’s live location data is routinely protected in the UK and California, among other places, but won’t be under these Codes. Many Australian parents and carers would be alarmed to know that their children’s live location data is not adequately protected by domestic codes.
The narrow conception of ‘safety’ used fails to address issues and experiences that can alter the health and wellbeing of children and young people. The proposed social media codes only focus on ‘child sexual exploitation material and pro-terror content’. There is an obvious need to protect children from online advertising of harmful products, such as e-cigarettes (vapes), tobacco, junk food, alcohol, and gambling.
Codes that don’t improve the current situation, and don’t meet basic international norms, will not improve online safety.
How did we get here?
How did we end up with draft Online Safety Codes that appear to approve lower standards than what already exist in Australia and internationally?
In line with the Online Safety Act, these industry Codes have been developed via co-regulation. Simply put, co-regulation means industries themselves write their own codes. In this instance, this means that an industry representative group drafted the Online Safety Code for social media providers. Given all we’ve seen about how social media companies manage safety of children and other groups, a set of dangerously weak Codes feels almost inevitable.
Previous research by Reset Australia highlighted how co-regulation does not meet community standards. Only 21% of adults and 14% of young people surveyed replied that they trusted social media companies to write these sorts of Codes: most said they’d rather independent regulators draft privacy and safety codes. As Facebook whistle blower Frances Haugen put to Australian parliamentarians 18 months ago, it’s time they, too, stopped trusting Facebook.
What could be done?
There is some hope. The draft Codes are currently with the eSafety Commissioner, Australia’s independent regulator for online safety, who decides whether to formally register them.
Ultimately, the Codes fail to adequately protect children and should not be formally registered in their current form. It would be extremely perverse if one of the key pillars of Australia’s Online Safety Act update led to the adoption of industry Codes that leave children less safe.
If these Codes aren’t registered, it leaves the eSafety Commissioner potentially able to draft stronger Codes instead. This would provide an opportunity for the Codes to better reflect appropriate community standards to protect and support children. A more comprehensive approach and understanding of risk, safety, and harm would recognise the consequences of online advertising and sales of harmful products. Ultimately, there should be appropriate research into the most effective regulation for the industry.
Dr Fiona Robards is a lecturer at the University of Sydney and is co-convenor of PHAA’s Child and Youth Health SIG.
Follow Dr Robards on Twitter: @fionarobards
Image: Bruno Gomiero/Unsplash