Reddit, Snapchat, TikTok, Facebook, Instagram and X will “at minimum” be the digital platforms affected by the Australian Government’s plan to ban Australians under 16 from social media accounts, according to Federal Communications Minister, Michelle Rowland.
The bill, announced earlier this month and first tabled in parliament last week, would allow exemptions for some messaging and online gaming services, as well as health and education platforms.
While platforms such as Bluesky, Meta’s Threads and some streaming and communications platforms such as Discord and Amazon’s Twitch were not mentioned, Rowland said the legislation would include some carve-outs for certain apps.
Google’s YouTube, as well as Meta’s Facebook Messenger Kids and WhatsApp platforms, were expected to be deemed to be “out of scope” of the bill, she said.
The regulation, which would introduce a minimum age of 16 to have an account on impacted platforms, would come into effect 12 months after its passage through parliament and would require all Australians to verify their age when accessing social media.
The government said “serious and repeated breaches” of the bill could result in penalties of up to $50 million for social media companies.
The federal opposition is expected to support the legislation, while the Greens and some crossbenchers have said they will not support it.
The proposed ban has also been criticised by some mental health organisations and digital media experts, who have argued it could isolate younger people.
A parliamentary inquiry into social media also stopped short of recommending a ban for under-16s in a final report handed down last week.
The Digital Industry Group Inc. (DIGI), which counts Meta, Google, TikTok, X, Snap, Discord and Twitch among its members, said it was concerned the world-first legislation would be “rushed through parliament without scrutiny or consultation” before the end of the year.
DIGI managing director Sunita Bose said the group encouraged the federal government to create a parliamentary committee and consult more widely.
“Neither experts nor the community have been consulted on the details of the legislation being released today, and we need to hear from them before this becomes law,” Bose said.
“Mainstream digital platforms have strict measures in place to keep young people safe, and a ban could push young people onto darker, less safe online spaces that don’t have safety guardrails.
“A blunt ban doesn’t encourage companies to continually improve safety because the focus is on keeping teenagers off the service, rather than keeping them safe when they’re on it.”
Rowland said the government’s plan to simultaneously impose a duty of care on social media companies would help protect all Australians by putting the onus on tech companies to keep users safe.
How will a social media platform be defined?
A definition of an age-restricted social media platform would be introduced into the Online Safety Act, Rowland said.
“Its definition includes that a significant purpose of the service is to enable online social interactions between two or more users,” she said.
“While the definition casts a wide net, the bill allows for flexibility to reduce the scope or further target the definition through legislative rules.”
This meant the government could be “responsive to changes and evolutions in the dynamic social media ecosystem,” Rowland argued.
Messaging and gaming services, as well as health and education apps, did not feature the same “psychological manipulation to encourage near-endless engagement” that social media apps did, she said.
The government was also hoping to “strike a balance between protecting young people from harm while limiting the regulatory burden on the broader population” by regulating the act of having a social media account, instead of the act of accessing social media, Rowland said.
“Importantly, this obligation would help to mitigate the risks arising from the harmful features that are largely associated with user accounts or the logged-in state — persistent notifications and alerts, which have been found to have a negative impact on sleep, stress levels and attention,” she said.
Privacy concerns as age assurance trial gets underway
DIGI’s Sunita Bose said her organisation was worried about “anything with comments enabled” potentially being caught up in the ban, as well as the impacts on Australians’ privacy.
“For a range of websites to verifiably know whether someone is 14 or 40, young people and adults alike will need to take regular actions like providing sensitive personal ID documents, biometric facial scans, or link to myGovID [now called myID],” she said.
“We haven’t seen this implemented anywhere else in the world and don’t yet know the unintended safety and mental health consequences, nor the privacy and data security implications.”
The Albanese government has announced a UK body called Age Check Certification Scheme (ACCS) had won the tender to trial technologies which could be used to check the ages of Australian social media users.
Technologies to be tested could include facial biometrics and digital use patterns, the government said.
Australians will be invited to participate in the testing, with a final report expected from ACCS mid-next year.
Under the government’s bill, social media companies and related third parties would be expected to destroy any data they received during age assurance processes.
Meta’s Instagram attempted to get ahead of the proposed regulations by implementing teen accounts earlier this year.
The platform last week announced it would also begin testing a tool to allow all users, including teenagers, to reset the types of content its algorithms suggested to them, billing it as a “fresh start” for their feeds.
This article was first published on ACS InformationAge