There has been massive global interest in the new social media legislation introduced in Australia aimed at protecting children from the dangers of doom‑scrolling and mental‑health risks potentially posed by these platforms during their developmental years
The platforms’ methods so far for verifying young people’s ages have shown mixed effectiveness.
The Australian Christmas period may be interrupted with cries of “I’m so bored without Insta”, but the Australian government is not done yet. New measures are scheduled to come into force before the new year, which will include further restrictions on content deemed age-inappropriate across a range of internet services.
What are the new restrictions?
While families grapple with the social media ban, Australia is about to dial up the volume on increased measures to further regulate the internet through the impending industry codes. These will eventually be implemented across services including search engines, social media messaging services, online games, app distributors, equipment manufacturers and suppliers (smartphones, tablets and so on) and AI chatbots and companions.
Over the Christmas break we’ll start to see hosting services (and ISPs/search engines) that deliver sexual content including pornography, alongside material categorised as promoting eating disorders and self-harm, start to impose various restrictions, including increased age checks.
However, there are concerns the codes may result in overreach, affecting marginalised communities and limiting young people’s access to educational material. After all, big tech doesn’t have a great track record, particularly in terms of sexual health material and associated educational content.
How will it work?
From December 27 (with some measures coming in later), sites delivering content that fall under the new industry codes will be required to implement “appropriate age assurance”. How they will do this is largely left to the providers to decide.
Age checks will likely be administered across the internet through various age-assurance and age-verification processes to limit young people’s access.
While much of the media coverage has focused on the social media ban, the industry codes have been much quieter, and arguably more difficult to understand. Discussion has focused on the impact and extent of the code with little focus on the very people that the changes are designed to impact: young people.
The quiet voices
Our new research explores the view of Australia’s teens on various age-verification and age-assurance measures – views that don’t appear to have been fully taken into consideration by policymakers.
Teens believe governments and industry should be “doing more” to make online spaces safer, but are sceptical about age verification measures. Unsurprisingly, consistent with other research, teens confess they will find ways around the ban, such as the use of VPNs, borrowed ID or using images of adults to overcome age verification and assurance measures. Biometric measures such as facial identification have also shown concerning racial, gender and age bias.
Miles, 16, told us:
There are nifty little ways around it. […] I think that’s one thing that all kids have, [a] knack to kind of — there’s a little thing, “oh I can get ‘round it, it’s a bit of fun”[…] There will be loopholes that people will find, there’ll be younger generations finding little knickknacks [VPNs] there’ll be ways around.
Much like adults, teens held concerns around the privacy and security implications of age verification.
Some measures require personal data to be either validated or processed by third-party companies, potentially outside Australia. Users are expected to trust such companies despite data being a highly valuable commodity in the modern age.
Previous research has indicated scepticism around the safety of allowing third parties to host such personal data. This raises justified security and privacy concerns for all Australian users – especially following the recent Discord data leak that disclosed photos used for age verification of Australian account holders.
Even research by the office of the eSafety commissioner itself indicates teens are tech-savvy and likely to bypass restrictions.
In the United Kingdom (where on the day of implementation, one VPN platform saw a 1,400% surge in uptake, minors are now using unstable free VPNs to overcome Ofcom’s age-assurance measures to access blocked pornographic content. While functional for the end-user, their use leaves them susceptible to sensitive personal data leaks and phishing, further compromising their safety.
Such concerns are exacerbated by uncertainty over the kind of data being captured by third parties and government bodies, (particularly if digital ID or temporary digital tokens are to be used as a measure in future). For teens, this possibility was of particular concern when considering access to online sexual content as the new rules come into force. As Miles told us:
What you’re consuming I think is a little bit too far. I think there are certain limits and prying into people’s personal sexual lives is a little bit too far [capturing] personal sexual interests and viewings.
Teens note that by restricting access to content, the government may actually be making the desire to access content more enticing too. Some may even see it as a challenge to find ways around the restrictions. Tiffany, 16, told us:
[I] don’t know if they [restrictions] actually work that much ‘cause I feel like where people lock something or disallow something it makes [them] want to look at it more, and see it more, so I feel it’s more incentive.
More relevant measures than age
Interestingly, some teens suggest that maturity would be a better measure of emotional and cognitive readiness for content than age. Tiffany put it this way:
[because] some people, they could be 13 or 14, and they could act much older than they are, and have an intellectual level much higher than their age, and then some people could be that same age, but their intellectual level is much younger. So, there’s a big variation in people’s personalities and their lives and how they think.
However, they conceded this would be very difficult to measure.
Teens were supportive of protections for younger children consistent with New Zealand research. Levi (pre-teen) said:
There’s probably a certain age that’s too young to see certain things like violence or sexually explicit content like pornography.
However, they also argue that for older teens there may be benefit to accessing both sexual content and social media for educational purposes, particularly for sexual information.
Teens argue that independence and autonomy is key in these crucial years of development as emerging adults. Tiffany said
[Teens] can’t really be their own person if somebody doesn’t have trust in them and let them have their own independence. It’s a necessity for somebody to be able to grow into their own person.
Many participants stressed they are able to self-regulate. Arguably, teens will inevitably access content, whether it be social media or sexual content online, and benefit from chances to build these skills.
What lessons need to be learned?
Such measures often overlook young people’s fundamental rights, including their sexual rights, and policymakers need to consider the views of young people themselves. Until recently, these views have been strikingly absent from these debates but represent valuable contributions that should be appropriately considered and integrated into future plans.
Findings indicate there is a growing need to separate older teens from children in policy. Teens also overwhemingly recognised education (including digital literacy and lessons relating to sexual health and behaviours) in offline and online spaces as powerful tools – that should not be withheld or restricted unnecessarily.
