Part 4: Implementation (Australia v Social Media Mini-Series)

Pod Notes

This is episode 4 of TPDi’s 5-part Tech Mirror mini-series, Australia vs Social Media: Inside the world-first online safety experiment. 

In this episode, we do a deep dive into the practicalities of implementing the Social Media Minimum Age legislation. What is likely to happen on 10th December when the law comes into effect? We answer some of the main questions that Australian young people and their parents and carers might have.

We hear from the eSafety Commissioner Julie Inman Grant, Privacy Commissioner Carly Kind, deputy program director of the Age Assurance Technology Trial Andrew Hammond, clinical psychologist Dr Danielle Einstein, Professor Amanda Third, co-director of the Young and Resilient Research Centre at Western Sydney University, and Minh Hoang, a member of the eSafety Commissioner’s Youth Advisory Council. spot.

Credits

Written and narrated by Johanna Weaver, Executive Director, Tech Policy Design Institute.

Produced by Olivia O’Flynn & Kate Montague, Audiocraft.

Research by Amy Denmeade.

Original music by Thalia Skopellos.

Created on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation.

Special thanks to all the team at the Tech Policy Design Institute, without whom the pod would not be possible, especially Zoe Hawkins, Meredith Hodgman, and Dorina Wittmann.

Links

eSafety Commissioner’s Social Media age restrictions hub https://www.esafety.gov.au/about-us/industry-regulation/social-media-age-restrictions-hub

eSafety appoints Stanford University-led academic advisory group to assess the impacts of the Social Media Minimum Age obligation (September 2025) https://www.esafety.gov.au/newsroom/media-releases/esafety-appoints-stanford-university-led-academic-advisory-group-to-assess-the-impacts-of-the-social-media-minimum-age-obligation

Office of the Australian Information Commissioner (OAIC) resources on the social media minimum age https://www.oaic.gov.au/privacy/your-privacy-rights/social-media-minimum-age

The Dip, founded by Dr Danielle Einstein https://www.thedip.com/

Young Men Online https://www.esafety.gov.au/research/young-men-online

Cyberbullying https://www.esafety.gov.au/key-topics/cyberbullying

Sextortion https://www.esafety.gov.au/key-topics/image-based-abuse/deal-with-sextortion

Parental Controls https://www.esafety.gov.au/parents/issues-and-advice/parental-controls

Press Conference: Social Media minimum Age Platform Assessments, Minister for Communications media release (November 2025) https://www.youtube.com/watch?v=b9CIZK_12Zc

Meta announces it will begin implementing required changes from 4 December https://www.abc.net.au/news/2025-11-19/meta-to-block-teens-from-instagram-facebook-week-early/106028014

Family Tech Agreement Template (eSaftey, good for younger children): https://www.esafety.gov.au/parents/resources/family-tech-agreement

Family Tech Contract (Think you know, good for teenagers): https://www.thinkuknow.org.au/find-advice/building-safe-online-habits

Headspace guide to the social media ban https://headspace.org.au/our-impact/campaigns/social-media-ban/

ReachOut https://about.au.reachout.com/home

Kids Helpline https://kidshelpline.com.au/

Lode a complaint with the Privacy Commissioner https://www.oaic.gov.au/privacy/your-privacy-rights/social-media-minimum-age

Transcript

Johanna:  The Tech Policy Design Institute acknowledges and pays our respects to all First Nations people. We recognize and celebrate that indigenous people were this continent’s first tech innovators.

Minister Wells clip:  We know Australians naturally have a lot of questions about this world leading law from 10 December. Social media platforms will have a responsibility to remove young Australians under the age of 16 from their platforms. We know the platforms have the capability to do this. These are some of the biggest and best resourced companies in the world.

We cannot control the ocean. But we can police the sharks. And today we are making clear to the rest of the world how we intend to do this.

Opening clips: The minister and I have an important announcement. Social media is harm children, social media. It’s not a ban on content, age restrictions, the age assurance ban, social media, age restrictions, the.

Johanna: Welcome to Tech Mirror, brought to you by the Tech Policy Design Institute. In this five-part series, we’re exploring Australia’s social media minimum age restrictions that will come into effect on the 10th of December.

I’m Johanna Weaver, co-founder and executive director at the Tech Policy Design Institute. We’re an independent, non-partisan think tank dedicated to technology policy. In this series so far, we’ve covered the harms that the social media minimum age restrictions are trying to prevent. We’ve examined the political story behind how this law came to be, and we’ve looked at what the law actually requires.

Including how the age assurance technologies that underpin it will work. And in this episode we are doing a deep dive on the practicalities of the law’s implementation.

Amanda: So in the lead up to the implementation of the new legislation in December, I think a lot of teenagers and their parents and caregivers are gonna be a bit nervous about what lies ahead and that’s totally understandable.

Johanna: Professorial Fellow and Child Rights Advocate, Amanda third, who we heard from in previous episodes, explains

Amanda: I think there’s a lot of unknowns and the reality is that the government is having to work very hard to develop this legislation in ways that can then be clearly communicated to children and parents and caregivers.

So I think one of the best things that people can do to prepare. Is to begin to have conversations at home about what it will mean to give up social media for those children and young people who are already on social media, but will need to come off and to talk about how you’re gonna manage that. I think the more we can talk these things through and support children and young people to comply with the new legislation, the better prepared we’ll be to manage some of the perhaps unintended consequences of the legislation, so definitely talking about it, but of course it’s really hard to talk about it in a bit of an information vacuum.

Johanna: I asked Julie Inman Grant, Australia’s eSafety Commissioner to help paint us a picture. What will it look like when we wake up on the 10th of December? The law is in force and we go online. What will change?

Julie: I think the realistic answer to that is it depends on what platforms the children are on and how effectively they deactivate or remove the content on December 10th, so. I don’t expect that every single under 16 social media account will magically disappear.

So that’s part of the communications and expectations setting that we need to do. We already have a social media minimum age microsite, where we’re putting information up there on a regular basis, but we will be releasing a whole new slew of materials for parents checklists and how to have conversation starters, actually getting them to wean their kids off some of these.

Maybe download their archives, sign up for parental controls.

Johanna: This brings us to our first piece of practical advice for this episode. If you are a parent or if you have young people in your life who will be affected by this new law, I encourage you to visit the eSafety Commissioner’s social media age restriction hub.

Online, it’s got lots of great resources. This includes things like fact sheets, but there’s also a really handy action plan that you can work through in advance of the 10th of December. Another thing that I really like on this site is that there’s links to practical guides on how to, for example, download your young person’s Instagram, TikTok, or Snapchat archives.

This is gonna be particularly useful. If you have young people in your life who are in that age bracket where they have been using social media, they now need to get off social media, but they don’t wanna lose those precious memories.

The most pressing question on many people’s minds is what social media companies are actually going to be covered by the band.

Here’s Minister Wells on the 5th of November providing some clarity for us.

Minister Wells clip: eSafety has assessed Facebook, Instagram, Snapchat, TikTok, YouTube X threads Reddit and Kick as age restricted platforms. This means from 10 December. These services must take reasonable steps to prevent under sixteens from holding accounts and failure to do so could warrant fines of up to $49.5 million.

eSafety platform assessments will be ongoing and respond to technological change, but we understand that families need certainty now. And now regarding the major platforms captured under the social media minimum age laws.

Johanna: Importantly the platforms that you’ve just heard the minister read out, there are not the only social media services captured by this ban. That list is a list of the major platforms that have been assessed by eSafety to be included in the law. But as the minister said, these assessments are ongoing and they’re being prioritized based on the platforms that have the highest number of young people using them with the biggest potential for harm.

But this absolutely does not mean that other social media platforms don’t have an obligation to implement age restrictions. These public announcements were designed to help provide clarity to parents and young people about the major platforms. But as we discussed in episode three, the obligation remains on all social media companies to self-assess if they are covered by the law and if they are to implement age restrictions on their services from the 10th of December.

Not withstanding these announcements, we know there’s a lot of uncertainty and speculation about what will actually happen on the 10th of December. So next up, I asked the eSafety commissioner to help set the record straight with five rapid fire questions on the 10th of December. Will everyone need to prove the age to use the internet?

Julie: No, they will not. And I said in my regulatory guidance that if social media sites took that approach, it would be unreasonable.

Johanna: Will young people that have existing accounts be kicked off if they are under 16?

Julie: They may very well. That is the expectation that deactivation or removal of under 16 accounts will happen on December 10th.

Johanna: Will this stop young people from being able to access important information online?

I do not think so, and that is why we have a children’s digital rights statement and the way that we’re implementing this. Indeed, they will be able to use messaging sites and gaming sites and certainly search the internet.

Johanna: Will I get in trouble if my 12-year-old is using Instagram and the government finds out?

Julie: No. The onus is purely on the platform, not on the parents or the children. And by the way, a feature of the legislation is that children will be able to access content in a logged out state.

Johanna: Is this all a waste of time? Because the kids are gonna find a way to get around the ban in inverted commas.

Julie:  I mean, I think we know there is going to be some circumvention and we’ve put some more responsibility on the platforms to deal with that, and I think they can. I. I think it’ll create some positive friction. It’ll make some normative changes that will be helpful to parents and to educators.

I’m positive. And I also note that we have a group of 12 incredible independent academics that will be. Evaluating the impacts of this legislation. Are kids sleeping better? Are they interacting interpersonally more with their friends? Are they doing more sport?

And what are the unintended consequences? Are they going to darker areas of the internet? I think these are really important questions that we need some of the best minds in the world to put their academic rigor to.

Johanna: One of the academics chosen by the eSafety team for this independent evaluation panel is Amanda third. We heard from her earlier and also in past episodes where she talked about coordinating the expert letter opposing the ban.

I really have to give credit here to the eSafety Commissioner for including critics on the evaluation panel for this law and also to Amanda for the thoughtful way she continues to engage in this process. We know that when this band kicks in, young people under 16 will be restricted from using social media, but of course, their online lives won’t just suddenly disappear. As Amanda points out, this means we need to think carefully about what online safety means as young people find new ways to connect and interact online.

Amanda: Families are also going to need to talk about how they’re going to. To keep each other safe as children engage with social media outside that social media account structure.

 

So if children and young people don’t have an account, that doesn’t mean they can’t access social media. They can, and I guess families will need to be having conversations around what sorts of safety mechanisms they put in place in their home to make sure that children are engaging safely with the broader kind of open access version of social media.

Right? The kind of non account holding version of social media. It’ll be about families. Just recognizing that children and young people can still go online and access social media, but it will just be outside that social media account. Some families are beginning to talk about. Setting up family social media accounts that are auspice by the adult, but the children can have access to that through their devices.

That might be a really nice way for parents and caregivers to keep an eye on what’s happening on a social media profile, if you like. But I think, let’s not get too complacent about that because actually teenagers, one of the things we know about them is that they do need their privacy need. Some autonomy.

And so even if you do have those spaces, it’s likely that your teenagers will still do other things around that. So I think one of the key things here to remember is that the aim of the legislation is to put the onus on technology platforms to ensure children’s and young people’s safety online.

Johanna: As Amanda has articulated, the onus is on the platforms to keep young people off social media. There are big fines for tech companies if they don’t take reasonable steps to prevent young people from using social media after the 10th of December, that there are no penalties or fines for parents or young people who find ways to continue using the technology after the law comes into force.

One of the biggest fears that those of us who are following this closely have. Is that this social media age restriction will protect some young people, those kids that get off social media because of this law, but it may also expose other kids, the kids who find workarounds to a less safe version of social media. The fear is that it might push them to darker places of the internet that have absolutely no protections for young people.

If these young people perceive that what they are doing is illegal, they may be even less likely to seek help if they find trouble online. And the kids that will be most at risk are likely those that are already some of our most vulnerable. The outgoing Australian children’s Commissioner Ann Hollands told the new site, crikey, that she’s worried that the ban will adversely affect children who already struggle to find connection and belonging.

 

This leads us to our next practical tip. If you have young people in your life, don’t just have a conversation with them about how they can prepare for the ban. Also, have a conversation with them about what they should do if they find a workaround, go on social media and then need help. This is a call to action, not just for parents, but to the entire Australian community.

We need to be having these conversations with the young people in our lives, especially if you happen to be the one trusted adult in a young person’s life. Here’s Amanda third.

Amanda: Clearly it’s not a good idea to encourage children to do something illegal, but also opening that door to say, as we navigate this together. Things might go wrong, and when you come to me and say, I’ve done this bad thing or whatever, I’m not gonna be pleased with you. But I do promise to listen and take your concerns or your predicaments seriously, and we’ll work together to find the solutions. But of course, you also need to be. Really educating them about the risks of harm they might encounter in less regulated spaces.

It’s always a tricky thing trying to talk about those things in an age-appropriate way. Obviously, the way you would talk about some of the things that children might encounter in those less regulated spaces with a 12-year-old is quite different to the ways that you would talk to a 15-year-old about them.

Johanna: If you’re not sure how to have these conversations. Both the eSafety Commissioner and the Privacy Commissioner have great resources online, including how to talk about things like sextortion or cyber bullying, or the online experiences of young men. Now, of course, every conversation with every young person will be different, but these resources are great starting points for your conversations.

A psychologist, Danielle Einstein agrees that keeping the lines of communication open is really important. And while she doesn’t encourage it, she has some practical advice for how to handle it. If a young person tells you that they’re going to find a work around after the ban and keep using social media.

So here’s our next practical tip.

Danielle: If a teenager say, look, I’m gonna work around this, then I think a parent’s gotta say, okay, I am pleased that you’re telling me and I understand, but let’s just look at the addictive pull of social media for you and let’s help you just contain that. So why don’t you, whatever way you are gonna use to work around this, put it on a single device, be that an iPad or a computer, just one device.

 

And use it in one place in the house. Get a chair that is appointed, that is where you’re gonna use your social media. Say that you are using it intentionally. You are not being pulled to it when you’re doing your homework. You are not being pulled to it everywhere, no matter where you go.

Johanna: But what about the kids that the children’s commissioner is concerned about, the ones that don’t have supportive home environments who may depend on social media for the connections that they can’t otherwise find at home or at school?

Well, there are safe and supportive online communities that aren’t social media. That young people will still be able to access after the 10th of December.

Julie: We’re working closely now with, uh, Headspace and reach out and beyond Blue and Kids Helpline to make sure that young people know where to go for help, but also for other engaging sites that they can continue to engage in.

Johanna: This brings us to our next practical tip. So headspace.org au is a website for young Australians, and it has something that they call online communities. What I really like about headspace.org is that it is a community. It is a group of peers with whom you can share and have ideas, but all of those chats are moderated by trained peer workers who bring their own lived experience and make sure that that chat environment remains supportive.

So for any young Australian who’s concerned or anxious about losing your online community, I encourage you to head over to headspace.org au. There are a lot of other resources out there as well. Things like Reach Out or The Kids Helpline, and we’ve linked those in the show notes. But here’s another tip from the eSafety Commissioner.

Julie: One of the pieces of advice we’ll be providing to young people is before you go on school holidays in December 10th, set up a group messaging chat with your friends. If you’ve got influences you like that your parents are happy for you to look at, bookmark their website. There are lots of things that we can do to keep kids engaged but not addicted.

Johanna: As young people transition off social media, it’s a great time to implement good digital discipline more broadly. Here’s psychologist Danielle Einstein.

Danielle: If we start to think by the individual and we think about both, uh, a teenager and a parent, in fact, any adult really thinking about how they use technology, I think everybody can put in place some, what I call digital discipline.

So healthy device discipline. So if we can recognize the addictive pool of devices, we can do what I was talking about, which is put in healthy device discipline. We’re not taking devices away, we’re just constraining our use of devices and making intentional decisions about how we’re gonna use them in our home. So the small things are, for instance, parents recognizing their modeling and deciding.

We’re agreeing to have phones, you know, on a shelf in the house during dinner. 100% no devices in the bedroom station your computer, in one space in your house. Don’t have it carried around with you all the time. Put your phone down when you’ve gotta do other things and park it out of sight. Where it is. Then also out of mind.

Johanna: Another simple practical tip, and this one is one of my favorites, is to make a family or household tech agreement. We’ve linked a few templates in the show notes, but you really don’t need to overthink this. It’s just about sitting down and agreeing the ground rules for how you will use your device in the home, and importantly, how you will hold each other accountable.

If you are a parent or a caregiver, make sure your own devices a part of this deal. This is not just about holding young people to account. They also get to remind us to put our phones away too, which sometimes, let’s be honest, is probably good for everybody. But as Amanda Third reminds us, it’s not just about device discipline, but also how we interact more broadly online and off

Amanda: What we know is that. Children go primarily online in order to nurture their connections with others. And contrary to what we often hear in the mainstream media, actually children are mostly connecting with their peers that they know face-to-face. So it’s really about an additional dimension to their social relationship.

They can be an incitement to take risks, but most of the time actually kids will regulate one another. And so what it’s about is like setting the tone of those engagements. It’s about reminding your child that you know what your family’s values are and what you expect from them. And that doesn’t have to be just about digital media, right?

It’s just about reinforcing your family’s values in general, because children don’t go into a moral vacuum when they go online. They translate. Those moral and ethical frameworks across in between online and offline spaces. And so actually just really reinforcing what it means to be a good person in the world and online right is one of the best things that parents can do.

Also, role modeling great social media practices yourself for your children. Goes a very long way up until the age of 25, believe it or not, a parent is the most significant role model for a child, whether you are a good parent or a bad parent, right? You are the most significant role model.

Johanna: Even if you practice the best possible role modeling, you’ve got a family tech agreement in place, you’ve got really good intentions.

We know that some young people will continue to push the boundaries, particularly around how much time they spend online and the level of trust that they’re afforded by adults when they do. So we have another practical tip might not be appropriate for everybody, but it’s something that parents can explore.

And we know that a lot of schools are already implementing, and this is device level controls. It’s a tool that helps you to monitor young people’s use, reinforce the guardrails that you’ve pre-agreed with them about how they’ll spend their time online. And I know that. Many parents that I talk to are daunted by this.

They say things like, but my kids know so much more about the tech than I do. If you would like to learn more about parental or on device controls, there are some step-by-step guides from eSafety in the episode show notes that take you through some of the options. These technical tools are powerful.

They may be part of the solution for your household, but they should also be part of a broader agreement and conversations that you’re having with young people about their online use. It’s important that we respect young people’s privacy and autonomy and acknowledge that the age at which this is appropriate will differ from young person to young person.

Another thing that Danielle Einstein urges is that we start to do something that she calls being comfortably uncomfortable.

Danielle: The more people don’t like uncertainty, the greater the number of psychological problems they have. And what we’ve seen is having a smartphone on your person all the time means that you can block out that feeling of uncertainty all the time.

You can try and get organized, right? Anticipate every problem, try and come up with a solution in advance. And we’ve actually seen technology companies almost milking that. I call it now, the anxiety economy. Where people don’t realize that they’re paying either with money or with their time and attention for methods to take away uncertainty, but being able to handle uncertainty is really good for us.

It’s like it’s a really essential muscle for mental health and. When you are overcoming any anxiety, you have to be able to be comfortably uncomfortable. So we have to encourage people to be okay with uncertainty. It’s part of. Building resilience that everyone understands that they can cope without being connected 24 7, and that it’s healthier in fact, to first develop your social skills, develop your emotional skills, your self-regulation before we go to the skill of broadcasting.

Johanna: In a nutshell, this is something that the proponents of the social media age restriction hope happens in this period between 13 and 16, when kids are not engaging on social media, that they’re developing their social skills, their emotional skills, their self-regulation skills, so that when they do come into contact with social media, later down the track, they’re able to engage in a healthier and a more resilient way.

Of course this works for young people, but building this muscle to be comfortably uncomfortable is just as relevant to building your resilience regardless of your age. Here’s a little challenge for you. Next time you’re in a conversation with friends and someone doesn’t know the answer, don’t reach for your phone.

Stay in the conversation with your friends, and if it really bugs you, you can look it up later. Practice this art of being comfortably uncomfortable.

As the eSafety commissioner has said, she doesn’t have any expectation that on the 10th of December. Everyone in Australia will need to prove our age to use social media. The obligation is on the platforms to take reasonable steps to ensure that they don’t have users who are under 16 on their sites.

And the eSafety commissioner has said that if those sites suddenly started requiring everyone in Australia to upload ID documents, she would consider that to be unreasonable and not in compliance with the law. So what can you expect from the 10th of December? We talked in detail about the different types of age assurance technologies in episode three, but for a quick recap.

For many of us, we won’t even notice that age assurance is happening in the background of our social media profiles. And that’s because the platforms will be using a technique called age inference. This is where they use the information that they already know about you to infer your age, and then if the platform doesn’t have enough information about you.

 

It may refer you on to something called age estimation, where they might ask you to take a selfie and then use that to estimate your age, or they might ask you to do age verification where they ask you to upload an ID document. The law expressly prohibits platforms only giving you the option to upload a government ID document.

They have to provide you with a choice and unless you unambiguously consent, otherwise, they must delete the information as soon as that age assurance process is complete and it’s Carly Kind, the Privacy commissioner’s job to make sure that companies are actually deleting your information. I asked her what she’ll be watching out for.

Carly: So if a tool, um, requires you to take a selfie or to perhaps speak into a tool, so using some kind of voice analysis, we’d wanna see the immediate deletion of that information, uh, straight away. And then the entity can retain, essentially a token that says, yes, no, 16 or under 16 above or under, for example.

So really encouraging. Platforms or other entities involved in implementing this obligation to retain the minimum amount of information that’s required to be able to satisfy themselves that they’ve done the age check. The second around age verification, again, we’re talking about the provision of very sensitive and important government documents.

Again, we’d wanna see deletion of. Those materials. The second they’re no longer required for verifying age. And then the retention just of that kind of over under token. And then the third around age inference is a little trickier because we’re looking at potentially the use of information that’s been collected over a long period of time.

So what an entity might say, we’re going to collect location data in order to understand if children are logging in, in and around schools, for example. Or they may also look at. Pictures of profile posts that you’ve put up over a long period of time and try to do age inference on the basis of those images under the Privacy Act.

If you take information you already have and infer something from it, it becomes a new collection of personal information so it gets captured by the regime in that way. But we also are conscious that entities won’t want to delete all of that information that they already hold as is required with the collection of new information.

So it is a bit of a tricky jigsaw puzzle. We have spelled out in great detail in our guidance how platforms are to navigate that puzzle, but it depends on whether it’s information you already held or new information that you’re collecting.

Johanna: Now the assumption is that if companies need to assure themselves that a user is over a particular age, that they will give users a choice of methods.

 

And if you are faced with a choice where you have options of which methods to use to assure your age, how do you decide? Which is the best to use? Andrew Hammond works for the company age check certification scheme that was commissioned by the government to test the different types of age assurance technologies.

Andrew: A lot of the companies that we worked with through the trial have quite detailed terms and conditions and acknowledgements as you go through the process so that you can really see what they’re doing and what they’re using your data for. And most of them actually had the final step before they give you the result was to delete.

That whatever information they had, and they actually said that was an overt step to say, yeah, we’re deleting your facial scan so that it’s gone on.

Johanna: So here’s another practical tip. If you are asked to prove your age online, and you are given a choice between multiple different methods, choose the method that makes it clear upfront that they are going to delete your information when the process is complete.

Responsible providers of age assurance technology, and that may be social media companies themselves, or they may outsource it to a third party, but the responsible providers of age assurance will expressly and prominently tell you that they will delete your information when the process is complete.

And if you are unsure, don’t use that technique. If you are worried that a company has collected your information and not deleted it, you can make a complaint to the privacy Commissioner. There are links to the complaints page in the show notes as well as links to some excellent resources that the Privacy Commissioner has co-developed with young people on how to protect your privacy online.

We wanted to round out this episode with a firm reminder. That this isn’t all about what happens online. And indeed for many young people, the distinction between offline and online is entirely false.

Minh: I don’t really agree or disagree per se, with the X because I can see both the pros and the cons.

Johanna: This is Minh Hoang, community Manager for Bloom, a youth innovation center based in wa, who we spoke to in episode one when they talked about the youth-led consultations that they ran in the lead up to the law passing.

Minh: What I’m seeing from my day job working in the Youth Innovation Center is that young people want more community, like physical community, but. That doesn’t mean that they completely turn away from the social media or like the digital platforms because there’s always a layer of them being digital natives.

It is complicated because we can talk all day about algorithmic like radicalization and polarization and this and that. But there’s also research and studies looking into how third places are un decline for decades now. And so the fact that not only you have all of these devices, the third place is like the actual physical place, a construction, a building, a place where young people can just come together and learn and connect and play.

Are being decimated. And so again, like you can’t just simply take away like the devices from young people without investing in the infrastructure for them to have a community offline.

Johanna: With the law about to come into force, a pressing question is whether or not we have the necessary offline infrastructure to support young people.

This could be. As outgoing children’s commissioner and Hollands has called for addressing the systemic failures leading to escalating mental health disorder. Or it might be providing funding so that kids, including those who might not be able to pay the fees or buy the uniforms, have the opportunity to play sport.

But it’s also about investing in public libraries and spaces for young people to gather to form community and including forums for political expression. Amanda third weighs in on why this is so important.

Amanda: Children in that teenage bracket use social media to learn about, to organize, and to take action on the key issues they care about.

And we’ve seen this in the sort of the Fridays for the future climate change movement, right? Where. Social media has been a really key tool in young people’s capacity to share information about what’s happening and also to organize collective action. And this is giving them a sort of a sense of purchase on political decision making, right?

It gives them a sense that they can influence political agendas and they can take action and they can call for change, and we are taking that away from them. So the question is. What do we do to replace that?

Johanna: Indeed, if there are legal challenges, we expect that one of the grounds that might be used to challenge the legislation in the courts is that this law deprives young people of their implied freedom of political expression.

There are many who also expect that some companies will challenge their inclusion on the list of social media companies covered by the laws. We at the Tech Policy Design Institute and many globally will be watching this very closely. But for now, a final piece of advice from Amanda. Third

Amanda: I think it’s really important to validate children’s own perspectives on these issues. What’s really clear? From the children in the debate is that many of them are dreading this moment that the legislation comes into play. They do see it as unjust. They call on social media platforms to make those spaces safer for them to engage in, and they really want their government to listen to their concerns about what happens next.

Johanna: As Australia moves towards the 10th of December. The tone that we set in our classrooms, in our homes, our communities, and by the media in the public conversation is going to be really important. We hope this episode has given you some practical tips and a lot of information to help you navigate the implementation of the social media minimum age laws.

In the next episode, the final episode in this miniseries will turn the lens forward. And explore what more we need to be doing to capture the momentum and to continue to build an environment online and off in which all Australians can thrive.

Cam Wilson: It’s a work in progress. I don’t think that come December 10, we can just put our feet up on the desk and say, job’s done.

Julie: I think the next big step, uh, that the government is committed to is the digital duty of care.

Carly: I think it’s great that we’ve prioritized children in strengthening privacy protections. Next stop. Everyone else in Australia.

Johanna: This has been Tech Mirror, a podcast brought to you by the Tech Policy Design Institute. We are based in Canberra on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation. You can find information about the research mentioned in the episode in the show notes. A big thank you to our guests, Amanda Third, Danielle Einstein, Julie Inman, grant, Carly Kind, Andrew Hammond, and Minh Hoang.

This podcast is made possible with thanks to the generous contributions from government, industry and philanthropy to the tech policy design fund, the full details of which are transparently disclosed on our website. For information about the archival audio we’ve used in this episode, please check the show notes.

The soundtrack is by Thalia Skopellos, a Sydney based artist and entrepreneur with Aboriginal and Greek heritage. This podcast was produced with the support of audio craft on the lands of the Gadigal people of the eora nation. Amy Denmeade provided invaluable research support. A big thank you also to all the team at the Tech Policy Design Institute, without whom this pod wouldn’t be possible.

For more information about our work, visit us at techpolicy.au or follow us on LinkedIn.