Episode 2

Part 2: The Politics (Australia v Social Media Mini-Series)

Pod Notes

This is episode 2 of a special 5-part Tech Mirror mini-series, Australia vs Social Media: Inside the world-first online safety experiment.  

In this episode, we discuss how the issue of social media harms and the idea of a minimum age restriction became such a political hot topic in the lead up to the 2024 Federal election. We explore the political, social and media forces that lead to the law passing Parliament, notwithstanding reservations of experts.  

We speak to Cam Wilson, a technology reporter from Crikey, Lizzie O’Shea (founder and chair of Digital Rights Watch), Professor Amanda Third (co-director of the Young and Resilient Research Centre at Western Sydney University), Australia’s eSafety Commissioner Julie Inman Grant, and Australia’s Privacy Commissioner, Carly Kind. 

Credits

Written and narrated by Johanna Weaver, Executive Director, Tech Policy Design Institute. 

Produced by Olivia O’Flynn & Kate Montague, Audiocraft. 

Research by Amy Denmeade. 

Original music by Thalia Skopellos. 

Created on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation 

Special thanks to all the team at the Tech Policy Design Institute, without whom the pod would not be possible, especially Zoe Hawkins, Meredith Hodgman, and Dorina Wittmann. 

Links

Cam Wilson, Crikey https://www.crikey.com.au/author/cam-wilson/ 

Lizzie O’Shea https://lizzieoshea.com/  

Digital Rights Watch https://digitalrightswatch.org.au/  

Amanda Third https://www.westernsydney.edu.au/young-and-resilient/people/directors/amanda_third 

Julie Inman Grant https://www.esafety.gov.au/about-us/about-the-commissioner 

Carly Kind https://www.oaic.gov.au/  

Minister Wells Speaking during Parliament House Question Time (31 July 2025) https://www.youtube.com/watch?v=kcLpm9SbOrk 

ABC News Breakfast (29 November 2024) https://www.youtube.com/watch?v=niaeYxdlvkw 

The Project, 10X Media Group/Network Ten (19 May 2024) https://www.youtube.com/watch?v=525CiA19WPI  

36 Months campaign https://www.36months.com/ 

Let Them Be Kids campaign https://www.dailytelegraph.com.au/topics/let-them-be-kids 

Social Media Summit, NSW & South Australia, October 2024  

https://www.nsw.gov.au/nsw-government/social-media-summit  

https://www.dpc.sa.gov.au/responsibilities/social-media-summit    

Report by Chief Justice Robert French, Legal Examination into Social Media Access for Children https://www.premier.sa.gov.au/media-releases/news-archive/banning-social-media-for-children 

Government response to the Privacy Act Review Report (September 2023) https://www.ag.gov.au/rights-and-protections/publications/government-response-privacy-act-review-report 

eSafety Commissioner Julie Inman Grant’s speech at the Royal Society of NSW, W x 3 — The World Wide Web (we weaved)! (July 2024) https://www.youtube.com/watch?v=nSFVrIugy3E 

Laws not bans can make kids safer online, Carly Kind, Privacy Commissioner (November 2024) https://www.oaic.gov.au/news/blog/laws-not-bans-can-make-kids-safer-online 

Prime Minister and Minister for Communications media conference (November 2024) https://minister.infrastructure.gov.au/rowland/speech/press-conference-parliament-house 

Social Media Age Limit, Office of Impact Analysis (November 2024) https://oia.pmc.gov.au/published-impact-analyses-and-reports/social-media-age-limit  

Social media: the good, the bad, and the ugly – Final report, from the Joint Select Committee on Social Media and Australian Society (November 2024) https://www.aph.gov.au/Parliamentary_Business/Committees/Joint/Social_Media_and_Australian_Society/SocialMedia 

Statutory Review of the Online Safety Act 2021, led by Delia Rickard, released February 2025 https://minister.infrastructure.gov.au/rowland/media-release/report-online-safety-act-review-released  

Environment and Communications Legislation Committee inquiry into the Online Safety Amendment (Social Media Minimum Age) Bill 2024 [Provisions] (November 2025) https://www.aph.gov.au/Parliamentary_Business/Committees/Senate/Environment_and_Communications/SocialMediaMinimumAge  

Online Safety Amendment (Social Media Minimum Age) Bill 2024, including the explanatory memorandum and transcripts of all second reading speeches https://www.aph.gov.au/Parliamentary_Business/Bills_Legislation/Bills_Search_Results/Result?bId=r7284  

Transcript 

Johanna:  The Tech Policy Design Institute acknowledges and pays our respects to all First Nations people. We recognize and celebrate that indigenous people were this continent’s first tech innovators.  

News clip: There has been a flurry of attention around the globe to Australia’s move here. All the big news sites like the New York Times, BBC, CCN uh, news sites throughout Asia covering this. 

Prime Minister: This one’s for the moms and dads. Social media is doing harm to our kids, and I’m calling time on it. I’ve spoken to thousands of parents, grandparents, aunties and uncles, they like me, are worried sick about the safety of our kids online, and I want Australian parents and families to know. That the government has your back. The minister and I have an important announcement  

News grabs: Social media is harming children. Social media. It’s not a ban on content. Age restrictions, the age insurance ban. It’s a ban. Social restrictions ban on…. 

Johanna: Welcome to Tech Mirror, brought to you by the Tech Policy Design Institute. In this five-part series, we’re exploring Australia’s social media minimum age restrictions that will come into effect on the 10th of December. 

I’m Johanna Weaver, co-founder and executive director of the Tech Policy Design Institute, an independent, non-partisan think tank dedicated to technology policy. In episode one, we acknowledged that social media use does bring some benefit to our lives, but we really focused in on the harms that the social media minimum age law is trying to prevent. 

We established that experts generally agree that social media can and does cause harm, especially for young people, but they disagree on what we should be doing about it. When the social media minimum age bill was introduced into Parliament, for the most part, parents welcomed it, but many experts didn’t. 

Most experts recognize and share the concerns about children’s safety, but they worry that this ban won’t actually make kids safer online, or worse, that it will encourage them into darker and more unregulated spaces. In upcoming episodes, we’ll discuss what is actually in the law and ask the question, will it work? 

But before we can do that, we need to understand how it came to be.  

In this episode, we’re asking if the experts were so sceptical about the ban, why did this become a political priority? To understand how this became such a hot button issue, I spoke to tech journalist Cam Wilson.  

Cam: I am an associate editor at Crikey, and I write the Daily Australian Tech newsletter, the Sizzle, and I think the environment over the last five to seven years has been a feeling that social media companies are not doing enough to protect children. 

You know, you could look at the Facebook files and what meta then Facebook knew about how Instagram was affecting teens. Then you had COVID-19, suddenly everyone was. Thrown inside for a while. We all had to go remote and spend a lot of time on our devices, and out of that there was this real emerging kind of community concern about children’s wellbeing and that was really linked to their use of devices. 

Out of that, we’ve seen this kind of global push. To try and do something about how children are using technology. It’s been around smartphones, it’s been around social media. Obviously. They’re both very intertwined, and so the immediate history of this bill is that in 2024, while there’s this big kind of, I would say, global campaign going on in countries around the world, popular books about it, there was a campaign in Australia to raise Australia’s minimum wage for social media platforms. 

News clip: Well, there’s not a lot of things that we trust. The developing brains of a 13-year-old with, they can’t drive, they can’t vote. They can’t even go on a school excursion without mom and dad’s consent, but they can have full access to social media Now. A growing chorus of voices is pushing to change all of that by raising the age. 

Johanna: One person who was influential in this campaign was Australian radio host and comedian Michael Wipfli AKA Wippa.  

News clip: Well, radio host Whipper has co-founded 36 months. It’s a campaign calling on the social media age to be raised to 16. Would you please welcome whipper? Thank you guys.  

Johanna: The media loved this topic in the lead up to the 2025 federal election. This is Wippa on the project, a national news panel show, looking down the barrel of the camera and issuing this challenge to the leaders of the two major political parties. During prime time on commercial television 

Wippa clip: If I’ve got this camera here to Mr. Albanese or Mr. Dutton there, if you were to make this an election promise, then you’ll win the vote of every parent. So I’ll leave it with you guys. We’re gonna do something.  

Cam: Initially, Peter Dunton, then opposition leader, committed to raising the age, and then shortly after Anthony Albanese was asked on a radio show and he too committed to doing something about it. That was in middle of 2024.  

Johanna: It’s worth noting that at the same time as the 36 month campaign was running, News Corp owned by the Murdoch media empire was also running a campaign. 

‘Let them be kids’. These campaigns amplified the voices of parents, many of whom had tragically lost their children to suicide and were rightly demanding action. Against these national conversations, individual states were independently exploring what they could do to protect kids. South Australia even went so far as to commission former Chief Justice of the High Court, Justice French, to conduct a review into possible options. This culminated in October, 2024 with the Social Media Summit.  

Cam: The New South Wales and South Australian premiers held this joint social media summits where they said, we’re bringing people together to talk about all the social media harms. And it was really framed as this idea of like, we know that there’s harm happening to kids on social media. What are we gonna do about it?  

And they invited all kinds of people and it was this idea of let’s hash this out, let’s figure out what we can do about it. Then from what I heard from attendees on the first day, the South Australian premier gets up at the start and says, the science is settled. We need to do this. We need to ban social media for kids. Completely like revealing the fact that while I’m sure there is a lot of interest to canvas various options about how to help kids. This was something that was almost in a way preordained and I was able to, via freedom of information requests, get internal correspondence showing that in the lead up to the event staffers are talking about how these events would actually create almost the permission structure for this policy. 

We also saw Jonathan Ha global phenomenon, who’s really been one of the leading figurehead in this idea of banning kids from having smartphones, banning kids from being on social media. He was in contact with the Australian government saying, don’t pay attention to academics who are cautioning. Against this, you should really push for this. His book was actually cited by the South Australian Premier, and I think it’s been mentioned by the Prime Minister as well.  

Johanna: We spoke to Jonathan Hyde in episode one, his premier Milanus speaking at the South Australian Social Media Summit, referring to the impact of Jonathan’s book.  

Premier Malinauskas clip: My wife, Annabelle, who’s here, um, today. Um, Belle, uh, read the book, uh, the Anxious Generation and, um, she read the book and then turned to me one night and said, you better ** do something about this. Um, and. That sincerely was the genesis of actually putting our minds to, well, what can we do as a state government, uh, in conjunction with the state parliament to make a difference, which then in turn led to the commissioning, uh, of the. Uh, review or report handed down by Chief Justice French.  

Cam: So you’ve got all this happening and amid all that Anthony Albanese and the Labor federal government committed to this, and not just as a way to do it as a policy, but also I think you know, that there is a value in harmonizing this.  

Johanna: This is then Communications Minister Michelle Roland, now the Attorney General talking at a press conference on the 7th of November in 2024. You can hear her reaffirming cam’s theory that the need to unify the states and territories was part of what drove the federal government to step in along with the concerns of parents.  

Michelle Rowland clip: The welfare of children is a collective responsibility and it is heartening, as the Prime Minister said through national cabinet to see the Commonwealth and states and territories working together for a common outcome. We know that social media offers many benefits to Australians, including to young people as a way of keeping connected, of finding their tribe, of making sure that young people who may otherwise be isolated by geography or other factors have that connection. But we also know that it brings many harms. 

The fact is that social media has a social responsibility, but. The platforms are falling short.  

Lizzie: I don’t mean to sound cynical, but I do think there was a political imperative at play here. My name is Lizzie O’Shea. I am a founder and the Chair of Digital Rights Watch. I am also a lawyer, and I sue technology companies that have done the wrong thing. 

So I would start by saying that we at Digital Rights Watch, like many people in this space, do acknowledge that online spaces can be really harmful for children, and social media in particular can be a place where there are known harms that are occur. And that’s true for young people. It’s also true for adults. 

Johanna: Lizzie and Digital Rights Watch. Have a longstanding position calling for the government to take action to hold tech companies to account. Here she is explaining why she thinks the ban gained traction when it did, and also why she’s worried that it will harm rather than help. We were going into an election. 

There was a question as to whether the government of the day had done enough on this front. There was also a question around whether the opposition, the conservative opposition, would push for a proposal like this, and the opposition from the government might cost them electorally In some respects, it is a good sounding reform. 

It’s something simple that you can announce that people instinctively understand. You don’t have to explain what a duty of care is, for example, and a lot of people support it. However, a lot of that support is quite. Brittle when you press the polling participants about this, about concerns around privacy and security, those concerns are very high. 

So I think a lot of Australians instinctively understand this as something that the government probably should do. It may not work. It may be, uh, that it creates further problems, but it does look like you’re doing something about this problem.  

I think there are a lot of well-intended people who support this proposal as well. I don’t mean to be patronizing or dismissive about the genuine policy concerns that underlie this particular proposal, but it is very disappointing to me as an observer of this field of policymaking that lots of other opportunities. To significantly improve. These online spaces have gone wanting, and this is the one that has proceeded, especially in circumstances where the government had considered something like this and found the technology wasn’t quite up to scratch, but then didn’t about turn in an election year. 

That seems to me a very disappointing set of circumstances, and that doesn’t even touch on the incredibly truncated process, that accompanied the passage of this legislation, including very minimal engagement with the public in a meaningful way, and instead pushing this through and it becoming a centerpiece of a claim that the government could make, that they were taking action against these large technology companies who were harming children. 

Johanna: To be clear, it’s not just Digital Rights Watch that, share these concerns. We heard from Amanda Third in episode one. She’s the professorial research fellow and co-director of the Young and Resilient Research Centre at Western Sydney University, who led and coordinated the campaign of more than 140 experts against the ban. 

Amanda: It’s really clear to me that young people face unwarranted levels of risk of harm online, and we really do need stricter regulation. Regulation that can genuinely prompt the sort of systemic changes that would see young people. Not just engage safely, but engage in an optimal way with all digital technologies. And I think for those of us who were nervous about this legislation and indeed many of whom spoke out, the key thing was not that we don’t need regulation, we do need regulation. The key thing for us was that really as we develop regulation, what we need to do is to put children’s and young people’s best interests. 

Their positive experiences at the heart of the regulatory systems that we design for them. You know, arguably by evicting under sixteens from social media, you are taking away the impetus from social media companies to design for children and young people’s needs, rights, and aspirations, right? You are actually weakening the accountability mechanisms that we might put in place. 

Lizzie: I do understand that Australia is proactive in legislating, and there’s something I really like about that, but just making laws, just creating regulators, just developing other forms of regulation, I don’t think that’s good enough. These problems that we’ve been talking about are some of the most important and profound, and we’re talking about the experience of some of the most vulnerable people in society, particularly young people. 

We owe an obligation to them to respect their rights, to allow them to participate in policy that is made about them, and to also preserve their right to participate in public life that often occurs in online spaces to build their own sense of community and identity. We at Digital Rights Watch, join many child safety experts in raising concerns about restricting the access of young people. Whether that’s a legitimate purpose in circumstances where there are less intrusive ways, less privacy infringing ways, less ways that infringe upon young people’s human rights and their rights under relevant conventions. There may be many other ways in which you could achieve the legitimate purpose of minimizing harm of being on social media platforms without necessarily imposing an age restriction on access to those platforms. 

We need stronger protections in the privacy act. We need to shift the dial on how these companies make money, and we need to protect people, give them the right to have involvement in these spaces without constantly being surveilled and having their information taken from them. And then I think we do need to think about implementing a digital duty of care, imposing a flexible, broad-based obligation on companies to do better, to proactively identify harm and potential harm and take steps to address it, not find convenient ways to ignore it. 

Johanna: We will talk a lot about this concept of digital duty of care that Lizzie has just referred to. A number of other urgent policy reforms that many are calling for in the final episode of this miniseries. But for now, I want us to stay with this specific missed opportunity that we could have used the social media minimum age law to incentivize companies to create safer online spaces. Here’s tech journalist, cam Wilson again.  

Cam: One of the big things that got dropped from the law at last minute was this exemption framework, which is. Only a name that a public servant could come up with for the least sexy thing ever. But it was this idea that we want to get these tech companies, you know, reel them in, make them treat kids better. 

But what if instead of just banning kids from the age of 13, from 16, and being on their platforms, what if we also tried to add a carrot to the stick, which is this idea that. If you can do certain things, if you remove certain features that we think contribute to the harm of social media platforms, then maybe you don’t have to come under the score. 

Like maybe you can come up with variations on your apps, like child-friendly versions, whatever it is, as a way of saying that. We will encourage a certain, maybe using the eSafety commissioner’s favorite term, like Safety by Design versions of these applications that strip out the things that we are worried about as a way of incentivizing, not just like social media bad, but encouraging them to produce something good.  

Julie: We did recommend then a additional test be used around risks and harms. For whatever reason, it was deemed that there just wouldn’t really be time to implement that, and that will be considered in the next iteration of the legislation. But there are clear situations of those platforms that we’re analyzing now who seem to be doing the right things with safety protections for under sixteens. 

Johanna: That was the eSafety Commissioner Julie Iman grant, explaining that as the law was being drafted, she was advocating to include a harm threshold. Now, one way to do this could have been by adding a framework to exclude companies that were doing the right thing. Now, I wanna be really clear here. This would not have been a free pass, but it’s about creating an incentive for companies to build better technology. 

The legislation as it passed through Parliament does allow the communications Minister to make rules to specify that particular services will or won’t be captured by the law. But as it stands, that legislation does not include any criteria against which an individual company should be assessed for exclusion. 

It’s essentially at the Minister’s gift to make that decision. In contrast when this bill was originally drafted. It included a specific exclusion framework that transparently set out the criteria on which a social media service that would’ve otherwise fallen within the definition of the law could have been awarded an exclusion based on a specific set of criteria. Things like that. They had strong safety protections in place that went above and beyond industry standards. If they could provide that, they had robust safety measures in place, they could have applied to be assessed and excluded from the law, and young people would’ve been legally able to continue using those services, which had been uplifted because they had put these extra safety measures in place. 

This could have created a race and a market to create safe online spaces and not just for kids. I think one of the things that I always am interested in is how some of the safety and other features that are rolled out often for smaller populations often then get rolled out for broader populations as well. 

Cam: Features that platforms like Meta have rolled out for kids initially have then end up being available to adults and I mean that, that’s great to have those options 

Johanna: However. As Cam reported at the time, political negotiations, complicated matters, leading to the exclusion framework’s, unexpected disappearance. Once the bill was introduced to Parliament 

Cam: The reason it had been taken out at last minute was because of a deal between the Labor Party and the liberal party where there was kind of pressure from the liberal party. Who said at the time, we don’t want to have almost like a get outta jail free card for these platforms. 

We don’t want to give them a way to kind of weasel out. And that clearly was something that the government agreed with. So between the time that the government initially announced the kind of final law, which was at the start of November and started briefing it out to reporters, so saying, this is what you’ll expect to see in the legislation. And then about a week and a half later, that portion had been removed. In my opinion, probably damaged the bill in terms of its ability to, I think, get better outcomes for not just kids, but for everyone.  

Johanna: With the exemption framework. Removed the law excludes young people from social media without offering social media platforms any incentives to transform their services for children’s needs. Talk about a missed opportunity. It would be a significant improvement if this exemption framework was added back into the law when it comes under review. 

Johanna: The social media minimum age restrictions have two independent regulators charged with overseeing it. The eSafety Commissioner, Julie Inman Grant, and the Privacy Commissioner, Carly Kind, because they enforce the law. These regulators are often seen to be its public face, but regulators can only enforce the laws that Parliament passes, and it doesn’t mean that they always love the laws that they are handed to oversee. 

In this case, the social media age restrictions. Both the eSafety Commissioner and the Privacy Commissioner expressed reservations about the legislation before it passed Parliament. The eSafety Commissioner Julia Man Grant addressed the Royal Society of New South Wales in June last year. This was before government had made its announcement, but it was a very live national conversation. 

Julie: Some argue the tech industry is already acting like big tobacco and should therefore be treated as such as the industry is accused of ignoring compelling research that shows the damage their platforms pose to children so that they can protect their bottom line. The debate may even see Australia swinging at the pendulum more towards an interventionist camp of online safety regulation with a media fueled push banning children under the age of 16 from joining social media. 

While this will ultimately be a policy question for government, I think a much deeper debate needs to be had around what we mean by social media. We also need to think long and hard about what unintended consequences might be of pushing kids into darker recesses of the web. I’m also concerned about the pursuit of forbidden online fruit, that it will actually deter help seeking by young people. 

It prevents them from combining in parents when things do go wrong online. And I can tell you that the evidence base is thin and the research very mixed on all of these questions. And to be honest. As Australia’s regulator in this area, I’m struggling to get my head around the how in terms of successfully implementing such a policy. 

You see, I’ve been working on age assurance in one shape or form since 2008, but until these fundamental age assurance systems and technologies are in place, implementation and enforcement of such a ban will be virtually impossible. 

Johanna: Another independent regulator who expressed reservations about the law when it was first introduced was Australia’s privacy Commissioner Carly Kind. 

The risks that I foresaw at the time were probably threefold. The first is that children wouldn’t be able to use social media platforms that they had come to rely on for a range of different reasons, including educative, connecting with each other and seeking access to information that might be important to them. 

The second was that. Those platforms would lose the incentive or obligation they had to create safe and trustworthy spaces, and I think I pointed at the time to what we’ve seen happen when Elon Musk took over ownership of. X and essentially fired the entire trust and safety team. We have seen X become a demonstrably less safe and as a user worse place because of that lack of investment in trust and safety that comes with, comes as a moral obligation for having children use your products. 

And then the third concern related to the privacy of Australians more broadly because. This regime will require not only children to assure or verify or establish their age, but all users of social media platforms. Those are probably the three concerns I expressed at the time.  

Johanna: Are they still concerns? I think it’s clear that we have seen over the last year.A change in the political and geopolitical space in which social media platforms exist and a lessening, I think more broadly of regulatory requirements on their shoulders. And it may be that this kind of absolutist approach is the best way to protect children, and that preceding and alternative approaches, which would have been about leveraging reputational concerns. 

Appeals to norms and moral arguments would no longer be effective because of the changing political environment. So I do think that perhaps this is what is necessary to protect children going forward. I still retain the concerns around privacy, and that is of course, front of mind as. We at the office of the Australian Information Commissioner prepare to regulate the privacy aspects of the regime. 

Um, we, we want to make sure that Australians privacy is protected, and I’m confident that can happen within the. Context of the regime, but I think it is still a risk that has to be managed.  

Johanna: Now, there are those who object to the law in principle, and it’s quite possible that we will see legal challenges. We did invite Google who owns YouTube and Meta to come on the pod and explain their positions, but they declined or didn’t reply. I do hope that it’s now clear that most of the experts, child’s rights advocates, and even the independent regulators who expressed reservations about this law did so not because they don’t think that we should be regulating. 

Because they think we should be doing more to hold tech companies accountable, or they were worried that this particular law would have inadvertent negative consequences, particularly on vulnerable young people, and by placing broader age assurance obligations on all Australians.  

In another twist for the policy nerds, it wasn’t just the experts, but also the politicians themselves. The Joint Select Committee on Social Media and Australian Society tabled its final report, social media, the good, the bad, and the ugly in the Senate and the House on the 18th of November in 2024. The report was very critical of social media. It didn’t recommend minimum age restrictions, and neither did the independent review of the Online Safety Act led by dearly a rickard, but regardless of what the experts thought, we live in a democracy. 

And the legislation was introduced into Parliament on the 22nd of November and it passed little over a week later. Truncating a process that would normally take months with a theater of a half day committee hearing and a 24 hour turnaround for public submissions. If nothing else, this proves that the laws can be passed quickly if politicians are motivated enough. 

Cam: The context of this is that this policy throughout the process has been enormously popular depending on where you look at the support. When it was passing, it was around 80 to 90% of people supported banning kids from using social media under the age of 16. Obviously, people ask. Whether people who are saying that they wanna do this know how that’s actually gonna be enforced. 

But regardless of that, it was a policy that the government could say we’re committing to. Funnily enough, it doesn’t cost anything, so it doesn’t affect the budget. Bottom line, it’s cracking down in big tech, something that is very popular and with the context of the election, it was a policy that allowed the government and also the opposition at the time to say, we are going to do something. 

We’ve actually passed law. We’ve done it. To kick the details down the road. And so from there we had this commitment to do something, and the following 18 months was this almost reverse policy process where they came up with the idea first and then they had to figure out all the details. And that went from a. 

Figuring out what age was gonna be the minimum age, and then the subsequent 12 months, which were almost through, was this idea of how do we actually figure out how to enforce this law? It’s been a policy process that’s involved, consultation that’s involved setting regulation that’s involved, getting advice from various regulators about how to set guidelines and expectations for what social media platforms need to do to enforce this age. 

’cause that’s a key part of it, like. This wall, while it says we’re setting ’em in a age, and you’ve gotta do something, it leaves the actual, the nitty gritty that matters of, uh, how you actually implement it to the social media companies themselves. Because as the government says, you are the experts on this, you can figure out how to do it. We’ll tell you if you’re doing a good job.  

Johanna: And so, listeners, this is the story of how Australia passed its world leading social media age restrictions. Are they perfect? No. Will it work? Well, we are gonna cover that in the coming episodes, but for now, I’d like us to pause and take a moment to applaud the intent of this law and the courage of the advocates who made it happen. 

Because I think we can all agree that things needed to change. This is Prime Minister Albanese speaking on the floor of Parliament on the day the legislation was introduced.  

Prime Minister clip: I also wanna be grateful for the work of Civil Society, let Kids Be Kids Campaign. Uh, the 36 months campaign have all been really important, and this morning I met with Matt and Kelly O’Brien, parents of Charlotte, who tragically, tragically, uh, lost her life. 

And this is an unspeakable tragedy to lose one’s child. I met with Rob Evans, the father of Live Evans. No parents should ever go through what they have gone through. No parent should ever have to call on the sort of courage that they have shown in speaking out on behalf of others, and I’m very grateful for their advocacy and I pay tribute to them today. 

We know that this isn’t the only solution. We know there’s more to do. We know that there’s a ban on young people buying alcohol, and from time to time, uh, under eighteens do get access, but that’s not a reason to walk away from sending that message and putting in place the support for parents that they need and are calling for in order to support….The children in,  

Johanna: In the remaining episodes. In this miniseries, we’ll shift our focus from how did we get here to, how do we make the best of what we have? How do we build on this momentum to drive more and better outcomes, not just for young people. For all Australians. Next up, we’ll look into what the legislation does and doesn’t do. 

We’ll hear directly from the eSafety Commissioner and the privacy Commissioner on how they are going to enforce the law, and we’re going to wade into the vexed world of online age assurance.  

Julie: I guess this is the beauty and also the curse of an independent regulator. You’re given a piece of legislation and it gives you parameters and then you have to actually figure out how it works. 

Andrew: In practice, what we have in our current day society is if you go to buy some alcohol, the signs that are at the shop now are at the bar. So if you look younger than 25, you’ll be asked for id. So that’s a seven year difference, and we have this tangent where as humans, we’re actually not that accurate in guessing age. But when we look at tech doing it for us, we want it to be near a hundred percent. 

Johanna: This has been Tech Mirror, a podcast brought to you by the Tech Policy Design Institute. We are based in Canberra on the lands of the Nawal Nare people. You can find information about the research mentioned in the episode in the show note. A big thank you to our guests, Cam Wilson, Lizzie O’Shea, Amanda Third, Julie Inman Grant, and Carly Kind. 

This podcast was made possible thanks to generous contributions from government, industry and philanthropy to the Tech Policy Design Fund, the full details of which are transparently disclosed on our website. For information about the archival audio we’ve used in this episode, please check the show notes. 

The soundtrack is by Thalia Skopellos a Sydney based artist and entrepreneur with Aboriginal and Greek heritage. For information about the archival audio we’ve used in this episode, please check the show notes. This podcast was produced with the support of audio craft on the lands of the Gadigal people of the EO nation. 

Amy Denmeade provided invaluable research support. A big thank you also to all the team at the Tech Policy Design Institute, without whom this pod wouldn’t be possible. For more information about our work, visit us at www.techpolicy.au or follow us on LinkedIn.