Pod Notes
This is the final episode of TPDi’s 5-part Tech Mirror mini-series, Australia vs Social Media: inside the world-first online safety experiment.
In this episode, we turn our attention to the future, looking at policy priorities for the Government going forward, including privacy law reform, a prohibition on unfair trading practices, competition codes, and the introduction of a digital duty of care.
We also call on the Tech Mirror community to get involved and help shape Australian tech policy to make sure that it works well for everyone.
Featured experts in this episode include Privacy Commissioner Carly Kind, Lizzie O’Shea from Digital Rights Watch, ACCC Chair Gina Cass-Gottlieb, eSafety Commissioner Julie Inman Grant, and clinical psychologist Dr Danielle Einstein.
Credits
Written and narrated by Johanna Weaver, Executive Director, Tech Policy Design Institute.
Produced by Olivia O’Flynn & Kate Montague, Audiocraft.
Research by Amy Denmeade.
Original music by Thalia Skopellos.
Created on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation.
Special thanks to all the team at the Tech Policy Design Institute, without whom the pod would not be possible, especially Zoe Hawkins, Meredith Hodgman, and Dorina Wittmann.
Links
PM Press Conference at Parliament House (30 July 2025) https://www.youtube.com/watch?v=Zv-ZQ6jyLxw
Children’s Online Privacy Code, via the Office of the Australian Information Commissioner https://www.oaic.gov.au/privacy/privacy-registers/privacy-codes/childrens-online-privacy-code
Age appropriate design: a code of practice for online services, Information Commissioner’s Office (UK) https://ico.org.uk/for-organisations/uk-gdpr-guidance-and-resources/childrens-information/childrens-code-guidance-and-resources/age-appropriate-design-a-code-of-practice-for-online-services/
Regulatory reform in digital platform markets is needed to improve competition and consumer outcomes, via the ACCC (June 2025) https://www.accc.gov.au/media-release/regulatory-reform-in-digital-platform-markets-is-needed-to-improve-competition-and-consumer-outcomes
Minister for Communications announcement of duty of care obligations (November 2024) https://minister.infrastructure.gov.au/rowland/media-release/new-duty-care-obligations-platforms-will-keep-australians-safer-online
Statutory Review of the Online Safety Act 2021, led by Delia Rickard, released February 2025 https://minister.infrastructure.gov.au/rowland/media-release/report-online-safety-act-review-released
Minister for Communications Press Conference regarding digital duty of care survey (November 2025): https://minister.infrastructure.gov.au/wells/transcript/press-conference-canberra-0
Minister for Communications media release on commitment to digital duty of care (November 2025) https://minister.infrastructure.gov.au/wells/media-release/government-continues-commitment-online-safety
Government survey on digital duty of care online (November 2025) https://minister.infrastructure.gov.au/wells/media-release/government-continues-commitment-online-safety
Transcript
Johanna: The Tech Policy Design Institute acknowledges and pays our respects to all First Nations people. We recognize and celebrate that indigenous people were this continent’s first tech innovators.
Mia Bannister clip: I’m here to represent the my fellow parents, Emma Mason and Rob Evans, Ollie, Tilly, and Liv. Their lives mattered.
Johanna: This is the voice of Mia Banister. Mia was just one of many parents who were instrumental in the passage of the social media minimum age restriction legislation. She was speaking at a press conference at Parliament House in Canberra on the 30th of July, 2025.
Mia Bannister clip: Thank you to the collective. Of individuals, charities, and organizations who came together and sang from the same song sheet to make this legislation a reality. It wasn’t the result of one voice, but the power of many United in purpose, driven by hope and committed to protecting our kids. This restriction while specific to account creation is a good starting point.
We won’t stop pushing for real meaningful reform together, we made change happen, and together we will keep going. Thank you.
Opening clip: The minister and I have an important enact. Social media is harming children social media. It’s not ban on content. Age restrict the age assurance and the band. It’s restrictions the.
Cam Wilson: It’s a work in progress. I don’t think that come December 10, we can just put our feet up on the desk and say Job’s done.
Johanna: This is the final episode of Tech Mirror’s five-part series, exploring Australia’s new social media minimum age restrictions. I’m Johanna Weaver, the co-founder and executive director of the Tech Policy Design Institute.
In the last four episodes, we’ve explored how we got here. The complex legislation, the harms that it’s trying to prevent, and the practicalities of what will happen after the 10th of December. So what do we need to do next? In this episode, we are going to turn our attention to the future and cover four priority reforms that many think are the most urgent.
But our first call to action is to you, and it’s a call to you to get involved.
Lizzie: I think it’s absolutely fundamental that lawmakers, policy makers, regulators listen to people. So, one of the jobs I’ve given myself and our organization has given ourselves is the idea that. People’s voice should be involved in this kind of policymaking.
Johanna: Lizzie O’Shea, co-founder and chair of Digital Rights watch an organization that exists to ensure fairness, freedoms, and fundamental rights for all people who engage in the digital world. We heard from her in earlier episodes.
Lizzie: There’s often this huge incentive to attract foreign investment to buy into the hype of the tech industry as being the future of growth, of productivity, of innovation that justifies ignoring a lot of the problems that come with it.
And I think many Australians now recognize that with the experience of social media, for example, a lot of the harms that we see now could have been prevented, could have had more active policymaking. And we’ve lost a lot of time. While some of that industry hype was exceeded to by government,
Johanna: Active tech policy goes well beyond social media. It includes, but certainly isn’t limited to artificial intelligence. We need to make sure that we’re learning and not repeating the mistakes that we’ve made with social media.
Lizzie: These problems that we’ve been talking about are some of the most important and profound, and we’re talking about the experience of some of the most vulnerable people in society, particularly young people.
We owe an obligation to them to respect their rights, to allow them to participate in policy that is made about them. To also preserve their right to participate in public life, to build their own sense of community and identity, which often occurs in online spaces.
Johanna: And Lizzie’s call to action.
Lizzie: I think it’s critically important that we think about making policy that is rights respecting, that’s about protecting people, but ensuring that they can still exercise their rights as people with inherent dignity rather than making decisions on their behalf without considering that experience.
One of my experiences in looking at this policy is to look at what young people have been talking about in relation to it, and I’m always impressed. By the sophistication with which they approach discussing these topics. It’s a huge resource that we have the benefit of that we should be trying to build into our policymaking process.
I would argue that looks like a few different things. We should be working really hard to get and develop the children’s online privacy code, which was introduced recently up and running, and it should be as expansive as possible in terms of protecting children’s rights and also imposing obligations on companies that have services that might be accessed by children.
We need stronger protections in the Privacy Act. We need to shift the dial on how these companies make money and we need to protect people, give them the right to have involvement in these spaces without constantly being surveilled and having their information taken from them. And then I think we do need to think about implementing a digital duty of care, imposing a flexible, broad-based obligation on companies to do better, to proactively identify harm and potential harm and take steps to address it, not find convenient ways to ignore it
Johanna: to the three reforms that Lizzie has just mentioned there. I would add competition and consumer reforms. Let’s dive into each of these proposals in turn.
Carly: So the social media age restriction regime is designed to keep children out of certain parts of the internet. The children’s online privacy code is to ensure they’re safe when they go in other places online.
Johanna: This is Carly Kind, Australia’s privacy commissioner explaining the online children’s privacy code, which will come into force at the end of next year, 2026.
Carly: The social media restriction only applies to a very small number of entities, less than two dozen, I think it’s fair to say. Whereas the Children’s online privacy code will apply to all online services likely to be accessed by children. So the way I am thinking about that currently is that’s essentially a couple of buckets of entities.
The first is those which are directly targeted at children, so educational games, tools, apps for kids, streaming services that are particularly for children. Then those that aren’t targeted only at children, but are clearly likely to be accessed by children, so other streaming services games. Social media platforms, educational tools, et cetera.
And then there’s a third category of entities we’re also considering for inclusion in the code, which is those that process a large amount of children’s data, but may not necessarily be directly accessed by children themselves. So we’re thinking about apps that schools use to manage children’s information, baby tracking apps, that kind of thing.
Johanna: Back in late 2024, parliament passed something called the Privacy and Other Legislation Amendment Act. Sounds great, doesn’t it? But this act is important because it gave Carly and her team the mandate to go out and develop this online children’s privacy code, which must be completed before December, 2026.
Carly: The Children’s online privacy code will cover a range of privacy related protections that those services need to ensure. Are provided to children on their platforms, and that will range from child-friendly privacy policies and complaint processes right through to higher thresholds and restrictions around things like targeted advertising or the use of nudge techniques and dark patterns, those kinds of things.
So we’re still considering a range of different issues for inclusion in the code, but the code will be mapped. Closely to the international counterparts, like the UK’s age appropriate design code, which cover all of the things that I just mentioned. And our obligation is to develop the code in a way that’s not inconsistent with the Privacy Act as it stands.
And we see that as meaning that there can be higher protections afforded to children and are currently in the privacy Act. This year already, we’ve done a range of consultation with children, parents, educators, as well as industry. And we will do that again in 2026 because early next year we’ll publish a draft of the code for consultation, and our plan is to publish a child friendly version of that draft so that kids can also give their feedback on the draft code as well.
We’ll be publishing a report of the consultation we’ve done this year so we can. Tell back to children and their parents what we heard from them and sneak peek. They don’t necessarily agree children and their parents, so we’ll be publishing a report on that and we really wanna be very transparent.
Here’s what we’ve heard, here’s how we’re putting it into our draft code. Tell us now what you think of the draft code. Children, parents, educators, but also industry who will have to comply with this, and government agencies who will also have to comply with it. And then we’ll do another round of report back next year before we register the code in December.
Johanna: The Children’s online safety code will be an important compliment to the social media age restrictions and the industry codes that we spoke about in episode three. The social media age restrictions are designed to do what it says on the tin to restrict young people’s access to social media. The industry codes will limit young people’s access to lawful but awful material.
Things like pornography, self-harm gambling, and the children’s online safety code will make sure that when kids are engaging more broadly on the rest of the online environment, that companies still have an obligation to ensure that they aren’t exploiting or extracting children’s data or monetizing their attention.
Carly: I think it’s great that we’ve prioritized children in strengthening privacy protections. Next stop. Everyone else in Australia.
Johanna: Australia’s Privacy Act hasn’t been meaningfully updated since 1988. There has been several long processes of reviews and many different recommendations made, and the reforms to the Privacy Act are complex, and the government decided last year to split them into several different tranches. The first tranche passed late last year that included mandating for the children’s online privacy code. The government has committed to a number of additional reforms, and we’re watching closely to see what they introduce in this term of parliament.
Carly: I would like to see the wholesale reform of the Privacy Act in the introduction of the fair and reasonable Test is a really key reform that’s on the table, which would require entities to much more consider the rights and interests of individuals when they’re thinking about how to process and collect their personal information.
So I think children privacy is a great first step and. The government has clearly prioritized it for good reason. It remains the case. The entire Australian community is concerned about privacy, that their privacy is disregarded or interfered with too often, and so wholesale reform, the privacy act should be the next stop.
Johanna: Lizzie, O’Shea agrees.
Lizzie: One of the primary reforms I’ve been advocating for is privacy reform. That is because the business model of these social media platforms involves taking as much personal information as possible from users in order to profile them better for advertising, which is. What they make money from, and that incentivizes engagement on the platform, which means that more polarizing extremist content is better favored by the algorithms they use to create your content feed.
This is not good for people who go online. It’s not good for our democracy. So let’s look at how we could reform access to that personal information as a way of. Incentivizing companies to move away from that business model. Privacy reform is a rights respecting way to, I think, improve the experience of being on social media, and it would mean that it goes to the heart of the business model that is giving rights to so many downstream harms.
Johanna: One specific privacy reform that many are calling for is something that is called the fair and reasonable Test. I asked Carly to explain what practical difference this would make to our lives online.
Carly: So the current kind of bar for collecting personal information is whether it’s reasonably necessary for a company to collect it. According to their own interests, their own functions, what they’re out there to do. So it’s completely reasonably necessary through the view of some companies to collect your information and use it for marketing, for example. The fair and reasonable test requires them to fore ground you and what’s fair to you and what’s reasonable for you as a consumer.
And is it fair that in order to buy this product I have to hand over. My date of birth and my gender, for example, when you already have my bank details and my email address, do you actually need that additional information really to look through the lens of what is fair to the consumer and to connect with those more consumer rights as well as human rights, of course.
So I think it’s a really important turning of the tables in terms of whose interest and rights come first. It also gets us away from this notion that. If you consent to something, then it’s free range. That once you’ve consented to lengthy terms and conditions, then a company can do whatever they want and actually requires them to have this ongoing consideration of, okay, yes, the person may have said yes, but does that still mean it’s fair and reasonable in all the circumstances?
So I think it’s actually quite an important step forward in saying, we know that consent is broken as a tool in many situations. How do we make sure that the rights and interests of individual are foregrounded
Johanna: put simply under Australia’s current privacy law? It’s the company that is put at the center of an assessment of reasonableness, whereas the fair and reasonable test would put the individual at the center.
This doesn’t mean that businesses can’t be profitable, but it will be a considerable shift from the current regime. But Carly doesn’t stop at privacy reform.
Carly: I’d make my pitch not for a duty of care, which I’m hearing many people advocate for, but for an additional and complimentary duty of fair that I think could encapsulate many of the changes that are already on the table with the Privacy Act reform, as well as the changes proposed by the ACCC in terms of unfair trading and conduct across the economy.
I really think the idea of a duty of fair could encapsulate empowering consumers. Particularly in the digital economy, and particularly when it comes to their interaction with emerging technologies like AI could ensure that their rights interests are foregrounded and that they have the power to control where decisions are made about them.
I suppose one of the things we see very clearly in the privacy domain is just what a power asymmetry. Individual consumers are at when they’re engaging with products and services online all day, every day. The the North Star would be a duty of fair on all entities, organizations, and agencies operating in the digital realm to ensure that the fairness of their actions and the impact on consumers was really the first consideration in all circumstances.
Johanna: You heard Carly there talking about a duty of fair. And so much of the focus around shaping online spaces has fallen within the safety domain. Privacy is another important domain. It gets far less attention than safety, but at least it’s something that people think about. A third area that I’d encourage us to explore is in competition and consumer laws.
We dug into tech mirror’s archives to a conversation that I had in 2023 with the Chair of the Australian Competition and Consumer Commission, or the AC CC Gina Cass Lib, to highlight two further reforms that we think are important.
Gina: So the first recommendation is an economy-wide unfair trading practices prohibition and that the ACCC has been advocating for a number of years.
We reference it. Particularly in the digital platforms and tech context, because there are so many examples in the online environment.
Johanna: I asked Gina to give us a practical example of what would be considered unfair trading under this new proposed law.
Gina: One good example that many people will have faced, uh, what is frequently called a subscription trap. So we will have been offered a either free period to come onto a new online service or, or a discounted period, and then it will return to, at the end of that period a fall subscription paying basis in order to first open the account and come on, we will have given a credit card, uh, means of payment. You have an absolute right in the terms, so it would not.
Qualify as an unfair term. We have a right to terminate the subscription. However, the process of the way the subscription is established virtually impossible to exercise that right to terminate.
Johanna: I’m laughing because my husband would be saying, yes. Johanna does not terminate in the background. Sorry, Gina. I didn’t mean to interrupt you.
Gina: We, my family has had this experience, I mean. Virtually. So here you are. Yeah. A judge would look at the terms. Yeah. They would say, oh, there’s a right to terminate, but it’s impossible. Nobody will ever pick up the phone and then terminate. There isn’t an easy click and you terminate.
You could click and sign on, but you can’t click and terminate. So the actual practice and process of the business, and as we are all now. Over 75% of our transactions are occurring in an online environment. Mm. That ease of becoming committed and giving the payment mechanisms, but the difficulty of terminating that subscription mm is actually having, it’s not just frustrating.
It is causing financial loss for people who no longer have an interest, or actually because of their financial circumstances need to reduce the expenditure.
Johanna: The second area of reform that the a CCC is calling for and that the government has agreed to in principle, is a new digital competition regime. This would include new specific rules for large digital platforms
Gina: In terms of innovation. Many in the tech sector, particularly large global incumbents, will say that regulation impedes innovation. What the AC CC is drawing out is that actually when you have a concentration of ownership, and particularly in the way we are seeing large global digital platforms.
Expanding out across the ecosystem. So not only holding power in the specific segments of original services that they offered, but using self-referencing and other means to expand into adjacent services so that you have a situation where even without, and they also engage in acquisition of new entrants or nascent competitors in particular adjacent sections.
You actually have a situation that removes the capacity for innovation across very diverse and new entrants and all of the benefits that can arise from that, from competition, producing innovation and greater choice across the tech sector because of that ability to constrain and impede the entry and the expansion and the access.
To followers, users on the platforms, even the STA that they have now reached.
Johanna: Whereas the prohibition on unfair trading practices would be broad applying to all businesses across the economy. The digital competition reforms are intended to be much narrower.
Gina: It is in relation to the competition. Areas of protections and measures that we are looking at a much more targeted approach so that there would be a legislative framework that would allow. Service by service binding codes of conduct, which only designated digital platforms. So the largest, most influential our digital platforms would be designated. And then the code would only apply to their offering of services, which is informed Joanna by that question about innovation. Mm. We don’t want to throw a blanket over the whole industry and the whole sector.
We want in terms of these. Competitive measure obligations that they’re targeted only at those that are the largest, most influential and are capable of exercising market power. And that that would then address issues like anti-competitive self prefacing, the sort of exclusive pre-installation agreements mm-hmm.
That are had with the OEMs, the device manufacturers, so that many users may not be aware that it’s a pre-installed. Web browser or search machine rather than an ability to make a choice.
Johanna: Let’s try and make this real with an example. Here’s Tim Levy. He’s the CEO of Australian Safety Tech company, courier speaking at the Senate and Environment Communications References Committee in October this year.
Qoria and many other companies want kids and parents to be able to access specific types of safety technology on their devices, but Tim claims that these large digital platforms are using their dominant market power. To prevent this from happening
Tim clip: installed by schools. These tools ensure age appropriate access to all online platforms. All platforms and the entirety of the web. They stop kids gonna the dark web. They stop them using VPNs and very relevant to this inquiry. They can also direct kids to the safer version of Google search, edge search and YouTube. Essentially kids mode, business governments and big US schools, they use these tools because they work and they’ve been working for decades.
Enterprise sector technology provides almost all of the security, privacy, and safety measures parents are begging for. But that technology is being withheld by Google, apple, and Microsoft who only provide that access to enterprise app developers, parental control apps that you might wanna download from the app store.
They can’t get access to that technology.
Johanna: Now, I think it’s very important to say here that Google, apple, and Microsoft have a very different view on this to Qoria. So I’m gonna leave it up to the regulators and the courts to decide if this is a specific example of anti-competitive conduct. But I wanted to use this as a way to make tangible the idea that competition reforms if they are implemented well. Can really help foster Australian innovation by creating an environment for Australian businesses like Qoria to compete with large companies.
The government has accepted the AC C’s recommendation that we need to reform our existing competition regime. And in December last year, the Department of Treasury began consultations on what this new digital competition regime might look like.
We are all eagerly awaiting the outcome of that consultation and a decision from government on next steps. This has been considerably complicated by Trump’s presidency in the United States. Trump has argued that similar competition laws in the EU are taxes against US companies and has threatened tariffs in retaliation.
However, there are ways that Australia can hedge against Trump’s threats and move forward with these reforms. This is something that my co-founder at the Tech Policy Design Institute, Zoe Hawkins and I have written about in the Australian Foreign Affairs. If you are interested, there’s a link in the show notes.
For now, the important thing to take away is that if we get digital platform competition reform, right, it will help foster the Australian tech sector by providing an environment where they can compete with the big players.
So this brings us to our final area of reform, a digital duty of care. Julie Iman, grant, Australia’s eSafety commissioner will get us started.
Julie: The next big step that the government is committed to is the digital duty of care, and that’s absolutely predicated on safety by design and requiring the platforms to assess the risks and harms and build the safety protections in upfront. And I think this will be powerful. It’s where other governments are moving as well.
Johanna: An independent review of the Online Safety Act was handed to government by Delia Rickard in October, 2024, and it recommended that Australia establish a digital duty of care. The government has publicly committed to this. Here’s Minister Wells acknowledging a new digital duty of care as a compliment to the social media minimum age restrictions.
Minister Wells: These platforms, in or out, will become subject to the digital duty of care, which is where the Albanese Labor government is going next. In the work of online safety, we’re opening consultation on that soon. And digital duty of care is about what these platforms owe their users by way of social moral obligation to try and keep them safe online.
So the government agrees that a digital duty of care is a good idea. They want to do it, but the details of how and where remain very scarce. The idea of a digital duty of care is that it creates a positive requirement, not just for social media companies, but for all online service providers to take reasonable steps to prevent foreseeable harms on their services.
In addition to this positive requirement, the Rickard Review recommended a due diligence approach that requires service providers to have robust processes in place to manage the risks of harms on their services. This includes transparency requirements and reflects the approach taken by the European.
As the independent review recognized these reform would shift much of the burden from remaining safe online, away from the individual users to those most capable of identifying and addressing the harm the service providers themselves. It also would move us away from a whack-a-mole approach of creating laws for social media and then for messaging app.
And then for AI chatbots and for whatever it is that comes next, which means that the lawmakers are constantly battling to keep up with the technology with a digital duty of care, we can say we don’t care what the tech is, but all tech companies have a responsibility to protect against harm.
Let’s hear more from Lizzie about what the creation of a digital duty of care would mean.
Lizzie: So creating that duty in legislative terms gives people who hold this power the capacity to define it and make it clear to industry what the expectations are. So in legal terms, a duty of care means that you need to take generally reasonable steps to fulfill that duty, to protect people from harm and what is considered reasonable sometimes can be a very flexible concept.
It often is taken to mean that you have to comply with. Best practice. I think that creates a level of flexibility in the concept, which can be very useful because it means that there’s an incentive for companies to ensure they’re up to date with best practice, and they’re constantly looking at this question of how harm might be arising and what they can do to remedy it.
It also means that new harms that perhaps may not have been envisaged at the time the rule was passed into law can be interpreted into both company decision making, but also ultimately the court scrutiny of these issues. The downside is what is reasonable. May be difficult to determine and you may get people not engaging with what is best practice.
And it may be the courts don’t fully get the opportunity to ventilate what that is or determine that some lower standard than what most people would accept is reasonable. So there’s questions around that. Uh, but I think it offers some potential to ensure that the law is not dragging all the time behind the state of technology.
And also create clear expectations to industry as to what we expect as a society, what our lawmakers expect, and when they might be held to account when they fall short.
Johanna: A digital duty of care is how we future proof. It allows the law to keep pace with the development of technology, and it also creates an incentive for tech companies to create better online spaces, moves us away from excluding people or restricting content, and instead helps us to get the best out of technology.
If you think a digital duty of care, a digital duty of fair, perhaps some competition reform, privacy reform, or any of the other reforms that we’ve spoken about in this episode or the miniseries need to happen, we encourage you to get in touch with your local member of Parliament and let them know you care.
We do have the power to shape technology for the better. And one of the most impactful things that you can do is to let your local member of Parliament or your local senator know that these are issues that you care about. As we’ve seen with the social media minimum age restrictions, politicians really do listen to their electorates, and when they’re motivated, parliamentarians can pass laws pretty quickly, and those laws change technology.
Just as the social media companies right now are changing the technology that they’re building to ensure that it’s implementing age assurance measures that are restricting children under the age of 16 from accessing social media, that is a law that is changing technology. Going forward, one of our biggest challenges is going to be making sure not just that Parliament is passing laws, but that they’re passing laws that make technology better.
That’s why working with politicians is a core part of what my team and I do at the Tech Policy Design Institute. But we need your help. We want to harness the momentum that has been built through the social media, minimum age restrictions to help develop better technology policy.
Jonathan: What Australian legislators at the state and federal level have done to enact age limits and to require the tech companies to enforce them. Is by far the largest step to protect children on the internet that has ever been taken on this planet.
Johanna: Jonathan Haidt, academic and author of the Anxious Generation has this message for Australian politicians.
Jonathan: I applaud you for your boldness. I applaud you for the care with which you’ve drafted this legislation.
I’ve spoken with some people involved in the implementation to understand what’s going on. I am encouraging you to take the long view. There will surely be problems and the roll out there will surely be criticism. My message is, you are doing something world changing. You are doing something must be done.
Australia is, uh, the country and the leaders of Australia are heroes to parents around the world. So take the long view no matter what the pushback is. Keep going. Let’s get this right and if you can get it to work well in Australia, it will spread around the world.
Johanna: Psychologist Daniel Einstein agrees what Australia has done and where I’m quite proud of having contributed to this is. Putting together the evidence and explaining to the public and to parents why social media is not a good way to support mental health. And we’ve cleared away that argument and that is gonna help the rest of the world because the rest of the world is looking at what we’re doing.
Jonathan: Australia’s approach has drawn global attention, some criticism, and also quite a lot of admiration. And many of us hope that it sets a new level of ambition for the way that we develop technology policy in Australia. In my view, one of the biggest positives to have come out of the social media minimum age restrictions, is that it has awoken. Politicians and parents and people to the fact that we don’t just have to accept the technology that is given to us.
That we as society have the power to demand that tech companies make technology differently. There is, of course, much work ahead of us, including to pass and then implement many of the systemic reforms that we’ve talked about in this episode. But the road ahead is as much about what we want from society as it is about what we want from technology.
And that’s why we are calling on you to get involved because the decisions that we make about technology today will shape our future tomorrow.
If you’ve enjoyed this series, please leave us a review or share it with your friends and colleagues. We’d love to keep making series like this and your support and encouragement makes that possible.
This has been Tech Mirror, a podcast brought to you by the Tech Policy Design Institute. We’re based in Canberra on the lands of the Nal Nare people. You can find information about the research mentioned in this episode in the show notes, and a big thank you to all of our guests across the miniseries.
Amanda Third, Andrew Hammond, Cam Wilson, Danielle, Einstein, Lizzie O’Shea, Carly Kind, Jonathan Haidt. Min Wang and Julie Inman Grant, this podcast is made possible with thanks to the generous contributions from government, industry and philanthropy to the tech policy Design fund, the full details of which are transparently disclosed on our website.
The soundtrack is by Thalia Skopellos. For information about the archival audio that we’ve used in this episode, please check the show notes. This podcast was produced with the wonderful support of the team at Audio Craft on the lands of the Gadigal people of the Eora nation. A special thank you to our producer, Olivia O’Flynn, and executive producer Kate Monague.
Amy Denmeade provided invaluable research support throughout the miniseries. And finally, a big thank you to our team at the Tech Policy Design Institute, without whom this podcast wouldn’t be possible. Special thanks to Zoe Hawkins, Meredith Hodgman and Dorina Whitman. Last but not least, thank you for listening to this episode.
Please do stay in touch and get involved.