To start off our new season of Tech Mirror, Johanna Weaver is join by her co-founder at the Tech Policy Design Institute, Zoe Hawkins, and the pair look forward into 2026 and take a punt on predicting the hot tech policy topics for the year. Will Australia use its position as a middle power leader to shape technology, rather than just react to it? What domestic tech policy issues will take flight this year? What steps will Australia take to harness the potential of AI? We’ll have to wait till December to see just how well we do with our predictions. Until then, enjoy the episode!
Credits
Produced by Audiocraft.
Original music by Thalia Skopellos.
Created on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation.
Special thanks to all the team at the Tech Policy Design Institute, without whom the pod would not be possible, especially Zoe Hawkins, Meredith Hodgman, and Dorina Whittmann.
Links
Tech Policy Design Institute: https://techpolicy.au/podcast
AI Agency Discussion Paper: https://techpolicy.au/ai-agency
Blog: Middle Power Tech Strategy in Carney’s Honest World Order: https://techpolicy.au/honestworld
Cal for Coordination: Tetris for Australia Future – Aligning Australia’s AI Priorities: https://techpolicy.au/ai-tetris
Follow us on LinkedIn: Tech Policy Design Institute
Transcript
Tech Mirror S03EP01
2026 Preview
Johanna Weaver & Zoe Jay Hawkins
Johanna: The Tech Policy Design Institute acknowledges and pays our respects to all First Nations people. We recognize and celebrate that Indigenous people were Australia’s first tech innovators, and they have enduring knowledge systems and continued connection to land, water, and sky.
music
Johanna: Welcome to Tech Mirror, a podcast about how humans shape technology and how technology shapes our world. I’m Johanna Weaver. I’m one of the co-founders here at the Tech Policy Design Institute, and on this podcast I speak with the big thinkers, the experts, the policy makers and politicians about ideas, debates, and the decisions behind the technology.
And the policy that is shaping our lives. I’m especially curious about what becomes possible when we remember that technology is made by people and it can be made differently. Thanks for tuning in. Let’s dive into this episode.
So listeners, we are back with 2026 and series three of Tech Mirror. This series we’re gonna bring to you will be reverting back to our interview style.
So we’ll have a number of leaders coming on to talk about all of the hot topics of tech policy in 2026. And we wanted to kick off today by introducing formally co-founder Zoe Hawkins of the Tech Policy Design Institute, uh, who’s here with us today.
Hi Zoe.
Zoe: Hi Johanna. Thanks for having me.
Johanna: We thought it would be great to start this particular series and given what we start at the start of 2026, by examining and talking about the issues that we think are gonna be hot topics in 2026.
And then at the end of the year, we’re gonna look back on this and look at all of the things we got wrong. So it’s always a dangerous exercise to do this, but I think this just. So much on the agenda this year. Really excited to jump in on that, uh, with you Zoe.
Zoe: Hey, I can’t wait to dig in. There’s so much to talk about and I’m sure future us at the end of 26, we’ll have lots of notes, but I’m sure we’ll get a few things right.
Johanna: Exactly. Alright, so let’s delve into, first up, I mean, over the holidays I did actually take some time to do a digital detox, which, you know, I’m particularly passionate about. But I have to say, turning back on after the end of the holidays, it was. Pretty overwhelming, just the speed and the pace. Um, the sense that the world is really changing, but also how intertwined technology is with all of these changes.
So, you know, how did, how are you reflecting on that as we start out in 2026?
Zoe: I think that’s right. It’s exactly my reflection over the summer as well is your point about it’s shaping the world at that world stage level. These developments on the international stage feel like a bit of a, a tidal wave, but also on that personal level, I certainly agree that how to process and interact with a lot of this information and reflecting as citizens, I guess on the health of democracy.
Around the world. That’s certainly been something that’s stuck with me. But I think that, you know, I know that even though you did go off grid over summer, which I always admire you for, I’m not as disciplined as you are. I suspect that these big shifts that we’re seeing in the world order probably aren’t coming as a huge surprise.
To you of all people. I know that it’s something that we talk about a lot and actually wrote about together last year in the Australian Foreign Affairs Journal, and one of the things that we’ve been talking about, I guess, is the Canadian Prime Minister describing this shift in the world order. More recently, only a few weeks ago, he coined it and turned it as a sort of rupture in that US led world order.
But you know, something that we’ve been thinking about for several months as well. I’d really love to unpack. I mean, how do you see this so-called rupture of the world order playing out for tech policy in 2026?
Johanna: Mm. Well, I think the thing that stands out for me, and so I read over summer Tim Burners Lee’s book.
This is for everyone, so he’s the inventor of the worldwide web, and his book is kind of a retrospective. Looking back over this. Period of time as the internet sort of unfolded in, you know, took over the world basically. And reflecting on how different the world is right now in comparison to back in the nineties when the internet.
Became a thing. So when you think about it in that term, 1990s, so 1991 Worldwide Web was launched, 1993. It kind of became more popularly available was around the time just after Berlin wall had fallen, China was engaging in the world, joining the World Trade Organization – there was optimism and excitement. Democracy was expanding and looking back on that period, it has such a different feel to the feel that we have now reflecting on this dramatic step change that we are going through with the development, but also the proliferation of artificial intelligence through almost every dimension of our lives.
And I think that’s coming at the same time as we are having the increasingly transactional politics but we’ve also got the backsliding of democracy increasing polarization. And so it’s really looking at, well, what does that mean in 2026 for countries like Australia, it’s. Often easy for us to think, oh, you know, tech policy in that environment is gonna be too hard.
But actually I think there are some really pragmatic things that we can do as middle power countries. And you’ve just written a blog about this, so maybe if you take us through some of the things in that blog that you think we should be watching out for in 2026.
Zoe: Oh yeah, absolutely. Well, I think the first thing I’d say is, you know, they say the first step to healing is admitting that you have a problem.
And I feel like that’s where we’re at as a world. everything talking about international security and world orders we’re at that, uh, recognition stage. And so it’s, it was powerful to see, you know, one of the five eyes leaders stepping out and naming that. I think that’s something that we felt. The, you know, conversations were being had, you know, over the last year and maybe the time prior, but it wasn’t really being put on the record in that sort of formal way.
And so Okay. Admitting that there’s a change in dynamics. Well, you know, we really believe at TIPD that it’s about, you know, take the world as it is and, and do something with it. Don’t, don’t sort of sit and observe. And so I guess the blog looks at what are three practical things that middle power should be thinking about.
What does this mean for their tech policy? And look recommends three things like firstly, you mentioned that transactional dynamic that we’re seeing on the world stage with great powers throwing their weight around. I’m happy to name Trump in that dynamic. And so really it puts the onus on middle powers to find strength in numbers.
So that first practical step that we really, that calling on, you know, policy makers to think about is how can we be a little bit more open-minded in what are the coalitions of middle powers that can come together to really create that sort of balancing, I guess, to some of the pressure that. Countries on their own may not be able to withstand from some of these great powers.
And we’re talking coercion around, you know, whether it’s trade. We’ve seen a lot of tariffs in the conversations, but that’s also sort of applies, I guess, to digital infrastructure reliance that countries around the world rely on either the US and China. So it’s something that’s very front of mind and we’ve proposed something called the interoperable tech Regulation Initiative.
iTry to its friends, which really is about saying countries like Australia and Canada need to think a bit beyond the, we would call them like-minded countries that we would normally look to for this sort of coalition and move towards a more middle ground countries that might involve, you know, also countries like Indonesia, Malaysia.
Mexico, Brazil, what are the common ground areas that we have with countries like that that we can form some strength in numbers to negotiate more powerfully on the world stage now that things are looking a little bit different. And that links to the second point, which is recognizing, you know, the world, but AI ecosystems and digital technologies particularly are so globally interconnected that.
I think being really conscious of what those supply chains, and trade partnerships look like for middle powers. And, you know, that’s where you hear a lot of people talking about moves to self-reliance. But something that’s probably more practical, feasible, and possibly more strategic is also looking at how do you combine the, you know, self-reliance in the necessary areas with some sensible redundancy and choice.
At the same time, so you can maintain that accessibility to key technologies that run our societies and economies. Which I guess leads into the third point, which we are proposing this concept of when we’re talking about pursuing technology sovereignty. We can also think about expanding that to this idea of AI agency, which takes that point of control and ownership, and how we need to have our hands on some of those core technologies.
As a nation, but how do we supplement that with understanding our control choice and even leverage, like what is it that Australia and other countries in that sort of middle power group, uh, actually have a, a competitive advantage of? And how do we use that on the bargaining table as a group to actually level the playing field a little bit more in some of those conversations so that countries can keep that ability to steer outcomes and.
And I guess, protect their national interests. So it’s a pragmatic way of thinking about it. The blog really unpacks that, and I think for us it’s exciting to think about, you know, that actually is an opportunity. You can get quite, uh, overwhelmed by the changes that we’re seeing right now. But it does present an opportunity for, for countries like Australia and other middle powers.
Now, I know that sounds like a complicated environment, but. You know, you used to be the chief cyber negotiator at the UN for Australia, so I feel like you’ve tackled some complicated international dynamics before. And I wanna know when you are looking at all of this, what are some lessons from those multilateral environments that you think would apply here, that we can take forward?
Johanna: Yeah, look, I mean, I think the first thing that I really observe looking at this dynamic at the moment is how important the international is going to be to our ability to deliver on tech policies at home. Because we will need to be acting in concert, otherwise we are going to face coercion, whether it’s from the US or whether it’s from China, or whether it’s from others.
And so just that direct link between the international and the domestic and seeing. So many other people coming on board with that link, I think is really a big shift in the conversation, both at home here in Australia, but also internationally. And then the other thing I’d say is often when I talk about middle ground countries, or you know, even in response to that piece that we wrote for the Australian Foreign Affairs.
People often act kind of skeptical about Australia’s weight to engage internationally, and they’re kind of like, oh no, no one listens to Australia. Like we’re very, we’re very modest. And actually I could tell you as someone who’s been in the room holding the negotiating pen that countries do look to Australia for leadership.
And we have had a long track record in this space, and particularly in international cyber and digital agreements. So whether you’re looking at. The way that, many Australians have led, for example, AI standards discussions way before AI standards were cool, right?
Zoe: Mm-hmm.
Johanna: Or, countries looking to Australia for our online safety regulations.
You know, for better or for worse, we were. The first country to be out and doing that. Lots of countries look to Australia and go, oh, it is possible to regulate in this space. Or the types of negotiations that I was involved in, which was more looking at kind of fiber warfare and the, the laws of, of war and peace.
But Australia’s been leading and a leading broker and those negotiations since like the late 1990s. So it’s really awakening people to the fact a of that connection, uh, which I think people are, are really seeing and grasping now, but also encouraging Australia to see ourselves. And, and this is almost the biggest thing for Australians to recognize that people are looking to us for leadership and we can step up and fill that void, not just for the international stuff, but because that helps us.
Deliver at home. And so I guess that for me, makes me think about what are the key areas at home that will be, we think will be on the record in 2026. And one of those is very, I mean, we’ve done a lot around, uh, online safety and particularly social media. And I know this is an area of your particular expertise.
So what are you expecting to see in this space in 2026?
Zoe: Yeah, absolutely. I mean, I think that your, your point there about, you know, how do we think about that connection between the international dynamics and the confidence and appetite to regulate at home? They are connected, right? We see a lot of countries doubting whether they are going to face repercussions for regulating, particularly on tech.
In talking about US Tech particularly, and so I guess talking about how that’s playing out, uh, for Australia and online safety, you know, it’s pretty clear that Australia hasn’t lost its confidence in its willingness to be out in front in regulating, if we’re talking about the social media age restrictions, obviously that got a lot of headlines around the world and you know, it’s been a big shift over summer that change from.
10 December, you know, we’ve seen 4.7 million counts, apparently deactivated removed and restricted. So that’s, that’s a big change over the summer holidays for young Australians. I mean, as you mentioned, there’s been a deep dive. To enter that content that you’ve done with taken the Tech Mirror listeners through with Australia versus social media, which has got all the information in there that they need.
But I guess I’m curious to think about like since doing that miniseries and talking to all those experts and reflecting on how it’s rolling out, I mean, how, how do you think it’s playing out? How do you think it’s landing on the ground now that we’re sort of several, I guess we’re two months in.
Johanna: Hmm.
Well, look, I think there’s the, there’s the data on the implementation that you’ve spoken about there. I still hold concerns about the negative impacts of that,, that particular piece of legislation, particularly for kids that are cut off., you know, as a young kid who was a little bit different, who grew up in a small country town, like I can really understand that life lifeline that it offers people.
So, I, I am though pretty optimistic that one success that has come out of this is that we have actually seen a change of a social norm and a setting of a social norm that it may, it is not acceptable or it’s not a good thing for children to be on social media unsupervised. And I think that is an important step for us, but it absolutely is not enough, and that’s why I’m really passionate about people.
Seeing the social media age restriction legislation as an awakening and saying, look, we actually have the power to shape technology. Little old Australia decided that we were gonna do this. Social media companies have implemented it. They have changed their platforms and technologies, and we are now, you know that technology is different.
We’ve shaped that technology and so my challenge to everyone listening to this. Is to think, to challenge you, to think about, well, how do we want technology to be shaped because we have the ability and the agency to do that. So I think that’s my optimistic take on the best thing. And then I think there’s a lot of work that the government has committed to around, you know, the online safety review, et cetera.
And so I know you’ve been following that really closely. So what, what are you, what are the key things you think people should be watching for this year?
Zoe: Oh yeah. I love that. Viewing it as sort of like case in point, you know, policy can shape technology. I feel like that’s good, good evidence for us. yeah, 2026 I think is gonna be a big year for Australia on online safety because the online Safety Act, which was passed in 2021, you know, it’s, it’s got a lot of complex elements to it, including a large number of codes, which I won’t go into all the details of.
It’s the industry and the eSafety commissioner have been, uh, working away on hard. For, for years. And most of, you know, I think the final six of the 16 codes are gonna be in force, uh, by next month. But there’s been a lot of movement and a lot of engagement on both the regulator and the industry side. But the big thing as well is that five years ago, you know, that 2021, that act was world leading.
I think I, I feel I have to disclose some involvement in that, if I’m gonna be so bold as to say that. But I think at least the element that was well leading was it introduced the first adult cyber abuse scheme. Right.
Johanna: Mm.
Zoe: First, first scheme that acknowledged that an individual might be targeted by content that needs to be removed.
Uh, that meets a very high bar of abuse, but five use is a really long time, uh, in tech, right? And so there’s a lot of things that need to be reviewed, modernized, and updated in the legislation. And that review, an independent review, has been done recommending. I think it’s, you know, more than 60, 67 recommendations of things that need to be changed.
And that was handed in October, 2024. Right. There’s been a lot going on. The government’s been really busy, including with the social media age restrictions that we were just talking about, but I think 2026 is the year that they’ve, we’ve gotta dig in and really deliver on, you know, the review and consider all of those.
Exactly. All of those.
Johanna: That made, that made it look like I was throwing it over my shoulder. What I meant to be doing was a digging in symbol. For those of you listening, you’re not gonna be able to see that. But anyway,
Zoe: dig into the online Safety Act and also particularly, you know, the government has committed to implementing a digital duty of care.
So I think the big thing we’re gonna see this year is one. The review of the online Safety Act in detail, how does that move forward? How do we really modernize that legislation? And a big part of that is the duty of care. And I mean, I think this is something that you and I are both really interested in because.
As you’ve said before, the social media age restrictions are an important element., but they can’t be the only, it can’t be the only solution because ultimately it doesn’t make social media itself safer. And so I guess like, can you talk a bit about why is the digital duty of care, I guess, different to that and why is it a significant and essential compliment that we need to see, come through and see activity on this year?
Johanna: Yeah. So I think, I mean, I, I think Australians should be proud, of our leadership in the online safety space. I think we do need to be cautious about it too. So I think we need to make sure we’ve got proper regulatory oversight of, of, for example, the e Safety Commissioner and, and others who are doing an incredible job and facing huge amounts of pressure.
But, you know, for me it’s always about making sure we’ve got the right protections in place for whoever the next eSafety commissioner is and whoever their successor is. But I think the biggest challenge we have is exactly what you’ve touched on, which is, well, it’s been five years. Technology has, uh, changed so dramatically in that period of time.
And if we continue to have technology specific provisions or even harm specific, so what I mean by that, I mean social media as a particular form of technology or harm, cyber bullying. As, as long as we keep being that specific, the lawyers, the politicians, the policy makers are gonna struggle to be able to keep up with the changes in the technology.
What I see as the huge potential opportunity of getting digital duty of care right, is that we implement a regime that is harm neutral, that is technology neutral, that says actually if you are operating in the digital. Sphere, you have an obligation and a responsibility to ensure that you’re not causing harm to people and to your users or to to the citizens.
And that is a game changer because it’s future proofing the obligation. Now, we can’t just do that by ourselves. There has to be strong cooperation with international partners on that, so that. Comes back to the relationship, that we’ve, we’ve already talked about, uh, in terms of the geopolitics, but also coalition building and the, the challenges around that.
But it’s also really important that we look at it and examine how a digital duty of care will operate with the existing regime. Whether or not we need to change the existing regime rather than just layer something else on top. So lots of big questions. I think things for people to watch out on – is this going to be something that is a digital duty of care only for social media?
Will it be framed more broadly? We don’t know that yet. Is it gonna be framed around particular harms or particular categories of harm? We don’t know. The government hasn’t really given any detail about whether they’re looking at reviews of other things like the basic online safety expectations or the bows, which is a, you know, a piece of legislation that tried to sort of do a lot of what the digital duty of care will do, but in a sort of much more limited way.
So. I think this is, for me, one of the most consequential pieces of policy reform that Australia can lead this year. And I’m really, you know, I think there’s a lot of interest and engagement across the sphere. We’ve had so much outreach already on this topic, so I’m really looking forward to digging in and we’ll be doing quite a lot of work in this space over the next, uh, few months.
Zoe: Yeah, I’m excited to do that work. And I think you’re right. It is a big opportunity and I feel like I need to, to balance my sort of, you know, it’s been five years, we need to hurry up and do something. The reason this is so challenging is because this is a complicated piece of work as well. Like you just started naming some of the complexities in this simple, just implemented duty of care.
You and I are in the business of best practice policy design, and that means two things. That means, you know, moving quickly to, you know. Keep up pace with technology and meet the needs of society, but it also means doing the process in a way that actually gets the outcomes that you want. And sometimes that takes a little bit of time.
So I think the big challenge for government and industry this year will be like striking the right balance in that keep that momentum up. You need to deliver on the reform, but do it in a consultative, considered, meaningful way so that we end up with a properly modernized, streamlined sort of. Modern online safety regulatory framework that’s not a Frankenstein of kind of everything having been layered on top of each other, as you said.
And so I think that the other reason that that consideration and momentum balance is so important is because online safety, like a lot of tech policy issues is so connected across other policy areas. You know, you can’t move in online safety for touching issues that are related to privacy or other areas.
So I mean, I think that’s something that we also think very broadly about is, you know, how do we move out of those silos and consider these tech policy challenges from a more holistic point of view. so I guess. What are those, some of those intersections that you see for 2026, these different policy areas that I think maybe traditionally people may have thought of as separate, but that are really becoming closer and closer.
What are you seeing there?
Johanna: That is, that is a dangerous question to ask me. Zoe Hawkins. Uh, because, uh, this answer could go on, uh, for the entirety of the episode when I was thinking about, preparing for this, I was thinking about, well, let’s start with what, what the government has actually committed to do.
So late last year, the government released its national AI plan, and it did actually articulate in that plan at a, uh, you know, a, uh, heaven forbid, a plan within a plan for regulatory reform and the areas that they were gonna prioritize. So just like a quick run through of those. I’ve got a list here. They said they were looking at minor amendments to consumer law to protect around AI harms, and I’ll, I’ll come back to that in a minute.
And they did also flag the review of the online safety Act, of which the digital duty of care that we’ve just been discussing will be a part, but it’s not. The whole picture, as you say, there’s over 60 recommendations in that review, and the government hasn’t actually responded to the review yet. So really waiting with bated breath as much as we can, uh, in the policy world for that.
All the –
Zoe: other nerds.
Johanna: Exactly. Exactly. but you know, they’ve, they’ve flagged, and I thought this was interesting in the, in the AI plan, that they’re looking at further restrictions around notify apps, which of course, Kate Cheney, the independent member, has. Introduced a private members bill around that as well, but also things to tackle algorithmic bias and they flag that these are things that are being considered.
So really, we’ll have a close watching brief on that. The next thing that they had in the AI plan,, was referenced to copyright reform. And I’m sure all of our listeners will remember that being, you know, uh, sort of drinking from the fire hose of that for a few months towards the sort of second half of, of last year.
Has quietened down because the government has said, well, we are not going to grant an exemption for, uh, text and data mining, but that is a policy position that they’ve put out. There’s still an ongoing reform process for the Copyright Act, and so I, I’m really interested to see how that evolves and develops, and it could, I think, potentially become another flashpoint just because there’s quite differing views and huge differing different interests there in terms of.
, the perspectives of industry,, creatives, Australians who want to see Australian content, so we’ll, we’ll be watching that one as well. I think perhaps for those of us with more of a security background, the reforms of the Soki Act,, are also, so the,, the security of critical national infrastructure is going to be something that review is ongoing as well.
So there’s, you know, quite big consequences of that for critical infrastructure providers. particularly around cybersecurity, but also infrastructure protection. So, that’s another area to watch. And of course privacy reform, which you’ve already mentioned, but, you know, this was flagged as something that the government was working on.
They’ve articulated this many times. We’ve had tranche one sort of at the end of the first term of the Albanese government. They flagged that they’ll continue to prioritize this. in the second term with the trench two privacy reforms, and I really think Lizzie O’Shea from Digital Rights Watch just says this best when she says, this is one of the most consequential pieces of reform for our digital and our online environment.
And I think people who work in this. Get that, but I, I’m not sure always that when you talk to Australians on the street, that they make that connection between privacy reform and shaping their online experience. So that’s something that there is a lot of us, not just the policy nerds, not just the tech policy nerds, but also the privacy nerds who are really focused and hoping.
but also I think there’s strong expectations on the Albanese government that there is actually gonna be progress in that space in, in 2026. So. Touch on a couple of things that were not called out in the AI plan, which I think is interesting as well. And one of those is around competition law reform.
So the a CCC digital platforms inquiry, which is going on for a, for a long time, had its final report last year. There’s recommendations in here around the designation of digital platforms, the introductions of competition, specific codes for designated services. Again, like this is, you know, quite. We could do whole episodes on these particular issues, but what’s interesting to me is that the government accepted those recommendations.
In principle. There was consultations over a year ago now, back in December, 2024. We haven’t really seen, you know, the submissions from those consultations haven’t yet been published online. We haven’t. Seen a lot of progress or even public commentary around what is happening, on the government’s agenda.
Now, of course, that’s very complicated and directly linked back again to the geopolitics. Trump has made it really clear that he sees, the Digital Services Act, the Digital Markets Act in the EU. Things that are anti-competitive, and taxes on, uh, US companies and these threatened tariffs. And arguably this is sort of sitting in the same belly wag, so I can understand why there might be hesitation, but it’s really interesting to me given how much productivity was like the key word last year.
This isn’t something that has gained anywhere near as much attention in tech policy debates, as we might expect. I guess maybe the final thing I’ll say is I think another really heartening thing to see, and you and I met with Jess Hunter, the incoming, uh, cyber ambassador was a commitment in the National AI plan, uh, to develop an international AI engagement plan.
Now, I think that is so important and I’m really, uh, I, I think Jess Hunter is. Excellent. She’s got a great team over at dfat, really keen to see where they take that and to deliver on this opportunity for Australian leadership, not just internationally, but then to help to reinforce the delivery of this huge platform of work that we have the opportunity to deliver on in, in 2026.
So, you know, there’s so much, there’s so much there., uh, this is what happens -
Zoe: When I get you going.
Johanna: I know, I know. I’m really sorry I’ve tried to be contained., but I, I do think like, when we’re talking about policy agendas, everything that I’ve just referred to, there is very much policy or legislation.
Other, some people that are maybe not, uh, employees of the Tech Policy Design Institute would call that red tape. And I think it’s really important that we also emphasize that when we are talking about tech policy, we’re not just talking about the legislation, we are not just talking about the red tape.
So maybe could you talk about some of the areas where you think in 2026 we’ll see a focus in capturing the opportunities in, in areas where Australia does have that competitive advantage. And perhaps maybe a little bit on the, the project, which you were the lead offer on, on, on the AI agency and, and sovereignty work.
I mean, that is such a huge and important piece of work going into, into this year to set Australia up, not just for 2026, but well into the future.
Zoe: Absolutely. I think a word that really came through regularly and what you were just saying is 2026 has to be about delivery. There’s so many, policy issues that I feel the government has kind of gone, gone a really important sort of step towards.
And I think that there’s so many things we’re kind of looking to see this year, whether it’s. The AI Safety Institute announced in the plan. If it’s the engagement strategy, it’s digital duty of care, it’s privacy. I mean, that list is long. So I have a government also has my sympathy. There’s a lot of work to be done there and getting all of that right, as you said, whether you call it uh, red tape or if you’re like us and you call it sort of enabling policy and regulation, those guardrails or those sort of policies that will keep people safe and confident in their online lives is part of.
What is required for them to engage with a lot of these opportunities, you know, and you’re talking about how do we seize that opportunity, whether it’s Australia, whether it’s other countries. You have to create a scenario where your citizens feel like they can, I guess, adopt with confidence and trust in the governance and regulation that sits behind it.
But part of that and how we’ve been measuring that in this big project that you were just referring to, which is our AI agency project, we’ve produced this, enormous beast of an AI agency tool, which is designed to kind of recognize that full breadth of, if we’re talking about ai, what makes up a thriving AI ecosystem.
So I guess to pick up that thread there about opportunity, it’s, you know, how confident do you Australians feel about their ability to adopt AI safely would be one element, an ingredient. In what makes a thriving AI ecosystem. But the beauty of the AI agency tool is it situates those issues all the way through to, you know, do you need the data centers and the electricity of the infrastructure layer through to the skills that you need and governance.
And so it kind of, it categorizes 101 different ingredients of AI success for want of a better term. And then it empowers the leaders using it to actually break down that question of, okay, do I have that capability in my. Ecosystem. Yeah. What level of maturity do I have over that? And then this concept that we’re introducing around AI agency is saying, in addition to looking at those kind of more traditional sovereignty questions of control and ownership, we also wanna talk about what are the opportunities for countries like Australia to, to really lean into that leverage.
Like where are we strong?, that we can bolster our position as a country because we have a natural advantage. And it’s only by seeing them all, I think in this very large picture, can you actually strategize and say, okay, well we, it doesn’t make sense to try and be the best at absolutely everything on this very long list.
There are some things that we’re gonna determine that are non-negotiables that we need., but how can we be a bit bit savvy about how, where do we partner for certain things? Where do we want to lead? And, you know, the tool will help people do that. So we’re really excited to get that out there. Both, I guess because it empowers, you know, it gives more options.
To go back to the conversation about middle powers and what do you do in this situation, but I think it’s also holistic in that it recognizes everything from needing that hardcore national security, very established way of thinking about concepts of AI sovereignty. But I think people often wouldn’t think about some of the other ingredients that, through our national consultation have come up as you know, what experts and stakeholders are saying are, are important there to, to grab that opportunity as well. So I’m thinking that there’s a discussion draft available on our website for people who wanna dig into that detail. But we’re excited because we’re in the final rounds of our consultation at the moment, finalizing the product of Australia’s 2025 stocktake against that tool.
So there’s a really detailed take of where does Australia sit right now? And it’s the evidence base that I think should help inform the delivery of some of those things that you were just talking about because. I think you need to know where you are to know, uh, how you want to go, where you’re going. So we’re excited to release that in the next few months.
So, standby, uh, listeners of Tech Mirror. Yeah. You heard it here first. It’s coming out to an inbox near you.
Johanna: Yeah, and I think I, you know, I, the, for me, what is so powerful about that tool is obviously there’s the stock take of the capabilities, the articulation of the capabilities and things that,, you are right in the fact that I don’t.
Think every policy maker would be thinking about things like, what does it mean to live with ai? Like we talk about digital literacy, how do we not have the mistakes that we made around digital literacy repeated around AI literacy, right? If we’re gonna actually interject and not repeat those mistakes, now is the time to do that.
So I think it’s actually that full articulation of it, which is so incredibly powerful. But then also the transparency that it allows. Your ability to engage with it, to say, this is the basis on which we’re making these decisions. We are conscious that there are trade-offs around some of these things. You know, data centers, there’s so much conversation around that.
We are actually creating a tool where we can say, okay, well this is how we are prioritizing it and this is why. And so I really do encourage people, to delve into that body of work. We’ll pop a link in the pod notes. It is a very in depth piece of work, but I guarantee everyone listening to thi – you will find whatever it is that you work on in that piece of work, and if you don’t, please reach out and let us know because we’ll add it in.
But we’re, you know, it’s, as you say, the product of a lot of consultation.
Zoe: I was just jump in and add that, I think to your point about its utility from a point of transparency, one of the other things that I think is gonna be big for 2026 is that point about, well, the government needs to deliver on a big body of work and lots of different fronts, and some of that is also down to.
Maintaining public trust, right? In sort of delivery, follow through on commitments, but trust and that sort of public engagement on these technology issues. I think it’s a good thing that we’re seeing increasing public engagement on technology issues. I think AI has brought that so to the front of public consciousness that that has, I think that’s maybe a good thing that’s happened as people do feel more engaged with the issues.
But it also raises that question, you know, we talk about ai if we’re trying to see as the opportunity. You know, Australia sort of coming in at only 30% of Australians thinking that the benefits of AI outweigh the risks. And so, you know, we are having these conversations and one of our upcoming projects is around what.
What is it that would increase Australian’s trust in this technology? We’re hearing a lot of talk about how this technology is essential to sort of transform different parts of society and the benefits it’s gonna bring. And while people should retain the choice about whether they do engage with that or not and whether they want to adopt, I think we want them to feel like they have that choice and you don’t really have a choice.
If you don’t feel like you can trust it, it’s only if you trust it that you think, okay, no, I’m gonna opt out. But it’s actually a viable choice for me to opt in. So we’re excited to, to sort of bring that work forward in looking at what does that relationship between well-designed regulation and Australians trust and AI look like so we can.
Boost that trust.
Johanna: Yeah. I mean, and that body of work is underpinned by some really interesting survey data as well. So keen for people to see that. So watch that space. Oh my god, it’s like an infomercial for tippy products, which is, you know, there’s, it’s just, I really feel very, very excited to be working in a space that is evolving so quickly and where we can actually have real impact on what is happening.
So I think reflecting on the breadth of the conversation that we’ve just covered, right? If you think about this in terms of. You know, the government ministers that are in charge of the leadership of these things, it is so broad and it, it really is important if the government is gonna deliver on the agenda that they have set themselves, but also I think that they need to meet if they are going to meet this moment and this opportunity.
I think it’s. Really important that we have improved coordination across ministers. And that’s something that, you know, one of the first pieces of work that I did coming out of government was cultivating coordination. We’ve repeated that call again in, in recent work that we’ve done Tetris for Australia’s future, and I, I really would like.
To see improved coordination across government on all of these issues, because that is how we get good tech policy, but also meaningful consultation on all of these issues, uh, into 2026. So we’ll be, we’ll be keen and watching that very closely.
Zoe: I think that consultation piece is also a thread that is, I think, gonna be a big a theme for, for TIPD and what we’re expecting for 2026, which is that element of participation. You know, we’re talking a lot about how we make sure that good policy is designed by getting all different perspectives involved.
Johanna: Yeah. I mean, that’s one of the things I’m really proud of that in the work that we do, is to make sure that we’ve got diverse voices at the table and we are not always.
Going to agree, right? The, the challenge is like creating that environment of productive discomfort where we actually understand each other’s different perspectives, and that’s actually how you design good tech policy by understanding, not necessarily always agreeing, but understanding those perspectives.
And I see that as a key role for us here at TIPD. Now, I just wanna flag a couple of other issues that I think are potential hot topics in 2026 that we haven’t really touched on. One of these is government use. Of ai. So I think there’s, there’s going to be a lot, coming out in this space and it’s really important when we talk about public trust.
So this is something that I think is. Absolutely mandatory for the government to be on the front foot about, and, excited for some of the work that’s coming out of the DTA or places like Services Australia in that space. I think also looking at things like digital id, verifiable credentials, these are always topics that I think that changing and evolving so quickly that I can see, coming up in the discussions that we have.
So, look, there’s just. So much going on in this field of technology policy or frankly, public policy. Because as you know, Zari, one of my favorite mantras is there’s no such thing, uh, as, uh, as just tech policy, it’s public policy, or as Nick Davies likes to say, technology policy is overly obsessed with technology.
And, you know, we’re also keen to hear from our listeners, what is it that we’ve missed in this list? What are the things that you’ll be watching or that you think we should be prioritizing? We’re super excited to bring you this next series of Tech Mirror, and so we’ll have a number of interviews with sort of leaders in this space.
So please do let us know who you wanna hear from or the topics that you want us to cover. We are keen to sort of shape this and bring you along the journey, but for now, Zoe, I’ve really enjoyed this conversation. We thank you so much. We love talking
Zoe: about this stuff and uh, thanks for having me on. And look, it’s a great lineup of interviews.
You’ve got ahead, looking forward to listening and being part of some of them. but it’s a big year. 2026 is looking busy for tech policy 2026.
Johanna: Thanks everyone. Looking forward to embarking on this journey with you. Thanks for listening and as always, get in touch and get involved.
Well, that’s it for this episode of Tech Mirror, which is brought to you by the Tech Policy Design Institute.
We are based here in Canberra on the lands of the Albury people. If you found today’s conversation useful or thought provoking, please do share it with a friend or a colleague, and you can subscribe wherever you get your podcasts. Or if you’re watching, don’t forget to like and follow. For show notes and more resources, visit tech policy.au.
This podcast was made possible thanks to the generous contributions from government, industry and philanthropy to the tech policy design fund. The full details, including our sponsors are available on the website. The team at Audiocraft produced this pod on the lands of the Gadigal people of the Eora nation, and Amy Demi provided invaluable research support.
A big thank you also to all of the team at the Tech Policy Design Institute, without whom this pod just wouldn’t be possible. For more information about our work, visit us on our website or follow us on LinkedIn. Thanks for joining us. Get in touch and get involved.