In this episode Johanna Weaver catches up with Jess Wilson, CEO of Good Things; David Masters, Head of Global Public Policy at Atlassian; Lizzie O’Shea, Chair of Digital Rights Watch; Nick Davis, Co-Director at the UTS Human Technology Institute; and Meredith Hodgman, Head of Strategy from the Tech Policy Design Institute.
Each of them attended the recent AI Action Summit in Delhi, a conference that started in at Bletchley Park in 2023 and is now one of the largest gatherings of AI experts in the world. The summits started with a firm focus on safety and existential risk, but have evolved progressively – the focus in India was opportunity & impact. Our guests share what it was like on the ground in Delhi and what their main takeaways have been. Sovereignty, social impact, and human capital are just some of the themes traversed in this episode.
Links
AI Impact Summit Communique: https://d19ob9sqegt2wc.cloudfront.net/stage/uploads/AI_Impact_Summit_Declaration_f208574dfc.pdf
A/Minister Charton’s Speech: https://www.minister.industry.gov.au/ministers/charlton/speeches/speech-australian-business-economists-conference
Collective Intelligence Project Global Dialogues: https://www.cip.org/globaldialogues
TPDi’s AI Sovereignty to AI Agency Discussion Paper: https://techpolicy.au/ai-agency
For transcript and full show notes visit techpolicy.au/podcast
Credits
Produced by Audiocraft.
Original music by Thalia Skopellos.
Created on the lands of the Ngunnawal, Ngambri people and the Gadigal people of the Eora Nation.
Special thanks to all the team at the Tech Policy Design Institute, without whom the pod would not be possible, especially Zoe Hawkins, Meredith Hodgman, and Dorina Whittmann.
Transcript
Johanna: The Tech Policy Design Institute acknowledges and pays our respects to all First Nations people. We recognise and celebrate that among many things, Indigenous people were Australia’s first tech innovators.
Welcome to Tech Mirror, I’m Johanna Weaver, one of the co-founders here at the Tech Policy Design Institute. I am fascinated by a truth that is easily lost in the noise of our daily news feeds, and that is that technology is made by people, which means that it can be made differently. And on this podcast, I get to sit down with big thinkers to explore the ideas and the debates that are shaping technology, that is shaping our lives.
And I’m interested in how we can use policy as a tool to shape that technology. So let’s dive into today’s discussions.
In today’s episode, we are going to talk to a number of Australians who are on the ground at the AI Action Summit in Delhi last week.
Nicholas: I will say from an AI safety perspective, literally. Has failed in terms of creating a broad based buy-in in the summit for discussing safety.
Lizzie: This summit has some shifting or slightly unclear ambitions and audiences. None of us really are part of that main stage conversation.
David: It was a big conversation around what does this mean for human capital? What does this mean for the future of work? What’s the role for humans? And all of this.
Jess: I’m really interested in social impact and so what are the benefits of AI and what are the tools that are being used for positive social impact in a multiple different areas?
Meredith: In the conversations that I was participating in that were mostly government focused, it was really ambitious, but also very sovereignty focused.
Johanna: Now, this conference was one of the world’s largest gatherings of AI experts, and it’s the fourth summit in a series that started back in 2023. The first iteration of the summit, some of us may remember, was at Bletchley Park in the uk, and it was first convened back then when there was growing concern about this existential risk that frontier artificial intelligence poses to humanity.
Since then, the conference has been held in a few different locations around the world. In Seoul, it retained largely the safety focus. Then we went to Paris where there was a shift from safety to action and to Delhi where action has moved to impact, or at least the title has. And the objective of this conversation is to unpack a little bit of what it was like on the ground at a conference this big.
So organisers are saying there was 85 different countries represented over a hundred thousand delegates. And so reflecting that, we have an unreasonably large number of delegates here on this call. And we are gonna start by asking each of you to just help us to create a picture of what the mood was like on the ground.
So I’ll go one by one, get each of you to introduce yourselves. As well, but it doesn’t have to be tech related this one, just what’s a story that you have that kind of helps capture the mood for those of us who weren’t there, of what it was actually like to be in Delhi last week? So who am I gonna start with?
Jess, let’s go with you first. So Jess, CEO of Good Things, which is a not-for-profit foundation. Jess, what’s your mood setting story for us?
Jess: I mean, it just felt so enormous to be there. I think the biggest thing was arriving the first day at the Impact Summit, having, you know, traveled through all of the deli traffic, but actually turning up and the number of people just waiting to get in was significant.
It was huge. But you know, interestingly the ladies only line, ’cause there is a ladies only line, um, to get through the security. I was a lot shorter than the men’s lines. So that was one of the things that I think was, is always interesting as a woman in the tech space to realize that there are a lot less of you, but I got to meet some great people.
And that’s one of the things that I think also happened is that. Everybody wanted to talk to you. Everybody wanted to say, this is what I’m doing, this is what I’m here for, and this is why I’m here. So I met an amazing woman in the line that’s doing great social impact work in ai. So you know, that’s the kind of mood, it was a lot of energy, a bit of chaos, but real positivity.
Johanna: David, perhaps if you can introduce yourself, although you, uh, have been a guest on the pod before. What’s your reflections?
David: Thanks, Johanna. Yeah, David Masters. I’m the head of Global Public Policy Atlassian, and also at TIPD board director as well. I think there was just this tremendous entrepreneurial spirit from all the people that attended.
So Lizzie and I were having a chat and which was sort of trending towards a deep, meaningful about our shared Catholicism, but I’ll leave that aside. But we were approached by this gentleman who just started launching into this like. Speech about what he was doing, and I think in five minutes he didn’t take a breath and basically, I think used every acronym and industry standard known to man.
And basically I, I still don’t know exactly what it is that he did, but it was quite the sales pitch, just incredible activity, interest from the Indian population, you know, wanting to know about AI and wanting to know how they could benefit from it.
Johanna: Let’s go next to Lizzie O’Shea Chair of Digital Rights Watch, also a multiple guest on Tech Policy.
Lizzie: That’s me. And I’m grateful for the opportunity to defend myself against an allegation that I might be, uh, practicing Catholic, which I’m not, but it was more a conversation about lapsed Catholicism. Thanks, David. Yeah, I mean, Alandra Nelson, who many of you may know, she was key drafter of the AI Bill of Rights under the Biden administration.
She, I think, uh, was, it was the last session of the whole summit. She made the observation that she’s been to many AI conferences in her time – many, many, many. And this was the first one that had actually invited the community in. So a lot of the chaos was framed as this quite negative thing. Like it’s hard to get around, it’s hard to get things started on time, but I thought that openness was a nice reset on understanding how these things usually occur.
So whether that’s a meaningful engagement from the public, we have to ask questions about that. Like what is the purpose in getting lots of people to come? What kind of vibe was Modi going for? I think in, in. Specifically framing it in that way. But I, I do think Alondra’s observation in that respect was a valuable one worth thinking about.
Johanna: Mm-hmm. And Nick Davis, I’ll let you introduce yourself.
Nicholas: Yeah, thanks Johanna. So I’m Nick Davis. I’m the co-director at the UTS Human Technology Institute, really building on Jess, David and, and Lizzie, that vibe of interest and engagement from the kind of local population, like the people who were there, particularly from Monday through to Wednesday and, and I had my big role as part of the research symposium on the Wednesday.
But one of the things that I really took away was I was expecting, uh, a more of a safety discussion. Even despite that, your intro and the shift away from that wording in the title. You know, I spend a lot of time with those folks at the international level, but actually I got a much more of a social discussion and uh, just, come a, a lot of questions that revealed the anxiety that people have in India and by extension in the rest of the south around the impact of AI on their jobs, on their livelihoods. Two quick examples.
A lot of students, engineering students came up to me and said, I don’t think I’m gonna get a job because I’m a software developer and I’m really worried about the rate of progress in coding.
I had someone who ran a, a business process offshoring business, like a kind of an outsourcing business, come up to me and saying. I’ve lost half my work. What do I do? AI has taken away this overseas stream of income for my business. How do I adapt? And, and so those kind of, that social impact, the economic impact as well as the opportunity, were really, you could see it in people’s eyes.
And that was the benefit to Lizzie, to your point. It wasn’t just experts and folk talking to one another. It was that population who were being affected, who were in the room for, for the first time for one of these summits.
Johanna: Hmm. Meredith – so TIPD’s very own representation on the ground – what’s your sort of scene setter for us?
Meredith: Hi, I’m Meredith Hodgman, head of Strategy from the Tech Policy Design Institute. I suppose not unlike the others, the conference was so massive, it was a little bit of a choose your own adventure, and at times felt like it was perhaps two or three different conferences happening at once in parallel. But certainly in the conversations that I was participating in that were mostly government focused, it was really ambitious, but also very sovereignty focused.
So much of the conversation still really circulated around possession. Whether that’s about domestic models or national infrastructure or asset ownership, et cetera. It was always really framed as binary at the beginning, really something that you have or something that you don’t. However, of course, we were there talking about our recent publication, the AI Agency Tool, which really situated that conversation more around control and put it in more of a a broad strategic framework. So with that in mind, it was actually really a lot easier to have more honest conversations about the harder structural questions, and you really could see people leaning towards those nuances, particularly around a more difficult questions like persistent rent channels or contradictory copyright settings and the risk of hollowing out state capability, et cetera. And so I really was quite pleased about the mood being so welcoming towards a greater depth in conversation about how the middle powers preserve leverage, particularly through current policy settings and collaborative positioning.
Johanna: Hmm. So I think that’s some good grounding for us. Can you just give us a sense, perhaps, Jess, I’ll, I’ll throw this one to you – you wake up in the morning, what happens next? Because it’s also the dispersion of it, like how are you choosing where you were going? How many different venues are you going to? I want the listeners to really have a sense of the scale of this event – it’s not just a conference, it took over the whole city.
Jess: Yeah, it absolutely took over this whole city. I mean, number one, you had to work out how you were gonna get to where you needed to be. So making sure you had, you’d chosen, whether you’re gonna use an Uber or you’d booked a car or any of those things was really, really important.
So how you were gonna get places. And I also was really like, I’d offer to take people places if I had a car, like, come with me, it’s okay, we can get that together. But I think, you know, I mean, I’m really interested. In social impact. And so what are the benefits of AI and what are the tools that are being used for positive social impact in a multiple different areas?
So that was really the stream that I was most interested in. And I think, you know, there was a significant number and I was really positively surprised, I suppose, about not only people just talking about the potential of AI to benefit. Social impact, but actually seeing some of those examples of people, how people have used large language models alongside community intelligence.
So bringing those two things together to kind of think about how do you do this or, and predict, you know, when flooding is gonna hit a particular area and how do you support people to respond to that. So I think for me the focus on social impact was really important. So there were a lot of different side sessions that were happening every day.
And so I suppose you had to kind of work out, well, where am I gonna be today? I was gonna go back to the summit on Thursday, but of course couldn’t get there because it was all shut down for all of the, the keynote speakers, Macron and, and Modi and everyone. So I think you kind of had to be a bit flexible.
So I think, you know, being flexible and just knowing there was a lot to, to be part of anyway.
Johanna: Yeah, I think when this conference first started in the Bletchley Park context, right, it was pretty much just government, it was a relatively small number of people – 28 countries were represented, but by sort of three or four people from each country.
You know, the explosion of it is quite extraordinary. Let’s focus in a little bit more on the, so what your key takeouts were, and part of the reason why we do have so many guests, wonderful guests on this episode is ’cause we really wanted to have the diversity of perspectives just also so that we are showing the diversity of the conversations that were happening.
So David, why don’t we start with you? What’s your key takeout in terms of the tech policy substance of the conversations that you were having?
David: Yeah, look, it’s clear that India was trying to sell its vision of digital public infrastructure and that sort of flavored a lot of the conversation. And I think that’s a very different conversation that you would have outside of the global south being the first sort of summit that was held in global south.
And I think if you look at that sort of dynamic in terms of technology rollouts, you probably have seen a dynamic where governments in the global south have tried to control, have more of a role in rolling out technologies than you have say in, you know, in sort of western economies. And so I think that certainly flavored it.
I think related to that, there is, you know, to merit this point an interesting conversation around sovereignty, but also like everyone is talking about slightly different things when they talk about sovereignty. And so there is some talking about owning the full stack of AI infrastructure and software. Others, it was about maintaining control of the data within the jurisdiction by extension, having some control over the AI outputs on the digital public infrastructure point, you know, a key role for governments in kind of managing that rollout, ensuring that you don’t divide issues ensuring that you, you know, you have a democratised access.
And another perspective is really, it’s about sort of domestic capability. Like do you have people that can build and can develop AI technologies and exploit them? And so it just reaffirmed to me that sovereignty’s become a really problematic term. And I actually commend, you know, TIPD’s sort of use of AI agency because I think we need to find new language around that because I think everyone is using that term to mean very different things.
Johanna: Mm. And Lizzie, what about for you?
Lizzie: What I would say is that this summit has some shifting or slightly unclear ambitions and audiences. So that has changed, as you pointed out, since the first one that took place at Bletchley Park. But even now, the communications that came out of this summit from delegates really related to investment in the industry.
So, what it looked like to me is a dialogue between states and also a chance for them to talk to industry and then stitch up deals, whatever it may be. It can be framed less cynically than that, but certainly that’s what comes out of the stated communications. Civil society, and if we use that term broadly to include academic organizations, think tanks, as well as advocacy organizations like mine, of which there were very few in the latter category there at all.
None of us really are part of that main stage conversation. None of us were on that stage when Modi’s standing there with a bunch of tech CEOs and, and putting their hands in the air, even if Sam and Daria couldn’t manage to hold hands, so. What I would say is there are ways in which this whole summit is designed to avoid taking account of the harms, the concerns that people have, and if you want to make them part of this summit, you really have to force your way on.
So you have to find ways to create events, to do that, to have dialogues that you convene, to do that, but you are going to be swimming against the tide if you’re doing that work. And you know people in this. In this group that we’re talking to now, we’re part of doing that. But it’s also fair to say that that is not where the main conversation is happening.
That worries me. So the conversations I witnessed were incredibly different to the conversations I witnessed in my day-to-day work as an advocate. I mean, I can give you examples of that if it’s useful.
Johanna: Please. Yeah.
Lizzie: The one that’s most obvious that comes to mind is I went along to one on AI safety and children, partly because there was a rep from OpenAI there to speak about this. And I’m interested in that ’cause Australia’s obviously got a very sophisticated approach to safety in young people and the conversation, particularly the contributions of the rep from OpenAI did not discuss harms. In fact, they were applauded, right, so applauded for offering the opportunity to create bespoke educational resources for people.
And OpenAI was also applauded for having differential presentations or user experiences based on age. So essentially a form of age management. None of which was transparent or explained or clear, right, so no one talked about the suicides that have been attributed to open AI or the growing movement to hold.
OpenAI is responsible for this product, like whether it will succeed or not. There’s a lot of consternation among parents associated with the use of these kinds of technology, right, so even if you don’t agree with it, that conversation is not even canvased or part of it, right, so that is important to note.
The purpose of these summits is still shifting and shaping. I think there’s a role to play in civil society and trying to do work to reshape it. But at the moment, I think a lot of the questions around harms around, uh, returning impact to people in, in local communities, giving local communities a say in how this technology unfolds, they’re not part of the conversation.
Johanna: Yeah. I mean, I have to say from my perspective as well, watching, even if you just map the communique, so the Bletchley Park communique, which was very much about, okay, we need to get the science, we need the international science reports. It was the establishment of that committee and then Seoul was the establishment of the Safety Institutes.
Paris became less substantive, and the communique that has come out of the Indian Summit really has got, very, very little in it in terms of how are we actually going to be shaping this technology for the benefit of humanity, which is obviously TIPD’s, it’s mission, but very much aligned in terms of a lot of what you were saying there, Lizzie.
And just to reinforce, certainly what I’ve been hearing is that most of the really interesting conversations as often is the case, but I think especially so at this conference weren’t happening on the main stages. It was actually the plethora of side events, which, you know, every, you were all speaking at where sort of the interesting conversations were happening.
And Nick, you were involved in several parts of that, both in terms of the research summit, but also some work with the Pacific. So what was your sort of key takeout from the discussions?
Nicholas: I wanted to plus one this point around it was a noticeable absence from the advocacy end of civil society, the more rights protecting and supporting end of civil society.
In terms of the main program, I will say from an AI safety perspective, you know, literally has failed in terms of creating a broad based buy in, in the summit for discussing safety. The most intense safety discussions were, as you said, Johanna, outside the main venue. But there were still the standard advocates and holdouts for AI safety.
Yoshi Beo, you know, was there, Aaron Talon, there was, you know, Stuart Russell was going hard as well. And so there was still a little bit of space, you know, people like Alandra and others who start to also slide into those kind of daily harms, they were there, but they were kind of noticeable by the rareness in a way that was quite interesting.
But also, you know, from what Meredith was saying before, also really contrasted to how strong and well organized those groups were outside the main venue. So the first two days where, uh, Meredith you brought a, a whole lot of also work and influence there from fifties work was the, the MAP AI, uh, organised by the Global Network Initiative, a fantastic set of really coordinated and deep discussions about power and influence from global civil society into the space, which in itself is super valuable, right?
So I think that the commitment to moving towards Switzerland from everyone that I’ve heard. And spoken to since on this slightly more right side of things is to actually, gosh, we have to, you know, twist the arm of those, those bloody swiss early to make sure that, you know, despite the cost and distance, we, we, we’ll actually get something here and something there.
I was surprised by the fact that it was meant to be a south oriented event, and I didn’t feel that the buy-in from or the engagement of the Pacific was at the right level. I felt, I felt that that, you know, there was a little nod here and there, and obviously 89 signatories, as you said to the declaration.
A lot of economy signing up to that. But in terms of actually seeing anyone really on stage faded, asked for their perspectives from smaller and, uh, nations and, and southern nations, you know, the, again, the one exception was Philip Tego from Kenya, the special ambassador there, the president, who Meredith came out with all the talking points that you guys have been saying – sovereignty is not about kind of ownership, it’s about agency. And it was really fantastic to see Philip flying the flag. But I, I’m really hoping that as we move into next year, that that outside organization continues to, you know, have influence and power and then perhaps in Switzerland also gets recognized and legitimized in a way that will push back a little bit against that.
It’s just about the opportunities, the kind of Krauss speech from the US side of things. The head of the Office of Science Technology policy, who was very clear in saying any discussion of risk is bureaucratic nonsense. What we actually need to be doing is focusing on concrete opportunities, and we totally reject any form of global governance.
I feel like we’ve gone quite far in that direction in the official program.
Johanna: Yeah, it’s disappointing. And you think we would’ve learned some lessons, don’t you? So just flagging there for people. The next conference will be held in Switzerland. What I find interesting about that is that it’s being held almost back to back with Davos, which does not fill me with inspiration.
But if anyone can bring together a multi-stakeholder community, it is the Swiss. So I’m not giving up hope and it’s incumbent on all of us here and all of our listeners to get engaged with that. Meredith, one of the conversations that we had as feedback was at the event that Nick was just talking about there, where there was a standing ovation for one of the civil society advocates.
Maybe you can just briefly touch on what that was and what, what she got a standing ovation for, because I think that’s an interesting reflection.
Meredith: Um, yeah. Thanks Johanna. I’m really actually pleased to hear, Nick, that you, well, not please necessarily, but it’s good to know that I wasn’t the only one that picked up on that because I do think that perhaps we could be framing this as an opportunity for the next years.
There was definitely, um, a lack of Asia Pacific representation and voice, but I think to pick up on your point there, Johanna, recently it’s, there’s been a fairly, not just binary notion of sovereignty, but like a fairly black and white mood, I would say across politics as a general, but particularly relating to tech policies.
We’ve witnessed some of the global mechanics unfold, and I think in Australia sometimes we in particular, we absorb and consume a lot of very US specific and perhaps EU specific news. And so to sit there in that room at the MAP AI conference, there was, it was a very refreshing. Experience to see some very impassioned speakers as well on the stage.
Johanna: In case you’re wondering, this event was held under Chatham House rule and that’s why Meredith is being careful not to name names here, but we are gonna try and have this particular speaker come onto the pod as a guest in the near future. So watch this space.
Meredith: But she spoke just so passionately basically in terms of, you know, India and the south, south is not just a market for the West.
There’s an opportunity there for them to actually really leverage all of the incredible scale and depth of expertise that’s already in play in these places and to really stand up and work together. But there was a an anger, it was a, an anger that was motivating. Like it wasn’t an anger that was, uh, necessarily fatalistic.
It was a, a real call to action, and the room was just going wild, honestly, at the back of the room, I ended up messaging her after the session and just saying, I don’t know if you could see up the back, but it was, it was quite something that, how that was received.
Johanna: Jess, what about for you from a key takeout perspective?
What was something that you learned that you didn’t know before you went to Delhi that you’ve come away with?
Jess: I think that there is a real passion for looking at the different intelligences and, and how we support the inclusion of community intelligence into the broader AI intelligence and how we bring those together to make positive social impact.
I think that that was definitely one of the really key things. And at the same time, because our focus as an organization is on digital inclusion, that’s still like 60% of the Indian population are not meaningfully connected. So there’s all of this amazing passion and students everywhere and real interest in how this matters to to India.
But actually, you know, still there is a significant issue with people just not even being. Properly connected, and particularly in rural areas and particularly women. So I think that real need to continue that focus on just the basics is, is so key to all of the conversations that we’re having and to make sure that those people that you know are the last mile.
There’s a lot of talk about the last mile and reaching the last mile of people that are not connected and that are not currently benefiting from some of this technology. I think that real need to focus on that. So it’s, it’s not necessarily new, but I suppose it’s a confirmation of what really needs to happen across all of these conversations.
It’s not just in India or, but actually across the world.
Johanna: Yeah. And picking up on that reflection, Jess, of, you know, the pace and the sophistication of the conversation around artificial intelligence, the different types of artificial intelligence, how it’s transforming things, but also that disparity in how it’s being received.
Nick, for you, were there any big surprises out of the conference in terms of takeouts that you’re like, huh, I did not expect that. Or, that is a new perspective, that my life is enriched by having, being exposed by that, which I think it’s difficult to go to a conference like this and not have those moments.
So which one stands out for you?
Nicholas: I think the biggest that stands out does relate back to our Pacific neighbors because as you mentioned briefly earlier, we, we had the opportunity to bring together nine digital leaders from across six Pacific nations at the Australian High Commission for a round table and a dinner.
And that group actually, you know, had been working together for a few days with the team at DFAT with our amazing new Australia’s cyber ambassador, Jess Hunter, who you know well. And so they’ve been working on this question of what, what is and what should be digital public infrastructure and how does that relate to AI?
And I guess what was really gratifying was the kind of the commitment and the willingness to not get sucked into the hype and the marketing aspect of this, but to say, take it from the perspective of, oh, we can see India doing this. We see Kenya doing that. We see other kind of Southeast Asian, uh, countries investing in different ways.
But what do we need and how does it play across the region so that we can share the fact that some of the countries are 20,000 people, right? Like some of the very, very tiny nations and, and you know, like p and g and others like, you know, relative giants in the, in the Pacific. But, you know, shared overlapping cultures of isolation, really challenging access to telecommunications.
You know, some of the Pacific Islands only got internet landed in the country in 2017. So these are big connectivity and accessibility issues That doesn’t make any sense to say how you’re gonna use the large language models to, uh, to reinvent public services. Uh, what it says is, well, you know, what can we do together?
How can we be flexible? And yeah, it was, it was really fantastic to just be able to listen and have that conversation. To deepen our understanding of where Australia can be a good neighbor in those engagements, but more importantly, to really just understand the conditions for investment in parts of the world where wifi can’t be taken for granted, uh, let alone access to Claude or chat gpt.
Johanna: Hmm. Yeah, and I think that is what I was hoping for more of out of the India Summit because I think there’s so many incredible stories of how India’s taken a technology but built it at speed and scale at cost, right? That is affordable to these parts of the world that are never, you know, the solutions that are coming outta the US or frankly sometimes even outta China, aren’t going to meet the needs of these particular countries.
Meredith, what about you? What was your surprising takeout?
Meredith: I think what was surprising for me was trying to marry up these multiple conversations that I felt were happening. You know, on the one side we had the, you know, large platforms for selling many amazing products, whether they be entrepreneurs or you know, big tech.
And then the other side, we had these global, south south conversations happening, and I was looking for the parallels across them, and I felt very struck by the reality that I just. I think that sovereignty alone without institutional depth isn’t really gonna hold, and there’s this notion of the state sort of being hollowed out that I think a lot of people are grappling to understand.
So while there was like significant momentum behind different layers of the AI stack, whether that be, you know, compute and infrastructure or the foundation models, et cetera. Sovereignty or agency as we’ve expanded that term really needs to be exercised across all six layers, particularly across governance and standards.
And there was far less tension being PA paid to whether like public institutions have the depth to evaluate those layers in particular. So that probably ties that to Nick’s comments around safety and, and certainly Lizzie’s as well. But really, uh, just quite surprising how, that we can talk so at such a focused way around sustained investment into certain areas of that stack, but not necessarily the capability across the full stack itself.
So governments can go out there and fund their own infrastructure or their own models, but as long as they have a lack of internal expertise to negotiate from a position of strength within that system. I was a bit surprised and sad to see that. There wasn’t a stronger understanding around the need to actually invest in that internal capability and to address the structural asymmetries internally.
Johanna: Absolutely. And then with government and their citizens, right? And citizens’ expectations of what government is, is doing. David, for you, what was the surprising takeout?
David: I think there was a real disparity between the public messaging and the private conversations. And so I think as –
Johanna: There often is, let’s be honest, like –
David: But I think, I think it like really noticeably here. I mean, you know, even for people that would say one thing publicly and quietly, they would say, you know, almost the opposite too. So I think that was interesting. I mean, it was also very gossipy. I mean, you know, Bill Gates being pulled from the speakers list, uh, Jensen Wong not turning up. I mean, there was a lot of sort of political intrigue about kind of what was going on and, and why one of those people was pulled and the other one didn’t show up. So I, I think, yeah, it was, uh, it was, the private conversations were far more fascinating than the actual public and summit events.
Johanna: Lizzie.
Lizzie: So I suppose there’s a couple of things, if I can use my position as a guest here. The first one is that it was held over Chinese New Year, so there wasn’t this huge contingent of Chinese reps, whether they’re industry or government, as compared as I understand it, to previous summits. So people might remember Paris last year there was this big confrontation where JD Events comes along and basically tells, he doesn’t like them and they’re gonna have to do what he says. And Europe’s, you know, gets itself very, upset and starts doing stuff in its own way as a result. That’s my take on international affairs.
Anyway, the point is, this particular summit didn’t, didn’t quite reproduce that kind of international dialogue between superpowers.
Maybe because when it was time, maybe there’s other reasons, maybe some of that is now accepted and part of the furniture of discussing these topics in a way that at the time, at least at the last summit, the Trump administration was quite new. The other thing that I think is worth mentioning that is surprising, I mean, it’s not really that surprising to me, but might be surprising to your listeners, is that basically everyone’s talking about a social media ban for young people. So if you’re Australian, you do have something to offer in these places because people want to talk to you about that. Politicians were falling over themselves to announce that they’re pursuing it.
Some of the nuance that we’ve argued for, and you’ve covered in your podcast, Johanna, to uh, a painstaking degree was not in those conversations. So how tech policy is made is often done to suit the agendas of politicians. Like, I don’t, I don’t mean to sound cynical, but what I mean is if we wanna have nuanced, uh, policy that does reflect evidence and harm and the interests of people, there’s a lot of work to do.
You know, there are a lot of ways in which that can be circumvented. So that policy, some people love it, some people don’t. I think there’s a lot to it, but politicians clearly love making the announcement. They’re pursuing it. That tells you a bit about how policy is made and what you’ve gotta do, if you’ve got a particular view on it that is different perhaps to the mainstream.
Johanna: And I think my mission in 2026 is to try and avoid duty of care going the same way that the social media minimum age band did. But let’s see if I’m successful in that. I know there’s a lot of other people who also have given themselves that mission, so it’s not single-handedly. Jess, what was your surprising takeout?
And then folks, I’m gonna do a round of, what did you walk away with from this summit? And a resource that you found at the summit that you wouldn’t otherwise have had. So just giving you the heads up that that’s coming next. Uh, so Jess.
Jess: I do think the real, it’s not so surprising, but it was really different to conferences I’ve been here into in Australia or in, in Europe is just about that passion that, that so many of the community had and that, you know, that I had exactly the same experience as others of people coming up to me and wanting to pitch something to me even though they didn’t even know who I was. Um, so I do think that kind of real interest in how does this benefit not just the whole of the government or industry, but actually how does this benefit me and how does this benefit my community and how can I focus on supporting people to be able to participate. So I think you know, that energy of having students and civil society and community organizations from all over the world, but mostly from India, that real like we, we can do something. Thing and we’re all here and we’re gonna try and make it happen. I think was, was probably the thing that was most surprising to me at the same time, and alongside that real need for connectivity as a key challenge, um, for so much of the global south, and I, I keep kind of having this conversation about where Australia.
In the global south, because technically we’re in the south, but we’re not in the south, the global south, and technically we’re in Asia Pacific, but nobody was talking about Australia. And so I think it’s kind of really interesting about thinking about how do we take a role and what do we need to do in this space too.
Johanna: Yeah. And I do think that we can do something about this. It’s not a fatalistic perspective, but also we must. Like there is an obligation for us to actually take the stewardship role. And that’s actually one of my favorite things about traveling to India is you, I always leave inspired because they actually do approach this from such a uniquely different way.
And part of the reason that I was so fascinated with them hosting the safety summit is that the last time I had been there, it was really like the conversation was, can we stop talking about safety and harms and talk about the opportunity because we are trying to educate. A billion people. And so the, the conversation is just a different conversation and a different way of engaging with artificial intelligence.
Not saying the risks aren’t important, but that the, actually the challenges, the starting challenges are different challenges, and I think that is actually really important dimension in terms of the dynamism that they bring to the conversations.
So Nick, for you key outcome, like what was something that you achieved or that you feel was a deliverable out of the conference for you and key resource?
Nicholas: I love the fact that Australia was overrepresented relative to other nations in that research symposium. It was fantastic to have Leming J and me both on like a key top level panel and that that just, that was really gratifying, um, of course to see how, yeah, that highlighting where Australia’s at in terms of AI regulation approach, uh, and, and some of the practical tools. I also really came away re-energized about the importance of really structured stakeholder engagement. I spent a lot of time thinking and talking about even the, the mathematical ways to kind of pull that out of research so that it’s, you know, quantifiable views. But the best resource was some of those. The Monday, Tuesday group of Civil Society leaders, El and i from GNI, Ja Luck from the National Law, University of Delhi, Isabella from Chatham House. The strength of that community, uh, I think is what’s gonna see through, particularly in the work that all of us care about, which is that human impact.
Johanna: David, for you, what was your key outcome?
David: I think the conversation shifted from sort of this hypothetical existential risk to, to really more tangible impacts. And I think there was a big conversation around what does this mean for human capital? What does this mean for, uh, the future of work? Like, you know, what the role for humans and all of this, and I think particularly in the global south where, you know, the, the focus is very much on education and uplifting using technology to, you know, elevate.
I think there’s a question about like, is AI going to help do that? Or is it gonna be a problem, you know, going forward? So I think that’s, that’s a live conversation. The, the one resource, it wasn’t actually at the summit, but, but post the summit, I, I’d recommend Andrew Charlton’s speech, uh, to the Australian Business Economists conference yesterday, so the 24th of February, which I’m sure was influenced by what he experienced in, in Delhi.
And so I think it’s a pretty sensible and balanced discussion of the global economics at play. And where Australia’s strategic advantage potentially lies, particularly in the nexus between the energy and infrastructure and the software and services layer and the need to ensure that we, and most importantly, Australian workers capture the benefits.
And I think it’s inherent that we move quickly to assert our agency over those areas. So I think it was, it was a, it was a good contribution.
Johanna: Yeah. We’ll pop a link to that speech in the pod notes. It was really heartening to see that. Meredith, on that note, what was your key takeout or resource?
Meredith: Perhaps just to end on a fun one, I’m gonna out myself as being a ginormous nerd with full caveat. That TIPD is, I feel like –
Johanna: you’re in good company there, like –
Meredith: True. Um, the full disclosure is that we, TIPD is actually just about to launch a new project with Lego called the Tech Policy Youth Ambassadors Program. But this particular observation is entirely my own and entirely due to my own love of Lego.
But they’ve designed a new program that builds skills progressively for kids across. Computer science concepts. So the kids start out by imagining and building their very own Lego robot, as you would imagine, but then they get to work in groups to actually code and experiment with it through like a curriculum aligned lessons that help introduce and socialize the concepts for children around probability, bias machine representation. So they get to train with simple models. They get to unpack the notions around like mis-classifications and begin to therefore understand how AI works itself through hands-on teacher friendly privacy first and classroom contained experiences. So this is not a product plug I really wanna say, but it is a healthy reminder.
I think for me that AI literacy doesn’t have to mean embedding children into commercial platforms. We can not only equip. Kids with capability, but actually they’re more than capable of grasping what many adults, myself included, might, might treat as magic. And I know what my nephews are getting for Christmas.
Johanna: Awesome. Who haven’t I asked this question of, I’ve lost track, Lizzie and Jess, so let’s go Lizzie, then Jess.
Lizzie: Yeah. I have to reserve my judgment a bit, right? Because I went along and I felt like there’s, uh, not enough of a role that civil society is playing in some ways, but also like, is this a good use of my time?
Like I’ve got a lot to do back in my own country, right? And maybe I shouldn’t be coming along to something that isn’t really designed for me, right? I don’t mean that in a judgmental way, I just mean as. A way you allocate your time and, and energy as an advocate. And I, I had thought maybe that these summits would be more like a eventually over time, like a cop, like a conference of parties on climate change, where you have this kind of side festival going on where civil society advocates get together and put forward proposals like states are talking and stuff.
And I’m not sure it’s really there right In that way. It’s maybe that it, it, it becomes like that. It’s gonna be interesting to see how Switzerland runs it. I think it was Boris Johnson who said the best advertisement for, for London is the night in Geneva. So we can all make jokes about the Swiss, but the point I’m making is it is tricky to know whether to engage with these in some ways.
Uh, sometimes the worst thing is not being in the room, right? But other times it is a huge investment of resources in circumstances where there might be other places to put that. So I’m trying to reflect on that and work out what I might do next time, whether we wanna do something else instead. I would just say I had a day when all the.
The big heads of state were in town, so no one’s allowed into the conference center. And I went and saw my friend who’s a civil rights lawyer in, in India. She was making submissions in the Supreme Court of India on that day. And she took me along to, to watch and she’s on her feet arguing, uh, in defense of Civil Rights in India.
And I thought to myself, oh, maybe litigation, which is my other life, right? Is one of the key ways you make change. She’s advocating for equality and, and freedom from discrimination based on religion in real time. And that brought home to me that there are lots of ways in which you can influence these processes politically.
This is one, but there might be more direct ones and we’ve all gotta work out how you use to use our skills to their greatest strengths, I suppose is, is how I describe it, to make sure that that’s that process is working. And so I wanna be involved in the process. Maybe this is where it’s at, but I, I’m not necessarily sure that it is, but I might reserve my judgment for a little bit longer.
Johanna: Yeah. And I, I do wonder if it’s a structural thing as well. So I definitely think Australia needs to be at this conference is really important that we are there and present, but maybe it’s actually encouraging the government to have more consultation with civil society before it goes so that it’s like reframing it in that way.
So creating opportunities. Nick, was that you wanting to jump in with something there?
Nicholas: I just wanted to say to the point about what happens at home domestically for all the countries that were present while the AI summit was going on, pretty much in the same week that the secretary of the, you know, the Office of Science Technology policy in the White House was saying we reject kind of global governance and, you know, any regulation is a, a drag on opportunity.
Four separate US states passed either through the house or out of committee, really critical policy movements on AI companions and laws proposal around addiction into, you know, laws that, frankly China’s had on the books for a long time, but, but these states are moving and so yeah, to kind. Take a signal about the kind of the cross country national cross national scene from this one conference when, as Lizzie said, there are people fighting every day through courts, litigation and committees, and parliaments.
It’s important to look down and see where that real impact is happening as well.
Johanna: Yeah, absolutely. And Jess, with the closing observations in terms of key output and any resource.
Jess: I mean, I, I just would say again that I think there were really good examples in the sessions I was at about the use of AI and to benefit real issues.
So there were, you know, digital green is looking at how they support farmers. To better prepare for disasters by using their information as well as predictions for climate around where flooding might happen. Like that kind of thing is really helpful to people and the lives that they’re living. And so it’s actually ultimately about the outcomes that we’re looking for.
What are we trying to create here? Not just the use of the tools. And so I think there are really, really interesting examples of how that’s actually happening now and it is really benefiting people and their lives. So I think that definitely was one of them. Also, again, I’m really interested in this. How do we make sure that there is a, a broader collective intelligence?
Being used, not just out of fill, out of all of the large language models. So how do we kind of collect those two together? And there was a really interesting organization, collective Intelligence project that is doing global dialogues, that is bringing together people from all over the world, and I think they’ve done like 70 countries, lots and lots of different people.
Only issue is you have to be online. So that’s one of the challenges for it. But definitely is looking at how do we make sure that there’s broader conversations happening, broader people investing in this information in the, in the large language models and how we use the ai. So it’s not just about the use of the people that are already in the space.
So that would a collective intelligence model and you can download their data. It’s really awesome.
Johanna: So awesome. Well, we’ll pop a link to that in the pod notes and I love ending on a positive note by the sounds of it, a fantastic experience for people to be on the ground and perhaps some questions in terms of how this needs to be shaped to be impactful and, um, optimistic for what the Swiss might be able to do with this.
They’ve got the muscle memory for it, so we’ll be watching very closely. Thank you so much for all of you, for generously giving your time in this. Somewhat chaotic, but I think wonderful insight into what it was like in Delhi on the ground. And we hope to see you all on the podcast again sometime soon and keep doing the extraordinary work that you are all doing because it is making a difference.
Thank you.
Well, that’s it for this episode of Tech Mirror, which is brought to you by the Tech Policy Design Institute. We are based here in Canberra on the lands of the Ngunnawal Ngambri people.
If you found today’s conversation useful or thought provoking, please do share it with a friend or a colleague, or leave a review and subscribe wherever you get your podcast.
If you’re watching, please don’t forget to like and follow. For show notes, you can visit tech policy au slash podcast. And this podcast was made possible with thanks to the generous contributions from government, industry and philanthropy to the tech policy design fund, the full details of which are available on our website.
The team at Audiocraft produced this pod on the lands of the Gadigal people of the Eora Nation and Amy Deme provided invaluable research support.
Music is by Thalia Skopellos.
A big thank you also to the team at the Tech Policy Design Institute, without whom this pod would not be possible.
Thank you for joining us and as always, get in touch and get involved.