Artificial intelligence has disrupted the way many of us work, study and even play. The people building it are convinced it will cure cancer, turbocharge productivity and solve climate change.
Fifteen years ago, there was a similar wave of optimism around social media, with promises of connecting the world, catalyzing social movements and spurring innovation. While it may have delivered on some of these promises, it also made us lonelier, angrier and occasionally detached from reality.
Few people understand this trajectory better than Maria Ressa, the CEO of news organization Rappler. In 2016, she reported on how Rodrigo Duterte, then president of the Philippines, had weaponized Facebook in the election he’d just won – and she was awarded the Nobel Peace Prize in 2021 for using “freedom of expression to expose abuse of power, use of violence, and growing authoritarianism in her native country.”
In the first episode of a new Globe and Mail podcast Machines Like Us, Ms. Ressa joins host Taylor Owen to discuss how AI’s promising future sounds all too familiar to the challenges we faced during the rise of social media.
You have spread awareness about how social media can polarize us and make us more radical. What do you think of artificial intelligence?
I’ve become far more cynical, because I think what I used to dismiss as naiveté is now quite dangerous for the world. And generative AI, it makes me crazy. First, let’s say that in How to Stand up to a Dictator, the book that I wrote, there were two men that I actually focused on. The dictator, like Duterte, but the bigger one was Mark Zuckerberg, because his power is global in scope. And it makes me crazy that there was absolutely no responsibility for it.
If you look at the new round, these are similar companies. They’re the same companies in some cases. If you look at large language models and generative AI and look at the idea, which is speculative in nature. No one has actually proven yet that this will work. What we have seen with generative AI is very focused applications, largely in the medical field work, right, which creates the promise, but they jump into it. They make large assumptions without any evidence.
When you talk about the information ecosystem. It’s a whole different ballgame from doing generative AI for radiology, which is very specific.
Should the people developing AI make the process more accountable?
Every day that democratic governments do not exercise their power to regulate and continue to cede to big tech, even now in the age of generative AI, they fall for the lobbying, and for this idea that they don’t understand it. They don’t have to understand it, what they have to prevent are the harms. They need to step in and do this now before it gets worse.
There’s an AI startup called Replica, which is offering you a constant companion. If the first generation of AI and social media weaponized our fear, anger and hate, this one is going to weaponize our loneliness. This is where governments, again, cannot abdicate responsibility. They cannot cede their power to regulate to the big tech companies.
It took a decade to figure out regulatory tools for social media platforms. Should governments expedite bringing in legislation to regulate artificial intelligence?
In my Nobel lecture in 2021, I called social media a toxic sludge. Now, what’s going to happen is that you are going to have a virtual world full of not just toxic sludge, but people will not know what’s real and what isn’t. And that will destroy trust even more.
I’ve said this so many times, without facts you can’t have truth, without truth you can’t have trust, and without trust you can’t have democracy.
What implications could AI have for trust in journalism?
Journalism is going to die in this age. If a majority of the internet is low quality content, what happens when people tune out when they distrust everything? That was actually what the Russians wanted to do, right? 2024 is an election year, how are we going to get people to care and understand that, despite the crap they are wading through, this is the moment when we must organize ourselves, our own communities, to stand up for the values and the principles that are critical.
I know there’s a lot of doom and gloom, but what does a world without manipulative tech look like. Can we not make it better?
What are your thoughts on AI’s effects on search engines, which are one of the biggest tools people use to get information and news?
What happened with AI when ChatGPT walked in, in November, 2022? When they rolled it out, they began an arms race, and so now you have 10 different large language models.
When you look at all 10 of those, none of them are transparent in what data they fed the machine. Stanford did a study that showed this. They all failed in terms of transparency.
Having said that, once search generative experience really kicks in, that will kill search traffic to news sites. So what do we do? At Rappler, we started building our own tech, building tech for the public information ecosystem that ensures integrity of information. That should be the government’s job, frankly. But they’re not doing it right. They outsource this to private companies driven by profit.
How should governments be thinking of their role in democratically developing technologies, such as AI?
It’s kind of like building roads. Private companies can build roads, you have public private co-operation that can do it, but this is not just roads. This literally touches the heart and mind of every person. This is the reason why information warfare can hit at the cellular level of a democracy, because we’ve allowed private companies to do this.
I think the first step for democratic governments is that they cannot abdicate their responsibility to protect their citizens. You have to own it, you cannot outsource it to these big tech companies. And you must limit their powers otherwise, every day that governments do not act, they lose more and more of their power.
Governments need to realize they do not want an Elon Musk, a person who has no accountability, determining whether Ukraine will be able to fight back in Russia.
They need to protect and empower the people who are building in the public interest, in meaningful ways. At least limit the private players, so they’re not experimenting in public, and make them responsible for the harms that they have created.
You have stated many times that we are in the last moments of democracy, potentially, this year. What do you mean by that, and why is it such a perilous moment right now.?
In 2021, I said 2024 was going to be a tipping point year, because in half of the world, more than 60 elections are being held in 50 countries around the world, and our information ecosystem is corrupted. We are being manipulated. How do we make our choices? Why did violence happen on Jan. 6 in the United States, on Jan. 8 in Brazil? This is not a coincidence, it is by design.
So this is the year we need to take our agency back. And I continue to appeal to those who have the power to change things right now, which are the big tech companies. Do you really want short-term profit over the death of democracy?
We’re already a quarter of the way through the year and we’ve had a few elections already. What gives you hope that we can make these sorts of big changes this year?
Poland! In the short term, you have to appeal to the people themselves. Civic engagement is what will take us through this, and Poland is a perfect example. There was a right-wing government that should have won. They were all set to win, but then the government in Poland passed an abortion law that was so brutal, that it brought women and youth out on the street, and they voted! We are democratically electing authoritarian style leaders, except in countries where citizens feel their back is up against the wall.
AI has been heralded for its ability to provide custom-built answers for people’s niche and specific queries. Could that be problematic?
Every time you personalize, you tear that person away from the public sphere.
Let’s say we’re in a room full of 25 people, and you give every one of those 25 people their own reality, what they want, right? That’s not a room where we’re all together. That’s an insane asylum.
This is the world we’re building. You need to please get up, get off your chair and talk to your family and friends. This is the year that matters.
You have been on the front lines of tech disruption in news before, when news on the internet and social media were still new. How did that shape your understanding of journalism and its relationship to the medium?
Up until 2016, I loved technology. One of the best parts of working for CNN at that time was we were one of the 12 test bureaus. So any new tech we would test, and we were live from everywhere. But I lived through that transition when we had a reporter and my team would have two weeks to do a story. You talk to people, you understand them, and you have two weeks and you come out and you have a lot of stories.
Now, you don’t know who you’re getting the information from. Especially in the age when you can now create video or create audio, you can make it up! This is the reason why I know gatekeepers are necessary. The gatekeepers are legally responsible for the public information ecosystem. News organizations are funny things because we have a set of standards and ethics.
So, it’s funny to hear tech companies now say we can self-regulate. Well, such standards and ethics are only the first part, but we are also regulated by law. And that is the biggest mistake, that democratic governments have made. Because big tech has moved to a place where there are far more negatives than there are positives, that they have built into the design of these platforms.
Looking at the big picture, what can be the larger cultural impacts of this quickly developing tech?
With social media, I wish there were more academic studies about emergent human behaviour. Biologically, what happens when you are constantly on a dopamine high, when the synapses of your brain, which are supposed to go straight, constantly turn right? So there’s an evolutionary effect. But beyond that, what does it mean for our species?
Because I think this tech we carry, the cellphone we carry around with us everywhere, it’s transforming not just our systems of governance and the way we deal with each other, but us as a species.
This interview has been edited for length and clarity.