Speaker 1:
CLIP
You’re listening to a frequency podcast network production in association with CityNews.
Jordan Heath-Rawlings
The other day I saw something awful out of Gaza. It turned out not to be true. A few days before that though, I saw something awful out of Gaza and everyone told me it was fake, but it was true. I read reports on what’s happening in the region of the civilians, including children and journalists killed in Israel’s effort to eradicate Hamas. I sometimes listen to reputable experts on podcasts who try to give me context to better understand the situation, and then I open up Twitter or X or TikTok and I just have no idea anymore what’s real. I’m pretty media savvy. I navigated this stuff when Russia invaded Ukraine and I wasn’t perfect at it, but I never found myself entirely lost. Something’s changed here. I don’t know whether it’s the platforms, the bitterly polarized sources of information, the lack of actual reporters on the ground because they keep being killed or all of those things, but I’m lost. I don’t know what to believe when I see it. And I bet unless you are choosing what to believe because it fits your own feelings that you’re lost to. So what do we do about it? Is there anything we can do?
I am Jordan Heath-Rawlings. This is the big story. Dr. Valerie Wirtschafter is a fellow at the Brookings Institution in Foreign Policy and the artificial intelligence and Emerging Technology initiative. She’s very busy talking about Misin info and Disin info these days. Valerie?
Valerie Wirtschafter:
Yes, quite busy, unfortunately.
Jordan:
Well, I guess the first thing that I’ll ask, this is something I felt, I don’t know if it’s true or if you’ve noticed it as well, but what’s different about parsing the misinformation or disinformation out of Gaza compared to past conflicts? And even just when Russia invaded Ukraine? There was a lot of disinformation around that. I didn’t find it as difficult as I do now.
Valerie Wirtschafter:
Yeah, I mean I think that that’s completely fair. It’s hard to say, especially at scale. If it’s quantifiably worse, but certainly feels worse, it feels more impactful. I think some of that is also due because we see the real tangible impacts potentially on diplomacy that some of these types of claims might have. So backing up, I guess, to the Russian invasion of Ukraine, my colleague and I had looked actually at the spread of Russian propaganda in Latin America after that invasion, and we saw a massive, massive spike flood of conspiratorial narratives to the region. Very state driven, huge boost in engagement. We saw that same huge spike after the initial attack on Israel, followed by the invasion of Gaza. And so it’s kind of natural that there’s this just giant spike in the amount of content circulating online. I think in both cases there’s the supply and demand problem.
The supply of credible information is low, the demand for it is extremely high, and so in that space, misinformation potentially disinformation can circulate quite a bit. I think what’s different is that, and it’s hard to get a gauge, I think, on what is maybe more manufactured and more organic in the case of the Israel homeless conflict, both I think because of lack of data access, so many actors in this space. There’s also, I think the conflict’s particularly maybe more messy just because of the fact that this conflict in particular really evokes strong reactions on both sides, which makes evidence blinging back and forth, images blinging back and forth. I think particularly people might be more prone to falling for recycled images, falling for recycled video footage, misleading claims, et cetera. And so I think that that’s one big difference. Whereas in the Ukraine case, it was everyone versus the Russian propaganda apparatus for the most part. But in this context, there’s bias and information sharing across both sides. There’s strong emotional reactions on both sides. And so it’s harder, I think, for a variety of reasons, whether it’s the nature of the information purveyor, the type of the conflict. And then of course, I think that there are real platform level changes, particularly at a place like X, that make it much, much more difficult on that specific platform to be able to parse what is sort of fact from fiction versus what is really kind of unknown at this time.
Jordan:
And especially in terms of for better or for worse, whatever you think of Twitter rx, it has been for a long time kind of where the world goes for on the ground images of breaking news. Maybe just tell us a little bit for those who are lucky enough not to follow the drama surrounding an online platform, how has it changed? How has that ground shifted? What is different?
Valerie Wirtschafter:
Yeah, so I’d say Twitter slash x, in the past, it was sort of this megaphone for the real time voices gave that hyper localized view into the world, always had biases, always had challenges, was not this perfect place at all. But there have been some shifts, especially with the takeover of the platform by Elon Musk, leaning more heavily into crowdsource content moderation approaches, known as community notes. And so that’s not inherently a problem, right? Wisdom of the crowd solutions exists all across a lot of places. Reddit is one. Wikipedia is pretty much holy wisdom of the crowd, but especially in times that are fast moving, the crowdsource solutions can really get things wrong. And so we’ve seen that in some cases where they’ve put a community note on actual real footage saying that it is fabricated, it doesn’t move fast enough. So the way that this particular content moderation approach operates is that it relies on consensus from what they describe as users of different perspectives.
And in a conflict that is really, really emotionally, the reactions are really strong on either side. That type of consensus is going to be really hard to find. Whatever way they’re defining consensus, it’s going to be far more difficult, I think, especially in a really polarized conflict. And then I think there’s other changes for those who, again, haven’t followed the drama as much monetization practices to be able to pay people for their content, for the eyeballs and the clicks on their content. Again, not inherently bad, but prioritizing that sort of metric as a heuristic for payment. People are going to take advantage of it. They’re going to share things that are shocking to be able to do that monetization, to get money on posts, that the truth of the post is secondary to whether or not it goes viral. And then the last bit of this is the elimination of verified accounts.
So those were previously provided to public figures, journalists, businesses, government figures, notable commentators, things like that. And now anybody can pay for them. And so not great with that payment. You get a boost in the algorithm. And so people, I think still have a little bit of a heuristic that these blue check marks stand for. You have something to contribute widely to the public square as Twitter likes to think of itself or X, but now it’s really just anybody who pays a small fee. And so all those things sort of maybe seemingly innocuous changes or maybe at the time sort of chipped off one by one, didn’t maybe seem to be as consequential as the collective impact of having them all existing together.
Jordan:
The other thing that you mentioned that really hit home to me is the information vacuum on the ground, which yes, existed in Ukraine, especially in the early days of the invasion. But when I think of what I’ve seen, or more importantly, I guess what I haven’t seen out of Gaza is just how few journalists are able to get information out and how many of them have been killed trying to do so.
Valerie Wirtschafter:
Yeah, and it’s horrible because that’s the most critical thing at this point, right? Is verifiable fact based on the ground information. I think something over 60 journalists have been killed so far. I was reading that it’s something like the same number as throughout the entire Vietnam War, which is insane. The challenge is, of course, so in the past, I think before this conflict, there were around something like a thousand active journalists in Gaza. Freedom of the press is not a priority for Hamas, of course. And so it’s sort of two ways of getting information on the ground. You’re relying on journalists in country, very important. You’re relying on the IDF and neither of these solutions. Of course, the IDF provides the security, but one of their requirements is that if you come with them, they review your footage. And so in either case, whether that review is purely to get rid of secret military operations or hide IDF personnel so that they’re not publicly identified or something like that, something innocuous, potentially that sort of requirement is immediately discrediting.
Whereas on the flip side, the recognition that the media environment has not been sort of supportive of a free press political expression is highly restricted. Journalists have been assaulted in the past in Gaza, also does not bode well for the other side of the information space. And so the challenge is that regardless of what kind of on the ground voice you get, which is still very, very, very limited, either way, it can be immediately dismissed with that degree of skepticism, given the implicit mistrust and our preexisting beliefs about the right and wrongness of either perspective.
Jordan:
I want to know how this happens, and I don’t mean how somebody decides to make a post, but how this cycle works. And I’m wondering if you can just take a recent, notable example, whatever the first one is that you can think of and just walk me through its life cycle from becoming allegedly breaking news out of Gaza to eventual debunking. And even after that, I guess it still lives on,
Valerie Wirtschafter:
And it often lives on even louder than the debunking. The debunking is the quiet part. The truth spreads much more slowly than something that is kind of shocking or viral. And so we saw that also with the hospital. That was sort of at the very start of the conflict, or maybe not quite at the very start, but a few weeks after, there was a rocket that landed outside or on inside. The details are still complicated, but landed at a hospital in Gaza killing potentially thousands. It was reported on spread virally online, reported on by the mainstream media by the New York Times, and led to quite significant diplomatic consequences in the sense that Biden was in the Middle East. He had tons of plans to meet with counterparts across the region, and those meetings were canceled. And so of course, accusations were hurled. This was a Hamas spider Hamas rocket?
No, it was the IDF. No, it was a rocket back and forth. Believe what you want based on your predisposed beliefs already, people still don’t necessarily follow the evidence in this case. But the latest on that is, I believe it was Human Rights Watch released a report that was very matter of fact, did an investigation. And I think their findings that they have leaned toward is that it did come from inside Gaza. And so of course, they cannot fully wholly concluded, but that’s the best possible conclusion that they’ve come to, given the evidence that is, of course, it’s important to get to the bottom of things often you can never get there fully satisfactory. But that sort of reporting was much less important than the initial spike in vitriol that led to serious diplomatic consequences in this case. And so I think that that is a particularly challenging example that led to actual tangible impacts, not just online, but offline. Another example I can think of more recently is these photos of men surrendering in Gaza. They were already in their undergarments. These
Jordan:
Are circulating right now as we’re talking,
Valerie Wirtschafter:
Yeah, circulating right now. There’s a lot of talk of if they are misleading or staged or propaganda, still unclear. Maybe this is tied to the IDFs kind of desire to share information, to share that they’re making progress. Potentially. It’s still unclear exactly what happened or what these images come from, but potentially to show, oh, look at all the people. We’re rounding up all these Hamas fighters, but there’s this ambiguity or lack of clarity from leaders. So it kind of starts with that ambiguity or political agenda, which leads to kind of the fomentation of conspiracies. So one of them that was circulating is that they were images that circulated online, that they were all staged, that they were different takes of the same person in this sort of staged display, look at the way the gun, just hands, et cetera. But they’re all different. And so what we see in this case is that there’s kind of an ambiguity or a lack of confidence in the initial information source out there, leads to pushback and then further escalates, fomentation of conspiracies.
The evidence is taken out of context. It’s boosted by people who are trying to understand the initial context, and then the debunk comes later. Of course, these are screenshots from a larger video. And so the real video is just the same person going back and forth putting different weapons down. So it’s not anything about, oh, the gun shifted hands and these takes and this and that, but it’s kind of like a propelling cycle or a vortex in some respects where something is ambiguous. People latch on to alternative explanations. Those spiral, the evidence in that is suspect as well. And so it becomes kind of a whole life of its own and then the debunks come later because that takes time. Do we
Jordan:
Have a full on debunk of this yet? Do we know what it actually is?
Valerie Wirtschafter:
So we have a debunk of the idea that it’s a take, but we don’t know. We still don’t know what the nature of the original recording was.
Jordan:
What is the average person? And I include myself in this because I like to think of myself as pretty media savvy. I was a journalist for a number of years. I’ve been doing this show for five years. I thought I could parse information out of Ukraine fairly well in that war. And I feel like I am lost here. And I’ll give you an example too, because this suspicion that all this misinformation and disinformation creates works the other way too. Just a warning that I’m going to describe a graphic image. But a couple of weeks ago, there was an image of a Palestinian man holding what appeared to be a dead child, and it was gripping. It was awful. It was horrific. Immediately I saw a quote debunking of that saying like, oh, no, no, that’s a doll. And it made it into pretty wide circulation before again, some reporting was done that indicated that this was obviously an actual child. And for me, as somebody who trusts himself usually in navigating this kind of stuff, it just stopped me in my tracks that I could be so mystified minute to minute by the stuff that’s coming out of here. And I guess all that is to say when something like the image I described or the clips from a video that you just described pops up in my feed, what should I do?
Valerie Wirtschafter:
Yeah, I mean, I think it’s a challenge for everyone. It’s a challenge for me. It’s a challenge for anybody who is consuming period. It’s just that we have our biases and we struggle, I think with information that goes against our prior perceptions. We are quick to endorse information that we think confirms them. And I think in the context of this conflict in particular with those strong emotional reactions, we’re really quick to hop on narratives that seem to discredit the team we’re against or boost the team that we support. And so I think that’s really on display in this particular conflict. I think beyond that, there are obviously challenges around sourcing, image origin, people capitalizing on that for those clicks, for that virality. And those are things that we can, if we take a minute, if we step back and remove ourselves from this rapidly moving sort of knee jerk online space where something pops across your feed quickly disappears, we might hop on it without clicking the link. There’s some stats about that in terms of the number of people who actually read beyond the headline is very low before they share. And so we saw a little bit of effort at Twitter to encourage people to read headlines before they actually retweeted something,
Jordan:
That little notification that pops up, are you sure you want to share this? You haven’t read it.
Valerie Wirtschafter:
Exactly. And that was a nudge, and I’m sure they’ve collected evidence on how often that worked, but that’s that knee jerk reaction. And so it still exists, whether it’s around images or around sort of viral threads, videos, whatever it is. And so if there are ways to kind of step back from that and remove ourselves to be able to actually explore things a little bit, it’s at the end of the day, not terrible. It’s hard. It’s never perfect. There are ways to do better. There are ways to think about sourcing of images, looking at expert credibility, things like that to get a sense of are we actually being fooled and sharing something that just confirms what we hope is true? Or are we sharing things that we know are true
Jordan:
When it’s so difficult for the average person to navigate this type of stuff? It’s tough to feel like it’s all on you to be able to discern what’s real or what’s not before you share it or send it to somebody in the big picture. Is this something that we are equipped to do something about? And by that I mean different levels of government or regulations. And it’s always dicey when you’re talking about the sharing of information and getting government involved. But I’m just like, what should be done? Because obviously every time a new conflict arises or a new breaking news event arises, it seems to be getting worse.
Valerie Wirtschafter:
Worse. Yeah. I mean, so kind of thinking back to that example about the do you really want to share this types of nudges and encouragement at the platform level, I think can be really valuable. Are there ways to encourage this to be able to tell people to pause a second before hitting share? I know Google has been rolling out some tools to help people better investigate images in the provenance of images, provenance videos where they’ve shown up before. So I think there are things that platforms can do that don’t really touch on questions of censorship in content moderation that I think it’s a different discussion, but a very, very controversial and vibrant one. But just these kinds of nudges to encourage more thoughtful consumption of information. It does go against the sort of profit incentives and value of keeping people on platforms, keeping people engaged, keeping them clicking through their timelines.
But I think that’s hugely important. And then I think that there is a really, really, really important role for these unbiased sources of information. It’s perhaps the most difficult challenge, but I do think there are ways to at least do better. And especially thinking about some of the missteps that maybe the New York Times made. And I think that the baby images that you’re referring to were also picked up by seemingly reputable newspapers in Israel. And so I think that it’s really critical for, especially at a time when trust in media is low, to take that sort of scrutiny of images, sources, information, credibility really, really, really seriously. Especially at a time when I think that boundaries between true and false are really hard. And when it might not be the popular opinion, maybe everyone gets mad, but it’s the responsible thing to do, and I think extremely critical to build trust in neutral arbiters of information.
Jordan:
Why is it a problem when there is so much mistrust out there? At the same time, while we debate fact from fiction, there is a real war happening and homes are getting bombed and people are dying. And we’re sitting here wondering whether or not it’s true or not. And I’m not sure you’re equipped to answer this question. I know I’m not.
Valerie Wirtschafter:
Yeah, I mean, I don’t know. I think that it is really, really difficult to approach something, and especially where some of the images are maybe challenging us to think a little bit differently about something that we had previously approached in a certain way. But I think having that kind of openness and sort of a baseline level of principles that are grounded in trying to get to the truth as best as we can, and also recognizing that the story is not simple. And so it is really complicated and it’s even more complicated probably than the online information space. And so especially in something that evokes really strong reactions, it’s doing so for a reason. And so being open, I think, to understanding those reasons is really important without contributing to the challenges at hand as well. And so that’s the best that I have to offer on that. But it’s really difficult and hopefully we can at least try to make it less difficult, especially thinking about those cases where the online conversation has made diplomacy harder, for example, I think are really, really critical.
Jordan:
Valerie, thank you so much for this.
Valerie Wirtschafter:
Thank you so much for having me,
Jordan:
Dr. Valerie Wirtschafter of the Brookings Institution in Foreign Policy and the artificial intelligence and emerging technology initiative. That was the big story from More From us, including previous reports on misinformation, which you can just search for at the big story podcast.ca. That’s where you go. You can also send us suggestions or comments or questions or complaints. We welcome any kind of feedback really at hello at the big story podcast.ca. If you can’t get off Twitter, me neither. You can find us there. We’re at the big story FPN, and you can call us and leave a voicemail, 4 1 6 9 3 5 5 9 3 5. The big story is available wherever you get your podcasts and wherever you do, don’t forget that we love ratings, we love reviews, and we especially love word of mouth. If you want to share us with a friend, thanks for listening. I’m Jordan Heath Rawlings. We’ll talk tomorrow.
Back to top of page