CLIP
You are listening to a Frequency Podcast, network production in association with City News.
Jordan Heath Rawlings
It happened without me even realizing it. In fact, it happened to me while I was criticizing other people for doing it. See, over the past couple of years, I have attempted to make the internet work better for me. Though, as you will see, there’s another term for that. I block unpleasant people liberally on social media. I have moved further towards subscribing to newsletters to get information rather than just browsing the internet. I have followed subjects and topics I enjoy spending time with, and I have muted all posts with words and phrases that annoy or offend me. And of course, I also spend time on TikTok, which basically does all that stuff I just mentioned for me algorithmically. So my online life is now much more pleasant. And yes, as you’ve guessed, largely without any dissenting points of view.
I created my own echo chamber now. I think my echo chamber is a good one. The sources in it are reliable, the reporters are trustworthy, but it definitely is an echo chamber. And every now and then, I am sure that I miss something that I’d understand better if I were open to hearing it, and I have probably been misled by mis- or disinformation. And guess what? So have you. How am I so sure? Well, you are about to meet the man who literally wrote the book on how misinformation spreads like a virus and how we can inoculate ourselves against it. But he falls for it too. So none of us are safe. Our echo chambers, no matter how good they are, don’t help, and exposure is inevitable. So what do we do? We vaccinate ourselves using a little bit of the virus that’s all around us, and then we vaccinate our loved ones. Here’s how.
I am Jordan Heath Rawlings. This is the Big story, Dr. Sander van der Linden is a professor of social psychology and society and director of the Cambridge Social Decision Making Lab at the University of Cambridge. He’s also the author of Foolproof: Why Misinformation Infects Our Minds and How To Build Immunity. Hey, Sander.
Sander van der Linden
Hey, Jordan. Pleasure to be on the show.
Jordan Heath Rawlings
Well, thank you so much for joining us, and I’m hoping we can dispel a few myths and offer some practical advice. I found your book helpful, though like so many books on disinformation, a little disquieting. Well, let’s get into it. So why don’t you start by telling me about this game you built. It’s called Bad News. Why did you build it? What did that teach you? Because I feel like after reading your book that this is kind of the core of the project.
Sander van der Linden
So Bad News is a kind of a fully fledged social media simulation that allows people to sort of experience the worst of the internet, and in fact actively participate in it. And Bad News is a pun. It’s supposed to show people kind of in simulated weakened form how online manipulators can take advantage of us by spreading disinformation and drilling down in some of the techniques that they use to do people that not everyone might be familiar with. And the thing with the game is that, you know, we started out doing lab research and you know, getting people to the lab and having them read something or you know, in an experimental setup. And we just thought like, well, if we’re gonna produce some practical research on misinformation and how to help people spot it, this isn’t really gonna do it. I mean, nobody’s going to be interested in it.
So what can we do to make it fun, to make it interesting to, you know, so it doesn’t feel like you’re listening to some boring media literacy lecture. And that’s really how we came up with the idea of of the game which was meant to challenge people. It’s meant to be kind of sarcastic and funny. And allow people on their own terms to come to grip with some of these as I call them, you know, the dark arts of manipulation in really a simulated sort of setting. And for research purposes, I mean, I’m not sure you know how much people care about this, but for us, for research purposes, it was also an important development because a lot of the testing that we do, can’t really take place in realistic settings or highly realistic settings because, you know, we don’t own social media companies. We were not able to actually run experiments usually on their platforms. And so we thought, let’s build one of our own
Jordan Heath Rawlings
And we’ll return to Bad News in a little while and definitely get into those dark arts. But first, right in the intro to the book, you write that at a basic cognitive level, we are all susceptible to misinformation. And I just have to ask, how serious are you about that? Even me, even people who are extremely, and listen, I’m not trying to brag or anything. I am extremely online. I’ve been a journalist for 20 years. You have a bunch of quizzes in the book, and I ace them all.
Sander van der Linden
That’s good to hear.
Jordan Heath Rawlings
Yeah. I find it difficult—I’m really not trying to brag. I find it difficult to think that I am susceptible to this stuff, and I imagine a lot of people who are very well versed in social media technology and news media especially, would feel the same way about themselves.
Sander van der Linden
Yeah, it’s, it’s a great question. You know, I gave my wife the test that’s in the end of the book. And she scored like in the 98th percentile or something, so it was she did better than me and I came up with it. So I was impressed. But you know, when I was writing the book, the editors asked me like, is it really the case that you are fully immune to it because you’re an expert? And I said, no. I mean not really. I mean, there have been instances where I’ve been duped just because, you know, our brains have limited capacity to process information. We’re all under pressure. We see, we’re overloaded with information. We rely on simple rules of thumb that are not always accurate. And so it is possible even for experts to get tricked and, and duped.
And I give one kind of example in the book where, you know, I think it was about the NASA kind of Mars rover landing and Stephen King retweeted something. And, you know here I go trusting Stephen King, to tweet maybe, you know, to tweet reliable conduct. So I retweeted it turned out it was a, it was kind of a fake video with fake martian winds. And it was realistic because it was pasted together from real landings, but it wasn’t the one that was actually about to happen. The thing is that they were able to dupe me because my brain was expecting to see something. So we’d heard that there’d be an announcement of the rover landing on Mars that month. I just forgot the exact date. And so when I saw it online, I just connected the dots. My brain just connected the dots and said, well, you were expecting to hear something about Mars. Here’s a video. This must be it.
But I think the moral of the story is that we can’t be scrutinizing every piece of information at that level of detail. You know, everything that’s coming at us, right? And so manipulators can take advantage of that and exploit us, and there are certain biases that are common to everyone, no matter how smart you are. And so one example is what we call illusory truth. And illusory truth is really the phenomena that the more often you repeat something, the more likely your brain is to think it’s true, because the brain uses a concept called fluency as an indicator for truth and fluency has to do with the ease with which a piece of information is processed. And the more often you hear it, the more familiar it becomes and the easier it is to process for your brain. And so that happens automatically. So if I keep telling you something that’s false, and you see this in experiments all the time, I mean even stuff that people have knowledge about that they know isn’t true. Just the sheer fact of repeating it leads people to give it a higher truth rating.
I mean, they might not, you know, completely believe it, but they’re still more likely to believe it than if they hadn’t heard it repeatedly. And so that’s really something that’s, you know, those kinds of biases are really difficult to overcome because there’s, they’re automatic. And everyone is susceptible to it. And that’s why, why good liars and manipulators repeat stuff. And there’s a host of related sort of effects. So the more often you repeat something, the less morally questionable people think it is over time. So if you start out with something outrageous, but then you keep repeating it, we find what you find in experiments is that people find it less morally objectionable. The 10th time they’ve heard it, because now it’s starting to seem pretty familiar and you know, you hear it over and over again. So maybe it’s not that crazy after all. And this is why we talk about platforming extremists, right? Because those ideas repeated over and over again.
Jordan Heath Rawlings
Absolutely.
Sander van der Linden
And people, you know, people have complex feelings about de platforming and, I completely get that. I mean, you know, freedom of speech and everything. But what the research shows though, and this is just, these are just facts, is that if you de platform what we call super spreaders, so people who continuously are just basically a fire hose of disinformation. when you de platform them, it’s effective in terms of reducing the amount of disinformation on the platform. I don’t think it’s, to use the title of my book, A Foolproof Measure because what you see is that they then jump to other social platforms. And become even more insular and their audience can become more extreme. You know, when Trump got booted from Twitter, he started a new social, and now he runs his own unmoderated network. And so there’s these, you know, side effects that you’d have to consider when it comes to de platforming, but it, it does work on some level.
Jordan Heath Rawlings
I’m glad you used the term firehose and that you pointed out the NASA example because we can’t possibly check and debunk everything. And when you are seeing everything, from politicians tweets to research and space news and everything, it can feel like you are just getting slammed with it every moment of the day, at least that you’re online. Do we have a real sense of the state of disinformation and misinformation right now? Like, can you gimme a sense of the scale of it? How bad is it right now?
Sander van der Linden
Yeah, it’s a great question. I mean, in the book I spent quite some time talking about, you know, history and context. I mean, if you wanna be a devil’s advocate, you can say, well, What historical period are you comparing it against? You know, when we were lynching witches in you know, during the Middle Ages or when we were putting Jews in gas chambers during World War II, like what’s the benchmark? And I think that makes the question really complex and I try to, you know, give some nuance to it. But I think if the benchmark, which I think when people ask me this question, I think most of us are thinking the last few decades we’re not thinking about World War II. We’re not thinking about other, you know, horrible periods in history. We’re thinking about, you know, the last few decades. Then I think things are pretty bad comparatively. I mean, let’s talk about Canada. You know, think about the truckers protest Ottawa. You know, you have a group of people, who were, who were actually quite separate. So you had anti-VAX groups, then you had neo-Nazis then you had anti-government protestors.
They were able to coordinate and con and connect and basically take over a whole city. And you know, that’s one thing. Of course the capital riots in the US you can see how that can escalate and undermine democracy based on the false notion that the election was fraudulent and that it was stolen, a narrative that was repeated over and over again. A new report came out from the Canadian academies recently that estimated that misinformation resulted during the pandemic in approximately 3,000 unnecessary deaths when it came to covid-19 and in excess of $300 million. That needed to be spent on hospital care that could have been prevented if people hadn’t been duped by misinformation. And so I think in terms of public health, in terms of, you know, trust in, in democracy in terms of violence that we’re seeing in as well as it relates to extremism, things are pretty bad right now. Even if you can come up with some historical examples that were obviously very bad. I think on a scale of things, things are not good at the moment.
Jordan Heath Rawlings
So before we get to what we can do about that, maybe you can explain a little bit about how misinformation can spread like a virus? Because we are gonna talk about vaccination.
Sander van der Linden
Yeah, absolutely. And it’s interesting cuz you know, people, some people who, who read the book and I would, I would stress this is that it’s not a loose analogy that I use. It’s not kind of a popular thing of like, okay, well, you know, we’re in the pandemic age and so everything is kind of a virus. That’s, that’s really not the case. So if you study this, if you study what we call information pathogens, you can actually directly use models from public health. And some of these are called the susceptible, infected, recovered models, which were used during covid and other, and other outbreaks. You can actually use those public health, you know, computational models and apply them directly to the spread of information in social networks. And then you can actually see that information does spread very much like a virus. And you can even calculate it or not, or the, you know, the amount of other people, somebody goes on to infect once they’ve been exposed to, to a falsehood.
And so the analogy is actually quite literal in terms of how misinformation spreads like a virus. And I should say, I mean, not all kinds of information spread like a virus. So the, the public health model, is what we call a simple contagion. So you’re exposed to a piece of misinformation that’s false. Then you become infected in the model and you go on to spread it to x number of other people, depending on the structure of a particular kind of social network. They’re all, a little bit different. Complex contagions are a bit more difficult. So, so when it comes to information unlike certain diseases, you need a bit more. So you, you might need to be exposed multiple times by people that you know well in order to quote unquote become infected. But that can be, you know, those parameters can be adjusted in the models to account for that. So I think all considered that’s basically how we look at the spread of information and how we know that we can actually use the viral analogy quite, quite literally.
Jordan Heath Rawlings
If the viral analogy is literal, how literal is inoculation? And when you talk about that in this context, what do you mean?
Sander van der Linden
Yeah, so in the book, I make the case for the, this theory of psychological inoculation. And it really follows the biomedical analogy exactly in terms of the theory at least. And so the theory is, that just as you expose people to a weakened or inactivated strain of a virus to trigger the production of antibodies to help confer resistance against future infection, it really turns out that you can do the same with information and that it works the same way for the human mind. So by preemptively exposing people to a weakened dose of disinformation, and by refuting it in advance, this is the crucial part. Sometimes people repeat this and say, expose people to a weakened dose of disinformation, but that’s only half the story. You actually have to refute it in advance. So that people can build up resistance or mental antibodies and become immune to similar disinformation, that they might come across in the future. And just as with your biological immune system, right, the more copies your immune system has of potential invaders, the more effective it can mount an immune response.
And really it works the same with the human mind. The more copies, the more micro doses you can give people of the types of disinformation that are out there and how people can spot it and refute it, the more people are able to actually resist and neutralize it. And that’s why I think it’s so important to train people’s psychological immune system. And of course, you know, in the research that we’ve done, and there’s all kinds of ways that you can extend this metaphor and sometimes it doesn’t behave. Like the biological analogy exactly. So for example, you know, if you get three booster shots for certain vaccines, you might be immune for life. I mean, right. It would be great if it worked the same way for the human mind, but unfortunately the mind is complex and we’re distracted by lots of things. Politics, social environments there, there’s so many things that can interfere with why people believe the things they do, that there is this, you know, natural kind of wearing off of the vaccine that you need to continually kind of boost over time. So I’m not suggesting that people can easily attain lifelong immunity against disinformation, but a lot of the things work in the same way. And what we show experimentally and in our research is that it’s that it’s quite promising.
Jordan Heath Rawlings
How can you approach that with somebody who and this is where I said we’d get to the practical part of the advice, somebody who, who is dear to you, but has been a victim of misinformation and disinformation. I think lots of people listening have that in their lives. Your work does seem to point out that the typical stuff we might do, such as argue with them, show them evidence, et cetera, et cetera, can only back them into a corner and make it worse. So how do you, and to you use the term pre-bunking, how would you prime a person like that? For inoculation, I guess?
Sander van der Linden
Yeah, so I use that term because it’s easier to distinguish it from debunking and also because, you know, psychological inoculation is a bit of a medical term, so people who don’t like medical terms might prefer the term pre-bunking.
Jordan Heath Rawlings
It’s also not great when you sit your older relative down and say, I’m going to psychologically inoculate you right now. It’s an awkward way to start a conversation.
Sander van der Linden
Exactly. And so pre-bunking is a bit more of a, of a gentle term and can intrigue people in terms of, oh, what, what is that? And so, I think the first thing, and this is, you know, we’ve done a lot of research with international organizations, social media companies, and producing games and videos and ways you can scale this across millions of people. But I think the benefit of for regular people is actually that you have the power of conversation to find out more. So I think the first thing that you need to do, which is really hard to do on the internet when we launch these interventions. So, you know, when we do this on Facebook or YouTube, A, a lot of these social media companies claim that they can’t give us real-time information about, quote unquote, people’s infection status in terms of how often they’ve been exposed to misinformation and when and how. I mean, maybe they have this data, but you know, they don’t want to look into it or for privacy reasons and there might well be good reasons for that.
But when you have a conversation with people, you actually have the power to find out their infection status. And I think that’s the most important thing. So you wanna find out, am I coming into this conversation in a purely prophylactic sense, which is the ideal scenario for inoculation, someone who hasn’t come across the misinformation yet. Or who hasn’t been duped yet, and now you’re going to inoculate or pre-bunk it really, that’s really the ideal thing. Or is there somebody who’s already been exposed a few times and is starting to think about it? They’re a bit on the fence, but they haven’t been completely convinced yet. This is where, what I call therapeutic inoculation is also still a good option. Or are you talking to somebody who’s already been radicalized, somebody who deeply believes in conspiracy theories, who’s already down the rabbit hole, you know?
You’re not gonna get much traction there with inoculation, very likely. And so what you need there is more of a de-radicalization strategy, which is going to be a little bit different. And so in the book I focus more on the people we can save and the power of protecting everyone else. And this is kind of interesting though, because when you talk about family get togethers, it’s often not that you need to convince your crazy uncle of something else, you need to vaccinate the rest of your family against what he’s about to say. And that’s kind of, I think sometimes the strategic focus that people get confused about that. Everyone ask, how can I convert my crazy uncle? But that’s not gonna be easy. What’s gonna be easier is to protect everyone by debunking it.
And the thing is that what I talk about in this book is that the people who are spreading conspiracies or by these things, they have tropes and narratives that go back to the 18th century. People keep repeating the same tropes over and over again, and they’re predictable. And so you can actually anticipate them and warn people in advance. And so to get concrete. I mean, what, how do you do this in a conversation? So the first thing you, you do after you find out people’s status more or less, is that you warn them in advance that there are manipulators out there who are trying to do people, and this is really important to activate people’s psychological immune system because most of the time it’s sleeping. It gets activated, it gets activated when we perceived that somebody’s trying to manipulate us. Nobody likes to be manipulated.
And so that kind of puts the brain in a heightened state of over awareness. And so you want to tell people that, look, there’s actors out there trying to dupe and manipulate us. They’re trying to make money off of conspiracy theories. They’re duping people and it’s really important that we, you know, that we tackle these things, but then you need to also give people the tools to actually identify it. And that’s why you want to give people the weakened dose. And refute something in advance. And I think if you’re asking me, if you’re talking to people who’ve already been exposed or are already a bit kind of believers, you wanna take what I call the technique level inoculation approach, which is a bit more gentle and a bit more indirect. So rather than getting very backed down in specific facts you want to do something much more general. And so let me, I’m happy to give you a practical sort of example.
Jordan Heath Rawlings
Please. Yeah.
Sander van der Linden
So one of the strategies that extremist often use on YouTube, on in misinformation and elsewhere, is something called a false dilemma or a false dichotomy. I mean, it’s used all the time by politicians too. An example would be like, oh, you know, we first need to address homelessness in Canada before we can think about immigrants, or either you join ISIS or you’re not a good Muslim, right? And so the goal of a false dilemma is to present as if there’s only two options, while in fact there’s many more. And it’s the idea is to get lost, to get rid of moderation, to get rid of nuance. So you’re forcing people into extremes. And for a lot of people that sounds pretty convincing. The first time they hear it, it was like, oh yeah, you know, we have these two things and this is what we need to prioritize. But if you think about it more, it’s actually a technique that people use to to get you to be more extreme.
And so what you wanna do then is not talk about hot button issues. You don’t wanna go into the immigration, climate change, vaccinations. Instead, you wanna pivot to what I call this sort of technique level, unveiling the techniques of disinformation. And so you’re gonna say, look, the people out there trying to manipulate us, they use techniques like a false dilemma. And the way we do this in research is that the weakened dose is something very innocuous. And so I, you know, I tell my folks, that I, you know, whom I know grew up with, with Star Wars, I’ll say like, well, remember the scene where Obi one kenobi’s talking to Anakin Skywalker, and he sort of says, either you’re with me or you’re my enemy. And then Obi one says, only a Seth deals in, in absolutes. You know, we don’t wanna be dealing in false dilemmas.
And most people will agree to that and say, yes, that’s true. And then you can gently go from there into more specific issues and say like, okay, well how could, how is this applied in the context of vaccines or immigration? Can you spot the patterns? Can you see how this is repeated? Like, I’m not saying that you need to get vaccinated. I’m not saying that you need to support immigration policy. I’m just, you know, saying that we can all now spot this technique in different contexts. And I personally find that most people are comfortable getting on board with when you abstract it like that, instead of pushing them on specific hot button issues.
Jordan Heath Rawlings
I want to specifically ask you about echo chambers, which is something you talk about in your book. It’s something that everybody who works in this field, I think talks about to one degree or another. And it struck me as I was reading your stuff. I don’t know if there’s anywhere in my life that I exist outside of an echo chamber anymore. You know, I curate my own experience on every social platform I’m on, and every streaming service that I’m on. I mean, I choose the guests and the topics for this show like it is rare and, you know, I’d like to think of myself as somebody who’s open to all opinion and who seeks out the facts and all that stuff, but it is rare that I am actively encountering something that disagrees with my worldview. And what do we know about the effect that has on people?
Sander van der Linden
Yeah, I mean, that’s a great way of, of describing it. I mean, basically, the core idea is that I, if everyone lives in their own personalized form of democracy, where you’re self-selecting the viewpoints that you’re exposed to, the theory is that then, and also what experiments show, is then people become more convinced of the beliefs that are being reinforced in their own echo chamber. And if that happens to, let’s say, two camps at the same time, then what you see is they’re drifting further and further apart. In you think of it as a network structure, and you have two bubbles. They’re becoming more and more disconnected because over time they’re digging in deeper into their own echo chamber, and the reinforcement is getting stronger. Their beliefs are getting extremer. And they’re drifting further apart. So what you see is, is effectively polarization.
Jordan Heath Rawlings
And when we talk about these vaccination techniques or inoculation techniques, and I’m not gonna get you to go through all of them, the thing that kept crossing my mind and why I brought up echo chambers is how do you either get inside there to inoculate somebody or bring somebody outside of it in order to provide any kind of accurate information or inoculation? Because to your point earlier, you know, when you’re continually getting reinforced inside these little ecosystems, it can feel like a tiny dose. It’s just like totally insufficient.
Sander van der Linden
Yeah, absolutely. I think it’s a real, I mean, when, you know, some people have reviewed the book, they talk about You know, Linden’s stuff’s encouraging, but then Echo Chambers bums me out because how, how we’re gonna get people out even if we want to inoculate them. And I think there…
Jordan Heath Rawlings
It feels like people have to be a little bit willing to discuss it in good faith in order for the inoculation to work. And, and that’s not what I’ve seen in the echo chambers and that’s why I mention it.
Sander van der Linden
And I think this is a valid point. I mean, I think, I completely agree with this. And so we’ve developed some strategies to try to get at this. I mean, the game, Bad News in many ways was an attempt to make people aware of their own echo chamber. Because what you do in the game is you kind of create, you curate your own echo chamber and your own filter bubble in some of our games. And we show people how that works. We even show this technique of false amplification, which is buying a huge amount of bots to amplify a narrative so that you kind of only see one side of it. And so our hope was that that actually leads people to reflect on the idea that they might be stuck in an echo chamber. I think people also need tools to be able to get out of the echo chamber. You know, we talked to social media companies all the time about transparency and letting people opt out. We were in Instagram’s headquarters in London a few weeks ago. And we flowed at the idea of you know, on Netflix when you get that notification, I mean, I’m not sure if you do, or maybe I’m revealing too much about myself…
Jordan Heath Rawlings
But do, do you want to stop…you’ve been watching for six hours straight.
Sander van der Linden
Exactly.
Jordan Heath Rawlings
Maybe you should stop.
Sander van der Linden
Exactly. Do you want to keep watching? And I kind of appreciate that on some levels, like what are you saying man? But, I feel like the internet and social media companies could, and Instagram actually thought this was interesting because, you know, some of the feedback they get is afterwards when people have gone down some rabbit hole and then they poll them about their experience. People don’t like it. I think the traditional problem has been that social media companies take cues from people’s behaviour, and so they see people engaging and so they infer, well, they must like it because they’re spending all their time on it. But then when they actually ask people, turns out people don’t like it. So I think we need to get people out. But also, there’s certain techniques that you can deploy.
So one’s called cognitive infiltration really talks about, going inside an echo chamber and, and trying to get people out by you know, posing as an in-group member. And then essentially trying to offer a, you know, an alternative or additional perspective. Perhaps another less, you know, more ethical option is to use influencers. And so what we’ve been thinking with inoculation, this maybe gets to your question, is who’s actually the right person to do this? And I think for a lot of communities who are in their own echo chamber, they need someone from within the echo chamber to deliver inoculation. Otherwise, they’re not going to think that it’s credible or accepted. And I spend a lot of my time talking to ’em conspiracy, they’re not quite conspiracy theorists. They’re people who are distrustful but not completely on board with conspiracy theories.
And they kind of have one foot in that world and the other foot’s still in reality and they’re often congenial to the idea that manipulation techniques are used and that we need to make people aware of them. And they seem to be willing to spread that sort of stuff into the conspiracy echo chamber. And so you need, you need somebody with an in, you know, and we see that all the time during Covid. Religious leaders were important communicators for distributing the vaccine within, certain religious echo chambers, right? Arnold Schwarzenegger, talking about climate change to Republicans in California, you need members of groups that can influence their base to, and have them propagate, quote unquote, the vaccine. That’s kind of what we’re looking at now in terms of how you would actually. Penetrate the, the echo chamber.
Jordan Heath Rawlings
So, I mean, people are gonna say, this sounds threatening, but you wanna pick high value targets essentially who can take it, take it from there and continue to spread the message.
Sander van der Linden
Yeah. Sounds threatening. But you know to maybe to rephrase that in social network terms. People who are highly connected within the relevant network. Yeah.
Jordan Heath Rawlings
Super spreaders. You just want to change what they’re spreading.
Sander van der Linden
Exactly. And so if we can target them to spread more reliable or accurate content or, you know, if people don’t want to get into debates about what’s accurate and what’s not accurate. What they can do is just make people aware of the manipulation techniques that, you know, I talk about in the book and then other people talk about, because really that’s not a heavy lift for most people to say that. I’m not telling you what to believe. Here is just some tech, like polarization, conspiracies, trolling, people using negative emotions like fear to influence people. We can all agree that those are, that those are bad things. And so I think most people are willing to take that on board. And, make people more aware of these techniques. And ultimately, I think that just empowers people to make up their own mind instead of telling people what they need to believe. And I think that’s, that’s an easier way to go about it. Some people say, well, that’s too indirect. But I think, you know, at the end of the day, it doesn’t seem realistic to try to confront people and convince them of what you want them to believe. I think giving people the tools to dismantle deception and manipulation and misinformation is more feasible in the long term.
Jordan Heath Rawlings
Sander, thank you so much for this. Thank you for the book. It’s got some very practical advice and wish you the best of luck.
Sander van der Linden
Thanks so much.
Jordan Heath Rawlings
Sander van der Linden, the author of Foolproof: Why Misinformation Infects our Minds and How to Build Immunity. That was The Big Story. You can get more at TheBigStorypodcast.ca. You can get more of us, just sharing stuff and chatting on Twitter @TheBigStoryfpn, and you can tell us more about you by writing to Hello@TheBigStorypodcast.ca. You can find this podcast absolutely everywhere, and as I so often do, I will say, if you like it and you have a friend who likes podcasts who may not have heard of us, the best thing you can do is tell them to check us out. Thanks for listening. I’m Jordan Heath Rawlings. Have a great long weekend. We’ve got a little special treat for you on the holiday Monday.
Back to top of page