Menu Bar

Home           Calendar           Topics          Just Charlestown          About Us

Friday, April 14, 2023

How Misinformation Acts Like a Virus

The on-going struggle against lies and fake news

By Christie Aschwanden

“Foolproof: Why Misinformation Infects Our
Minds and How to Build Immunity,” by
Sander van der Linden
(W. W. Norton & Company, 368 pages).
In the early months of 2022, U.S. intelligence officials announced that Russian propagandists were preparing to release a fake video and images that purported to show Ukrainian aggression as a pretext for Russia’s planned invasion. 

“The goal of these campaigns was not to convince foreign audiences that Russia’s military invasion was somehow justified, but to make it harder for people to discern fact from fiction,” writes Sander van der Linden in his new book, “Foolproof: Why Misinformation Infects Our Minds and How to Build Immunity.”

The misinformation, he says, was meant to distract political analysts and fact-checkers by forcing them to track down and debunk information that was obviously false. 

One of the obstacles facing the effort to dismantle disinformation, van der Linden suggests, is that while “lies and fake news tend to be simple and sticky, science is often presented as nuanced and complex.” In other words, where factual scientific information is full of caveats, misinformation and conspiracy theories operate in certainties.

Another hallmark of conspiracy theorists: rage. “By analysing the language used in hundreds of thousands of tweets from the most popular conspiracy theorists on Twitter, we found that they express much more negative emotions — particularly anger — compared to their popular science counterparts,” writes van der Linden, a psychologist at the University of Cambridge.

In their analysis of language patterns among conspiracy theorists, his team also found that they tended to “talk much more about other groups and power structures,” and they used swear words more often than the comparison group.

Van der Linden proposes that misinformation and conspiracy theories represent “viruses of the mind” that can latch onto the brain and hijack its “basic cognitive machinery.” Like other viruses, misinformation spreads via contact with other infected people, but van der Linden asserts that it can be stopped with a “psychological vaccine” — one that “does not require any needles, just an open mind.”

According to van der Linden, the idea that it might be possible to deliver a cognitive vaccine against propaganda was first proposed decades ago by psychologist William J. McGuire, whose research laid the groundwork, van der Linden says, for the concept of “prebunking.” 

McGuire hypothesized that if you gave people detailed warning about the kind of propaganda they would encounter before they were exposed to it, they would be more likely to view it as misinformation and less prone to accept it. McGuire published a summary of his findings in 1970 in a Psychology Today article titled “A Vaccine for Brainwash.”

McGuire died in 2007, but van der Linden and his colleagues have built upon his work in numerous experiments. One of these tested a tool that van der Linden’s team created to train people to spot misinformation. 

Bad News” is an online game that gives players a chance to become nefarious producers of fake news. The game allows players to try out the “six degrees of manipulation” — techniques that van der Linden’s group has identified as hallmarks of misinformation. 

These include discrediting (“a technique that deflects attention away from accusations by attacking the source of the criticism”); deliberately playing on emotions like fear and moral outrage to get people riled up; provoking polarization; posing as experts or legitimate news outlets; promoting conspiracies; and trolling.

The researchers tested players’ ability to spot fake news before and after they’d taken part in the game, and in a dataset of 15,000 trials, they found that “everyone improved their ability to spot misinformation after playing the game,” van der Linden writes. 

People who’d fared worst at spotting fake news in the pre-game quiz made the biggest improvements. But the improvements were modest: “On average, players adjusted their ranking of fake headlines downwards by about half a point (0.5) on the 1-7 reliability scale after playing.”

In another experiment, van der Linden’s team exposed volunteers to a specific set of real-world misinformation about global warming, but forewarned them that “some politically motivated groups use misleading tactics” to suggest that climate scientists disagree about the causes of climate change (when in fact the vast majority agree humans are to blame). The researchers found that this warning made volunteers less vulnerable to accepting climate misinformation

People exposed to two prebunking scenarios updated their estimates of the scientific consensus on human-caused climate change by about 6.5 to 13 percentage points, with the more detailed forewarning producing the biggest improvements. And the findings held regardless of the volunteers’ initial attitudes. “We were not just preaching to the converted,” he writes.

Van der Linden makes the case that it’s possible to inoculate people against misinformation, but if there’s one lesson that’s come from the Covid-19 pandemic, it’s that developing a vaccine is one thing; convincing people to take it presents a whole other challenge.

One promising approach would be to embed prebunking material into social media platforms. Van der Linden writes that his research group, along with cognitive scientist Stephan Lewandowsky at the University of Bristol, collaborated with Beth Goldberg at Jigsaw (a unit of Google), on some short prebunking videos that could be embedded in YouTube’s non-skippable ad space. (Google owns YouTube.) 

The videos started with a warning that the viewer might be subjected to attempts to manipulate their opinion, followed by an explanation of how to spot and counteract this variety of misinformation, and finally a “microdose” example of that kind of manipulation to help them identify it in the future.

Tests found that people who’d seen the videos “got much better at identifying which posts contained a specific manipulation strategy and they were subsequently less likely to want to share this type of content with others in their social network,” van der Linden writes.

The problem? YouTube didn’t end up using them. Although the company’s specific objections were never shared, it’s clear that stemming the flow of misinformation may not always be in social media companies’ best interests.

Van der Linden is at his best when he’s describing the problem of misinformation, but his efforts to produce actionable solutions often feel unsatisfying. The 11 “antigens” he presents to help stop the spread of misinformation read more like sage observations about the nature of misinformation than actionable steps for countering it.

Consider, for instance, antigen number 4: “Minimize the continued influence of misinformation,” since, he writes, the “longer misinformation sits in our brains the more influential it becomes.” While this may be true, it’s not easy to act on and van der Linden offers little advice on how to do so. Similarly, antigen number 2, “Incentivize accuracy: Create an environment where people strive to be accurate rather than political,” sounds wise, but how can it be done in our current environment of political sectarianism?

The book ends with a chapter titled “How to inoculate your friends and family,” which offers advice that’s well-meaning but arguably weak. Sure, a technique that van der Linden calls “fact-based inoculation” (warning people that they are going to be exposed to misinformation about a specific topic) may work in some cases, but has it ever convinced someone’s conspiracy-touting uncle? 

And there’s only one person I know who could get away with van der Linden’s suggestion to use stand-up comedy or entertaining videos to deliver a good prebunk, and he’s a professional comic.

“Technique-based inoculation,” which warns people about specific techniques used to spread misinformation rather than trying to debunk specific ideas, might be easier to implement, and indeed van der Linden writes that he’s found in both his research and personal life that “unveiling the techniques of manipulation encounters less resistance than trying to tell people what the facts are.”

His solutions are addressing a very difficult problem, and it’s hard to fault him for failing to provide satisfying answers. We’re facing a crisis of public trust at the very moment that we’re being bombarded with misinformation, and there are no simple answers. Van der Linden’s book expertly lays out strategies for counteracting misinformation. If only they were easier to implement.

Still, he remains hopeful. “We are not defenceless in the fight against misinformation,” he writes. “The first step to countering it lies in your ability to spot and neutralize” its devious techniques. To that end, “Please treat this book as your guide to defeating the dark arts of manipulation.”

This article was originally published on Undark. Read the original article.