These psychologists found a better way to teach people to spot misinformation

A YouTube ad campaign prepared viewers to identify common deception tactics.
Misinformation doesn't have to be deliberate to spread on platforms like Twitter.
Misinformation doesn't have to be deliberate to spread on social media like Twitter. Deposit Photos

Share

A strong defense against online misinformation may be to administer a digital vaccine: Exposing yourself to common deception methods may help you recognize sensationalized headlines, misleading TikToks, or social media fabrications in the future. In collaboration with Google and its tech unit Jigsaw, a team of psychologists added short videos to YouTube’s ad lineup, educating people about how to spot common misinformation tactics. In an online campaign, they found these clips were an effective way to get people to identify what’s real and what’s fake news.

People who watched the videos were better able to identify misinformation techniques than those who didn’t see the clips, as the team reports in a study published in the journal Science Advances today. “It’s very possible on social media to reduce vulnerability and susceptibility to being manipulated,” says Jon Roozenbeek, a postdoctoral fellow at the University of Cambridge and the lead author of the study. “Maybe not all misinformation, but you can demonstrably improve people’s ability to detect when they’re being manipulated online.”

Misinformation happens when people spread false information, even if it wasn’t the person’s intention to mislead others. Misinformation happens regularly in our daily lives, says Sabrina Romanoff, a clinical psychologist who was not affiliated with the study, and it can be something as small as misremembering something you saw on television and telling someone else the wrong information. “You can think of it as analogous to the childhood game of ‘telephone,’” explains Romanoff, in which small errors become magnified through repetition. But through the megaphone of social media, wrong or misleading claims can become a harmful way to distort the truth.

[Related: The biggest consumers of fake news may benefit from this one tech intervention]

Anyone can fall prey to misinformation online, Romanoff says, though people who click on a story consistent with their pre-established beliefs are more susceptible. Being prone to impulsivity and feeling an overload of information could also make you more likely to spread fake news. 

The current study focuses on inoculation theory, where people learn about these types of misinformation techniques. Roozenbeek compares this theory to a vaccine: Introducing a weakened virus or virus-like material primes your immune system to recognize and destroy the pathogen in the future. Unlike fact-checking, which takes a more retroactive approach, inoculation theory stops people who are exposed to misinformation from spreading the content in the first place. “The idea was to inoculate people against these tropes, because if someone can successfully recognize a false dichotomy in content they’ve never seen before, they’re more resilient to any use of that particular manipulation technique on social media,” Roozenbeek says.

Roozenbeek and his team created five 1.5 minute videos covering common tactics used in online misinformation. To avoid any bias towards one group of people, the videos were designed to be nonpolitical, fictitious, and humorous. In the lab, the team invited over 6,000 participants to randomly watch either a video showing how to identify misinformation techniques or a neutral video that acted as a control. Afterward, the participants were shown 10 made-up social media posts that were manipulative or neutral. 

Roozenbeek then partnered with Google to expand the study. As part of a public ad campaign on YouTube, nearly 23,000 people watched one of two anti-misinformation videos. One video involved negative and exaggerated emotional language to encourage clicks and belief in fake news (Sample headline: “Baby formula linked to horrific outbreak of news, terrifying disease among helpless infants. Parents despair.”). The other one relied on presenting two points of views or facts as the only available options (The headline: “Improving salaries for workers means businesses will go bankrupt. The choice is between small businesses and workers. It’s simple mathematics.”). 

Within a day of seeing the video ads, one-third of people who watched the videos were randomly given a test question on YouTube where they were asked to identify the type of manipulation technique in a headline or sentence. People who watched the videos were better able to pick out misinformation techniques and misleading content.

“Finding a significant effect was actually quite surprising,” Roozenbeek says. This is because unlike a controlled laboratory setting, people on the internet can get easily distracted by other ads and videos. Additionally, there is no guarantee people actually watched the videos. While the videos were not allowed to be skipped, people could have turned off the sound or moved to another tab. “But despite all that, we still found a large and robust effect.”

[Related: Connecticut will pay a security analyst 150k to monitor election memes for misinformation]

Roozenbeek and other psychologists are wrapping up another study that looks into how long it takes for people to forget what they’ve learned from the videos. “It’s not reasonable to expect someone to watch a video once and remember the lesson for all eternity. Human memory doesn’t work that way,” he says. Ongoing results suggest ‌people might need a ‘booster shot,’ in the form of repeated video reminders. Another project in the works will use Twitter to see how watching these videos affects people’s behaviors, specifically how much they retweet misleading content.

To stay vigilant against misinformation as you scroll through the internet, Romanoff warns about these six common tactics:

  • Fabricated content: Completely false or made-up stories
  • Manipulated content: Information is intentionally distorted to fit a person’s agenda
  • Misleading content: A person deceives others, such as presenting an opinion as a fact
  • False context of connection: A person strings together facts to fit the narrative they are trying to convey, such as new stories using real images to create a false narrative of what happened
  • Satire content: A person creates false but comical stories as if they were true
  • Imposter content: A story is created through the branding and appearance of a legitimate news story, but is false such someone creating a video using someone else’s logo to seem legitimate
 

Win the Holidays with PopSci's Gift Guides

Shopping for, well, anyone? The PopSci team’s holiday gift recommendations mean you’ll never need to buy another last-minute gift card.