SHARE

Misinformation is rampant on social media, and a new study has shed some light on why. Researchers from Yale University and the University of Southern California argue that basically, some people develop a habit for sharing things on social media—whether they’re true or not. Although “individual deficits in critical reasoning and partisan bias” are commonly cited as reasons that people share fake news, the authors wrote in the paper, “the structure of online sharing built into social platforms is more important.” 

Previous studies have found that some people—especially older people—just don’t consider whether something is true before sharing it. Other research has shown that some people are motivated to share news headlines that support their identity and match their existing beliefs, whether the headlines are true or not—especially Conservatives

While the research team from Yale and USC accept these as contributing factors to the spread of misinformation online, they hypothesized that they may not be the only mechanisms that lead people to share fake news. Both the idea that people share misinformation because of a lack of critical thinking or that it’s a result of partisan bias assume that they would share less fake news if they were sufficiently motivated or able to consider the accuracy of the headlines they are sharing, however, the Yale-USC team’s research suggests that may not be the case. 

Instead, the team argues that “misinformation sharing appears to be part of a larger pattern of frequent online sharing of information.” To support that, they found that the people in their 2,476-participant study who shared the greatest amount of fake news stories, also shared more true news stories. The paper is based on four related, but separately conducted studies all aimed at teasing out how habitual sharing affects the spread of misinformation. 

[Related: The biggest consumers of fake news may benefit from this one tech intervention]

In the first study, 200 online participants were shown eight stories with true headlines and eight stories with false headlines and asked if they’d share them on Facebook. The researchers also measured how strong their habitual sharing was on social media using data on how frequently they shared content in the past and a self-reported index that measured if they did so without thinking. 

As the researchers expected, participants with stronger sharing habits reposted more stories and were less discerning about whether they were true or not than participants with weaker habits. The participants with the strongest habits shared 43 percent of the true headlines and 38 percent of the false headlines while those with the weakest habits shared just 15 percent of the true headlines and 6 percent of the false ones. In total, the top 15 percent of habitual sharers were responsible for 37 percent of the shared false headlines across this study. 

The second study, which contained 839 participants, was aimed at seeing if participants would be deterred from habitual sharing after they were asked to consider the accuracy of a given story.

While asking participants to assess the headline accuracy before sharing reduced the amount of fake headlines shared, it was least effective in the most habitual participants. When participants had  to assess the accuracy before being asked about whether or not they would share a sample of stories,they shared 42 percent of the true headlines and still shared 22 percent of the false ones. But, when participants were only asked about whether or not they would share the stories, the most habitual participants shared 42 percent of the true headlines and 30 percent of the false ones.

[Related: These psychologists found a better way to teach people to spot misinformation]

The third study aimed to assess if people with strong sharing habits were less sensitive to partisan bias and shared information that didn’t align with their political views. The structure was similar to the previous study, with around 836 participants asked to assess the whether a sample of headlines aligned with liberal and conservative politics, and whether or not they’d share them. 

Again the most habitual sharers were less discerning about what they shared. Those not asked to assess the politics of the headlines beforehand reposted 47 percent of the stories that aligned with their stated political orientation and 20 percent of the stories that didn’t. Even when asked to assess the political bias first, habitual sharers reposted 43 percent of the stories that aligned with their political views and 13 percent of the ones that didn’t. In both conditions, the least habitual sharers only shared approximately 22 percent of the headlines that aligned with their views and just 3 percent of the stories that didn’t. 

Finally, in the fourth study, the researchers tested whether changing the reward structure on social media could change how frequently misinformation was shared. They theorized that if people get a reward response to likes and comments, it would encourage the formation of habitual sharing—and that the reward structure could be changed. 

To test this, they split 601 participants into three groups: a control, a misinformation training condition, and an accuracy training condition. In each group, participants were shown 80 trial headlines and asked whether or not they’d share them before seeing the eight true and eight false test headlines similar to the previous studies. In the control condition, nothing happened if they shared the true or false headline, while in the misinformation condition, participants were told they got “+5 points” when they shared a false headline or didn’t share a true one, and in the accuracy condition they were told they got “+5 points” when they shared a true headline or didn’t share a false one. 

As predicted, both accuracy training and misinformation training were effective in changing participants sharing behaviors compared to the controls. Participants in the accuracy condition shared 72 percent of the true headlines and 26 percent of the false headlines compared with participants in the misinformation condition who shared 48 percent of the true headlines and 43 percent of the false ones. (Control participants shared 45 percent of the true headlines and 19 percent of the false.)

The researchers conclude that their studies all show that habitual sharing is a major factor in the spread of misinformation. The top 15 percent most habitual sharers across were responsible for between 30 and 40 percent of all shared misinformation across all studies. They argue that this is part of the broader response patterns established by social media platforms—but that they could be restructured by internal engineers to promote the sharing of accurate information instead.