Even If Fake News on Facebook Is Flagged As Such, Our Bias Can Make It Skew as True
With the 2020 presidential election season moving into high gear, many people will get their political news on social media, especially Facebook. But a new study shows that most people can’t trust themselves to figure out what’s true and what’s not when on Facebook.
“We all believe that we are better than the average person at detecting fake news, but that’s simply not possible,” said lead author Dr. Patricia Moravec, assistant professor of information, risk and operations management at The University of Texas at Austin. “The environment of social media and our own biases make us all much worse than we think.”
For the study, researchers recruited 80 social media-proficient undergraduate students who answered 10 questions about their own political beliefs. Each student was then fitted with a wireless electroencephalography headset that tracked their brain activity during the experiment.
The students were then asked to read 50 political news headlines presented as they would appear in a Facebook feed and assess their credibility. Forty of the headlines were evenly divided between true and false, with 10 headlines that were clearly true included as controls, such as “Trump Signs New Executive Order on Immigration” (clearly true) and “Nominee to Lead EPA Testifies He’ll Enforce Environmental Laws” (true).
The researchers also randomly assigned fake news flags among the 40 non-control headlines to see what effect they would have on the participants’ responses. In late 2016, Facebook incorporated fact-checking into its platform and began flagging certain news articles by noting that an article was “disputed by third-party fact checkers.” The students rated each headline’s believability, credibility, and truthfulness.
The study discovered that the students assessed only 44 percent correctly, overwhelmingly selecting headlines that aligned with their own political beliefs as true.
As they worked through the exercise, the students spent more time and showed significantly more activity in their frontal cortices — the brain area associated with arousal, memory access and consciousness — when headlines supported their beliefs but were flagged as false. These reactions of discomfort indicated cognitive dissonance when headlines supporting their beliefs were marked as untrue, according to the researchers.
But this dissonance was not enough to make the students change their minds. They overwhelmingly said that headlines conforming with their preexisting beliefs were true, regardless of whether they were flagged as potentially fake.
The flag did not change their initial response to the headline, even if it did make them pause a moment longer and study it a bit more carefully, the researchers noted.
Political affiliation made no difference in their ability to determine what was true or false, the researchers discovered.
“People’s self-reported identity as Democrat or Republican didn’t influence their ability to detect fake news,” Moravec said. “And it didn’t determine how skeptical they were about what’s news and what’s not.”
The experiment showed that social media users are highly subject to confirmation bias, the unintentional tendency to gravitate toward and process information that is consistent with existing beliefs, according to Moravec. This can result in decision-making that ignores information that is inconsistent with those beliefs.
“The fact that social media perpetuates and feeds this bias complicates people’s ability to make evidence-based decisions,” she said. “But if the facts that you do have are polluted by fake news that you truly believe, then the decisions you make are going to be much worse.”
The study was published in Management Information Systems Quarterly.
Wood, J. (2019). Even If Fake News on Facebook Is Flagged As Such, Our Bias Can Make It Skew as True. Psych Central. Retrieved on August 8, 2020, from https://psychcentral.com/news/2019/11/09/even-if-fake-news-on-facebook-is-flagged-as-such-our-bias-can-make-it-skew-as-true/151710.html