advertisement
Home » Blog » Surprise: Facebook Says that Facebook A-Okay for News!

Surprise: Facebook Says that Facebook A-Okay for News!

Surprise: Facebook Says that Facebook A-Okay for News!A study was published last week in the journal Science that shows that Facebook’s algorithm isn’t the cause of a hypothetical “filter bubble” — where people see only news stories that align with their political leanings on a social network.

The only problem? It’s a Facebook study conducted by people who are employed by Facebook.

Should we really be all that surprised that Facebook’s own researchers minimize the impact of their company’s manipulations of people’s news feeds?

The study, Exposure to ideologically diverse news and opinion on Facebook, by Eytan Bakshy and colleagues at Facebook was published last week in Science. How could this study be published in the prestigious journal Science?1 Isn’t peer-review supposed to catch self-serving studies published by companies that only demonstrate what the company wants the data to show?

One has to wonder how peer-reviewers passed this study as published. Many critiques of the study have already been published, so I’ll just do a quick roundup of the problems cited by others. This critique from Christian Sandvig pretty much sums up the problem with how Facebook researchers spun their own findings in the published study:

On the subject of comparisons, the study goes on to say that:

“we conclusively establish that…individual choices more than algorithms limit exposure to attitude-challenging content.”

“compared to algorithmic ranking, individuals’ choices about what to consume had a stronger effect”

Alarm bells are ringing for me. The tobacco industry might once have funded a study that says that smoking is less dangerous than coal mining, but here we have a study about coal miners smoking. Probably while they are in the coal mine.

What I mean to say is that there is no scenario in which “user choices” vs. “the algorithm” can be traded off, because they happen together […]. Users select from what the algorithm already filtered for them. It is a sequence.

I think the proper statement about these two things is that they’re both bad — they both increase polarization and selectivity. As I said above, the algorithm appears to modestly increase the selectivity of users.

Indeed, you’d think the Facebook researchers who, you know, work at Facebook, would understand that you can’t tease out and separate the algorithm from the user’s behavior, since one is dependent upon the other. Without manipulating the algorithm (which Facebook research Adam Kramer discovered is not a good thing to do without first obtaining user’s informed consent), you can’t really say why users are clicking on one thing more than something else.

But this review of the Facebook study from Zeynep Tufekci, a professor at the University of North Carolina, Chapel Hill wrote that what the study’s data really found were buried in the study’s appendix:

The higher the link, more (a lot more) likely it will be clicked on. You live and die by placement, determined by the newsfeed algorithm. (The effect, as Sean J. Taylor correctly notes, is a combination of placement, and the fact that the algorithm is guessing what you would like).

This was already known, mostly, but it’s great to have it confirmed by Facebook researchers (the study was solely authored by Facebook employees). […]

One novel finding is that the newsfeed algorithm (modestly) suppresses diverse content, and another crucial and also novel finding is that placement in the feed is (strongly) influential of click-through rates.

Facebook is showing you news stuff you’re more likely to read (because it agrees with your political point of view) and that the higher it appears in your newsfeed, the more likely it is that you’ll click on it.

In other words, Facebook’s manipulation of your newsfeed continues to be an important contributing factor in determining what you will likely click on. And they continue to manipulate that feed to show you politically-aligned news stories more than if there was no bias whatsoever in their algorithm.

And, as Tufekci importantly notes, this was a tiny, select group of Facebook users studied, only because it was more convenient (and cheaper) for the Facebook researchers to study them. They only looked at users who self-identified their political affiliation on Facebook and regularly log onto the service (about 4 percent of total Facebook users [only 9 percent of users declare their political affiliation on Facebook, meaning this is not a sample you can generalize anything from]). It says nothing about Facebook users who don’t declare their political affiliation on Facebook — which is most of us.

Could Facebook have conducted a more robust study? Sure, but it would’ve required more traditional research techniques, such as survey recruitment on the site combined with randomized email recruitment off-site.

So, here’s the unfiltered, unbiased finding from the data (courtesy of Tufekci) that Facebook’s researchers seemingly spun to say that what you see in their newsfeed is not their fault:

Facebook researchers conclusively show that Facebook’s newsfeed algorithm decreases ideologically diverse, cross-cutting content people see from their social networks on Facebook by a measurable amount. The researchers report that exposure to diverse content is suppressed by Facebook’s algorithm by 8% for self-identified liberals and by 5% for self-identified conservatives.

Or, as Christian Sandvig puts it, “the algorithm filters out 1 in 20 cross-cutting hard news stories that a self-identified conservative sees (or 5%) and 1 in 13 cross-cutting hard news stories that a self-identified liberal sees (8%).” You are seeing fewer news items that you’d disagree with which are shared by your friends because the algorithm is not showing them to you.

Unfortunately, outside of the New York Times and a select few other outlets, most mainstream news media sites just reported what the Facebook researchers claimed in their study, without skepticism.

But now you know the truth — Facebook studied a small, non-representative sample, then downplayed what their data actually showed to emphasize a result that was more positive toward the company. To me, this is another example of Facebook researchers not really understanding the point of scientific research — to share knowledge with the world that isn’t biased and manipulated.

 

For Further Reading…

Zeynep Tufekci: How Facebook’s Algorithm Suppresses Content Diversity (Modestly) and How the Newsfeed Rules Your Clicks

Christian Sandvig: The Facebook “It’s Not Our Fault” Study

NY Times: Facebook Use Polarizing? Site Begs to Differ

Surprise: Facebook Says that Facebook A-Okay for News!

Footnotes:

  1. Sadly, I now have to add another journal to my list of respected scientific publications that need further scrutiny. As long-time readers of World of Psychology know, that list gets longer every year: Pediatrics (Pediatrics publishes studies about technology that are apparently reviewed by nobody familiar with basic research causative relationships.), Cyberpsychology, Behavior, and Social Networking (Cyberpsychology apparently publishes any study about Internet addiction or social networking, regardless of its quality.) and now, Science. []


John M. Grohol, Psy.D.

Dr. John Grohol is the founder and Editor-in-Chief of Psych Central. He is a psychologist, author, researcher, and expert in mental health online, and has been writing about online behavior, mental health and psychology issues since 1995. Dr. Grohol has a Master's degree and doctorate in clinical psychology from Nova Southeastern University. Dr. Grohol sits on the editorial board of the journal Computers in Human Behavior and is a founding board member of the Society for Participatory Medicine. You can learn more about Dr. John Grohol here.


5 comments: View Comments / Leave a Comment
APA Reference
Grohol, J. (2018). Surprise: Facebook Says that Facebook A-Okay for News!. Psych Central. Retrieved on November 19, 2019, from https://psychcentral.com/blog/surprise-facebook-says-that-facebook-a-okay-for-news/
Scientifically Reviewed
Last updated: 8 Jul 2018 (Originally: 11 May 2015)
Last reviewed: By a member of our scientific advisory board on 8 Jul 2018
Published on Psych Central.com. All rights reserved.