Home » Blog » Even Facebook Doesn’t Understand Facebook’s Algorithms
Even Facebook Doesn't Understand Facebook's Algorithms

Even Facebook Doesn’t Understand Facebook’s Algorithms

After all the hand-wringing that came from the “fake news” spectacle courtesy of Facebook’s news feed — the content you see when you log into Facebook from your phone or laptop — one thing has become abundantly clear. Even Facebook doesn’t understand Facebook.

And that’s the problem with relying on an algorithmic artificial intelligence (AI) that has been built (or more accurately, pieced together) over the years by hundreds of different developers and programmers.

This all became clear to me over the past few days as I mulled over the things I learned at the latest HealtheVoices 2017 conference, and after reading an excellent article about Facebook by Farhad Manjoo in the New York Times Magazine.

At the conference, a Facebook representative faced a somewhat frustrated (and at times, nearly hostile) questioning crowd about why the stuff they wrote as health activists and advocates seemed rarely to surface on other people’s Facebook news feeds. The only way it seemed to get engagement, said people in the audience that day, was to purchase it through Facebook (via a paid-for mechanism known as “boosting” a post).

The Facebook representative had no answer to give to these questions about why seemingly high-quality, good content isn’t being showcased in their news feed. Yet every single participant I spoke with — passionate, engaged health advocates — saw it as a problem with Facebook. But even Facebook couldn’t explain how to fix it.

It’s no wonder. The Times Magazine article shed light on to the “why.”  The Facebook news feed algorithm sorts though, on average, 2,000 pieces of possible content every time a person loads Facebook for the first time and on every refresh.

There are so many variables that go into that complex, dark, and proprietary sorting algorithm that not even Facebook can answer why something will or will not show in someone’s news feed. It’s very much the same kind of pain webmasters have long felt in dealing with Google and their search indexing algorithm.

The fact that the algorithm can easily be bypassed by simply purchasing access to people’s news feed is no surprise. It’s a large part of the reason Facebook is making so much money year after year. Even the poorest-quality content can buy its way on to your supposedly-personalized news feed.

News Feed: Still a Work in Progress

You’d think that after so many years of hard work, attention, developers’ hours, and research into this algorithm, it would get some basic things right. But two recent anecdotes show me how much farther Facebook’s lauded AI still has to go.

The first is the concert meme that took hold of Facebook’s news feed in late April 2017. In this particular meme, participants list 10 concerts they’ve attended in their lifetime, but one of them is fake. It’s up to the person’s friends to spot the fake concert they have not attended, and comment on it.

My initial thought when I first saw this pop up was, “Who cares?” because, frankly, I don’t care what concerts my friends have attended. If I was engaged in an actual conversation with my friends about the topic of “concerts we’ve seen,” it could potentially be cool. But as a conversation starter, I found it non-sensical because it is out-of-the-blue — socially connected to nothing. So I hit “Hide this post” from the first one I saw on my news feed.

Did that help stop the influx of this content onto my news feed over the next few days? Not one iota. I saw no fewer than a dozen such questions in the next two days flood my feed (even after clicking on “Hide this post” on at least two more, too). All of which was of zero interest to me. So much for Facebook’s AI working.

The second is Bloom County. I’m a huge fan and rejoiced when Berkeley Breathed started drawing again. I look forward to each day when a new comic of his shows up in my feed. Yet mysteriously, Facebook has no clue about this. It stopped showing me his comics a few weeks ago, even though I studiously clicked on most of them in order to read them more easily in their enlarged form.

How could so many clicks result in a “not interested, stop showing this to this user” tag? Facebook couldn’t answer this question if it wanted to, because it has no idea how its news feed algorithm actually works — or doesn’t work — for individual users. This would be funny if it wasn’t such a big deal. According to the Pew Research Center, more than half of Americans get their news from Facebook.

Feedback Forms that Give Little Feedback to Users

Facebook claims it can assess (and work to fix) a lot of problems if users just used their feedback forms more often. But there’s a reason users hate to give Facebook feedback — their feedback system just sucks.

Facebook’s feedback forms give users very little feedback of its own, and instead values leading users down an array of, “Here’s how to fix this yourself, idiot” questions that clearly seem always to suggest that most of the problems you come across are yours to fix — not Facebook’s. There’s no human at the end of those forms, and no human response that you’ll ever receive. This is the very definition of inhuman. Ironic for a company that sees itself “building a global community” connecting everyone more socially. How can a company do this while eschewing any human contact with its users?1

AI is So Complex, Humans Need to Help

Many developers tell themselves that AI can pretty much solve any human problem, if given enough variables, datasets, and tweaking. But what Facebook has clearly shown is that whatever it’s doing to help solve its news feed problem, its AI is not working well for many, many people.

Nobody feels like Facebook is even listening. As I noted earlier, health advocates and activists I spoke with at the conference said they are not being heard. Social media companies like Facebook don’t seem to care. And on a closed Facebook group for news publishers, I hear the frustration daily concerning how many respected news organizations are having their in-depth, investigative journalism articles languish on Facebook without paid-for boosting. Meanwhile, a breezy “5 Amazing, Quick, and Easy Ways to a Summer Tan” throwaway article gets more hits than Trump’s Google searches for “How do I fix…” queries.

Parting Shots on Today’s AI

Apple’s lauded Siri is often held up as a useful example of AI that works in today’s modern world. But as I was writing this article, I said, “Hey Siri, tell Nancy that I love her.”

Tell her I love her

Siri, in her infinite wisdom got the recipient of the message correct (luckily, since I know only one Nancy). But the text she sent was, “I love her.” Siri obviously had no understanding of the actual meaning of my sentence, and instead used rudimentary filters to figure out I wanted to send a text to a person in my contacts with the literal message, “I love her.”

This is the level of AI we’re working with today across big technology companies such as Facebook, Google, and Apple — somewhat useful, but frustratingly uneven in its quality and implementation.

I really hope that Facebook figures this out, because every day I find myself using it less and less as it increasingly becomes less and less relevant to my actual interests and daily life. I may be in the minority today, but I suspect that will change if Facebook doesn’t resolve these problems of relevancy and interest soon.

Even Facebook Doesn’t Understand Facebook’s Algorithms


  1. This is one of the reasons Facebook presenters at conferences often get pelted with so many questions — its the sole human contact most of us has ever had with this technology giant. []

John M. Grohol, Psy.D.

Dr. John Grohol is the founder of Psych Central. He is a psychologist, author, researcher, and expert in mental health online, and has been writing about online behavior, mental health and psychology issues since 1995. Dr. Grohol has a Master's degree and doctorate in clinical psychology from Nova Southeastern University. Dr. Grohol sits on the editorial board of the journal Computers in Human Behavior and is a founding board member of the Society for Participatory Medicine. You can learn more about Dr. John Grohol here.

2 comments: View Comments / Leave a Comment
APA Reference
Grohol, J. (2018). Even Facebook Doesn’t Understand Facebook’s Algorithms. Psych Central. Retrieved on October 31, 2020, from
Scientifically Reviewed
Last updated: 8 Jul 2018 (Originally: 2 May 2017)
Last reviewed: By a member of our scientific advisory board on 8 Jul 2018
Published on Psych All rights reserved.