Blog Ipsa Loquitur

Published on under Fourth Estate Chronicles

Facebook’s researchers published a study last week about content diversity on Facebook’s News Feed, which is roughly ‘links to news stories written from a political ideology with which users disagree.’ For example, a diehard Republican’s News Feed with stories from Rachel Maddow, or a Democrat’s News Feed with Fox News links. The horror: reading something on the Internet that could challenge your worldview, or broaden your perspective! You could even be proven wrong.

Ha! Just kidding. When your political worldview is challenged, your original beliefs actually get stronger. Actually, it’s any belief. It’s really, really hard to change your mind.

But still: lots of folks deliberately avoid news from a political worldview with which they disagree. I certainly don’t spend a lot of time reading what those idiots on the other side of the political spectrum think. I’m too busy nodding furiously at the articles gently massaging whatever part of my brain stores all my confirmation bias endorphins.

Facebook’s study, then, was about how people don’t click on News Feed stories from their Other End of the spectrum.

Sidebar: now would be a good time to note that of the many Facebook Friends you have made, and the many Facebook Pages you have Liked, Facebook selects a subset of the updates/links/photos/etc that those Friends and Pages post to show you. You can I could have the exact same 50 Facebook Friends and see wildly different Facebook News Feeds because the News Feed algorithm thinks we’re interested in different things.

For example, if you click on cat photos and I click on dog photos, the algorithm shows you more cat photos and fewer dog photos, and shows me more dog photos and fewer cat photos. Facebook (thinks it) knows what we want, and will (attempt to) give more of it to us.

The Shocking Twist

You see where this is going: Facebook’s News Feed algorithm, carefully engineered to keep your eyeballs glued to Facebook dotcom, filters out those Other End stories and replaces them with more promising links. But only because you don’t want them there! Therefore, it’s not Facebook’s fault that people live in an ideological bubble, you see? They don’t want to see Other End stories and so Facebook doesn’t show them Other End stories. Done.

But don’t you need to have some kind of control group? Don’t you need to have some people who see all the stuff and some people who see less of the stuff, and then measure who clicks what stuff?

This doesn’t make a whole lot of sense to me, and neither does it make a lot of sense to Professor Zeynep Tufekci, who writes:

As Christian Sandvig states in this post, and Nathan Jurgenson in this important post here, and David Lazer in the introduction to the piece in Science explore deeply, the Facebook researchers are not studying some neutral phenomenon that exists outside of Facebook’s control. The algorithm is designed by Facebook, and is occasionally re-arranged, sometimes to the devastation of groups who cannot pay-to-play for that all important positioning.

Essentially, Facebook is acting like the only variable in this study is the rate at which people click on stories. But the rate at which News Feed shows you Other End stories also changes per person, per day, per click, etc. in a thousand other ways we’ll never know because the algorithm is super secret.

As noted internet scholar and friend of the blog Professor James Grimmelmann said, “the study’s independent and dependent variables are hopelessly snarled.”

Tufekci again:

I’m glad that Facebook is choosing to publish such findings, but I cannot but shake my head about how the real findings are buried, and irrelevant comparisons take up the conclusion. Overall, from all aspects, this study confirms that for this slice of politically-engaged sub-population, Facebook’s algorithm is a modest suppressor of diversity of content people see on Facebook, and that newsfeed placement is a profoundly powerful gatekeeper for click-through rates. This, not all the roundabout conversation about people’s choices, is the news.

It’s telling that these “findings” were published in an appendix instead of front and center in the paper itself. The researchers might understand that this isn’t a good thing. Ideological bubbles where people are insulated from anything resembling a challenge to their viewpoint create badly polarized institutions. Those are probably Bad For Democracy.