Buzzfeed News on how algorithms designed to increase engagement at all costs end up sending users to bizarre conspiracy theories:
How many clicks through YouTube’s “Up Next” recommendations does it take to go from an anodyne PBS clip about the 116th United States Congress to an anti-immigrant video from a designated hate organization? Thanks to the site’s recommendation algorithm, just nine.
The video in question is “A Day in the Life of an Arizona Rancher.” It features a man named Richard Humphries recalling an incident in which a crying woman begged him not to report her to Border Patrol, though, unbeknownst to her, he had already done so. It’s been viewed over 47,000 times. Its top comment: “Illegals are our enemies , FLUSH them out or we are doomed.”
The Center for Immigration Studies, a think tank the Southern Poverty Law Center classified as an anti-immigrant hate group in 2016, posted the video to YouTube in 2011. But that designation didn’t stop YouTube’s Up Next from recommending it earlier this month after a search for “us house of representatives” conducted in a fresh search session with no viewing history, personal data, or browser cookies. YouTube’s top result for this query was a PBS NewsHour clip, but after clicking through eight of the platform’s top Up Next recommendations, it offered the Arizona rancher video alongside content from the Atlantic, the Wall Street Journal, and PragerU, a right-wing online “university.”
It’s not just an anodyne TV clip from PBS; Buzzfeed News documents step by step what happens after you finish watching a video. Even with a fresh user account that YouTube’s never seen before, the algorithm steers people down bizarre political conspiracy theory rabbit holes. When asked why, the YouTubes and Facebooks of the world say ‘ah, you see, but this content does not violate our community standards and also we are an international media company who cannot be certain whether VP Mike Pence is a time-traveling lizard Nazi or not.’
Which is why I’m glad Buzzfeed News also catalogued how many videos YouTube recommended for a while before deleting.
The platform does, at least, seem capable of finding and removing it — though not, it would seem, until after its own recommendation system has helped these videos accrue tens of thousands of views. Clearly, the people behind these channels have figured out how to game YouTube’s recommendation algorithm faster than YouTube can chase them down, leading the company to recommend their pirated videos before imminently deleting them.
I’m not sure if “eventually deleting” is an improvement or not, but I hope YouTube can figure this out before it turns us all into unhinged conspiracy theorists.
…unless they want us to be conspiracy theorists?