The Guardian just published an excerpt of Extracted from World Without Mind: The Existential Threat of Big Tech by Franklin Foer titled Facebook’s war on free will. This bit, about the ubiquity of algorithms and the harms which this necessarily implies, really resonated with me:
Still, even as an algorithm mindlessly implements its procedures – and even as it learns to see new patterns in the data – it reflects the minds of its creators, the motives of its trainers. Amazon and Netflix use algorithms to make recommendations about books and films. (One-third of purchases on Amazon come from these recommendations.) These algorithms seek to understand our tastes, and the tastes of like-minded consumers of culture.
Yet the algorithms make fundamentally different recommendations. Amazon steers you to the sorts of books that you’ve seen before. Netflix directs users to the unfamiliar. There’s a business reason for this difference. Blockbuster movies cost Netflix more to stream. Greater profit arrives when you decide to watch more obscure fare. Computer scientists have an aphorism that describes how algorithms relentlessly hunt for patterns: they talk about torturing the data until it confesses. Yet this metaphor contains unexamined implications. Data, like victims of torture, tells its interrogator what it wants to hear.
These algorithms aren’t fundamental laws of nature, like the rate at which stars exhaust their fuel, or the time it’ll take the Moon to fall out of the sky and rejoin the Earth. People build these algorithms with a purpose, and it’s good to be mindful of that.
Tangentially, here’s an absolutely wonderfully researched bit of debunking by Maciej Cegłowski. He writes about a series of news article that went viral recently, the premise of which is roughly ‘so many people are buying bomb-making components on Amazon, the Amazon algorithm now recommends users purchase bomb-making materials in conjunction with one another.’
In a word, no:
When I contacted the author of one of these pieces to express my concerns, they explained that the piece had been written on short deadline that morning, and they were already working on an unrelated article. The author cited coverage in other mainstream outlets (including the New York Times) as justification for republishing and not correcting the assertions made in the original Channel 4 report.
The real story in this mess is not the threat that algorithms pose to Amazon shoppers, but the threat that algorithms pose to journalism. By forcing reporters to optimize every story for clicks, not giving them time to check or contextualize their reporting, and requiring them to race to publish follow-on articles on every topic, the clickbait economics of online media encourage carelessness and drama. This is particularly true for technical topics outside the reporter’s area of expertise.
And reporters have no choice but to chase clicks. Because Google and Facebook have a duopoly on online advertising, the only measure of success in publishing is whether a story goes viral on social media. Authors are evaluated by how individual stories perform online, and face constant pressure to make them more arresting. Highly technical pieces are farmed out to junior freelancers working under strict time limits. Corrections, if they happen at all, are inserted quietly through ‘ninja edits’ after the fact.
Got that? Facebook is an attempt to undermine your Free Will as a Service, Facebook also undermines journalism as a service, and here’s one for the road: Facebook appears to have accidentally undermined American democracy as a service.
On the one hand, I will not shed a tear for Facebook when some upstart service takes its place. On the other, it’s hard to imagine Facebook’s replacement will be any more benign.