I must confess my love for short-form text social media platforms. I don’t like little videos. Photos are okay. But I can’t tell you how many hours I’ve whiled away reading other people’s quick thoughts on an endless scroll. The reading is a total crapshoot. Some of it is insightful, and some of it makes me feel like humanity is doomed. But I can’t stop.
Just the other morning, I woke up too early and didn’t have it in me to get out of bed. Not yet. So I picked up my phone. I checked my email quickly, which is another compulsion I’ve developed as a freelance writer. I am always waiting for that life-changing email to arrive in my inbox. It still wasn’t there. So then, without even really thinking about it, I opened Threads. And then I spent the next two hours reading and scrolling (and I know it was two hours because, in a fit of self-disgust, I checked my usage afterward).
Two hours! Of my one precious life! And in all that scrolling I didn’t even find anything good or memorable to read.
Finding something interesting to read is my primary motivator for opening these apps. I got hooked on Twitter because for years it was the place where journalists and writers promoted their work and people had conversations about it. But that started to change when Elon Musk bought it because, in an attempt to keep people on the app, reading the endless feed, they changed the algorithm that controlled it to stop showing posts with external links. I’ve heard Facebook does the same thing, and I suspect the Threads algorithm does too. Because although there is no shortage of text to read, very little of it is about this or that recent essay or article or short story or whatever that’s been published by the smart people elsewhere.
In fact, since Elon Must bought Twitter, my social media life has experienced some turmoil. And that has heightened my awareness of something I had always been perfectly content not thinking about: the algorithms that control my experiences on these sites.
Algorithms, as we all probably know, are the protocols that determine most of our online experiences. It’s the code that hooks us by showing us more and more of what it thinks we’ll like. Or what it thinks will prevent us from clicking away. In the beginning, when Threads was new and fresh, there was a lot of talk about the algorithm controlling what updates everyone could see. Comments like, “I’m still getting used to the algorithm,” and “This algorithm is so sensitive,” were very common. More common than I’d ever seen on Twitter.
This algorithm chatter culminated in a Threads trend (perhaps the first) that I will call the “Dear Algorithm Post.” A Dear Algorithm Post is exactly as it sounds, an update in the form of a letter to the algorithm, asking it for specific types of content to appear in the feed. Almost everyone on Threads wrote one of these. For days my feed was one right after the other. Lists of personality types, hobbies, preferences, standards, and identifiers. On the whole, they were earnest attempts at finding community and like-minded conversations. But something about them grated against my sensibilities. Dear Algorithm? Seriously?
I couldn’t bring myself to write one. I tried to break through by making a joke about it, something like, “Dear Algorithm, please send me the jokers, the smokers, and the midnight tokers.” Like most of the jokes I make online, it fell flat. I think one guy got it. Maybe Threads is not the platform of classic rock, or maybe my reach on there is terrible, or maybe I’m too old for the crowd, or maybe I’m just an internet bitch who doesn’t know how to play along. Probably it’s a combination of all of the above. But what exactly does it mean that algorithms are so prevalent in our lives that we feel compelled to address them directly?
Kyle Chayka’s awesome and enlightening book, Filterworld; How Algorithms Flattened Culture, suggests there’s good reason to question our relationship to these mysterious algorithmic forces. Algorithms are everywhere, not only controlling the social media feeds, but also the songs that play on the streaming platforms, the movies and shows that the streaming apps suggest we watch, what we buy, where we go out to eat, and where we travel. Chayka describes Filterworld as the web of various algorithms that influence modern life.
The algorithms were created to organize the vast and constantly growing amounts of information online. For decades, Google has organized and prioritized everything on the internet to make searching it easier. When social media sites got crowded with content, recommender algorithms replaced chronological feeds with content not necessarily from the people you follow but whatever the algorithm thinks you’ll like based on your previous actions. And users have adapted, like it or not.
What’s important to understand about these algorithms is that they exist to make money for the companies that develop and own them. That means keeping us hooked to the feed. Algorithms are decision-making digitized and scaled, and they are designed by human beings to behave a certain way. Chayka uses the example of the man behind the curtain in The Wizard of Oz—something that seems like an all-knowing and mysterious entity is revealed to be something mundane and comprehensible.
Algorithms, because they’re made to promote what immediately attracts attention, also affect culture. All the decisions about what gets shown and what gets attention used to be made by people, like newspaper and magazine editors with subject-matter expertise deciding what to publish. But now, algorithms designed by engineers in monopolistic corporations control what we see and what gets attention. Everyone’s feed is made up of content that lots of other people have already interacted with, and, according to your past behavior, you will probably pay attention to it as well. And this is significant because artistic merit is not the same as everyone has given it attention.
Chayka writes that, “the culture that thrives in Filterworld tends to be accessible, replicable, participatory, and ambient. It can be shared across wide audiences and retain its meaning across different groups who tweak it slightly to their own ends.” Algorithms promote sameness, in other words.
Chayka gives the example of the white tiled, distressed wood, industrial lighting aesthetic that’s so popular for coffee shops. In his travels for work, Chayka writes that he’s always able to find (through algorithm search results) a similar place wherever he is in the world. Because this aesthetic is so easily palatable, it performs well on Instagram, and drives more attention to the places that look the same. It perpetuates itself, reinforcing the idea of its own appeal.
Most people would scoff at the idea of going to a chain restaurant like McDonald’s when you’re traveling because if you’ve been to one then you’ve been to them all. Travel is supposed to be about new experiences. But the Instagram aesthetic is the same thing. Why would you want the coffee shop you visit on the other side of the world to look the same as the one back at home? Chayka, who visited these places often, writes, “I wasn’t surprising myself with the unfamiliar during traveling, just reaffirming the superiority of my own sense of taste by finding it in a new place.” This sameness was comfortable at first, but eventually felt hollow and inescapable.
The implications of this are real when you consider that the Instagrammable coffee shops are replacing the ones with local character and charm that don’t necessarily have a photo backdrop that will perform well with the algorithm.
By passively consuming what the algorithms recommend, our natural reaction is to seek out culture that soothes rather than surprises or challenges. It depletes our curiosity and mutes our ability to be moved. Passivity devalues cultural innovation. It forces creators to conform to the feed or get left behind. When culture doesn’t conform, then it gets choked off from exposure and financing because money easily flows to that which performs well in the algorithms.
The platforms that use the algorithms aren’t stable. The algorithms can change anytime and without notice. They are guided by the whims of people who care primarily about squeezing as much money from our attention as possible.
The algorithms aren’t transparent. There is no way for us to know how they work or what factors their results are based on. They are unregulated. No one can be held responsible if an algorithm causes harm—feeding content that glamorizes anorexia to a person with eating disorders, for example, or content that romanticizes depression and suicide to a person with clinical depression. And we can’t tell the algorithms when they’re wrong. No matter how many Dear Algorithm Posts we write, we really have no way of talking back.
If this makes you feel like your head is going to explode, Chayka’s book offers some hope. Technology has been affecting culture and the way we experience it forever. It’s ultimately a neutral force. And the age of algorithms is too new to know what the long-term effects will be. But Chayka suggests that awareness can help us become less passive and more conscious of what we’re consuming. We can seek out human curation. We can step out of our phone’s predetermined paths. We can, as I recently did, delete the apps from our phones and read the newspaper instead.
Thank you for reading!
Melinda