In the weeks leading up to the US presidential election, Kacey Smith was feeling hopeful. Smith, who supported Vice President Kamala Harris’ campaign, says she knew it would be a close race between the Democratic nominee and Republican Donald Trump. But as she scrolled TikTok, she believed Harris would be victorious.
But Election Day approached, and she started to sense red flags in that positivity. She recalls TikTok serving her enthusiasm for reproductive choice with videos encouraging “women’s rights over gas prices” — implying, falsely, she thought, the choice was “either/or.” The rhetoric fit well inside her feed filled with strangers, but as a campaign strategy, it felt limiting and risky. “When I started seeing that messaging play out,” Smith says, “I started getting a little uneasy.” Her fears were borne out: Harris lost the popular vote and Electoral College and conceded the election to President-elect Trump.
Filter bubbles like TikTok’s recommendation algorithm are a common point of concern among tech critics. The feeds can create the impression of a bespoke reality, letting users avoid things they find unpleasant — like the real people in Smith’s life who supported Trump. But while there are frequent complaints that algorithmic feeds could serve users misinformation or lull them into complacency, that’s not exactly what happened here. Voters like Smith understood the facts and the odds. They just underestimated how convincingly something like TikTok’s feed could build a world that didn’t quite exist — and in the wake of Harris’ defeat, they’re mourning its loss, too.
TikTok’s algorithm is hyperpersonalized, like a TV station calibrated exactly to a user’s brain. Its For You page serves content based on what you’ve previously watched or scrolled away from, and breaking out of these recommendations into other circles of the app isn’t easy. It’s a phenomenon political activists must figure out how to adapt to, says Cristina Tzintzún Ramirez, president of progressive youth voter organization NextGen America.
“It not only makes it harder for us to do our job, I think it makes it harder for candidates to do their jobs. It makes it harder for news media to do their job, because now you’re talking about having to inform a public that has so many different sources of information,” she says.
From the onset, the Harris campaign seemed to understand the power of these silos. On TikTok, where the Kamala HQ account has 5.7 million followers, an all-Gen Z team of staffers produced video after video that are, at times, indecipherable to the average person. If you saw a video stringing together clips of Harris saying things like “Donald Trump was fired by 81 million people” and “I have a Glock” with a gentle Aphex Twin song as the soundtrack, would you understand it as “hopecore”? The campaign bet that it didn’t really matter because the TikTok algorithm would carry it to people who did understand it. And at least to some extent, they were right.
Smith, like other TikTok users, knows that the platform recommends her content based on what she watches, saves, comments on, or likes. When pro-Trump content came across her For You page, Smith would purposely not engage and simply scroll away.
“I don’t want my algorithm to think that I’m a Trump supporter, so I just want to scroll up and ignore it,” she says.
In hindsight, Smith wonders if that was the right thing to do or if a mix of different types of political content may have given her more insight into what the other side was saying, doing, and thinking. She likens it to being a liberal or progressive who consumes news from right-wing outlets like Breitbart or Fox News — not because you agree with the material, but because it’s helpful to know what messages are resonating with other types of voters.
The echo chamber effect isn’t limited to politics: we don’t even really know what is popular on TikTok generally. Some of what we see may not be guided by our preferences at all. A report by The Washington Post found that male users — even liberal men — were more likely to be served Trump content on TikTok than women. According to data from Pew Research Center, about 4 in 10 young people regularly get news from TikTok.
TikTok obviously isn’t the only filter bubble out there. Two years into Elon Musk’s purchase of Twitter, now called X, the platform has morphed into a right-wing echo chamber, with content boosted by Musk himself. While TikTok is simply (as far as we know) serving people things they like to sell ads, the slant on X was a deliberate electoral strategy that paid off handsomely for Musk.
“I don’t think we know the full implications of X’s algorithm being rigged to feed us right wing propaganda,” Tzintzún Ramirez of NextGen America says. A recent Washington Post analysis found that right-wing accounts have come to dominate visibility and engagement on X. That includes an algorithmic boost to Musk’s own posts, as the billionaire angles for influence with the incoming administration.
Unlike somebody drinking from Musk’s algorithmic fire hose, a young person deep in a pro-Harris TikTok bubble likely wasn’t being fed racist “great replacement” theory stories or false claims about election fraud. Instead, they were probably seeing videos from some of the hundreds of content creators the Democratic Party worked with. Though the direct impact of influencers on electoral politics is difficult to measure, NextGen America’s own research suggests that influencer content may turn out more first-time voters.
“I should know better than to be fooled”
Alexis Williams is the type of influencer that Democrats were hoping could carry their message to followers. For the last several years, Williams has made content about politics and social issues and attended the Democratic National Convention this year as a content creator, sharing her reflections with 400,000 followers across TikTok and Instagram. Though Harris wasn’t a perfect candidate in Williams’ eyes, she felt Harris would win the presidency in the days leading up to the election.
“As someone with a literal engineering degree, I should know better than to be fooled,” Williams says. She was fed TikToks about a bombshell poll showing Harris ahead in Iowa; young women in Pennsylvania going to the polls in support of Harris; analysis about why it was actually going to be a landslide. Professional polls consistently showed a dead heat between Trump and Harris — but watching TikTok after TikTok, it’s easy to shake off any uncertainty. It was a world full of what’s frequently dubbed “hopium”: media meant to fuel what would, in retrospect, look like unreasonable optimism.
TikTok and the Harris campaign didn’t respond to The Verge’s requests for comment.
For many voters on TikTok, the Kamala HQ content fit in seamlessly with other videos. The campaign used the same trending sound clips and music and a casual way of talking to viewers that seemed, at times, borderline unserious. (The Trump campaign also used popular songs and post formats but didn’t seem as native to the platform — more like a politician’s attempt at TikTok.) But Smith says that even as a Harris supporter, there was a limit to how much of that she could stomach. At a certain point, the trends get old, the songs get overplayed, and the line between a political campaign and everything else on TikTok starts to get blurry. Kamala HQ, Smith says, started to feel like just another brand.
Williams’ confidence began to break down on Election Day, as she walked to a watch party. “I know what I’m seeing on the internet and everything, but I still had [something] in my heart that was like, I don’t see us having another Donald Trump presidency, but I also don’t see a world where a Black woman gets elected for president right now,” she says. She started to wonder whether that much had changed in the eight years since the last female presidential candidate. “You’re seeing all this stuff, and people are getting so excited, but this could be just a mirage.”
Filter bubbles are not a new phenomenon, and voters have a wide range of places to get hyperpartisan news apart from TikTok: blogs, talk radio, podcasts, TV. Whether on the right or the left, there’s a tendency to look around at what you see and assume it’s representative. But the false sense of certainty that TikTok brings is perhaps even more powerful. What we see on the platform is both uncomfortably personal and incredibly global: a video talking about something that happened on our neighborhood block might be followed up by someone across the country voting for the same candidate for the same reasons. It gives an illusion that you are receiving a diverse assortment of content and voices.
As social media algorithms have gotten more precise, our window into their inner workings has gotten even smaller. This summer, Meta shut down CrowdTangle, a research tool used to track viral content on Facebook. A public TikTok feature called Creative Center — which allowed advertisers to measure trending hashtags — was abruptly restricted by the company after reporters used it to report on the Israel-Hamas war. It is harder than ever to understand what’s happening on social media, especially outside of our bubbles.
“As technology gets more advanced and more convincing, our idea of a communal reality might genuinely become archaic,” Williams says. “This election has really taught me that we are very much sucked into these worlds that we create on our phone, when the real world is right in front us.”
+ There are no comments
Add yours