Did you know you can customize Google to filter out garbage? Take these steps for better search results, including adding my work at Lifehacker as a preferred source.


While sometimes there’s a big piece of misinformation that a lot of people latch onto—like The Rapture or the existence of “MedBeds“—the fractured nature of the information sphere has all but killed the overarching conspiracy theory. No longer do big ideas like “we never went to the moon” unite the dumbest minds; instead, the algorithm creates bespoke conspiracy theories. So instead of joining the Flat Earth Society, you might think the actual year is 1728, or that AI secretly imagined a British comedian from the the 1980s and seeded the web with evidence of his existence.

But how does it start? And how quickly can social media platforms transform someone from a person from a seeker-of-knowledge to a believer-in-bullshit? YouTuber Benaminute recently posted a video where he dug in to find out. His question: If you start with a benign, broad, randomly chosen subject, and you only watch videos having to do with that subject, how long will it take until TikTok, YouTube Shorts, and Instagram Reels feed you a conspiracy theory video? The answer: not long at all.

Different topics all lead to the same place (more or less)

For the experiment, Benaminute created “blank” social media profiles and behaved like someone who was innocently curious about one of three topics—dinosaurs, The Vietnam War, and the 2000 presidential election. He put the keyword in each platform’s search bar and only watched and liked videos about the initial subject.

Dinosaurs

The Vietnam War

Things get worse for people interested in historical or political events. On all short-form platforms, an interest in Vietnam will lead you pretty quickly to right-leaning content, which leads you to conspiracy theories.

The 2000 election

The election of 2000 is still a charged topic, but it’s been awhile, so maybe cooler heads and verified information will win the day? Spoiler: nope.

Which social media app leads to conspiracy theories fastest?

The champion of “normal search to conspiracy theory” speed runs is TikTok, with an average of 114 videos or 57 minutes of watching. YouTube Shorts comes in second with 230 videos or 1 hour 57 minutes of time, and Reels takes 275 video or 138 minutes. It’s a distinction without a difference; however, all three platforms lead to conspiracies in the time it takes to watch a Marvel movie.

What does it all mean?

It would be easy to conclude that the massive tech companies that built YouTube, Instagram, and TikTok companies weight their recommendation engines so viewers are led to fake stories. Maybe they have specific political aims and are trying to sway votes, or maybe (as Benaminute posits in a semi-tongue-in-cheek way) these apps are built to “keep us angry, divided, and distracted” from realizing the conflict isn’t between Left and Right, but between “up and down.”

This is also a conspiracy theory, however. I’m not saying he’s wrong, but we don’t have enough information to know why algorithms recommend conspiracy content. It could be because bad actors at the top demand specific results for some purpose, but it seems more likely to me that TikTok et al. don’t have an agenda beyond making money.

I have no doubt that a social media platform featuring an algorithm that weighs the truth heavily would fail pretty quickly; the Truth is boring compared to conspiracy theories. Conspiracy theories, broadly, make believers feel special, like they have inside knowledge the rest of us lack. People scroll TikTok to have fun; the truth usually isn’t fun. Conspiracy theorist can say things like “UFOs are here!” or “They’re turning the frogs gay!” Meanwhile, if you’re devoted to the truth, you mostly have to go with “the best evidence suggests…” or “it seems logical that…” and who wants to hear that?