More like this

Images, left to right: Moon landing, Canva; Taylor Swift, Wikipedia Commons; The Royal Family Mother’s Day photo, @KensingtonRoyal Twitter; Star Wars Episode IX,

The theory could have been cooked up in a lab, specifically for a disappointed Star Wars fan like me: in late 2019, a Twitter user declared that the ending of Episode IX: The Rise of Skywalker was so unsatisfying because it was cobbled together at the last minute by re-using footage from an earlier scene. Their proof was two blurry but similar screenshots of lead actor Daisy Ridley. It kicked off a flurry of replies and clickbait articles, all credulously repeating and amplifying the theory.

Image: @im_organa. Source: Twitter.

In QAnon, the bizarre conspiracy theory and political movement that has hovered over American politics since 2017, this would be called ‘baking’. The lore of QAnon is deep (and unhinged), and relatively little actually comes from the titular Q (a self-declared mole in the US government). Instead, the conspiracies are crowdsourced by ‘bakers’ who follow the ‘breadcrumbs’ in Q’s rambling diatribes, decoding them by mixing in real news stories and old theories. This mashup of ‘evidence’ is presented as proof of a global satanic cabal, and anything that seems to contradict them is spun into proof of a cover-up.

While it’s easy to laugh at the conclusions of QAnon’s bakers (pizzagate, frazzledrip, lizard people etc.), that process of baking—of hunting for hidden information, combing an oversaturated internet to untangle the truth and then sharing your findings—is addictive. And it’s taking over pop culture.

Mash-up of ‘evidence’ is presented as proof of a global satanic conspiracy.

On TikTok, theorists decode Britney Spears’ Instagram posts. On X, users rushed to find the ‘real source’ of a photoshopped picture of Kate Middleton and her kids. On Reddit, fans of the TV series Our Flag Means Death interpreted the new SAG-AFTRA residuals agreement as the real reason the show was cancelled—not because it hadn’t hit viewership targets but because it had been watched too much and the residual payments were at risk of bankrupting HBOMax.

And across all platforms, few celebrities have encouraged baking quite like Taylor Swift, who has long included cryptic clues or ‘Easter eggs’ in her work. The pop superstar has created an entire coded narrative of her own life, with all its sinners (Scooter Braun, Kim Kardashian, Kanye West) and saints (herself, and Charlie Puth). By inviting people to search for secret meaning in her work, Swift has unwittingly become Schrodinger’s political figure. In 2016, neo-Nazis declared Swift a covert Aryan redpilling the masses. In 2024, the New York Times published an opinion piece claiming Swift was secretly queer. Both offer ‘evidence’ of their beliefs by reading into her work, and assumed that Swift was just waiting for the right moment to come out: either as queer or as a Nazi.

Rolling Stone on Taylor Swift conspiracies. Images:

I am not immune to the allure of baking: I actually made a podcast that tried to answer the fairly conspiratorial question ‘What really happened behind the scenes on Rogue One?’ I have over-analysed photos, drafted up timelines and sought the hidden meaning behind a creative’s coded words (like Rogue One screenwriter Tony Gilroy referencing the rules that govern how a film is credited to avoid saying outright how much of the film he rewrote).

But what really sets baking apart from actual research is its commitment to a pre-drawn conclusion: a search for evidence to support a theory, rather than weighing up evidence to find the truth. Take the song ‘thanK you aIMee’ from Taylor Swift’s latest album: even as Swift sings ‘I changed your name and any real defining clues’, the capitalisation in the title leaves no illusions about who the real bully in the song is meant to be. That hasn’t stopped theories that the song is actually about Karlie Kloss (whom most ‘Gaylor’ theorists believe Taylor had a secret relationship with), or that KIM stands for Knowledge Information Management and the song is actually about the news media. The theorists insist that the obvious answer is simply too obvious: it’s a distraction, a simple answer for simple minds. Taylor’s word is malleable, endlessly re-interpretable: just like Q’s.

What really sets baking apart from actual research is its commitment to a pre-drawn conclusion.

This rise of baking in pop-culture spaces has many complicated causes, but the most obvious is algorithmic: social media platforms prioritise time spent on the app above all other metrics and recommend content that is most likely to keep users on the platform. Baking videos, with their confident ‘factual’ case-building style, are both compelling and captivating—they also often point to other videos, previous instalments of the conspiracy or other posts you need to see. Justin Grandinetti, writing for the Global Network on Extremism and Technology, argued that TikTok has little incentive to slow the spread of conspiratorial content because it—like all media enterprises—operates on the logic that eyeballs are currency.

Since the internet has centralised to a handful of apps, these theories also now live on the same platforms we spend most of our lives on. They are presented alongside news, memes and our friends and family, in short digestible chunks fed directly to us. And for content creators, apps like TikTok have made baking videos easy to produce—as well as potentially lucrative. Creators who have one hit theory will often pivot their entire output to chase engagement: the theories around Kate Middleton’s ‘disappearance’ have initiated a wave of new royal decoders.

Kate Middleton doctored photo headline. Image: Business Insider.

However, it would be an oversimplification to blame the rise of baking on just the algorithm: the point of the algorithm is to push content that interests users. Therefore, these videos must offer something we crave. One argument is that they endure because they help alleviate the pressures of reality, telling us that we are right to think something is wrong in the world. Feelings of alienation, anxiety and powerlessness can all exacerbate conspiratorial thinking. If a young, closeted queer fan finds solace in the music of Taylor Swift, it’s comforting to think that Swift understands them. If you’re bitterly disappointed by a Star Wars film killing off your favourite character, it’s reassuring to think that wasn’t the original plan.

And like all conspiracy theories, we’re offered evil masterminds who are pulling the strings: Taylor Swift’s manager Tree Paine who supposedly keeps her closeted; Star Wars creatives who don’t understand the franchise and are deliberately destroying it (either JJ Abrams or Rian Johnson, depending on which sequel films you like, or producer Kathleen Kennedy if you hate all of them). Conspiracy theories have always offered this reassurance and these villains, though for broader issues: fear of governmental control becomes a faked moon landing, an unequal capitalist society is nefariously explained through antisemitic conspiracies.

Thanks to hyper-specific online spaces, these new lower-stakes pop-culture conspiracies don’t need to be broad. You don’t need a record-setting moon landing or a pandemic that hits on a global scale. They can meet people where they already are, tapping into specific feelings of disappointment and powerlessness. But even seemingly harmless conspiracies have the ability to radicalise. Fandom podcast Rewriting Ripley found that hatred of Episode VIII: The Last Jedi was weaponised into a recruitment tool by the alt-right: because if you believe that Star Wars has been ruined by Laura Dern with purple hair, then you’ll probably be open to some other very cool and normal beliefs about women. Taylor Swift’s presence at the Superbowl spawned a sprawling theory that she was part of an attempt to influence the upcoming US presidential election, which a third of surveyed Republicans claim to believe, and perhaps more worryingly, Joe Biden’s official X account joined in on. Russian disinformation network Doppelganger jumped onto ‘Kate Gate’ and funnelled traffic towards pro-Russia content, often regarding the ongoing invasion of Ukraine. 

Thanks to hyper-specific online spaces, these new lower-stakes pop-culture conspiracies don’t need to be broad.

I wish there was an easy, pithy conclusion to draw here: that I could piece all the evidence together into one all-encompassing solution for the insidious normalisation of conspiratorial thinking. But reality is far too complicated for that, and since anxiety and powerlessness push people towards conspiracy, this is likely just the beginning of the trend. It was previously believed that encouraging critical thinking skills could reduce conspiratorial thinking, but those findings have recently been called into question, and could not be replicated. What’s more, recent analysis has found a U-shaped relationship between conspiratorial belief, socio-economic status and education, with university-educated wealthy white men particularly drawn to ‘taboo’ conspiracies.

Perhaps the only solution is a kind of counterbaking: taking the ‘evidence’ of a conspiracy and applying Occam’s razor.

The theory that the ending of Episode IX: The Rise of Skywalker was made out of reused footage got so much traction that eventually the film’s editor, Maryann Brandon, was asked about it. Her answer was blunt: the two scenes looked similar because they’d been shot in the same location. Filming in a desert is expensive, so of course you’re going to shoot all your desert scenes one after another. That’s what it came down to: not a conspiracy, not an evil director out to destroy a beloved franchise but a practical shoot schedule.

More often than not, the obvious answer isn’t a distraction. It’s just the answer.