It’s Easter Sunday. To celebrate, I took the Windrush line up to Hoxton, had festive Swedish meatballs at the Curious Yellow Kafe, then wandered through Shoreditch under that thin, deceptive spring sun London offered today. On the train back south, the miracle arrived in my idle-time phone scrolling. It happened somewhere between Whitechapel and where the internet cuts out when the train goes underground for a few stops. A tech bro cured his dog’s cancer with ChatGPT. It’s an Easter story, neatly packaged for the feed. Like, share, move on. Don’t miss your stop.

Auto-generated description: A man smiling next to a dog is featured with a message about using AI to create a cancer vaccine to save his dog.

Read the story, why do that? There was a preview card. A toothy, smiling white man and his happy Staffy mix. The headline completes the Hallmark Channel story for our times: “He used AI to create a cancer vaccine to save his dying dog.” The ChatGPT logo floats just above the dog’s head, like a halo in a Renaissance painting of a saint. The story completes itself without the need of clicking through, or dealing with cookie consent forms, various pop-ups, a paywall, an e-newsletter subscription request, or whatever else nearly every commercial news site throws at you with javascript. We are living in miraculous times.

Auto-generated description: A smiling man sits beside his dog, who is wearing a bandana, accompanied by text about using ChatGPT to help develop a custom vaccine for the dog's cancer.

This is how most information moves now. Not through articles, but through surfaces. Preview cards, thumbnails, captions. Carefully assembled fragments designed to survive the scroll. The article itself exists, somewhere beneath, but it’s now almost incidental. By the time you might click, you’ve already decided what happened.

And this one lands because it’s engineered to. Cancer does the heavy lifting. The dog does the rest. It disarms you, makes scepticism feel inappropriate. It’s Easter after all.

But here’s the twist! It’s not just medicine, but AI! The bot we can all access to lazily respond to emails we’d like to ignore is being used by some wunderkind Down Under to cure cancer. We all have access to the thing that cures cancer. Or something like that. Or rather, no. It’s nothing like that.

Read these or don’t, can’t say you didn’t get the chance: To varying degrees, these are all hype headlines. The articles themselves vary in quality and detail. They all contain elements of the mythology that makes them sharable. The UNSW headline is especially egregious, it’s a university for fuck sake have some standards.

What actually happened is slower, messier, and much less cinematic. It’s not even a particularly good Netflix series. Multiple rounds of conventional treatment didn’t work. The dog’s owner, with access, privilege and resources, pushed further into the system rather than bypassing it. DNA sequencing, researchers, lab work, a bespoke mRNA construct manufactured by specialists (not bots), layered with another form of immunotherapy. Ethics approvals. The AI is there throughout, but as a tool in the process, helping navigate research and make sense of data, not designing, manufacturing, or delivering treatment. The result isn’t a cure. It’s a partial response. It’s a treatment. Uneven, uncertain, and still unfolding.

That version of the story doesn’t travel. First off, it’s too complicated. Secondly, there’s no tidy hero element. No archetype pulled from the offspring of an Ayn Rand character template and a Robert F. Kennedy health policy. There’s a reason that Elizabeth Holmes conned people for so long. We’re conditioned to believe in unicorns. It slots neatly into something older. The founder myth: The outsider who breaks through where experts failed. The idea that you don’t need institutions, just ingenuity and the right tools. Being a drop out is even better. Not knowing the field is somehow an advantage, not a limitation. We want the dropout to win. It’s a comfortable tale because we’ve seen it before, in different forms, attached to different sectors, technologies, selling different shortcuts. The thing about unicorns, though, is that they aren’t real. It’s a team of special effects people.

“When an Australian tech entrepreneur with no background in biology or medicine said ChatGPT helped save his dog from cancer, the story couldn’t help but spread, wrote Robert Hart in The Verge. “It’s the kind of validation Big Tech has long craved: proof that AI will revolutionize medicine and take on one of its deadliest diseases. The reality, as usual, is more complicated.”

That Verge article gets it. “Not only was Rosie not cured of cancer, it’s not clear the mRNA vaccine was responsible for her improvement.” But it’s lobbing the truth bombs on the wrong side of a paywall. Misinformation runs free online while facts, context and details often need a monthly credit card payment. But even when an article isn’t paywalled, there’s increasing tendency to share before reading. A person could take that Verge article url and knock it into archive.ph and see the whole thing. But who knows that? How many people will do it? How many people will see the article at all compared to the more SEO tasty clickbait headlines that conform to our mythologies about tech founder genius? The funnel chart narrows pretty fast.

As is the custom on LinkedIn, it became fodder for everyone’s personal TED Talk script in the form of very long posts, often with single-sentence paragraphs. “This sounds like science fiction… but it actually happened,” wrote one person. “This is what can happen when a data scientist refuses to give up on his dog,” gushed another. Sorry folks, not this time.

“ChatGPT did not design or create Rosie’s treatment; human researchers did. At most, the chatbot served as a research assistant helping Conyngham parse medical literature — impressive, but a far cry from the breakthrough implied.” — Robert Hart, The Verge

This isn’t about AI. It’s about belief. Right now The Discourse is fermenting. AI enthusiasts are banging the drum. Utopia is nigh! AI bashers are pointing out that the hype machine has its new poster critter. It’s not that these technologies aren’t useful in medical research, they demonstratively are: “These technological innovations not only improve vaccine design but also enhance pharmacokinetics and pharmacodynamics, offering promising avenues for personalized cancer immunotherapy.”

Humans don’t do lossless data compression. Information drops. It goes like this… Some event happens, a medical or technical breakthrough of some kind, let’s say. It’s complicated and contingent. Institutions frame it through teams of reviewers, cautiously, but optimistically. Companies try to leverage it for shareholder value. Media compresses it into something clickable to trigger as many monetisation scripts as possible before page exits hit. Social platforms format it into something that propels engagement and reduces departure. And then people take it, reshape it, and pass it on again for whatever reason. At each step, something is lost in a sort of social web non-random natural selection process. Nuance, complexity and uncertainty drop out of the pool early. Collaborative efforts are recessive, hero elements are dominant. What remains is the part that travels. To understand why it works this way, read fewer blog posts on social media engagement strategies and pick up some Joseph Campbell.

This isn’t Cambridge Analytica shenanigans. Those happen but they’re something else. This is default mode transmission: It comes with each transaction. The tools are technical, but behaviour is human. It doesn’t just spread information, it reshapes it into something that can move faster with each share that gets reshared. And in doing so, it often removes the parts needed to understand whether it’s true. It’s not necessarily false, but it’s often not accurate. And it’s optimised for people to be wrong.