Mallinson said he watched with dismay as his old social media posts supporting Democrats were dredged up and recirculated as supposed evidence of his dark political intent. Assigning political blame – even where no evidence yet exists – is, by now, another certainty after tragedy.
“Social media is a machine that runs on takes and scoops,” said Mike Rothschild, an independent journalist and author who studies conspiracy movements. “Nobody has anything but takes, and they’re getting more outlandish as the hours go by because the alternative is to just say nothing. And that doesn’t generate clicks.”
Kash Patel, director of the FBI and a former podcast host, then announced on X – erroneously, it was later revealed – that the police had apprehended another man who was the shooter.
An hour later, Patel clarified that the man was released, implying that the shooter remained at large and unknown to the authorities.
“Our investigation continues,” Patel wrote. The FBI did not immediately respond to a request for comment.
US President Donald Trump waded into the political fray with a video of his own. He shared his condolences but also blamed the “radical left” for fomenting a climate of rage, despite having no apparent knowledge of the shooter’s motives.
Many of the posts that contain mistakes and falsehoods remained online long after they were proved false. One about Mallinson received more than 3 million views on X despite having a fact-check notice appended to the bottom. Such corrections on X, called Community Notes, are Elon Musk’s preferred solution to the deluge of misinformation on the platform he owns. Studies have shown, though, that nearly all Community Notes are appended after the post has reached almost everyone that it will reach.
On fringe social media websites where far-right agitators congregate, the mix of news and wild speculation stoked outrage to a fevered pitch, Holt said. A similar mix of content is starting to appear on more mainstream websites like X, Holt added, which he said was a sign that ideas once relegated to the margins were finding traction among everyday social media users.
The confusion was also amplified by artificial intelligence tools.
Phoney news websites devoted to generating clickbait content sprang into action, publishing hastily written articles in a bid to rank highly in search engines. Those articles are sometimes written with the help of AI, which can take threadbare information such as the supposed name of a shooter and spit out realistic-sounding news articles.
AI-powered chatbots introduced their own falsehoods and errors. Two deployed on X had sometimes repeated falsehoods or made mistakes in their responses. Grok, an AI chatbot created by xAI, one of Musk’s companies, dismissed footage of Kirk’s killing as “staged satire or a form of sarcasm” in one early post, and named Mallinson as the shooter in another. The day after Kirk’s death, a bot made by Perplexity, another AI company, claimed that he was alive.
What comes next is likewise clear to disinformation researchers who study social media: the claims, fictions and speculations begin to snowball into full-throated conspiracy theories, which are promoted by influencers eager for larger and larger audiences.
RT, the Kremlin-backed news network, was among dozens of accounts suggesting that “unusual gestures” by men standing behind Kirk before the attack – one touching his hat and his ear, another waving his hand – were worthy of deeper scrutiny. “What do the signs mean?” the account questioned ominously.
Roger Stone, a close ally of Trump and a self-proclaimed “dirty trickster”, added his own speculation on X, writing that Kirk was “killed by a skilled shooter in a professional hit either by a nation state, rogue elements of our own government or a terrorist organisation”. He provided no evidence to support the claim, except that he had written a bestselling book about the assassination of former president John F. Kennedy.
The post received more than 1.5 million views.
This article originally appeared in The New York Times.