If there’s one thing that everybody knows about the internet, it’s that it’s full of lies. This much has been understood for such a long time that, even as far back as 1993, a New Yorker cartoon could use it as a punchline. While we’ve grown used to the idea that “on the internet, nobody knows you’re a dog,” nowadays you’re more likely to get fooled by an online-wide conspiracy, possibly bankrolled by the Russian state, claiming that dogs are made of hollow cardboard tubes. From flat-earthers to anti-vaxxers to people convinced the Australian Prime Minister once soiled himself in a suburban McDonald's, the denizens of the internet have clearly abandoned logic and reason for their respective filter bubbles, spinning frictionless from the truth.
Such has been the received wisdom since the political upheavals of 2016. Guardian editor Katherine Viner published her seminal long-read, 'How technology disrupted the truth,' a couple of weeks after the Brexit referendum in the UK; a few months later, Trump's victory in the presidential election would move the US discourse to roughly the same terrain. Michiko Kakutani's 2018 book The Death of Truth, for instance, takes the fact of the corrupt, lying Trump administration as its central emergency (other books covering the issue include works with such diverse titles as Post-Truth by Matthew D'Ancona, Post-Truth by Evan Davis, and Post-Truth by James Ball).
These works typically share a number of themes: a fear of technological change; a suspicion that online disinformation must all be the result of foreign meddling; and a vague nostalgia for an earlier, centrist, political consensus when apparently politicians didn't lie quite so much – or, even if they did, it mattered more when they were caught (nobody tell Bill Clinton, I guess. Or George Bush. Or Tony Blair.). However, the most important theme cutting through these works is a fundamental distinction between objective truth – which is good – with fantasy – which is bad.
Viner, for instance, bemoans that the “listless Remain campaign attempted to fight fantasy with facts, but quickly found that the currency of fact had been badly debased... When a fact begins to resemble whatever you feel is true, it becomes very difficult for anyone to tell the difference between facts that are true and 'facts' that are not.”
Kakutani, meanwhile, insists that while we can and should debate policies and issues, “these debates must be based on common facts rather than raw appeals to emotion and fear through polarizing rhetoric and fabrications. Not only is there such a thing as objective truth, failing to tell the truth matters... [we should not] normalize an indifference to truth.”
The political diagnosis is clear: people are no longer really interested in objective truth, which they ought to be beholden to; instead, fantasy is winning.
This way of setting out the problem has undoubtedly installed itself as the dominant view in the respectable commentariat. However, as an explanation of the poor state of political discourse by professional political discourse-ers, it manages to be simultaneously self-serving and self-defeating. Self-serving in that it allows the commentariat to cast themselves as the last bastion of truth in a world gone mad; but self-defeating in that the very existence of the problem is an admission of failure. Why, if objective truth is good and fantasy is bad, should an apparent majority of voters be more interested in the fantasies instead?
One candidate theory is, of course, that people are just too ignorant to be able to tell truth from lies, handily reinforcing the common stereotype that Trump and Brexit voters are just too stupid to be helped. But here's an interesting counterexample to that theory. In 1971, the New York Times printed the ‘Pentagon Papers’; a leaked secret history of US operations in Vietnam from 1945 to 1967, commissioned by then Secretary of State Robert McNamara. The philosopher and political theorist Hannah Arendt offered a compelling analysis of these revelations in her essay, 'Lying in Politics.' Arendt describes a process in which some of the most qualified and best-informed people in the USA became caught up in a vast and disastrous network of deception and falsehood, which they both perpetuated and fell prey to. Far from an ignorant, angry mob, it was the “problem-solvers,” men like McNamara himself drawn into government from universities, think tanks, and the upper echelons of industry who found themselves trapped inside one of the most consequential echo chambers of the 20th century. In carrying out the war in Vietnam it was these men, these serious, educated men of power and stature, who were the most easily manipulated to slavishly follow a self-deceptive road to ruin. During the course of the war, the “problem-solvers” found themselves undone by their systemic inability to recognize that strategic decisions really ought to have something to do with the facts on the ground (compare for instance the problem solvers' insistence that they were fighting a war of liberation against marauding communist forces with how actual Vietnamese people viewed situation).
Significantly, Arendt does not contrast the blank, unvarnished truth with fantastical, malicious lies. Rather, she identifies two distinct ways that people can fail to act truthfully. One, a “passive susceptibility to falling prey to error, illusion, the distortions of memory...” and, the other, the “active, aggressive capacity” to “deny in thought and word whatever happens to be the case.”
The upshot of this is that it allows Arendt to relate systematic untruth robustly to human action. “A characteristic of human action,” Arendt writes, “is that it always begins something new.... In order to make room for one's own action, something that was there before must be removed or destroyed, and things as they were before are changed.” But “such change would be impossible,” she tells us, “if we could not mentally remove ourselves from where we physically are located and imagine that things might as well be different from what they actually are.” “In other words,” Arendt claims, “the deliberate denial of factual truth – the ability to lie – and the capacity to change facts – the ability to act – are interconnected: they owe their existence to the same source: imagination.”
This link between imagination and how we perceive and construct the world is, incidentally, one that by no means has its origins in Arendt: in De Anima, where he sets out his philosophy of perception, Aristotle notes that imagination – the Greek phantasia, which obviously is also related to fantasy – “has taken its name from light (phaos), because without light it is not possible to see.” At any rate: for this reason, Arendt concludes, “the lie did not creep into politics by some accident of human sinfulness,” and “moral outrage... is not likely to make it disappear.” “The deliberate falsehood deals with contingent facts; that is, with matters that carry no inherent truths within themselves... Factual truths are never compellingly true.”
Admittedly, Arendt notes, “under normal circumstances” the liar will be “defeated by reality” – eventually people will figure out that their fantasies either do not cohere, or could not cohere, with how things actually are. But there are nevertheless several reasons why reality may not triumph after all. One, which Arendt herself mentions, is the material interests of the actors involved. In totalitarian states, audiences are often “forced to disregard altogether the distinguishing line between truth and falsehood in order to survive.” But the experience of the problem-solvers in the US government shows us that, if you want to enforce general conditions of falsehood, show trials and forced labor camps are hardly necessary. All you need to do is make participating in a big, stupid lie a condition that must be met in order for people to keep their (perhaps very prestigious) jobs.
A second factor I'd like to mention is not quite so explicit in Arendt – although it can in some sense be related to the first. It is, however, very readily apparent in a number of recent post-truth controversies. This factor is desire: how people feel inclined to imagine the world is, based on how – for whatever reason – they would like it to be.
Katherine Viner opens her article, for instance, with a discussion of the Piggate scandal. For the uninitiated (as it were), this was a bizarre episode in British politics that occurred in September 2015, whereby an unauthorized biography of the then-Prime Minister David Cameron suggested, perhaps unreliably, that he had been involved in a hazing ritual at Oxford where he did something obscene with a dead pig's head. For Viner, Piggate is an example of fake news catching on – but as someone who loved sharing and lampooning the story myself, I find it hard to believe that anyone involved actually thought they had actually seen any reliable evidence that it was true. The point, rather, was that Cameron had previously, thanks to his disastrous government receiving what then seemed like unaccountably favorable media treatment (we now know better than to expect scrutiny of Tory governments by establishment journalists, of course), seemed impregnable, unsmearable – but now there was a story out there that was (a) very embarrassing to him; (b) let's face it, seemed, given everything we knew about Cameron, like it ought to be true. I wanted Piggate to be true because it would have helped reality as such make more sense to me than it did – I desired to do everything I could to help it catch on. It was how I desired to perceive things that made me spread it.
While the Piggate example is relatively light-hearted, as fake news stories go, the power of desire to create and spread untruth can be far more consequential. An illustrative example is an incident that took place during the 2019 UK general election campaign, when a conspiracy theory started spreading about a photo which had been published in the Daily Mirror, of a four year-old boy who had been admitted to Leeds General Infirmary with suspected pneumonia, wearing an oxygen mask and lying – due to a shortage of beds – on a pile of coats. The photo was taken as damning of the Tories' record in charge of the NHS – which was exactly why conspiracy theorists, from random users of Facebook and Twitter to the Daily Telegraph columnist Alison Pearson and the former England cricket captain Kevin Pietersen, decided that it had to be a leftie Labour hoax. The conspiracy theorists didn't want to be living in a world where they had to care about access to potentially life-saving medical treatment for the under 5s being compromised as a result of Tory cuts – and so they created, for themselves, a world where they didn't have to.
To conclude, then: it is never enough to simply state or present the facts. Blankly presented, the facts are roughly as powerful as the wartime train station posters described by Wittgenstein in a notebook entry included in Culture and Value, which asked “Is your journey really necessary?” (“As though someone who read this would think 'On second thoughts no.'”).
The challenge for those who want to combat disinformation is not to simply make the truth apparent but to make it appealing. Material interests, desires, and the facts – such as they are – must be made to align.
Tom Whyman is an academic philosopher and writer from the UK