A visual map of the internet, circa 2006 — image by The Opte Project, Wikimedia Commons CC BY 2.5

Too much data has made it impossible for us to construct narratives that give our lives meaning

‘It is possible to live in a world of data but no facts.’ — William Davies, NY Times

Postmodernism, I learned at university, means the demise of metanarratives. Religion, nationalism, liberalism and even science, could no longer provide satisfying overarching narratives to give meaning to our lives. There was a time when I believed that we were all like protagonists waiting for a story to become a part of. Everyone felt like an extra waiting for their scene, and eventually, someone or something would happen, and a new narrative would be created which would sweep us all along in its inexorable plot. Now, I’m not so sure.

Mark Fisher famously identified the dull lack of imagination inherent in contemporary capitalism by quoting one of Postmodernism’s main theorists, Jameson (and Žižek), who suggested that it is easier to imagine the end of the world than the end of capitalism. There is no other way, as we were told by dull political managers from Thatcher onwards. But what if this lack of imagination is a product of our inability to process the increasing amount of information we are presented with, and to build any kind of convincing narrative that tells us what we should do with it.

Yuval Harari has a very interesting theory in his book Homo Deus. He says that humanity is one big data processing system, with individuals acting as processors. The success of one society or ideology over another depends on its efficiency at processing data. This way of looking at history and society is a challenge to liberal humanism, because human dignity and feelings are no longer of central importance. Information is elevated in this worldview, and what information wants is to be free, and flow around the system. Humans are secondary, useful only in so far as we can process the data.

Harari calls the philosophy associated with this worldview ‘Dataism’: “Dataism declares that the universe consists of data flows, and the value of any phenomenon or entity is determined by its contribution to data processing”. Organisms are simply algorithms, Harari says, and if we can invent algorithms that perform tasks faster and more efficiently than humans can do them, then maybe we should. Perhaps with sufficient data inputs, algorithms could choose better romantic partners for us, or better political outcomes than democratic elections.

Growth of and digitisation of global information storage capacity — image by Myworkforwiki, Wikimedia Commons CC BY-SA 3.0

According to IBM, by 2020, there will be 300 times more information than existed in 2005. Much of that information is valuable, but much of it is not. Take the problem of bots: bots can automate tasks that save humans a lot of boring work. Bots on Wikipedia are absolutely valuable — they check for vandalism and revert it, they correct grammar and vocabulary errors and can even write articles that don’t exist yet. But they can also be abused. Bots can be used to spread political propaganda and sway public opinion, scalp gig tickets and commit billions of dollars of advertising fraud.

A battle is being waged between people who want to safeguard and empower us, and those who want to exploit our cognitive biases for political and economic ends. As we begin to understand the algorithms and heuristics that make up the architecture of our human psychology, understanding cognitive biases has become an important part of behavioural economics. People in advertising and PR have become very adept at exploiting the simple rules of thumb our brains use to make sense of the world. Take back control. Learn to live within our means. Immigrants are taking our jobs. They are simple ideas that fail to reflect the complexity of reality.

We are now bombarded with so much conflicting information that our brains, which are still rather better adapted for hunter gatherer societies than ones containing mass transit, mass communication and masses of people, simply cannot keep up. This fact lends itself rather better to people who want to exploit us than those wanting to help us.

Hannah Arendt famously stated that ‘The ideal subject of totalitarian rule is not the convinced Nazi or the convinced Communist, but people for whom the distinction between fact and fiction and between true and false no longer exist.’ Destabilising reality used to be a political technique employed by the Nazis and then the Stasi on their political opponents to make them think they were going mad. They called it Zersetzung. This technique is now being deployed on a mass scale by people like the notorious Putin advisor Vladislav Surkov, and may even be influencing Donald Trump’s approach to US politics, especially in his attempt to undermine the public’s understanding of Fake News by applying it to any media he dislikes.

Our narratives have become balkanised into different niche communities. Disparate online communities of flat earthers, the alt-right, the new atheists, the true believers of Bernie/Hillary/Trump/Corbyn, ISIS or Assad fanboys and political extremists of a million other varieties silo themselves away from conflicting information that might trouble their worldview. This is a realisation of the atomised individual that neoliberalism posits as its ideal subject. Disconnected from your community and society, it becomes impossible to understand anything on a systemic level, or to take action in concert with your fellow citizens.

Harari argues in Homo Deus that the excess of information which is destabilising our reality is a form of censorship. “In the past, censorship worked by blocking the flow of information”, he says. “In the twenty first century, censorship works by flooding people with irrelevant information.” We desperately need to find new ways to verify information, establish facts and generate political consensuses that allow us to make better decisions, or it may be that we will decide that giving up control of these things to computer systems is preferable to making poor decisions ourselves.

NASA computer scientist Margaret Hamilton with the code for the Apollo mission, enlarged arbitrarily by Photoshop — original image by US govt, Public Domain CC 0

Many people are working hard to do just that, with technologies like the Blockchain (or its alternative, Holochain), which create an open, distributed archive of interactions which can not be tampered with, allowing people to create verifiable records which can be used not just for cryptocurrencies like Bitcoin, but for collaboratively writing constitutions, tracking identities without your documents needing to be frequently checked, recording events and medical records, and even voting. But at the moment, the explosion of data has gotten ahead of our ability to manage it, and it will take some time to create these systems.

We also need to learn to be more than data processors. We spend so much of our lives plugged into data networks, pushing information around systems that reward us with Likes and Retweets, while they scrape our data to build ever more accurate algorithms that know who we are and what we want better than we do. We need to recognise how we are enmeshed in these systems so that we can make better choices about what information we give away for free, and we need to take time to be human beings who think for ourselves and spend time in quiet contemplation, rather than renting our minds out as data processing nodes in a vast system too complex for us to understand.

And lastly, we need to create not just data, but art. Our brains have a deep desire to create stories. It’s the myths, fables, fairy tales, epic poems and heroes journeys that define not only our cultural identities but our individual lives. The brain has dual systems for experiencing reality and for remembering and constructing a narrative out of those experiences. We need art more than ever because we currently lack the archetypal stories that allow us to understand what it is to be a human being in a complex digital age. Bots and algorithms may be able to do many things more efficiently than we can, but they cannot weave stories that tell us how to live, love or what we should aim to achieve, either for ourselves, or as members of a complex global society.

If we could make machines that could be more efficient algorithms than our brains, freeing us up to create the art that makes sense of the complexity in which we live, then perhaps the future would not look as dystopian as it currently does.

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store