Sebastian Groes and Nick Lavery The human mind has become a site of contestation once again: major power struggles are playing themselves out, and science and scholars are staking their claim. Our contemporary period is characterised by a multiplicity of revolutions that together are radically reshaping the contet of our thinking about what it means to be a human being. Globalisation, overpopulation, climate change, an ageing demographic, ongoing scientific breakthroughs, and the dominance of new technologies and social media in our lives are jsut a few examples of developments taht are having a major impact upon our understanding of ourselves, the world, other people and nature. The current explosion of interest in, and concern about, cognition and, more specifically, memory, is striking because it is vital in the formation of our understanding of who we are, and how our culture and world work. Do we, in the digital era, have the right to be forgotten after we die? Do external memory storage facilities and tools such as Google, Facebook and GPS make our brains lazy? What happens to memory when climate change may make mankind extinct and our collective future is shrinking before our very eyes? How dependent is our thinking about memory on cultural factors? Does the increasinlgy chaotic, unpredictable world we live in undermine the necessity of memory to help us imagine future narratives for manking? These were some of the questions that The Memory Network, an AHRC-funded research networking project that brings together scientists, scholars, artists and writers, adressed at four event organised in partnership with Cheltenham Literature Festival 2013. Taken together, the events spelled out a clear message: the future of memory is at stake. Below is an assessment of the first two events.
‘Like a bad smell in cyberspace’: Memory in the Digital Age
The event ‘Re-Wired: Memory in the Digital Age’ tackled questions about how, in an increasingly digital context, we are remembered after we die. We send out thousands of emails, upload photos of ou loved ones onto Facebook, and there is a host of other communication we send out via social media sites such as Twitter. This raises various questions. What happens to our digital selves after we die? How is our individual identity (re-)shaped in this new context? How is the relationship between man and machine changing? Identity and the mind are digitally extended and/or offloaded, and often our experience is more and more shared, with the individual only partly controlling the process. Yet, through archiving sites such as Facebook we also (appear to) have more power in shaping future memories of ourselves. And how is our conception of human nature changing when the digital world in which we are all immersed turns us all into cyborgs and makes us partly virtual? Not only has the traditional division between our private lives and public personas been blurred, but we are now restructured through the algorithmic mathematical formulas that both create and analyse our world. As Gilles Deleuze presciently said in 1990: ‘We no longer find ourselves dealing with the mass/individual pair. Individuals have become “dividuals,” and masses, samples, data, markets, and “banks”. This raises questions concerning the ethics of this context, about ownership and power, and who controls and has access to our private, public and digital lives. Socio-digital Interaction specialist Wendy Moncur (Dundee University) gave a talk about Digital Inheritance, and identified some of the contradictions that are facing us. Our physical life, social life and digital life are all becoming increasingly complex, but also interwoven with one another. Moncur warned that ‘there is no off-switch for digital death; there is no mechanism by which we can switch oursleves off at the end of our digital life and we can linger on, a bit like a bad smell in cyber space.’ Moncur took a look at the different forms of value in our digital lives: people with an ebay account have ‘financial value’ invested in their online selves, while posting pictures of loved ones has ’emotional value’, and writing blogs have ‘intellectual value’; all these different types of value can carry on even after our physical lives end. This is called the ‘post-self’, which consists of memories, mementos, memorials, reputations and other kinds of memory, and often changes, as she pointed out: ‘Think for instance of Jimmy Savile, who died some years ago but whose reputations is continuing to change and people’s memories of him are shifting. So the post-self is a dynamic being. And we may now be involved in our post-self: there are tools that allow us to send out posthumous emails and we can be involved in our own funerals, thus shaping how the post-self is constructed. The digital self thus outlives our physical life-span’. Literature that has responded to this context includes Lotte Moggach’s Kiss Me First (2013) and James Smythe’s The Machine (2013). Science fiction novelist Adam Roberts confided that, although he had held out against social media for a long time, he’s now addicted to Twitter, which he calls the ‘crack cocaine of the Internet.’ Now that his love for the digital goes so deep that Roberts reads novels on the Kindle app on his iPhone, he wonders how this is perceived by his children, and what kind of role model he sets. Roberts also identified a new, paradoxical form of human behaviour: ‘Social media are memory. Everything you write is engraved in virtual stone forever. Oral communication is different: once you speak certain words will vanish into the ether, yet when you communicate face to face with people you will behave more politely. The rudest communication happens on social media, which is weird because every word you say is recorded. I think this happens because, for instance, Twitter gives the illusion of fluency and there is no face to face contact.’ Adam Roberts (Royal Holloway) explaining how the digital age is essentially a democratising force that empowers people, with Wendy Moncur (Dundee)
Roberts told us about the inspiration for his novel New Model Army (2010), which projects into the future the idea than an army could be de-hierarchised and democratised by using technology that allows all soldiers to be involved in strategic decision making, their collective knowledge operating via wikis. Roberts got the idea partly from Apple-founder Steve Jobs: ‘Jobs was fond of saying: ‘the entire accumulated body of knowledge generated by mankind from the beginning of time to the year 2000, we now generate that amount of data every year.’ It’s now strictly speaking ‘knowledge’, of course, but all that you need to turn that intelligence into wisdom are algorithms and search engines that are sophisticated enough. That new situtation levels and democratises human interaction as a whole, and that is a good thing: rather than the tyranny of the previous hierarchical structures, democracy is the best way of reaching decisions.’ Roberts’ novel thus explores through science fiction the implication taht currently both consciousness and memory become less and less an isolated act contained within the individual human mind, and more and more a shared, collective activity and process. The digital revolution is, for Roberts, and age that empowers people, and we move ever closer to an opportunity to tap into, and use, the wisdom of the crowd. One of the pannelists, Stacey Pitsilides (Goldsmiths), a doctoral sutdent working on ‘Digital Death’, quoted Professor José van Dijck (University of Amsterdam) on why people are flocking to the digtial with such major intensity: ‘the anxiety of forgetting is implied in the desire not to be forgotten […] the most important beneficiaries of this software product are your descendents. Immortality thorugh software cultivation appears to be an attractive prospect in which to invest.’ Van Dijck seems to hint at an (unconscious) lin between our ability to remember and our desire to not be forgotten, but perhaps the irony is that the tools that grant immortality (sites such as Facebook) might actually make us forget, because the act of storing a photo of ourselves on Facebook implicitly licenses us to forget that content as well. The discussion of digital afterlives also invoked a question about religion: with the waning of religion and the comforting idea of the afterlife, can the success of social media be explained through the digital afterlife they give us? Moncur did not agree with this assumption: ‘We’re seeing people use the digital to memorialise. You can create a memorial on Facebook. The 9/11 memorial in the States also has a digital form so that you can remotely visit. But we’re also seeing digital memorials with a religious element. It’s not about religion disappearing, but it perhaps reflects how we’re behaving offline: some people are religious, some people are not, some people are embracing new forms of spirituality. The digital is just another form of expressing how we are as humans. We have many different forms for remembering people.’ Adam Roberts added to this: ‘Atheism is a big deal at the minute, but it seems to me that digital media are the apotheosis of particular religious traditions. If you look at Christianity, Judaism, Islam, there are two aspects that unite them: they are structured according to books, and religion creates a community of people congregating. Social media are exactly those two things. It’s about writing, but also communicating on a much larger, global scale.’ Nick Lavery interviews science fiction writer Adam Roberts, who wondered whether one day he will give up the thousands of material books that are in his library and replace them with electronic ones If we could predict the future it would be memory: ‘Memory, Prediction and the Invisible Future’
The ‘Memory, Prediction and the Invisible Future’ event started from the perhaps counterintuitive assumption that memory is not just for looking backward, but that it is a tool we need in order to predict the future. From an evolutionary point of view, individual memory probably evolved to allow us to plan ahead: in order to make choices (like, say, whether the hunting of an animal is likely to result in food for the tribe, or whether a traffic jam yesterday will make us change our route today), we need memory as a database that allows us to hypothetically predict outcomes accurately and succesfully in the imagination. On a collective level, we used to think that the future was simply an extension of the past, and that the world would continue much as it had done before. After the Industrial Revolution we thought we could even make our future lives better; we could make progress, and much of the effor to create progress depended on taking away risk. During the twenty-first century, however, many of the established linear, causal models of thinking were undermined, and made way for new theories that focused on the many uncertainties and instabilities at the heart of the universe. Now we find ourselves in an increasingly unstable world, with climate change, globalisation, an unstable economy and the technological singularity, the latter being the idea the nature of the technological revolutions we live through makes it impossible to predict the form and content of lives in, say, the next 20 years. In 1985, no one could have predicted we’d all walk around with a pocket-sized computer that allows us to ring, navigate cities, tell the weather and give us any information we desire. In the twenty-first century, our ability to use memory to predict is, paradoxically, both enhanced, because we have increasingly detailed models to calculate the future, and undermined, as we also realise that there are many contingencies and chance factors built into our world tha make it increasingly difficult to predict the future. One key book that explores such instability is Nassim Taleb’s The Black Swan (2007), which shows that the complexity of the financial world is now very instable: a seemingly minor event can trigger huge consequences on the other end of the world, whilst more and more often highly improbable evets actually do take place. Thus, one might argue that the role of memory and history is becoming less strong in our contemporary culture: our knowledge and the databases of the past (both at the level of the individual as well as socially at a global level) are less useful because simple, linear causality is undermined. This makes our world deeply fragile and our lives profoundly precarious.
The Memory Network put together a diverse panel, including the director of a company that specialises in the analysis and prediction of the behaviour of consumers, Bunnyfoot. First, technology futures analyst at Nesta Jessica Bland, opened with the claim that ‘the future is not always about prediction; looking forward is about imagining plausible future scenarios in order to help us understand and debate things today’. She described her work on the potential humanitarian uses of drones as a way of ‘shifting that future that we can imagine’, suggesting a correspondence between a changeable future and a past left unfixed by the plasticity of human memory. Bland concluded that ‘we don’t have to remember things perfectly for memory to be a useful thing in our life in the way that we don’t have to predict the future perfectly for it to be a useful thing in our world’. Kevin Fong introduced himself as a ‘prediction nihilist’, asserting that while ‘we tend to remember the good decisions we made that were predictive’ our ability to predict what we are going to need as a society is ‘near zero’. He took from the example of Robert Falcon Scott’s Antarctic expedition, whose failure belied the vital importance of Antarctic research in the study of climate change, the idea that ‘in our forays into the unknown continents of the body and the mind, we don’t know where we’re going’, and that our focus is not about looking to the future so much as ‘grappling with the present, trying to mitigate the consequences of the thing we did recently and hoping that somehow that will move us forward’. Rob Stevens, founder of Bunnyfoot, also showed how eye-tracking technology is used to predict consumers’ behaviour by doing an experiment with Memory Network adminstrator Nick Lavery. Nick was equipped with eye-tracking goggles (glasses that measure eye positions and eye movement) asked to find the letter ‘P’ in two versions of the alphabet: one which was chronologically ordered and one randomly scattered across the screen. Stevens elaborated on his notion of the brain as a ‘memory-based prediction tool’, using a demonstration of his work with eye-tracking technology to show the importance in perception and prediction of existing mental models, and tracing the vital function of memory from the survival and evolution of single-celled organisms to his current work in advertising that plays on disjunctions between the firing of our memories in the neo-cortex meeting sensory input coming the other way, from reality. Discussing their own future predictions, the panel balanced a measured understanding of this ability with a sense of optimism over humanity’s relationship to technology. Jessica Bland summed up the panel’s discussion of prediction and memory by pointing out that ‘if we could predict the future it would be memory’; while the fundamentally plastic, human nature of memory imposes limitations on our ability to predict, it will also play a part in mitigating the consequence of, and shaping our response to, whatever we will encounter in the future. Videos and audio of all the events will be posted on the Memory Network website www.thememorynetwork.com shortly. The Memory Network would like to thank Sarah Smyth and Rose Stuart for co-organising The Memory Network @ Cheltenham events.