The human mind has become a site of contestation once again: major power struggles are playing themselves out, and science and scholars are staking their claim. Our contemporary period is characterised by a multiplicity of revolutions that together are radically reshaping the context of our thinking about what it means to be a human being. Globalisation, overpopulation, climate change, an aging demography, ongoing scientific breakthroughs, and the dominance of new technologies and social media in our lives are just a few examples of developments that are having a major impact upon the understanding about ourselves, the world, other people and nature. The current explosion of interest in, and concern about, cognition and, more specifically, memory, is striking because it is vital in the formation of our understanding of who we are, and how our culture and world work.
Do we, in the digital era, have the right to be forgotten after we die? Do external memory storage facilities and tools such as Google, Facebook and GPS make our brains lazy? What happens to memory when climate change may make mankind extinct and our collective future is shrinking before our very eyes? How dependent is our thinking about memory on cultural factors? Does the increasingly chaotic, unpredictable world we live in undermine the necessity of memory to help us imagine future narratives for mankind? These were some of the questions that The Memory Network addressed at four events organised in partnership with Cheltenham Literature Festival 2013. Taken together, the events spelled out a clear message: the future of memory is at stake.
On the 9th October 2013, the Memory Network invited Wendy Moncur, reader in Socio-Digital Interaction at the University of Dundee, Stacey Pitsillides a PhD candidate in Design at Goldsmiths working on Digital Death, and writer Adam Roberts, to discuss ‘Memory in the Digital Age’ at the Cheltenham Literature Festival.
The event ‘Re-wired: Memory in the Digital Age’ tackled questions about how, in an increasingly digital context, we are remembered after we die. We send out thousands of emails, upload photos of our loves ones onto Facebook and there is a host of other communication we send out via social media sites such as Twitter. This raises various questions. What happens to our digital selves after we die? How is our individual identity (re-)shaped in this new context? How is the relationship between man and machine changing? Identity and the mind are digitally extended and/or offloaded, and often our experience is more and more shared, with the individual only partly controlling the process. Yet, through archiving sites such as Facebook we also (appear to) have more power in shaping future memories of ourselves. And how is our conception of human nature changing when the digital world in which we are all immersed turns us all into cyborgs and makes us partly virtual? Not only has the traditional division between our private lives and public personas been blurred, but we are now restructured through the algorithmic mathematical formulas that both create and analyse our world. As Gilles Deleuze presciently said in 1990: ‘We no longer find ourselves dealing with the mass/individual pair. Individuals have become “dividuals,” and masses, samples, data, markets, and “banks”. This raises questions concerning the ethics of this context, about ownership and power. Who controls and has access to our private, public and digital lives.
Socio-digital Interaction specialist Wendy Moncur (Dundee University) gave a talk about Digital Inheritance, and identified some of the contractions that are facing us. Our physical life, social life and digital life are all becoming increasingly complex, but also interwoven with one another. But Moncur warned: “There is no off-switch for digital death; there is no mechanism by which we can switch ourselves of at the end of our digital life and we can linger on, a bit like a bad smell in cyber space.’ Moncur took a look at the different forms of value in our digital live: people with an ebay-account have ‘financial value’ invested in their online selves, but posting pictures of loved ones has ‘emotional value’ and writing blogs has ‘intellectual value’, and all these different types of values can carry on even after our physical lives end. This is called the ‘post-self’, which consists of memories, mementos, memorials, reputations and other kinds of memory, which and this often changes. ‘Think for instance of Jimmy Savile, who died some years ago but his reputation is continuing to change and people memories of him are shifting. So the post-self is a dynamic being. And we may now be involved in our post-self: there are tools that allows us to send out posthumous emails and we can be involved in our own funerals, thus shaping how the post-self is constructed. The digital self thus outlives our physical life-span.’ Literature that has responded to this context includes Lotte Moggach’s Kiss Me First (2013) and James Smythe’s The Machine (2013).
Science fiction novelist Adam Roberts confided that, although had held out against social media for a long time, he’s now addicted to Twitter, the ‘crack cocaine of the Internet.’ Now his love for the digital goes so deep that Roberts reads novels on the Kindle app on his iPhone, and he wondered how this is perceived by his children; he wonders what kind of role model he sets. Roberts also identified a new, paradoxical form of human behaviour: ‘Social media are memory. Everything you write is engraved in virtual stone forever. Oral communication is different: once you speak certain words will vanish into the ether, yet when you communicate face to face with people you will behave more politely. The rudest communication happens on social media, which is weird because every word you say is recorded. I think this happens because, for instance, Twitter gives the illusion of fluency and there is no face to face contact.’
Roberts told us about the inspiration for his novel New Model Army (2010), which projects into the future the idea that an army could be dehierarchised and democratised by using technology that allows all soldiers to be involved in strategic decision making; their collective knowledge operates via wikis. Roberts got the idea partly from Apple-founder Steve Jobs: ‘Jobs was fond of saying: ‘The entire accumulated body of knowledge generated by mankind from the beginning of time to the year 2000, we now generate that amount of data every year.’ It’s not strictly speaking ‘knowledge’, of course, but all that you need to turn that intelligence into wisdom are algorithms and search engines that are sophisticated enough. That new situation levels and democratises human interaction as a whole, and that is a good thing: rather than the tyranny of the previous hierarchical structures, democracy is the best way of reaching decisions.’ Roberts’ novel thus explores through sf the implication that currently both consciousness and memory because less and less an isolated act contained within individual human mind, but that it becomes more and more a shared, collective activity and process. The digital revolution is, for Roberts, an age that empowers people, and we move ever closer to an opportunity to tap into, and use, the wisdom of the crowd.
One of the panellists, Stacey Pitsilides (Goldsmiths), a doctoral student working on ‘Digital Death’, quoted Professor José van Dijck (University of Amsterdam) on why people are flocking to the digital with such major intensity: ‘the anxiety of forgetting is implied in the desire not to be forgotten […] the most important beneficiaries of this software product are your descendants. Immortality through software cultivation appears to be an attractive prospect in which to invest.’ Van Dijck seems to imply an (unconscious) link between our ability to remember and our desire to not be forgotten, but perhaps the irony is that the tools that grant of immortality (sites such as Facebook) might actually make us forget because the act of storing a photo of ourselves on Facebook implicitly allows us to forget that content as well.
The discussion about digital afterlives also invoked a question about religion: with the waning of religion and the comforting idea of the afterlife, can the success of social media be explained through the digital afterlife they give us? Moncur did not agree with this assumption: ‘We’re seeing people use the digital to memorialise. You can create a memorial on Facebook. The 9/11 memorial in the States also has a digital form as well so that you can remotely visit. But we’re also seeing digital memorials with a religious element to it. It’s not about religion disappearing, but it perhaps reflects how we’re behaving offline: some people are religious, some people are not, some people are embracing new forms of spirituality. The digital is an just another form of expressing how we are as humans. We have many different forms for remembering people.’ Adam Roberts added to this: ‘Atheism is a big deal at the minute, but it seems to me that digital media are the apotheosis of particular religious traditions. If you look at Christianity, Judaism, Islam there are two aspects that unites them: they are structured according to books, and religion creates a community of people congregating. Social media are exactly those two things. It’s about writing, but also communicating on a much larger, global scale.’
On Saturday 5th October 2013, the Memory Network invited Jessica Bland (Nesta), Dr Kevin Fong and Entrepreneur Robert Stevens (Bunnyfoot) to discuss the role memory plays in anticipating the future as part of our series of events at the Cheltenham Literature Festival.
The ‘Memory, Prediction and the Invisible Future’ event at The Cheltenham Festival started from the perhaps counterintuitive assumption that memory is not just for looking backward, but that it is a tool we need in order to predict the future. From an evolutionary point of view, individual memory probably evolved for us to plan ahead: in order to make choices (like, say, whether a hunt of an animal is likely to result in food for the tribe or whether a traffic jam yesterday will makes us change our route today), we need memory as a database that allows us to hypothetically predict outcomes accurately and successfully in the imagination. On a collective level, we used to think that the future was simply an extension of the past. The world would continue much as it used to in the past. After the Industrial Revolution we thought we could even make our future lives better; we could make progress, and much of the effort to create progress depended on taking away risk. During the twenty-first century, however, much of the linear, causal models of thinking were undermined and made way for new theories that focused on the many uncertainties and instabilities at the heart of the universe. Now we find ourselves in an increasingly unstable world, with climate change, globalisation, an unstable economy and the technological singularity (meaning the nature of technological revolutions we live through makes it impossible to predict the form and content of lives in, say, the next 20 years. In 1985, no one could have predicted we’d all walk around with a pocket-sized computer that allows us to ring, navigate cities, tell he weather and give us any information we desire). In the twenty-first century, our ability to use memory to predict is, paradoxically, enhanced because we have increasingly detailed models to calculate the future, but we also realise there are many contingencies and chance factors build into our world that make is increasingly difficult to predict the future as well. One key book that explores such instability is Nassim Taleb’s The Black Swan (2007), which shows that the complexity of the financial world is now very instable: a seemingly minor event can trigger huge consequences on the other end of the world, whilst more and more often highly improbably events actually do take place. Thus, one might argue that the role of memory and history is becoming less strong in our contemporary culture: our knowledge and the databases of the past (both at the level of the individual as well as socially at a global level) are less useful because simple, linear causality is undermined. This makes our world deeply fragile and our lives profoundly precarious.
The Memory Network had put together a diverse panel, including the director of a company that specialises in the analysis and prediction of the behaviour of consumers, Bunnyfoot. However, first technology futures analyst at Nesta, Jessica Bland opened with the claim that ‘the future is not always about prediction; looking forward is about imagining plausible future scenarios in order to help us understand and debate things today’. ‘She described her work on the potential humanitarian uses of drones as a way of ‘shifting that future that we can imagine’, suggesting a correspondence between a changeable future and a past left unfixed by the plasticity of human memory’. Bland concluded that ‘we don’t have to remember things perfectly for memory to be a useful thing in our life in the way that we don’t have to predict the future perfectly for it to be a useful thing in our world’. Kevin Fong introduced himself as a ‘prediction nihilist’, asserting that while ‘we tend to remember the good decisions we made that were predictive’ our ability to predict what we are going to need as a society is ‘near zero’. He took from the example of Robert Falcon Scott’s Arctic expedition, whose failure belied the vital importance of Antarctic research in the study of climate change, the idea that ‘in our forays into the unknown continents of the body and the mind, we don’t know where we’re going’, and that our focus is not about looking to the future so much as ‘grappling with the present, trying to mitigate the consequences of the thing we did recently and hoping that somehow that will move us forward’.
Rob Stevens, founder of Bunnyfoot, also showed how eye-tracking technology is used to predict consumers’ behaviour by doing an experiment with Memory Network adminstrator Nick Lavery. Nick was equipped with eye-tracking goggles (glasses that measure eye positions and eye movement) asked to find the letter ‘P’ in two versions of the alfabet: one which was chronologically ordered and one randomly scattered across the screen. There result is evident in the film below:
Stevens elaborated on his notion of the brain as a ‘memory-based prediction tool’, using a demonstration of his work with eye-tracking technology to show the importance in perception and prediction of existing mental models, and tracing the vital function of memory from the survival and evolution of single-celled organisms to his current work in advertising that plays on disjunctions between the firing of our memories in the neo-cortex meeting sensory input coming the other way, from reality. Discussing their own future predictions, the panel balanced a measured understanding of this ability with a sense of optimism over humanity’s relationship to technology. Jessica Bland summed up the panel’s discussion of prediction and memory by pointing out that ‘if we could predict the future it would be memory’; while the fundamentally plastic, human nature of memory imposes limitations on our ability to predict, it will also play a part in mitigating the consequence of, and shaping our response to, whatever we will encounter in the future.’