Listen to this article
This is an experimental feature. Give us your feedback. Thank you for your feedback.
What do you think?
As tributes poured in after the death of Prince last month, a member of the Minnesota Lynx women’s basketball team spoke on US television about her visit to the musician’s Paisley Park studios. She recalled how the players, driven out by limo at his invitation to celebrate their victory in a crucial match, had been asked on arrival to leave cameras and phones outside. The party was amazing but, she regretted, “all we have are our memories”.
For millions of people, technological devices have become essential tools in keeping memories alive — to the point where it can feel as though events without an impression in silicon have somehow not been fully experienced. In under three decades, the web has expanded to contain more than a billion sites. Every day about 300m digital photographs, more than 100 terabytes’ worth, are uploaded to Facebook. An estimated 204m emails are sent every minute and, with 5bn mobile devices in existence, the generation of new content looks set to continue its rapid growth.
We celebrate this growth, and rightly. Today knowledge is created and consumed at a rate that would have been inconceivable a generation ago; instant access to the fruits of millennia of civilisation now seems like a natural state of affairs. Yet we overlook — at our peril — just how unstable and transient much of this information is. Amid the proliferation there is also constant decay: phenomena such as “bit rot” (the degradation of software programs over time), “data rot” (the deterioration of digital storage media) and “link rot” (web links pointing to online resources that have become permanently unavailable) can render information inaccessible. This affects everything from holiday photos and email correspondence to official records: to give just one example, a Harvard study published in 2013 found that 50 per cent of links in the US Supreme Court opinions website were broken.
Are we creating a problem that future generations will not be able to solve? Could the early decades of the 21st century even come to seem, in the words of the internet pioneer Vint Cerf, like a “digital Dark Age”? Whether or not such fears are realised, it is becoming increasingly clear that the migration of knowledge to formats permitting rapid and low-cost copying and dissemination, but in which the base information cannot survive without complex and expensive intervention, requires that we choose, more actively than ever before, what to remember and what to forget.
How we should go about doing this is a question explored in different ways by three recent books: You Could Look It Up: The Reference Shelf From Ancient Babylon to Wikipedia, by the literary scholar Jack Lynch; When We Are No More: How Digital Memory Is Shaping Our Future, by the information historian Abby Smith Rumsey; and Ctrl+Z: The Right to be Forgotten, by Meg Leta Jones, a researcher working at the intersection of law and technology.
Mankind has been seeking ways of making our memories more permanent for over 3,000 years, as Lynch shows in his informative and entertaining history of reference works. The story begins with the clay tablet writing systems established by Mesopotamian civilisations to record important information — taxation and commercial data for the most part, although scientific, medical and literary texts also survive. Tens of thousands of these records have outlived the civilisations that created them by several millennia, marking them out as a remarkably durable memory system.
After clay came more portable and flexible materials: especially papyri, parchment and paper, with the tablet being replaced by the scroll and then by the codex. Societies developed systems of collecting, storing and preserving information that, to varying degrees, we are still using today. Authors create information, publishers disseminate it, librarians and archivists appraise, select, collect and preserve what is considered valuable for the present and future needs of society. For 500 years or more, paper has been the predominant material used for these purposes and, as the technology thinker and advocate Clifford Lynch has remarked, it exists pretty well in “regimes of benign neglect”. Benign neglect, however, is not a viable approach when it comes to preserving digital information.
Abby Smith Rumsey’s excellent When We Are No More takes a similarly long view of our contemporary anxieties over knowledge preservation. As a writer with direct experience of both the practical and philosophical issues involved, gained through roles at the Library of Congress and a variety of research organisations, Rumsey has been an active participant in these debates since the early days of the internet. Her book is especially good at charting the changing shape of the institutions to which we have entrusted (or outsourced) our collective memory.
“Information inflations”, as she terms them, have always disturbed the landscape of memory. Socrates felt that writing was an immoral act because it made us lazy in remembering, while throughout the Middle Ages and Renaissance, individuals used mental techniques such as memory palaces to retain access to large amounts of information. In Rumsey’s discussion of attitudes to knowledge since the Enlightenment, she reminds us of how an educated citizenry has long been seen as a precondition of democracy — and why attacks on libraries, from the destruction of the fledgling Library of Congress by British forces in 1814 to the 1992 shelling of the National Library of Bosnia and Herzegovina in Sarajevo, are essentially political acts.
What we are struggling with today are the social, economic and legal structures for managing digital memory. There are a number of different players — large technology companies, not-for-profits, and “memory organisations” such as libraries and archives. Google’s self-stated mission, for example, is “to organise the world’s information and make it universally accessible and useful”. A few miles up the road in San Francisco, the non-profit Internet Archive similarly aims to build an “internet library” offering “permanent access for researchers . . . and the general public to historical collections that exist in digital format”. In the UK, meanwhile, organisations such as the Digital Preservation Coalition are working with institutions ranging from the University of Oxford’s Bodleian Libraries to the Bank of England to provide advice and training.
Perhaps the biggest unanswered question is whether the commercial imperatives of Google and its peers are compatible with their roles as guardians of digital memory. In her book Ctrl + Z, Meg Leta Jones documents how the growth of the internet has stimulated a rush for access to personal data. Where providing commercial advantage in search was the original aim for many companies, the tracking and monetisation of digital behaviour is now seen as the key business goal, with information even being traded in “data exchanges”.
One byproduct of these tendencies has been what Leta Jones calls the notion of “digital redemption”. As it becomes ever more difficult not to share one’s personal life online, the notion that it is a human right to have these records erased has emerged. There are differing approaches to “the right to be forgotten” across legal jurisdictions, with the European Union developing a framework in 2012 (the Data Protection Regulation, Article 17) that seems very alien from a US perspective. But everywhere, the emerging fear is that “in a connected world, a life can be ruined in a matter of minutes, and a person, frozen in time”.
Ctrl + Z argues powerfully that we should all take the advice of Google’s Eric Schmidt and be more careful about how we interact with one another online. Or as Leta Jones puts it: “We must all be stewards. Before you delete your next Facebook post, tweet, blog, comment, email, set of cookies, or chat, consider whether you are destroying history or exercising your power to participate in your digital identity.” University libraries have offered information literacy services to students for many years — perhaps now is the time for these to be more widely offered.
In all three studies, “filtering” is at the centre of potential solutions. This process involves the intervention of human or machine processes — such as specialist search engines, curatorial selection or archival appraisal — to control the enormous abundance of digital information. Filtering by librarians and archivists becomes all the more necessary as the traditional gatekeeping role of publishers is eroded, something that Rumsey strangely does not consider.
The filtering of publishers can have profound implications for society. Lynch describes the Diagnostic and Statistical Manual of Mental Disorders (DSM), first published by the American Psychiatric Association in 1952. This work provides a taxonomy of mental disorders used by a wide range of healthcare and other professionals — and which has slowly changed to reflect societal changes, such as the removal of the classification of homosexuality as a “disorder”.
Lynch, though a frequent user of tools such as Google and Wikipedia, is attuned to their failings. He draws attention to the commercial manipulation of search result rankings and the unevenness of the crowdsourced Wikipedia — pointing out, for example, that the English-language entries for Zoroaster and Zoroastrianism are less than half the length of the entry for Lady Gaga. It is a nice illustration of why many libraries, including my own, are working to improve the resource by employing “Wikimedians in residence”.
Rumsey’s book also ends with important messages. She emphasises the need for knowledge to be kept safe over very long periods, citing the role of libraries in transferring “forgotten” works of Greek science and medicine to the Islamic and Christian worlds. The commercial and not-for-profit internet organisations have no real track-record in the long-term preservation of information. Why should we trust them over those bodies that can show centuries of successful stewardship?
This issue is partly one of resources. Libraries and archives are evolving rapidly, and their staff are being reskilled, but they struggle to cope with the “data deluge” that the tech giants have unleashed. Meanwhile, their ability to keep up is hampered by growing pressure on the funding they receive — witness the financial crisis being faced by the Library of Birmingham. Could a “memory tax”, reapportioning some of the profits generated by digital information services to the cause of preservation, be one solution to this problem?
What is clear is that libraries have a vital role to play as impartial stewards of our digital memories. As Rumsey puts it: “We cannot know what the future value of any archaic or seemingly irrelevant body of knowledge may be. Our obligation to future generations is to ensure that they can decide for themselves what is valuable.”
You Could Look It Up: The Reference Shelf from Ancient Babylon to Wikipedia, by Jack Lynch, Bloomsbury, RRP£25/$30, 464 pages
When We Are No More: How Digital Memory Is Shaping Our Future, by Abbey Smith Rumsey, Bloomsbury, RRP£18.99/$28, 240 pages
Ctrl + Z: The Right to Be Forgotten, by Meg Leta Jones, NYU Press, RRP£20.99/$29.95, 284 pages
Richard Ovenden is Bodley’s Librarian at the University of Oxford’s Bodleian Libraries and president of the Digital Preservation Coalition
Photographs: Getty; Alamy