When We Are No More Read online




  for

  David Rumsey

  Hero of Time and Space

  CONTENTS

  Part One: Where We Come From

  Chapter 1: Memory on Display

  Chapter 2: How Curiosity Created Culture

  Chapter 3: What the Greeks Thought: From Accounting to Aesthetics

  Chapter 4: Where Dead People Talk

  Chapter 5: The Dream of the Universal Library

  Part Two: Where We Are

  Chapter 6: Materialism: The World Is Very Old and Knows Everything

  Chapter 7: The Science of Memory and the Art of Forgetting

  Chapter 8: Imagination: Memory in the Future Tense

  Chapter 9: Mastering Memory in the Digital Age

  Part Three: Where We Are Going

  Chapter 10: By Memory of Ourselves

  Acknowledgments

  Notes

  Selected Sources

  Illustration Credits

  Index

  A Note on the Author

  Plate Section

  PART ONE

  WHERE WE COME FROM

  I imagine the earth when I am no more:

  Nothing happens, no loss, it’s still a strange pageant,

  Women’s dresses, dewy lilacs, a song in the valley.

  Yet the books will be there on the shelves, well born,

  Derived from people, but also from radiance, heights.

  —CZESLAW MILOSZ, “AND YET THE BOOKS,” 1986

  CHAPTER ONE

  MEMORY ON DISPLAY

  Over forty thousand years ago, humans discovered how to cheat death. They transferred their thoughts, feelings, dreams, fears, and hopes to physical materials that did not die. They painted on the walls of caves, carved animal bones, and sculpted stones that carried their mental and spiritual lives into the future. Over generations we have created sophisticated technologies for outsourcing the contents of our minds to ever more durable, compact, and portable objects. Each breakthrough in recording technology, from the creation of clay tablets six thousand years ago to the invention of papyrus scrolls, printing, photography, audio recording, and now ultracompact, portable, and extremely fragile digital media, has added to the vast stores of knowledge that hold the key to our success as a species. In the digital age we are dramatically expanding our capacity to record information, freeing us to pursue our curiosity at will and seek answers to ever more ambitious questions.

  But every once in a while, we outsmart ourselves, and we have to scramble to catch up with our inventions. This is such a moment. The carrying capacity of our memory systems is falling dramatically behind our capacity to generate information. Since the creation of the World Wide Web in the 1990s and the growth of social media in the last decade, we feel increasingly overwhelmed by information. At the same time, we are intrigued—if not downright infatuated—with the power and promise of this abundance. We demand more and more—Big and Bigger Data. Yet it seems the more information we have, the less we feel in control of what we know. How do we catch up with ourselves now?

  This is not the first time humanity has felt overwhelmed by the riches created by our ingenious inventions. Every innovation in information technology, going back to ancient Mesopotamians’ invention of cuneiform tablets, precipitates a period of overproduction, an information inflation that overpowers our ability to manage what we produce. Having more knowledge than we know what to do with while still eager to acquire more is simply part of the human condition, a product of our native curiosity.

  But this moment is different in quality as well as quantity. We can no longer rely on the skills we have honed over millennia to manage our knowledge by managing physical objects, be they papyrus scrolls or paperback books. Instead, we must learn to master electrical grids, computer code, and the massive machines that create, store, and read our memory for us. What this mastery looks like and how we achieve it is today’s frontier of knowledge.

  The digital landscape before us is largely unmapped, terra incognita that we can only know by entering into it and exploring. Fortunately, vast as the unknown territory may be, digital technology itself helps speed communication of new knowledge between those rushing ahead to explore the unknown and those traveling at a slower pace who are settling the new landscape and making it productive. As the frontier retreats quickly before us, we can already see that our age-old understanding of humanity’s collective memory as something fixed to durable objects and constrained by the limits of time and space is obsolete. Digital memory is ubiquitous yet unimaginably fragile, limitless in scope yet inherently unstable. Mastery of digital memory means grappling with its vulnerabilities as well as developing its strengths. We will explore both as we examine the future of memory in the digital age.

  The consequences of going digital for the future of human memory came into sharp focus for me in 1997, while leading a team of curators at the Library of Congress to assemble a comprehensive exhibition of its collections for the first time in living memory. The library had just acquired its one hundred millionth item. From this abundance we were to select several hundred items that would tell the two-hundred-year story of the Library of Congress and, by extension, the American people. We had much—too much—to choose from. Home to the United States Copyright Office and faithful to its founder Thomas Jefferson’s vision of creating a universal and comprehensive collection of human knowledge, the library has records in virtually every medium capable of carrying information, from rice paper and palm leaves to mimeographed sheets and onionskin paper, whalebones and deer hides, audio wax cylinders, early television kinescopes, silent movies on nitrate film, maps on vellum, photographic negatives on glass plates the size of tabletops—and, of course, computer code on tape, floppy disks, and hard drives.

  It was remarkably easy to select several hundred objects out of one hundred million because each object tells a tale. To tell the story of how the Republic was born, for example, we displayed the Rough Draft of the Declaration of Independence, crafted over a few days in July 1776 by Thomas Jefferson and edited by Benjamin Franklin, John Adams, Roger Sherman, and Robert Livingston. It is written in the eminently legible hand of Thomas Jefferson. Yet several passages are boldly struck through with lines of heavy black ink and emended with the changes made by Adams and Franklin.

  The sight of Jefferson’s venerated text so vividly edited always draws people up short. They are startled to see that the most famous phrase in this most famous document—“we hold these truths to be self-evident, that all men are created equal”—is not what Jefferson wrote. He wrote that the truths are “sacred and undeniable.” The words we know so well today are in fact a correction suggested by Benjamin Franklin. The jarring yet oddly familiar sight of the Declaration of Independence in full Track Changes mode makes self-evident the disagreements among the Founders and the compromises they agreed on. The original document renders the past strangely new—the events dramatic, the motives of the actors complicated, the conclusion unpredictable.

  Historians continue to mine the Rough Draft’s four pages of tangible evidence for clues to the early stages of the colonial rebellion. As a historian, I was familiar with the excitement of working with original documents. I also knew how stirring—at times emotional—it is to work directly with originals. A physical connection between the present and past is wondrously forged through the medium of time-stained paper. Yet what I remember most vividly is the impact of the Rough Draft on tourists. Many of the visitors had stopped by the library simply as one more station on a whirlwind circuit of the capital. They were often tired and hot and not keen on history in the best of circumstances. But this was different. They would grow quiet as they approached the exhibit case. They lowered their heads toward the glass,
focused on lines of text struck through to make out the words scribbled between lines, and began to grasp what they were looking at. Their reactions were visceral. Even dimly lit and safely encased in bulletproof glass, the Rough Draft emanates an aura of the “sacred and undeniable.”

  It was then that I started to think seriously about the future of memory in the digital age—though worry is the more accurate word. What would my successor show in two hundred years’ time—or even fifty years? How would people feel that distinctive visceral connection with people from the past if the past had no undeniable physical presence? What we displayed in 1997 had withstood the test of time. It was already self-evident that there would be no test of time for digital information. At that time, web pages lasted an average of forty-four days before changing or disappearing altogether. We seemed to be moving at breakneck speed from a knowledge economy of relative scarcity of output to one of limitless abundance. By latest count in 2015, the Library of Congress had well over 160 million items, already a startling increase over the 100 million it counted in 1997. But relative to what circulates on the web, its collections could be described as if not scarce at least tractable. Engineers at work on building the largest radio telescope in the world, the Square Kilometre Array, estimate that when the telescope is up and running, it will produce “up to one exabyte (1018 bytes) of data per day, roughly the amount handled by the entire Internet in 2000.” And the web itself grows inexorably. One data-storage company estimates that worldwide, web data are growing at a rate that jumped from 2.7 billion terabytes in 2012 to 8 billion terabytes in 2015. But nobody really knows—or even agrees how we should be counting bits.

  How are we to keep from being drowned in the data deluge? In the past, the materials used for writing, the human labor of copying, the costs of disseminating and providing access to books, atlases, photographs, films, and recorded sound were very high. These costs imposed limits on the rate of production, in effect filtering what knowledge and creative expression were accessible and to whom. The expense of maintaining vast and redundant stores of physical artifacts meant it was costly to collect them and invest in their long-term access. The question had always been: “What can we afford to save?”

  Now, suddenly, those filters are gone and information travels at the speed of electrons, virtually free of friction. Now everyone with a computer can publish their own book, release their own movie, stream their own music, and distribute what is on their hard drive or smartphone across the globe instantaneously. The question today is: “What can we afford to lose?”

  Though this seems a daunting question, we have a lot of information from the past about how people have made these choices before, in other periods of information inflation—and there have been many. They routinely follow every innovation in recording technologies. It happened when Sumerians first invented writing to store information about grain harvests and found themselves puzzled by where to put so many clay tablets so they would be safe from damage or theft but also easy to get to when needed. It happened when Europeans invented printing and the marketplace filled up with competing and contradictory versions of canonical texts like the Bible. It happened again when we created audio recordings on platters that would break if handled roughly and moving images on nitrate film stock that could ignite and blow up, even in the absence of oxygen. Each innovation prompted a rethink about how to use these astonishing new powers of communication, each full of unknown potentials that could be uncovered only through experimentation. And each advance required a very costly retool of the information infrastructure already in place. Creators, publishers, librarians, and archivists all scrambled to catch up. But it was always worth the price, no matter how high it seemed at the time, because we gained the freedom to reimagine our collective memory, confident that we could capture so much more of the human experience.

  This growing body of shared knowledge and know-how decisively shapes our fate as a species, distinct from all others. Over generations, as we perfected the technologies of recording and created more resilient and compact media to hold our knowledge, we gained dominion over the planet. Our culture and technologies are the ultimate power tool, enabling adaptive strategies that far outpace the strictly biological evolution other species must make do with. Yet quite abruptly and without warning, at the beginning of the twenty-first century we embarked on a vast natural experiment, rendering obsolete our forty-thousand-year project to cheat death by using objects to hold the contents of our minds. Gone is the promise of preserving knowledge forever. We are replacing books, maps, and audiovisual recordings with computer code that is less stable than human memory itself. Code is rapidly overwritten or rendered obsolete by new code. Digital data are completely dependent on machines to render them accessible to human perception. In turn, those machines are completely dependent on uninterrupted supplies of energy to run the server farms that store and serve digital data.

  How do we guarantee that this uncontrolled experiment with human memory will turn out well for us? In our search for the answers, we will look back, exploring the history of how we have mastered the challenges of information inflation before. And we will look inward, into the human mind, to gain often surprising and frankly counterintuitive insights into how the brain’s natural filtering systems manage to determine what information to save and what to dump without any help at all from our conscious minds. Both historical experience and contemporary science provide insights critical for sustaining the collective memory of humanity and managing our own personal digital archives.

  WHAT IS THE BIG IDEA?

  Two reigning misconceptions stand in the way of a happy ending to our experiment in reimagining memory for an economy of digital abundance. First is the notion that today’s abundance is a new phenomenon, unseen in human history, which began with computers and is driven by technology. This is like blaming a mirror for the blemish you see on your cheek. Technology is an instrument of the human will, not vice versa. True, the pace of information production has accelerated, and the fact that computers make perfect copies so easily certainly accounts for the growth in redundant information that we store. It is effortless and seemingly inconsequential to hit Forward and further inflate the ballooning data universe. But the current information inflation began not in the 1990s, when the Internet was opened to commerce, nor in the 1940s, when computers were built by the military. It began in the first half of the nineteenth century. And it was not a technical innovation that set us on the present course, but an idea. That was the radically transformative idea that the universe and all that exists is no more and no less than the material effect of material causes.

  This idea, known in philosophy as materialism, itself is ancient. It was central to the thinking of the Greek Democritus (ca. 460–ca. 370 B.C.), immortalized in a poem by Lucretius (ca. 99 B.C–ca. 55 B.C.) called On the Nature of Things (De rerum natura), and can be found in ancient Indian and Chinese philosophies. But in the hands of Western men of science (and they were mostly men), the view of matter as both cause and effect did not serve the sole purpose of understanding the world as philosophers. They sought new knowledge to master Nature’s secrets and their political counterparts sought to use that knowledge to change the world. And so, by the 1830s, the great hunt for physical evidence was on. The rapid invention of tools of investigation resulted in a proliferation of new information technologies. From the daguerreotype invented in 1838 to the powerful imaging technology at the heart of the Large Hadron Collider that detected traces of a new subatomic particle in 2013, our information technologies all derive from the single insight that matter records the history of the universe because it is a slow, cold form of information. The universe writes its own autobiography in atoms. The evolution of our collective memory from Paleolithic cave paintings to the World Wide Web is the story of how and why this idea of matter as memory took hold and what it means for us today.

  Culture evolves in fits and starts. History is studded with false promises and dead ends, ex
periments that work for a while then prove unfit as circumstances change. But there are also moments of rapid change, inflection points when forces coalesce to accelerate and alter the trajectory of events. Four inflection points in particular precede and enable the scientific advances of the nineteenth century that inaugurated today’s information inflation: (1) the development of writing in Mesopotamia for administrative and business purposes, together with professional management of the collections; (2) the ancient Greeks’ development of libraries as sites for the cultivation of knowledge for its own sake; (3) the Renaissance recovery of Greek and Roman writings and the invention of movable type, which together helped to propel the West into the modern age; and (4) the Enlightenment of the eighteenth century, which refashioned knowledge into an action verb—progress—and expanded the responsibilities of the state to ensure access to information.

  These inflection points all lead up to the critical moment around the turn of the eighteenth century when some curious geologists discovered that rocks are like clocks. Properly read, they could be used to tell the time of the Earth. And the Earth turned out to be far older than people thought. That was the moment when science moved from the Age of Reason to the present Age of Matter and the great quest for physical evidence about all that exists began. The digital era is merely the most current installment in the unfolding saga of our desire to know more about the world and ourselves. The origins of today’s global information economy of abundance lie here, at this inflection point in the history of Western thought. For it is the West that created the global inscription used today—the digital code that travels on worldwide networks.

  WHAT IS MEMORY?

  The second misconception is our antiquated view of memory itself. The computer is not an accurate model for the brain. Scientists now understand natural memory—the kind that hedgehogs and humans have, as opposed to the artificial kind we use for storing information like books and silicon chips—is the primary mechanism animals rely on to adapt to their environment. Memory is the entire repertoire of knowledge an animal acquires in its lifetime for the purpose of survival in an ever-changing world—essentially everything it knows that does not come preprogrammed with its DNA. Given the complexity of the world, memory takes a less-is-more approach. It is sparing, even provident in its focus on information that may come in handy later. Like a traveler packing for a week trying to squeeze all necessities into an overnight bag, the brain compacts big loads of information into small spaces by combining and compressing similar information through elaborate networks of association.