This article is part of Future Tense, a partnership of Slate, New America, and Arizona State University. On Thursday, Jan. 15, Future Tense will hold an event in Washington, D.C., titled “How Will Human Ingenuity Handle a Warming Planet?” For more information and to RSVP, visit the New America website.
We modern folk could learn a thing or two from Dorothy, who took only a few minutes to figure out that she wasn’t in Kansas anymore.
Our expanding technological prowess and civilizational footprint are increasingly transforming the natural world, from the climate to biodiversity to fundamental systems such as the nitrogen, phosphorus, and hydrologic cycles. These planetary-scale changes are sufficiently profound that one can reasonably make the case that we’ve entered a new geological age, the Anthropocene Era—roughly, the Age of Humans.
But such a pronouncement is valuable only if it reflects a clear understanding of the changes afoot. The essence of the Anthropocene is not really about humanity’s planetary-scale impact, but about the beginnings of a radical destabilization of the core human ideas and institutions that made this impact possible. And unlike Dorothy, who quickly recognized that the rules of her old world were no longer valid, it seems that few people recognize the true implications of this new era. Like Dorothy, we need to open-mindedly and courageously explore and understand this Oz we find ourselves in rather than continuing to act like we’re still in Kansas.
No better example of the existence and consequences of this collective denial can be found than the ongoing debate over climate change, in which public discourse has plunged to levels of name-calling and character assassination that would humble a nursery school class. At the core of this debate is an idea that has been central to the development of our identity as humans for the past 400 years: that the key to solving the problem is rational action dictated by scientific knowledge.
Give it up, folks.
Three powerful Anthropocene trends are remaking the relationship among humans, our knowledge of the world we inhabit, and the relationship between that knowledge and the choices we make about how to try to make the world better:
First: Science ain’t what it used to be. Our ideal of science is of a highly structured activity for establishing cause-and-effect relationships that can be tested in the field and the laboratory. Now the focus is increasingly on computational models and scenarios aimed at exploring complex phenomena (such as climate change) that unfold on scales from the global to the molecular. Second: Information, which used to be scarce and closely guarded, is now everywhere, accessible to everyone. Once, the Catholic Church had a lock on what counted as knowledge and its interpretation. Then scientists took over. Today no individual or institution can ever have a monopoly on knowledge or expertise. Third: Therefore, the boundary between authoritative knowledge on one hand, and the subjective worlds of policy, ethics, and even religion on the other, grows increasingly fuzzy and meaningless.
Taken individually, any of these changes would be a significant challenge to our current models of rational policymaking based on scientific principles; as a whole, they signal the most profound shift in social and cultural understanding of the role of science since the Scientific Revolution and the early Enlightenment, with its emphasis on formal knowledge as a basis for solving problems.
Modern science did not spring fully formed from a single source. Rather, it grew over centuries, from early roots in ancient Babylon, the Islamic world, China, and Greece. It owes much to the intellectual traditions of the Catholic Church itself. (Roger Bacon and William of Ockham, two medieval founders of what became modern scientific thinking, were both friars.) The process of observation, hypothesis development, and testing has come to seem like the embodiment of rationality itself. At the core of this first phase of scientific culture was reductionism—understanding things by studying their component parts—controlled experimentation, and confirmation and replication of results. If I didn’t believe your story that fish died in water with high lead content, I could repeat the experiment, changing nothing but the lead concentration in the water. Despite how complex the water and fish physiology and my conditions were, I could change that one variable, and over time achieve a clear pattern of increasing lead, increasing mortality.
But this method works only for simple, controlled, and closed systems, in which pulling out one variable to experiment with is possible. It does not apply to complex adaptive systems, in which the very process of separating out a single variable changes the underlying system unpredictably. Such systems cannot be replicated, and therefore cannot be subject to standard scientific processes of confirmation. No one can replicate global environmental conditions in such a way as to experimentally test climate change. For such complex systems, the best we can do is create complicated computer models. But creating a model necessarily involves generating a set of rules that determines what we include in the model and what we exclude. And any set of rules that enables us to model a complex system that is coherent necessarily gives us a model that is partial and arbitrary—hence the common refrain that “all models are wrong, but some are useful.” We can use the model to generate multiple scenarios of the future that are consistent with scientific understanding, but we cannot have the underlying system itself. The complexity of the Anthropocene—in which, for example, climate change is an emergent phenomenon of 300 years of industrialism—is not subject to the sort of verifiable and predictive understanding that characterized science of the sort that Copernicus, Newton, or even Einstein practiced.
Science and the technological revolution it enabled have obviously been fantastically successful, and one consequence of that success is information-loading. Eric Schmidt, the chairman of Google, has famously claimed that today we create more information every two days than was created in all of human history up to 2003. You can quibble with the specifics, but the main point, that information growth is accelerating and unprecedented, is uncontroversial. Google gives everyone with access to the Web the accumulated memory of our civilization; information generation and processing power accelerate geometrically even though we fall further and further behind in understanding the implications. The complexity of the physical world, which has escaped the sort of understanding that conventional science promises, is easily matched by the complexity of our information world.
Both individuals and institutions struggle to adjust to this new, and historically unprecedented, level of information flow. Attention is an increasingly scarce commodity even as claims of authority proliferate. Through the din, climate scientists and activists, certain that they have special knowledge about the world that demands urgent, focused attention, find that they simply are not heard given the tsunamis of information that engulf the public. To be heard above this cacophony, to even hope to be relevant beyond a small group of already committed individuals, they grow increasingly loud, scary, and simplistic.
In this way the politics of fighting climate change share some similarities with those of fighting terrorism. Both have involved campaigns that use apocalyptic and extreme language in an attempt to create fear and insecurity among the public. Both seek to re-engineer society: In the case of global warming, for example, an important goal is to force broad changes in consumption and production patterns in the name of such meaningless goals as “saving the planet.” We should recall, though, that the threat of terrorism—a very real threat, as events in Paris remind us, and indeed in many ways much more tangible than climate change—was used to justify a radical erosion of the privacy of Americans, and to rationalize the invasion of Iraq, an action meant to stabilize conditions but whose destabilizing consequences continue to disastrously unfold.
Does anyone out there think that radically transforming the global energy system will be easier and more predictable than turning Iraq into a democracy? Or that the evidence for doing so is more compelling than the evidence in favor of eliminating Saddam Hussein? Remember, in the Anthropocene, everything is more complicated. Our computer models can give us a thousand scenarios of how the climate may change. But remember that global warming is an unintended consequence of 300 years of industrialism—why would we think that equally momentous unintended consequences would not accompany the enormous social changes pursued in our effort to control the future behavior of the climate?
There is indeed a cruel dilemma here: In order for the science to matter, it must be heard; in order to be heard, it must be translated into catastrophic visions and simplistic policy formulations that are literally absurd abstractions of the complexity that we inhabit. Thus, the third condition of the Anthropocene: Science moves from being a mutually accepted foundation for debating action in the world to being the tool of one or another group of partisans, wielded in the settings of politics as if it were as clear and inescapable as the equations that Newton used to describe falling objects. The necessary oversimplification, urgent appeal to fear and insecurity, insistence on predictive certainty, and direct linkage to an explicit social agenda that would create huge new groups of winners and losers (and is thus inherently divisive) obliterate the boundary between science and politics.
What we will need above all to manage complexity in the Anthropocene is humility all around. We are not in Kansas anymore, where things are simple, the truth is clear, and we know what we know. Everything really is connected to everything else now, and the biggest mistake we can make is to focus too narrowly on one thing or one way of doing things. That’s the most important lesson of the abject failure of climate change policy and politics, and it’s one that we must learn if we are to effectively confront the new world that we have and will continue to create. Climate change is not a problem of our old way of doing things—it’s a symptom of our new condition.