You have probably heard the story of Darwin’s intrepid voyage to the Galapagos Islands. On those rocky outcroppings far off the South American coast, Darwin noticed small variations in the beaks of a few finches, unlocking, we are told, the mystery of life’s variation over time and space. “The struggle for life,” Darwin deduced, would naturally select those beings whose hereditary mutations made them most fit to a specific environment. Over successive generations, scientists came to see the driving force behind evolution as perpetual competition between discrete individuals, a biological arms race to eat and reproduce in a world of scarcity.
Though Darwin articulated his theories of evolution over decades, and though he traveled far and wide during his years on the HMS Beagle, few accounts of his theories fail to mention the Galapagos, their wild remoteness and exotic biota. It’s important to us that Darwin went somewhere “out of the way” to discover the nature of life. We can imagine Darwin to be observing nature directly, unmediated by human interference.
Yet, like all humans, Darwin brought culture with him wherever he traveled. His descriptions of the workings of nature bear resemblance to prevailing thinking on human society within elite, English circles at the time. This is not a mere coincidence, and tracing his influences is worthwhile. It was, after all, the heyday of classical liberalism, dominated by thinkers like Adam Smith, David Hume, and Thomas Malthus, who valorized an unregulated market. They were debating minor points within a consensus on the virtues of competition. In an especially humble (and revealing) moment, Darwin characterized the principles underlying his thinking as naught but “the doctrine of Malthus, applied with manifold force to the whole animal and vegetable kingdoms.”
Fast forward a century and a half, and “survival of the fittest”—the expression social theorist Herbert Spencer coined to sum up Darwin’s thinking—is as much a cultural cliché as it is a scientific theory. Hell, your worst colleague at the office might even offer it as a justification for his one-upmanship. More than just a cliché, though, the supposed naturalness of competition has played a central role in substantiating the laissez-faire variety of capitalism the majority of the American political spectrum has championed for the past four or so decades. Indeed, any non-market-based solution to social issues usually falls prey to claims of utopianism, of ignoring the fundamental selfishness of the human species. Advocates for welfare programs, for instance, often run up against criticism that their policy proposals fail to understand to importance of “losing,” that they lessen the stakes of the competition innate to human social life. Similarly, collectively owned spaces or institutions (like communal land trusts or co-ops) are often presumed short-lived or inefficient, doomed to suffer the “tragedy of the commons” as the innate self-interest of each member leads to an overuse of collective resources—a thesis that has been debunked again and again since its first articulation by Garrett Hardin in 1968. To put it simply, we have let Darwinism set the horizon of possibility for human behavior. Competition has become a supposed basic feature of all life, something immutable, universal, natural.
Yet new research from across various fields of study is throwing the putative scientific basis of this consensus into doubt. Mind you, there have always been people, scientists and otherwise, who conceived of life outside a Darwinian paradigm—the idea of evolutionary biology is and has been a conversation among a mostly white and male global elite. Yet, even within centers of institutional power, like universities in North America, competition’s position as the central force driving evolution has been seriously challenged recently. In fact, criticisms have been mounting at least since biologist Lynn Margulis began publishing in the late ’60s.
In a deeply heretical 1967 paper, Margulis argued that mitochondria and chloroplasts—two organelles within eukaryotic cells—were once independent organisms that, at some point in the very distant past, merged with ancestral prokaryotic cells in a mutually enriching, symbiotic relationship. Rather than competition, it was collaboration, she argued, that constituted the origins of eukaryotic cells, which is to say, all complex life on planet Earth. Though her paper was rejected by as many as 10 journals before it was published in the Journal of Theoretical Biology, Margulis’ endosymbiont theory for the origin of eukaryotic cells is now the scientific consensus.
Since then, attention to microbial life has revealed a world of bewildering interdependence. You probably know that we (and most other animals) do not digest alone. Cows, as one example, do not have the genetic information required to encode proteins fit to digest grass. It’s the symbiotic community of bacteria in their guts that does that. If you’ve ever had a stomachache after a course of antibiotics, you know intimately that life is much less comfortable with a diminished community of bacterial collaborators. But bacteria’s role in the body far exceeds digestion. The National Institutes of Health recently found that over 10,000 microbial species occupy what they call “the human ecosystem,” outnumbering human cells 10 to 1 and doing diverse kinds of work at almost every level of the body’s processes. Bacteria, for instance, may make as much as 95 percent of the serotonin in our bloodstreams, meaning you have a diverse symbiont community to thank for your pleasant mood.
It gets a great deal stranger. The bobtail squid, for instance, is famous for its shimmering bioluminescent bottom, a trait that it is not born with but only develops thanks to a glowing bacteria called Vibrio fischeri that it invites into this productive collaboration. This critical trait, in other words, emerges not through genetic mutation selected by competition but skillful collaboration across difference. Symbiosis isn’t a mere matter of two species collaborating; recent studies on mealybugs (Planococcus) reveal nested layers of interdependence. Mealybugs can only synthesize certain amino acids because of a bacterial symbiont (Tremblaya) that contains its own bacterial symbiont (Moranella). Given all this, biologists like Scott Gilbert argue that animals, humans included, are really multispecies events, composite byproducts of collaboration.
Scientists are also unearthing a densely collaborative world beneath our feet, radically shifting Western scientific thinking on plant life. Ecologist Suzanne Simard, as one example, has spent the past 2½ decades studying the symbiotic fungal networks that nurture and connect trees. Thin tendrils that tangle around plants’ roots, called mycorrhizal fungi, provide increased water and nutrient absorption capabilities to plants and receive carbohydrates from photosynthesis in return. Almost all vascular plants (around 90 to 95 percent) are in such a mutually beneficial relation with fungi. Simard’s work has revealed that these fungal collaborators actually connect their plant symbionts together in networks of reciprocal care, and that trees share nutrients with younger or weaker trees through their fungal symbionts, even across species. A healthy forest requires a dense patchwork of reciprocity, an insight that, Simard notes, the people of various First Nations communities in her study area of British Columbia have known for generations.
There is a great deal more, too. Scientists are increasingly telling stories of the partnership of coral and algae, forged some 200 million years ago, that created colorful coral reefs; of termites that lack a genome for eating wood but do it anyway with the help of Mixotricha paradoxa, a composite organism that contains a protist and at least four different kinds of bacteria; of mutualist relations between ants and acacia trees, where the former remove leaf-eating pests and inhibit pathogens with the help of their own bacterial symbionts, and the latter provide shelter in their hollow thorns and food in the form of sweet nectar.
Put simply, life is beginning to look ever more complex and ever more collaborative. All this has fractured Western biology’s consensus on Darwin. In response to all these new insights, some biologists instinctively defend Darwin, an ingrained impulse from years of championing his work against creationists. Others, like Margulis herself, feel Darwin had something to offer, at least in understanding the animal world, but argue his theories were simplified and elevated to a doctrine in the generations after his passing. Others are chartering research projects that depart from established Darwinian thinking in fundamental ways—like ornithologist Richard Prum, who recently authored a book on the ways beauty, rather than any utilitarian measure of fitness, shapes evolution. Indeed, alongside the research I have explored here, works by scientists like Carl Woese on horizontal gene transfer and new insights from epigenetics have pushed some to advocate for an as-yet-unseen “Third Way,” a theory for life that is neither creationism nor Neo-Darwinian evolution.
This lack of agreement isn’t such a bad thing. Leaving the Darwinian consensus behind means a more capacious, diverse, and ultimately more rigorous science. The recent dissensus has opened up more room for important, heterodox voices like Robin Wall Kimmerer, a botanist and member of the Citizen Potawatomi Nation. Kimmerer speaks of plants as highly intelligent beings and teachers, a sharp departure from the reductionist, utilitarian approach to plant and animal life that passed as scientific rigor within the Darwinian framework. Much of the recent research I have highlighted might count as what Kim TallBear, a scholar and enrolled member of the Sisseton-Wahpeton Oyate, calls “settler epiphanies”—belated “discoveries” by settlers of Indigenous knowledge that was either ignored or outright suppressed by colonial land appropriation and attempted genocide.
Darwin’s legacy aside, though, one critical takeaway from all this is that we must learn to recognize the impulse to naturalize a given human behavior as a political maneuver. Competition is not natural, or at least not more so than collaboration.
This insight could hardly come at a more opportune time. With our climate crisis mounting, we dearly need new ways to think about our relationships to the diverse entities that share our planet. Far too many environmentalists assume that people, driven by innate self-interest, are bound to harm ecology, that we will inevitably clear-cut, extract, consume, so long as it gives us an advantage over the next guy. This leaves us deeply disempowered, with few solutions to climate change outside limiting humanity’s impact through some kind of population control. When competitive self-interest is revealed to be a mutable behavior, the causes of climate change come into greater clarity: not human nature, but an economic system that demands competition, that distributes resources such that a tiny elite can live tremendously carbon-intensive lifestyles while the rest of us struggle for a pittance. Leaving competition behind, we can also imagine richer solutions: climate policies that problematize the tremendous wealth of the few, that build economies concerned with collective well-being and sustainability.
Science can play a critical role in liberating our imagination from competition’s grip. It can show us all the symbioses that make life possible. Such a science can remind us that we can act and be otherwise—that the shortsighted self-interest that motivates, for instance, continued fossil fuel extraction is endemic to capitalism, not to our species, much less to life itself. We can find ways to live collaboratively with the bewildering array of life that roots and scurries across our planet, but only if we reckon with competition’s hold on our thinking—for if we see life as merely a competitive struggle to survive, we will make it one.