In suburban New England, gobbling gangs roam the streets. Wild Turkeys, each weighing in at 10 or 20 pounds, loiter in driveways,trapping residentsinside their homes. They lounge on decks, damage gardens, and jump on the car hoods. Flocks of 20 or 30 birds roost in backyards, while particularly plucky turkeys chase down mailmen andthe occasional police cruiser. They even fly (granted, not very well) across highways; one left a turkey-size dent in an ornithologist’s windshield. So far in 2018, the Massachusetts Division of Fisheries & Wildlife, or MassWildlife, has received 150 turkey-related calls and complaints, primarily from residents of densely populated counties in the southeast and Cape Cod. These are the Wild Turkeys of New England, and they’ve taken over.
The turkeys’subjugation of New England residents is a relatively recent phenomenon. Just 50 years ago, the Wild Turkey population in New England was essentially non-existent, and had been for over a century. Then, an extensive, coordinated effort to trap and transfer turkeys across state lines rejuvenated the population—a comeback lauded by wildlife biologists and agencies as a conservation triumph. “It was an all-hands-on-deck restoration effort,” saysChris Bernier, a wildlife biologist at the Vermont Fish & Wildlife Department. “It’s a fabulous success story.” But now, with turkeyspractically running the show, agencies must find a balance between celebrating the Wild Turkey revival and ensuring that human and bird get along. “We’re at opposite ends of the spectrum from where we were 50 years ago,” says wildlife biologist David Scarpitti, who leads the Turkey & Upland Game Project at MassWildlife. “It’s gone from a conservation success story to a wildlife-management situation.”
Before Europeans first colonized New England in the 17th century, an estimated 10 million Wild Turkeys stretched from southern Maine to Florida to the Rocky Mountains. As settlers spread out across the continent, they cut down forests as they went—and New England took the biggest hit. Forest area decreased 70 to 80 percent in Massachusetts alone in the first half of the 19th century, says Jim Cardoza, a retired wildlife biologist who led the Turkey & Upland Game Project at MassWildlife during the 1970s conservation effort.
As a result, the birds lost not only the cover of their habitat but also their food supply of acorns and chestnuts. Without hunting restrictions, hunters picked off any Wild Turkeys that survived the deforestation. By the mid-1850s, New England’s turkeys had all but disappeared. In the 1930s, biologists released hundreds of captive-bred turkeys into the region to try and resuscitate the species, but these domesticated birds couldn’t survive in the wild.
In the 1960s, biologists began to explore the idea of trapping Wild Turkeys, primarily from New York, and transporting them for release in New England. Biologists like Cardoza and his team sat in their trucks on cold winter mornings, sometimes for eight hours, waiting for Wild Turkeys to follow the trail of cracked corn, wheat, and oats to an open farmyard or pasture. Once 20 or so birds had gathered, Cardoza fired a2,625-square-foot cannon-net towards the gaggle to capture them before tagging the birds for relocation.
Massachusetts captured 37 Wild Turkeys from New York’s Adirondacks in the 1970s and released them in the Berkshires. Vermont relocated 31 New York turkeys in the mid-1960s, and Connecticut, Maine, and New Hampshire participated insimilar programs. By that time, the New England human population had migrated and condensed into cities, and forests and food had returned to much of the abandoned farmlands. Turkey predators like cougars and wolves had been extirpated, and the entire region created hunting restrictions to protect the birds. All the while, trapping and relocation continued between and within states—and soon New England’s Wild Turkeys, once considered extinct, were resurgent.
“Their population just exploded, quite literally,” Bernier says. Today, the Wild Turkey population in Massachusettsexceeds 25,000 birds. There are45,000 Wild Turkeys in Vermont,40,000in New Hampshire, and almost60,000 in Maine—almost all of which descended from those few dozen relocated birds, Bernier says. They now cover more terrain than they did before they disappeared; some Wild Turkeys even filled in pockets of previously uninhabited land on their own, something that researchers didn’t expect. “They did better than anybody thought that they would,” says Matthew DiBona, wildlife biologist with the National Wild Turkey Federation. The U.S. population is back up to roughly 6.2 million birds, he says.
In the mid-2000s, however, the turkeys started colliding with humans. New England is one of the most densely populated regions in the United States, and as people began putting out birdfeeders and growing gardens, turkeys found ample food. And the Wild Turkeys in suburbia, unlike skittish rural-roaming turkeys, quickly grew accustomed to humans. “No one had any idea that these birds would be showing up in suburbs,” saysMarion Larson, the chief of information and education at MassWildlife. “There was no precedent for it.”
Overall, locals don’t mind the company. Some eager residents even go out of their way to attract the birds by scattering nuts, seeds, and berries on background platforms or intentionally growing nut-producing trees.But that warm welcome sometimes fades as the turkey-human scuffles continue to mount, and residents claim that the birds are a nuisance.
Encounters with the four-foot-tall turkeys can be dangerous, especially to a household pet or a small child. The birds can act aggressively towards humans by charging at them, pecking at them, or otherwise intimidating them. They also attack reflective surfaces that they mistake for other turkeys. That’s because the birds, usually male, are trying—and succeeding—to establish themselves at the top of the town’s pecking order. “This is the way they deal with socialization,” Larson says. “They’re treating people as if they’re turkeys.”
Outside of cities, Wild Turkey populations, such as in some southeastern and midwestern states, are on the decline as other forests are converted to farmland. But the urban birds continue to flourish in New England. Now wildlife agencies across the region are tasked with managing both the Wild Turkeys and their human neighbors to make sure encounters don’t go awry. The answer, biologists say, is simple: “We just need to stop feeding them,” Scarpitti says.
That’s what he tells local residents when he’s called to mediate neighborly disputes: Don’t feed the birds, and don’t show fear. But people hardly ever listen, and so for the foreseeable future, Wild Turkeys will continue to rule the neighborhoods of NewEngland.
I have seen many things (corpses, the Northern Lights, a beached whale), but a few sights have left a particularly vivid impression. One is of a boy I spotted in Istanbul eighteen years ago. He was fifteen or so, with a pathetic whispy moustache, wearing a suit for what appeared to be the first time. We were in the textile district of Zeytinburnu, and it seemed to me he was likely beginning a new life in his father’s small business, though I could be wrong. Whatever the occasion, the boy had deemed fitting to commission the labor of an even smaller boy, nine years old or so, to shine his shoes. The shoeshine kid was kneeling on the ground, scrubbing away with rags and polish from his portable kit, a borderline-homeless street gamin for whom all of our rhetoric about the sacred innocence of childhood means nothing at all. The fifteen-year-old stared down haughtily, like a small sovereign, and the nine-year-old, knowing his place, did not dare even to look up.
Such is the way of the world, our collective, complacency-inducing clichés invite us to think on such occasions. Curiously, such a thought comes to us most naturally when we are observing an instance of domination as it were from above. The haughty kid dared to look down on the lowly kid, and yet if he had noticed he was being observed his haughtiness could quickly have curdled into shame. The further haughtiness of the ultimate obsever, in turn —in the event, me (as far as I know I was not being observed myself)— seems to arise from the passive and prejudicial presumption that the world of Turkish textile merchants and their sons is somehow a more accurate approximation of the mythical state of nature than what we are used to seeing in, say, a fast-food drive-through or a CostCo self-checkout.
But this is of course an illusion. The fifteen-year-old was channeling particular historical forces that pressed down upon him unawares, and that are far less distinct from anything I do at a Starbucks, with all the historical forces that press down upon me, than they are from the actions of a Paleolithic mammoth hunter or a New Guinean hunter of heads. The boy had perhaps seen his father humiliated by creditors; his father had perhaps humiliated him in a similar way; and now the boy was just passing the humiliation downward along the great chain of social being. This chain however is one that is ultimately formed by capital, by debt, by sedentism, by the state tyranny of which domestic tyranny is to some extent only a microcosm, and by a number of other factors that place even the most brazenly lupine behavior of man towards man in quite a different context than the state of nature. No, that’s entirely the wrong frame of reference.
2.
What people then may we more veridically hold up as living in such a state? The Sakha people of the Lena river basin in northeastern Siberia, whom I’ve come to know rather well, both through books and through people, over the past years, appear to be a splinter group that settled in this extreme climate region in the middle ages in order to evade the tyranny of the rising Mongol Empire. Mongol tax collectors couldn’t be bothered to enter the coldest inhabited region on earth, and so the newly formed Sakha ethnie in turn adapted to the new exigencies of life and lived in relative freedom from outside domination, though with a complex hierarchy from within. They had advanced metallurgy, a revered warrior class with armor and swords, and four to five months of seasonable temperatures each year that enabled them to build their economy around livestock. Those who went even further north however, perhaps to escape not just Mongol domination but domination from within by their fellow Sakha, largely adopted the geographically determined lifeways of the Indigenous Tungusic (e.g., Evens and Evenks) and Paleo-Siberian (e.g., Yukaghir) peoples: reindeer husbandry, in particular, but also seal-hunting and other circumpolar forms of subsistence common also to Greenland and Canada. At the same time as some Sakha were arriving from the south and taking on Arctic habits, socially marginal ethnic Russians were arriving from the west and doing the same, leading to a convergence in forms of life across people with different phenotypes and different historical trajectories that brought them into the same region and into the same destiny.
Who among these groups is “Indigenous”? We might in this case feel this is the wrong question to ask, but this feeling may in turn help to prime us for the further realization that the encounter zone of the Slavic, Turkic, Tungusic, and Paleo-Siberian peoples is in fact fairly representative of every corner of the inhabited globe, even those we take to be the most hermetic and (therefore?) the most pristinely representative of humanity in its original state. In their half-posthumous new book, the anthropologist David Graeber (1961-2020) and the archeologist David Wengrow (1972-) suggest that “even” the pre-contact Amazonian groups we generally take to conform most closely to the definition of “tribe” or “band” were likely aware of the Andean empires to their west, and may also have had, at an earlier time, relatively complex state structures that they consciously abandoned because they were lucid enough to come to see these as inimical to human thriving. The groups Europeans first encountered in the rainforest, in other words, may also have been splinters that broke away from tyrannies, just like the Sakha fleeing the Mongols, and to some extent also like the Mountain Time Zone libertarians grumbling about the tax agents from the mythical city of Washington.
It may be that more or less all societies that appear to us as “pre-state” would be more accurately described as “post-state” — even if the people who constitute them are not in fact fleeing from the center to the margins of a real tyranny, they are nonetheless living out their statelessness as a conscious implementation of an ideal of the human good. Even if they have not observed Inca ceremonies through the forest thicket from across a mountain ravine, they already know enough about tyranny simply from the expression of innate personality tendencies of individual members of their group —boastfulness, bullying, pride—, and have developed rational mechanisms to ensure that these traits are countered by ridicule, dismissiveness, and other mechanisms that keep any would-be tyrant in his place.
This is the sense of Pierre Clastres’s “society against the state”: societies that lack state structures are not in the “pre-” stage of anything, but are in fact actively working to keep such structures from rising up and taking permanent hold. They do this to differing degrees, with many societies around the world exhibiting a sort of seasonal duality in which they are subject to tyranny during the months of the buffalo hunt or the rainy season or the period of potlatch or inter-clan commerce, and then the hierarchy dismantles itself again and they all become as it were “anarchists in the off season”. As Graeber and Wengrow write of the Kwakiutl of the Pacific Northwest:
[I]t was winter —not summer— that was the time when society crystallized into its most hierarchical forms, and spectacularly so. Plank-built palaces sprang to life along the coastline of British Columbia, with hereditary nobles holding court over compatriots classified as commoners and slaves, and hosting the great banquets known as potlatch. Yet these aristocratic courts broke apart for the summer work of the fishing season, resorting to smaller clan formations — still ranked, but with entirely different and much less formal structures. In this case, people actually adopted different names in summer and winter — literally becoming someone else, depending on the time of year.
3.
On the authors’ telling, it is really only in the 1950s and ‘60s, with the quantitatively precise work on daily calorie intake and other such measurables spearheaded by such anthropologists as V. Gordon Childe (1892-1957), that the idea of “man the hunter” took hold, and the default setting of the species was taken to be a seasonally invariant, efficiency-maximizing, and culturally lifeless prehistory. When “man” in “his” “natural” condition is determined to be doing but one thing, a basic flexibility between forms of life, adaptability to both expected seasonal variation and to longer-term unforeseen changes, become correspondingly less salient for research. And when these are screened out, the narrative of monolithic unidirectional progress from bands to states becomes vastly easier to maintain.
In their opposition to this narrative, Graeber and Wengrow are building most immediately on the crucial work of James C. Scott, who has shown that repeatedly and in several different places in human “pre-history”, societies reverted from agriculture back to hunting and foraging, that they did so by choice, and that for several millennia farming existed alongside other viable forms of subsistence in the absence of any well-defined state structure with all its usual indices of inequality. This adaptability should not be at all surprising, given that there are many societies still in existence that alternate seasonally between sedentism and nomadism, as they move their grazing animals up and down in elevation in harmony with changing patterns of vegetation. But the prevailing view is that there can be no states without sedentism, and that states succeed bands as a “higher” stage of development, and that therefore transhumance must be some sort of “transitional” stage on the way to finally “settling down”. The part of the year that is spent wandering complicates the narrative of sedentization that is presupposed by the narrative of progress from bands to states.
And in fact we don’t even need to look as far as semi-nomadic pastoralists; until very recently it was common in Western Europe to “flip” the social order every now and then, in a way that was also determined ultimately by the cycles of the agrarian calendar. When things went à rebours for a limited time, nobodies got to act like kings, and sometimes kings had to submit to humiliation by nobodies, even allowing psychopaths and criminals to sit in the throne and to act the part for a time (or more specifically, people socially recognized as psycopaths and criminals, as the real king may in fact be both of these things himself, even if this fact is ordinarily only acknowledged sotto voce). We know the last dregs of such reversals from festivals such as Carnival or Halloween, which have something to do with popular religion, but are also periodically supressed in the name of that same religion when the lines of its authority have become blurred with those of the state.
In other words, even absolute monarchies with fully sedentary subjects have been known to practice anarchy “for a limited time only”: controlled anarchy that is both in the service of the state but also a full-fledged parallel reality, like dreams or story-telling, anarchy that continues to exist alongside or in alternation with the state. It is only in the most recent era of totalization of civic life —where some of us now have eyeball-tracking software that follows our faces eight-to-ten hours a day as we work from “home” in order to ensure that we aren’t doing anything anarchic on company time— that this parallel reality has been monitored and administered out of existence, or at least reduced to the hours of sleep, in which we just can’t do otherwise than hallucinate a topsy-turvy world, and the state so far has not been able to come up with a way to stop us.
4.
When the “quants” such as Childe took over a certain portion of the discipline, they left the interpretation of culture to those anthropologists who welcomed a corresponding retreat from any claim to scientificity. An earlier generation of work in cultural anthropology, notably the rich legacies of Marcel Mauss (1872-1950) and Franz Boas (1858-1942), helped to solidify the long-dominant narrative according to which the history of humanity is a progression from pre-state to state-based societies, and that where there are states there is also inequality, but this equality is compensated by a leisure of the mind that is more conducive to creativity and to the efflorescence of material and symbolic culture. Meanwhile hermeneuticists like Clifford Geertz (1926-2006), having retreated from any claim to the sort of authority that can tell you “how things really were”, to invoke the positivist definition of historical research offered by Leopold von Ranke, were in no position to dispute the claims of the quants.
While Mauss and Boas had been sensitive to the ultimate arbitrariness of cultural expressions —in the end, people do stuff because they want to, not because it maximizes calorie efficiency or body-surface heat-dispersal or some such thing—, late-twentieth-century scholars such as Childe came to treat human “prehistory” as if it were a branch of engineering. This is very much in line with the reigning behaviorism of the era (to which Chomsky dealt a fatal blow as early as 1959, though the methods and biases remained broadly entrenched across all the disciplines with an interest in what it is to be a human), and ironically it ended up eliminating from view among human societies the very sort of diversity and creativity (add scare-quotes according to taste) that evolutionary theorists continued to recognize among biological species: we kept right on observing island dwarfism right alongside island gigantism, for example, or bright patterns on one species’ skin offering a “dishonest signal” of toxicity, inhabiting the same ecosystem as another related species that instead uses camouflage to avoid getting eaten.
There is no single efficiency-maximizing formula at work in such cases. Sometimes, when isolated on an island, a population of animals will get really big; sometimes it will get really small. Both directions can do a fairly good job of helping it survive. Similarly, one group of Inuit may exhibit “reverse seasonality” in relation to its neighbors, crystallizing into an elaborate hierarchy in winter, and dispersing into “anarchism” in the summer, while the others do the opposite. Some hierarchical societies measure a person’s place in the chain of social being by how much wealth that person has permanently hoarded; others require anyone who seeks a high rank to hoard only temporarily, and soon to give it all away, potlatch-style, in a ritual of ceremonial magnanimity. Such a person is not “generous” — he’s just doing what he has to do to stay on top, according to the specific rules of the game that have taken hold in his cultural context and that cannot possibly be rationalized from the outside in terms of energy efficiency or any model of rational agency familiar to economists.
5.
Our inability to conceive of pre-modern peoples as existing all along in complex networks of long-distance exchange, as defining themselves against one another through what Gregory Bateson (1904-1980) called “schismogenesis”, as knowing what states are even when they construct and maintain their societies “against the state”, has as its corollary an equally handicapping inability to exercise the historical imagination in a way that fully appreciates the individual humanity of those who inhabit the deep past.
We know that anatomically modern humans have been around for 100,000 to 200,000 years, while we find little evidence of symbolic thought until roughly 40,000 to 50,000 years ago. This could be a mere result of not looking in the right places, or of the eventual return to nature of all artificial constructions. But still most still agree that something changed in the Paleolithic and that human beings began to externalize their inner lives in new ways. Yet we also know that any anatomically modern person has by definition the same brain we do, and while some theorists have speculated on late-stage “mutations”, some as recently as 3,000 years ago, that brought about a sudden propulsion forward in our capacity for abstract thought and our “transcendental” apprehension of our own selfhood, a generous interpretation has to suppose that any AMH’s life mattered as much to that AMH as your own life matters to you, even if she was not manifesting this mattering through woven fabrics, a sharp sense of fashion, or a proliferation of selfies. Same brains, same subjectivities — even through wildly different cultural expressions.
Perhaps Graeber and Wengrow’s most affecting accomplishment lies not so much in their new “theory” of the human past, which in any case is only a synthesis of already existing research, as rather in their sympathetic plaidoyer for the singular reality of lives lived in the past, their commitment to the idea that these were real people, as weird and idiosyncratic and unfathomable by quantitative methods as you and I. The alternative view is Hobbesian by default, and it is overwhelmingly more popular in our culture. I go to the Whole Foods with my elderly mother, and she looks at the produce section: “Aren’t we lucky to live in this era,” she says, “when there is such a variety and abundance of foods. It really enables us to enjoy it all, rather than just to survive.” This is a sort of entailment from the hypothesis that life before the state was “nasty, poor, brutish, and short”, an existential condition in which, we ordinarily presume, no one had the “luxury” of preferring one food over another, of ever getting their “favorite” for supper.
Yet we know of no human culture that does not have strict rules about what may be eaten, no culture that outside of periods of famine does not have a long list of perfectly edible species of animal that they nonetheless categorically refuse to eat, no culture that fails to organize itself around preferences enshrined into a scheme of values. “Aren’t we lucky to be here at this festin à tout manger,” every Cree who ever got to participate in an “eat-all feast” is likely to have thought or muttered aloud, each perhaps content with the piece of beaver meat doled out to him in accordance with his social rank, or perhaps aspiring to eat the brains out of the skull some day like the chiefs do, but either way experiencing a condition of abundance and leisure at least as intense as that known by any Whole Foods shopper.
Leisure, like calorie consumption, is something that can be measured from the outside, and the anti-Hobbesian descendants of Jean-Jacques Rousseau, for whom man was born free but is everywhere in chains, have long attempted, as in Marshall Sahlins’s magnificent Stone Age Economicsof 1972, to show that through our successive revolutions in agriculture and industry leisure has progressively given way to labor as constituting the principal part of human life. It may be true that hunter-gatherers (as they used to be called) engaged in no more than four hours of “work” per day and dedicated the rest of their time to leisure activities, as Sahlins contends. But this neo-Rousseauian estimation is generally proferred in the same spirit in which we talk about the daily cycles of the lives of lions or koalas — to mention just two other species that spend most of their time sitting around. What is lost is the “why” of the leisure, the fact that all those people were sitting around not just because it is their species-specific condition to do so (though perhaps that too), but because they like to do so, because it is “fun”.
6.
It’s a weird thing to have to insist on: that there is something that it was like to be a member of the prehistoric leisure class, which is to say to have been a prehistoric human being. Graeber and Wengrow’s reanimation effort for past humans echoes the former author’s earlier plaidoyer for currently living poor humans, notably in his magnum opus Debt: The First 5,000 Years(2011). In order to have a big wedding blowout, poor people might have to take out loans against which any rational financial advisor would sternly counsel them. Yet they just keep doing it, going into debt, wearing ruffled blue tuxedoes, and loving one another as much as any human being has ever loved another. That’s culture against credit, so to speak. In the course of a mortal life, a good wedding matters more than good credit; poor people have generally been able to keep this in mind whereas upstanding accountants have forgotten it.
A wedding is a ritual enactment of mythical, world-structuring motifs, and to this extent it is a form of heaven on earth, along with all the other high-ceremonial occasions for music, dance, and heightened speech. This is the stuff people live for. We know from the discovery of windholes drilled into an avian femur that even before the arrival of AMH’s in Europe, Neanderthals were performing music, and thus also, presumably, engaging in forms of ceremonial ecstasy during which the imaginations of all participants must have been fully activated and alive. Such ecstatic joy as a basic mode of existence is of course hardly compatible with the “nasty, poor, brutish, and short” scenario. It is also, curiously, an experience that typically is not acknowledged when affluent “thought leaders” and policy makers turn to consider the lives of the living poor.
In this case as in the case of our prehistoric conspecifics, what we are witnessing is dehumanization. The two make a natural pair, as both are symptoms of the general ideological delusion that bourgeois modernity is the only way to go, and anyone who fails to do bourgeois modernity right must be to some degree “poor-in-world”, as Heidegger said of animals, must really not have that much “going on in there”. Graeber spent his life combatting this conceit on all fronts, and was lucid enough to understand that it is a unified project. Many who defend the poor against the predations of the rich might imagine it’s not exactly a pressing matter to reconstruct what the lives of Paleolithic peoples were really like. But Graeber and Wengrow’s accomplishment in this book is to show how, in prejudice too, ontogeny recapitulates phylogeny, or, rather, the way in which the social reproduction of inequality with each new generation has something to do with our presumption of progress, which is to say our presumption of the inequality, the not-quite-humanness, of our ancestors in relation to us. The past is not so much a foreign country, as it is an Indian reservation, where, if it looks from the outside like the inhabitants are not thriving, one has the convenience of imagining that this is because they don’t know what thriving is.
7.
But if prehistoric people were like us, as Graeber and Wengrow insist they were, it also follows that they were not like each other, since we are ourselves, among the living, not like one another. The authors are particularly sensitive to the past existence of “anomalous” individuals, both those who have some special social distinction in view of physical abnormalities such as albinism or blindness, and those who are simply characterologically quirky in a way that marks them out for a special social role as a shaman, a prophet, a seer — as someone who is permanently in touch with a parallel reality that the average run of people is able to access only through ritual, if at all.
Contrary to the common idea that such people were typically eliminated through euthanasia or otherwise neglected until they perished, the archeological record clearly shows that they were often accorded special treatment. Many of the earliest evidence we have of ritual burial yields up skeletons of people who had evident deformities. This is likely not because skeletal deformities were common, and postmortem taphonomic deformation can also be ruled out. The simplest explanation is that these people were buried because they were revered, an explanation that at the same time does away with the idea that the earliest burials are at once evidence of the earliest emergence of social inequality. While the skeletons are often found adorned with riches and what might be interpreted as “royal” accoutrements such as antler crowns or sceptres carved from mammoth ivory, we know plainly that the deformities of those who have been buried are not hereditary, and thus that their special status in society could not have been traduced down to them across the generations through noble lineage.
What would it have been like to have been anomalous in prehistory? You would probably still enjoy stories and music (though there were also no doubt some reserved and awkward people who shied away from communal activities), but you would be exempted from typical adult responsibilities, and expected mostly just to “do your thing”. Graeber and Wengrow vividly imagine a prehistoric epileptic who passes his days “hanging upside down while arranging and rearranging snail shells”, the patterns of whose arrangements are attended to by his loved ones and neighbors, who keep him well fed and shower him with affection. They might have thought the snail-shell patterns literally held cryptic messages passed down through the epileptic man from another plane of reality; they might just have thought that the man is better off when he’s left to do what makes him comfortable, and if we are sensitive to his comfort, to what he’s muttering, under what circumstances, we might be better able to take the measure of our own well-being. The truth is probably somewhere in between, just as it always is when we are trying to determine whether some unfamiliar conduct is an instance of practical rationality or rather of natural magic.
8.
Graeber and Wengrow’s return to grand anthropological theory in the vein of Boas and Mauss parallels a similar return inthe work of Philippe Descola. Like the French proponent of the “ontological turn”, they are notably at ease with the methodology of recovering Indigenous voices from European sources. One must of course read historical sources, written by Europeans implicated in their individual ways in the centuries-long process of conquest and domination, with considerable caution. But to suppose that these sources trap the reader as well in the same colonial “gaze” as the author, is to give up too easily and retreat into skepticism. Descola is prepared to say that travel reports from early modern Brazil, even the ones that portray sheer cannibalistic brutality as in the famous engravings of Hans Staden, tell us at least something about the form of life prevailing in that place in the early contact period. When reports consistently echo similar themes across several different European languages and multiple generations of trans-Atlantic encounter, it is reasonable to presume the Europeans were identifying something real, even where that real thing is filtered through ungrounded contempt.
Graeber and Wengrow are prepared to go much further than Descola, and to see real elements of American social reality filtering into European texts from the sixteenth to the eighteenth centuries, even where no direct encounter is being related, and even sometimes where the author himself recognizes the work as a pure product of his own imagination. This is most of all the case in the European “discovery of freedom” via the Americans. According to Graeber and Wengrow, while by the time of Rousseau the people living in a purported “state of nature” are mostly of interest in view of what they can tell us about the origins of inequality, for roughly the first two hundred years of encounter what was of primary interest was not the apparent equality of les naturels, as Americans were often called by the French, but rather their freedom, their ability to live according to their individual desire without fear of repression.
In his remarkable book Native Pragmatism: Rethinking the Roots of American Philosophy(2002), Scott L. Pratt makes a convincing case that early American philosophy was forged through significant cultural exchange between colonists, missionaries, and Native Americans, with those on the European side often compelled to learn Native languages and to restructure their conceptual schemes in accordance with what is sayable in, say, Iroquois or Naragansett. Graeber and Wengrow revisit this argument, and add also that it was not just in the acquisition of languages, but most importantly in adopting the free deliberative and discursive practices that constituted the social “glue” and obviated any need for explicit systems of punitive justice in many Native American societies. This sort of deliberation would filter into such familiar colonial American settings as the Quaker meeting hall, and more generally would come to force freedom into European consciousness as an ideal.
The authors draw our attention in particular to a certain Kondarionk (c. 1649-1701), a Huron elder who may have engaged in lengthy dialogical exchange with the French traveler Louis-Armand de Lom d’Arce de Lahontan (1666-1716), who wrote in his later life in Amsterdam the Dialogues avec le sauvage Adario (aka Kondarionk) describing the views of the Native chief he had known at Michilimackinac, a region of what is today called Michigan. It is certain that there is a real Kondarionk, but how faithful Lahontan’s representation of his views is remains a matter of pure speculation.
One common view, which we see in the work of Jean-Pierre Vernant, is that in the early modern period Europeans, and particularly French authors such as Pierre-François-Xavier de Charlevoix (1682-1761), began to cast the Indigenous people of eastern North America in the role of political philosophers, in view of a perceived analogy between their “primitive” condition and the “pristine” or elemental structures of Old World society that had arisen in antiquity but had long since slipped away: thus, America as a window onto Europe’s past. Graeber and Wengrow however, like Descola, reasonably suppose that there was not only this sort of projection, but also a considerable amount of bidirectional flow of cultural practices, norms, and ideals. This “flow” may be more evident in French colonial history than in the English case. It sounds untimely to put it this way, but it is clear to anyone who reads the primary sources that the French sank more deeply into the American continent, in a transformation comparable in its thoroughness to that of the Yakutized Russians of the Taimyr Peninsula, than those who would eventualy usurp them in all but a single Canadian province. Until recently there was a lingering French dialect community in Michigan that spoke what was known as “Muskrat French”; “Kondarionk” in turn is a Huron word for “muskrat”, and the Huron chief was sometimes known in French simply as Le Rat. The French language would “rebrand” as one of supremely urban-centered universalism by the late eighteenth century, but when Lahontan and Kondarionk discoursed, it seemed natural to do so in French, in the wetlands, in the company, and perhaps in the imagined likeness, of the rat musqué.
The verisimilitude of Lahontan’s depiction may seem threatened by the fact that his Kondarionk speaks mostly in clichés, familiar already in Michel de Montaigne’s essay “On Cannibals”, and still a common rhetorical trope on the social-media-based lazy wing of the left, which loves nothing more than to play on the topsy-turvy trope of calling the United States a “failed state” that urgently needs election monitors sent from South Sudan, etc. Throughout these disparate expressions, the rhetorical aim is to show that “the real savages are us”, that, seen from an external point of view, European ideas about “justice”, and European tolerance of gross inequality, are more barbarian than anything encountered in the parts of the world to which the Europeans pretend to bring civilization. But this trope only makes sense if we are already familiar with, and inhabiting, the presumed default view according to which Europe represents civilization, and Americans (for example) represent savagery. If you aren’t already presuming the accepted view, suggesting that South Sudan send election monitors to the US doesn’t sound like a powerful rhetorical inversion; it just sounds like a bad idea.
The tropes that Kondarionk channels are in turn put in the mouths of countless other species of foreigner over the course of the eighteenth century, notably in Montesquieu’s Persian Letters, where the outsiders, far from being American savages, are simply the proximate neighbors of Europe from the Muslim world, which is fully acknowledged to be a center of great civilizations and to share the same points de repère of Europe and of Christendom in, for example, Greek philosophy and in the revealed scriptures of the Abrahamic faiths. In light of this fashion for inversion in European Enlightenment texts, conjuring up exotic franc-tireurs not only from the pre-state Americas but also from the “Oriental despotries”, it seems even more of a stretch to suppose that Lahontan’s effort to “look in from without” may be traced back to the real views of a particular Native American political philosopher named Kondarionk. The half-Inca half-Spanish author Garcilaso de la Vega (1539-1616), writing in sixteenth-century Spain, whom Graeber and Wengrow do not mention, likely gives us something closer to a “political philosophy from the Americas”. But significantly the political system he happens to know best is not at all one of the “societies against the state” among which we might perhaps, with qualifications, include the Hurons; it is rather the Inca Empire, and the world that de la Vega depicts is sooner ripe for comparison with Plato’s vision of an ideal totalitarian regime than with any idea of a return to Edenic freedom.
Still, we are surely better off looking to Lahontan’s fictional Kondarionk, along with any other sources we can get our hands on, in order to come to as full a picture as possible of the “Columbian exchange of light”, of the full cultural impact in Europe of the encounter with Americans over the first 250 years or so, than we are simply dismissing a work such as this out of hands on the grounds that it is nothing more than an ideological construct and a sheer fantasy. We still need to know why and by what influences authors such as Lahontan came to have the fantasies they did, and it seems certain that Kondarionk, or someone like him, played a role in this history. Graeber and Wengrow are to be credited for helping to relegitimize this necessary component of historical anthropology, which for better or worse is born out of the history of the missions and of early modern global commerce.
9.
Curiously enough in our aggressively presentist era, “big histories” of the world seem to be reliably popular. Yet until Graeber and Wengrow’s intervention, most of the people who tried their hands at this genre —which has its “scientific” origins in the cosmographical works of early modern authors such as Sebastian Münster (1488-1552), and which has always been, as Graeber and Wengrow acknowledge, its own sort of mythmaking about origins— have a scholarly formation that prepares them poorly for the undertaking. Steven Pinker is a psychologist, a field that hardly has any special grasp of how culture works, and that is no better equipped to understand methodological and epistemological challenges in reconstructing the distant past than is, say, structural linguistics. Jared Diamond is principally an ornithologist, and in his leap from the birds of New Guinea to the peoples of the earth, he makes considerable speculative errors. Yuval Noah Harari is some kind of “big historian” and a disciple of Diamond. What has been missing is the kind of expertise that comes from anthropology, which at its best straddles the boundary between hermeneutical art in its application to culture on the one hand, and on the other exact science, while sometimes slipping onto the one side of this divide and self-destructively disowning the other.
What anthropology is particularly well-suited to discern is the way in which culture works “against the state” (though of course not all anthropologists do so, and some even work directly for the state themselves, spinning out ethnographic narratives on Indigenous peoples that validate state encroachment on their lands and life-ways). This is because anthropology focuses on human beings, who, even when they live within state structures, continue to have carnivals, fail to pay their taxes, hide out in the borderlands, waste their money on weddings and go into debt. When the state attempts to incorporate new territories, sometimes it finds the inhabitants there totally indifferent to its efforts to get them to recognize it, as in Debt Graeber vividly showed with his example from French colonial Madagascar, where the French kept trying to give the Malgaches money, and the Malgaches kept burning it in lively ceremonies that only made sense from the inside: it’s not that they did not value money, it’s just that they valued burning it.
In this respect, anthropology is fundamentally an anarchist project, as it zeroes in on levels of social reality where the state, even when it exists, is not the most salient factor in accounting for why human beings do what they do. When this anarchist spirit is embraced, significant new conceptual insights may be had about the place of the state in human history. We have long attempted to bracket all “pre-state” societies into a chronological period known as “prehistory”, so that it comes out as trivially true that for as long as there has been history, there has been the state. But Graeber and Wengrow have made the most significant case yet that there is no good reason to do this. In fact the state is itself as adaptable as its supposed subjects; it can wax and wane with the seasons; it is not nearly as monolithic and necessary as we typically take it to be, and as all the other recent “big histories” of humanity have supposed it is.
The Dawn of Everything is clearly packaged and published as a conscious intervention in a discussion that has been dominated over recent years by Pinker, Diamond, and Harari. Sometimes it is annoying in the same way their works are, for reasons that, one suspects, were imposed in the editorial process and that have nothing to do with the authors’ natural styles. It is a welcome intervention, and a strong reason for hope that anarchist anthropology may have its place, alongside —what shall we call it?— plutocratic psychology and related endeavors, in helping us to understand what humanity is and how we got to be this way.
Despite definitive legal cases that have established the unconstitutionality of teaching intelligent design or creationist ideology in science class, the theory of evolution remains consistently under attack.
Creationist arguments are notoriously errant or based on a misunderstanding of evolutionary science and evidence.
Hundreds of studies verify the facts of evolution, at both the microevolutionary and macroevolutionary scale—from the origin of new traits and new species to the underpinnings of the complexity we see in life and the statistical probability of such complexity arising.
When Charles Darwin introduced the theory of evolution through natural selection 158 years ago, the scientists of the day argued over it fiercely, but the massing evidence from paleontology, genetics, zoology, molecular biology and other fields gradually established evolution's truth beyond reasonable doubt. Today that battle has been won everywhere—except in the public imagination. Embarrassingly, in the 21st century, in the most scientifically advanced nation the world has ever known, creationists can still persuade politicians, judges and ordinary citizens that evolution is a flawed, poorly supported fantasy. They lobby for creationist ideas such as “intelligent design” to be taught as alternatives to evolution in science classrooms. When this article first went to press in 2002, the Ohio Board of Education was debating whether to mandate such a change. Prominent antievolutionists of the day, such as Philip E. Johnson, a law professor at the University of California, Berkeley, and author of Darwin on Trial, admitted that they intended for intelligent-design theory to serve as a “wedge” for reopening science classrooms to discussions of God.
The good news is that in 2005 the landmark legal caseKitzmiller v. Doverin Harrisburg, Pa., set binding precedent that the teaching of intelligent design in U.S. public schools is unconstitutional because the idea is fundamentally religious, not scientific. The bad news is that in response, creationists have reinvented their movement and pressed on. When they lost the ability to claim that creationist ideas are valid science, they switched to arguing that they were only supporting “academic freedom.” Worse, to further obscure the religious roots of their resistance, they now push for “critical analysis” of climate change, cloning research and other scientific endeavors that they paint as culturally oppressive.
Consequently, besieged teachers and others are still likely to find themselves on the spot to defend evolution and refute creationism, by whatever name. Creationists' arguments are typically specious and based on misunderstandings of (or outright lies about) evolution. Nevertheless, even if their objections are flimsy, the number and diversity of the objections can put even well-informed people at a disadvantage. The following list recaps and rebuts some of the most common “scientific” arguments raised against evolution. It also directs readers to further sources for information and explains why creation science has no place in the classroom. These answers by themselves probably will not change the minds of those set against evolution. But they may help inform those who are genuinely open to argument, and they can aid anyone who wants to engage constructively in this important struggle for the scientific integrity of our civilization.
Many people learned in elementary school that a theory falls in the middle of a hierarchy of certainty—above a mere hypothesis but below a law. Scientists do not use the terms that way, however. According to the National Academy of Sciences (NAS), a scientific theory is “a well-substantiated explanation of some aspect of the natural world that can incorporate facts, laws, inferences, and tested hypotheses.” No amount of validation changes a theory into a law, which is a descriptive generalization about nature. So when scientists talk about the theory of evolution—or the atomic theory or the theory of relativity, for that matter—they are not expressing reservations about its truth.
In addition to the theory of evolution, meaning the idea of descent with modification, one may also speak of the fact of evolution. The NAS defines a fact as “an observation that has been repeatedly confirmed and for all practical purposes is accepted as ‘true.’” The fossil record and abundant other evidence testify that organisms have evolved through time. Although no one observed those transformations, the indirect evidence is clear, unambiguous and compelling.
All sciences frequently rely on indirect evidence. Physicists cannot see subatomic particles directly, for instance, so they verify their existence by watching for telltale tracks that the particles leave in cloud chambers. The absence of direct observation does not make physicists' conclusions less certain.
2. Natural selection is based on circular reasoning: the fittest are those who survive, and those who survive are deemed fittest.
“Survival of the fittest” is a conversational way to describe natural selection, but a more technical description speaks of differential rates of survival and reproduction. That is, rather than labeling species as more or less fit, one can describe how many offspring they are likely to leave under given circumstances. Drop a fast-breeding pair of small-beaked finches and a slower-breeding pair of large-beaked finches onto an island full of food seeds. Within a few generations the fast breeders may control more of the food resources. Yet if large beaks more easily crush seeds, the advantage may tip to the slow breeders. In pioneering studies of finches on the Galpagos Islands, Peter Grant and Rosemary Grant of Princeton University observed these kinds of population shifts in the wild.
The key is that adaptive fitness can be defined without reference to survival: large beaks are better adapted for crushing seeds, irrespective of whether that trait has survival value under the circumstances.
3. Evolution is unscientific because it is not testable or falsifiable. It makes claims about events that were not observed and can never be re-created.
This blanket dismissal of evolution ignores important distinctions that divide the field into at least two broad areas: microevolution and macroevolution. Microevolution looks at changes within species over time—changes that may be preludes to speciation, the origin of new species. Macroevolution studies how taxonomic groups above the level of species change. Its evidence draws frequently from the fossil record and DNA comparisons to reconstruct how various organisms may be related.
These days even most creationists acknowledge that microevolution has been upheld by tests in the laboratory (as in studies of cells, plants and fruit flies) and in the field (as in the Grants' studies of evolving beak shapes among Galpagos finches). Natural selection and other mechanisms—such as chromosomal changes, symbiosis and hybridization—can drive profound changes in populations over time.
The historical nature of macroevolutionary study involves inference from fossils and DNA rather than direct observation. Yet in the historical sciences (which include astronomy, geology and archaeology, as well as evolutionary biology), hypotheses can still be tested by checking whether they accord with physical evidence and whether they lead to verifiable predictions about future discoveries. For instance, evolution implies that between the earliest known ancestors of humans (roughly five million years old) and the appearance of anatomically modern humans (about 200,000 years ago), one should find a succession of hominin creatures with features progressively less apelike and more modern, which is indeed what the fossil record shows. But one should not—and does not—find modern human fossils embedded in strata from the Jurassic period (65 million years ago). Evolutionary biology routinely makes predictions far more refined and precise than this, and researchers test them constantly.
Evolution could be disproved in other ways, too. If we could document the spontaneous generation of just one complex life-form from inanimate matter, then at least a few creatures seen in the fossil record might have originated this way. If superintelligent aliens appeared and claimed credit for creating life on Earth (or even particular species), the purely evolutionary explanation would be cast in doubt. But no one has yet produced such evidence.
It should be noted that the idea of falsifiability as the defining characteristic of science originated with philosopher Karl Popper in the 1930s. More recent elaborations on his thinking have expanded the narrowest interpretation of his principle precisely because it would eliminate too many branches of clearly scientific endeavor.
4. Increasingly, scientists doubt the truth of evolution.
No evidence suggests that evolution is losing adherents. Pick up any issue of a peer-reviewed biological journal, and you will find articles that support and extend evolutionary studies or that embrace evolution as a fundamental concept.
Conversely, serious scientific publications disputing evolution are all but nonexistent. In the mid-1990s George W. Gilchrist, then at the University of Washington, surveyed thousands of journals in the primary literature, seeking articles on intelligent design or creation science. Among those hundreds of thousands of scientific reports, he found none. Surveys done independently by Barbara Forrest of Southeastern Louisiana University and Lawrence M. Krauss, now at Arizona State University, were similarly fruitless.
Creationists retort that a closed-minded scientific community rejects their evidence. Yet according to the editors of Nature, Science and other leading journals, few antievolution manuscripts are even submitted. Some antievolution authors have published papers in serious journals. Those papers, however, rarely attack evolution directly or advance creationist arguments; at best, they identify certain evolutionary problems as unsolved and difficult (which no one disputes). In short, creationists are not giving the scientific world good reason to take them seriously.
5. The disagreements among even evolutionary biologists show how little solid science supports evolution.
Evolutionary biologists passionately debate diverse topics: how speciation happens, the rates of evolutionary change, the ancestral relationships of birds and dinosaurs, whether Neandertals were a species apart from modern humans, and much more. These disputes are like those found in all other branches of science. Acceptance of evolution as a factual occurrence and a guiding principle is nonetheless universal in biology.
Unfortunately, dishonest creationists have shown a willingness to take scientists' comments out of context to exaggerate and distort the disagreements. Anyone acquainted with the works of paleontologist Stephen Jay Gould of Harvard University knows that in addition to co-authoring the punctuated-equilibrium model, Gould was one of the most eloquent defenders and articulators of evolution. (Punctuated equilibrium explains patterns in the fossil record by suggesting that most evolutionary changes occur within geologically brief intervals—which may nonetheless amount to hundreds of generations.) Yet creationists delight in dissecting out phrases from Gould's voluminous prose to make him sound as though he had doubted evolution, and they present punctuated equilibrium as though it allows new species to materialize overnight or birds to be born from reptile eggs.
When confronted with a quotation from a scientific authority that seems to question evolution, insist on seeing the statement in context. Almost invariably, the attack on evolution will prove illusory.
6. If humans descended from monkeys, why are there still monkeys?
This surprisingly common argument reflects several levels of ignorance about evolution. The first mistake is that evolution does not teach that humans descended from monkeys; it states that both have a common ancestor.
The deeper error is that this objection is tantamount to asking, “If children descended from adults, why are there still adults?” New species evolve by splintering off from established ones, when populations of organisms become isolated from the main branch of their family and acquire sufficient differences to remain forever distinct. The parent species may survive indefinitely thereafter, or it may become extinct.
7. Evolution cannot explain how life first appeared on Earth.
The origin of life remains very much a mystery, but biochemists have learned about how primitive nucleic acids, amino acids and other building blocks of life could have formed and organized themselves into self-replicating, self-sustaining units, laying the foundation for cellular biochemistry. Astrochemical analyses hint that quantities of these compounds might have originated in space and fallen to Earth in comets, a scenario that may solve the problem of how those constituents arose under the conditions that prevailed when our planet was young.
Creationists sometimes try to invalidate all of evolution by pointing to science's current inability to explain the origin of life. But even if life on Earth turned out to have a nonevolutionary origin (for instance, if aliens introduced the first cells billions of years ago), evolution since then would be robustly confirmed by countless microevolutionary and macroevolutionary studies.
8. Mathematically, it is inconceivable that anything as complex as a protein, let alone a living cell or a human, could spring up by chance.
Chance plays a part in evolution (for example, in the random mutations that can give rise to new traits), but evolution does not depend on chance to create organisms, proteins or other entities. Quite the opposite: natural selection, the principal known mechanism of evolution, harnesses nonrandom change by preserving “desirable” (adaptive) features and eliminating “undesirable” (nonadaptive) ones. As long as the forces of selection stay constant, natural selection can push evolution in one direction and produce sophisticated structures in surprisingly short times.
As an analogy, consider the 13-letter sequence “TOBEORNOTTOBE.” A million hypothetical monkeys, each typing out one phrase a second on a keyboard, could take as long as 78,800 years to find it among the 2613 sequences of that length. But in the 1980s Richard Hardison, then at Glendale College, wrote a computer program that generated phrases randomly while preserving the positions of individual letters that happened to be correctly placed (in effect, selecting for phrases more like Hamlet's). On average, the program re-created the phrase in just 336 iterations, less than 90 seconds. Even more amazing, it could reconstruct Shakespeare's entire play in just four and a half days.
9. The Second Law of Thermodynamics says that systems must become more disordered over time. Living cells therefore could not have evolved from inanimate chemicals, and multicellular life could not have evolved from protozoa.
This argument derives from a misunderstanding of the Second Law. If it were valid, mineral crystals and snowflakes would also be impossible, because they, too, are complex structures that form spontaneously from disordered parts.
The Second Law actually states that the total entropy of a closed system (one that no energy or matter leaves or enters) cannot decrease. Entropy is a physical concept often casually described as disorder, but it differs significantly from the conversational use of the word.
More important, however, the Second Law permits parts of a system to decrease in entropy as long as other parts experience an offsetting increase. Thus, our planet as a whole can grow more complex because the sun pours heat and light onto it, and the greater entropy associated with the sun's nuclear fusion more than rebalances the scales. Simple organisms can fuel their rise toward complexity by consuming other forms of life and nonliving materials.
10. Mutations are essential to evolution theory, but mutations can only eliminate traits. They cannot produce new features.
On the contrary, biology has catalogued many traits produced by point mutations (changes at precise positions in an organism's DNA)—bacterial resistance to antibiotics, for example.
Mutations that arise in the homeobox (Hox) family of development-regulating genes in animals can also have complex effects. Hox genes direct where legs, wings, antennae and body segments should grow. In fruit flies, for instance, the mutation called Antennapedia causes legs to sprout where antennae should grow. These abnormal limbs are not functional, but their existence demonstrates that genetic mistakes can produce complex structures, which natural selection can then test for possible uses.
Moreover, molecular biology has discovered mechanisms for genetic change that go beyond point mutations, and these expand the ways in which new traits can appear. Functional modules within genes can be spliced together in novel ways. Whole genes can be accidentally duplicated in an organism's DNA, and the duplicates are free to mutate into genes for new, complex features. Comparisons of the DNA from a wide variety of organisms indicate that this is how the globin family of blood proteins evolved over millions of years.
11. Natural selection might explain microevolution, but it cannot explain the origin of new species and higher orders of life.
Evolutionary biologists have written extensively about how natural selection could produce new species. For instance, in the model called allopatry, developed by Ernst Mayr of Harvard University, if a population of organisms were isolated from the rest of its species by geographical boundaries, it might be subjected to different selective pressures. Changes would accumulate in the isolated population. If those changes became so significant that the splinter group could not or routinely would not breed with the original stock, then the splinter group would bereproductively isolatedand on its way toward becoming a new species.
When confronted with a quotation from a scientific authority that seems to question evolution, insist on seeing the statement in context. Almost invariably, the attack on evolution will prove illusory.
6. If humans descended from monkeys, why are there still monkeys?
This surprisingly common argument reflects several levels of ignorance about evolution. The first mistake is that evolution does not teach that humans descended from monkeys; it states that both have a common ancestor.
The deeper error is that this objection is tantamount to asking, “If children descended from adults, why are there still adults?” New species evolve by splintering off from established ones, when populations of organisms become isolated from the main branch of their family and acquire sufficient differences to remain forever distinct. The parent species may survive indefinitely thereafter, or it may become extinct.
7. Evolution cannot explain how life first appeared on Earth.
The origin of life remains very much a mystery, but biochemists have learned about how primitive nucleic acids, amino acids and other building blocks of life could have formed and organized themselves into self-replicating, self-sustaining units, laying the foundation for cellular biochemistry. Astrochemical analyses hint that quantities of these compounds might have originated in space and fallen to Earth in comets, a scenario that may solve the problem of how those constituents arose under the conditions that prevailed when our planet was young.
Creationists sometimes try to invalidate all of evolution by pointing to science's current inability to explain the origin of life. But even if life on Earth turned out to have a nonevolutionary origin (for instance, if aliens introduced the first cells billions of years ago), evolution since then would be robustly confirmed by countless microevolutionary and macroevolutionary studies.
8. Mathematically, it is inconceivable that anything as complex as a protein, let alone a living cell or a human, could spring up by chance.
Chance plays a part in evolution (for example, in the random mutations that can give rise to new traits), but evolution does not depend on chance to create organisms, proteins or other entities. Quite the opposite: natural selection, the principal known mechanism of evolution, harnesses nonrandom change by preserving “desirable” (adaptive) features and eliminating “undesirable” (nonadaptive) ones. As long as the forces of selection stay constant, natural selection can push evolution in one direction and produce sophisticated structures in surprisingly short times.
As an analogy, consider the 13-letter sequence “TOBEORNOTTOBE.” A million hypothetical monkeys, each typing out one phrase a second on a keyboard, could take as long as 78,800 years to find it among the 2613 sequences of that length. But in the 1980s Richard Hardison, then at Glendale College, wrote a computer program that generated phrases randomly while preserving the positions of individual letters that happened to be correctly placed (in effect, selecting for phrases more like Hamlet's). On average, the program re-created the phrase in just 336 iterations, less than 90 seconds. Even more amazing, it could reconstruct Shakespeare's entire play in just four and a half days.
9. The Second Law of Thermodynamics says that systems must become more disordered over time. Living cells therefore could not have evolved from inanimate chemicals, and multicellular life could not have evolved from protozoa.
This argument derives from a misunderstanding of the Second Law. If it were valid, mineral crystals and snowflakes would also be impossible, because they, too, are complex structures that form spontaneously from disordered parts.
The Second Law actually states that the total entropy of a closed system (one that no energy or matter leaves or enters) cannot decrease. Entropy is a physical concept often casually described as disorder, but it differs significantly from the conversational use of the word.
More important, however, the Second Law permits parts of a system to decrease in entropy as long as other parts experience an offsetting increase. Thus, our planet as a whole can grow more complex because the sun pours heat and light onto it, and the greater entropy associated with the sun's nuclear fusion more than rebalances the scales. Simple organisms can fuel their rise toward complexity by consuming other forms of life and nonliving materials.
10. Mutations are essential to evolution theory, but mutations can only eliminate traits. They cannot produce new features.
On the contrary, biology has catalogued many traits produced by point mutations (changes at precise positions in an organism's DNA)—bacterial resistance to antibiotics, for example.
Mutations that arise in the homeobox (Hox) family of development-regulating genes in animals can also have complex effects. Hox genes direct where legs, wings, antennae and body segments should grow. In fruit flies, for instance, the mutation called Antennapedia causes legs to sprout where antennae should grow. These abnormal limbs are not functional, but their existence demonstrates that genetic mistakes can produce complex structures, which natural selection can then test for possible uses.
Moreover, molecular biology has discovered mechanisms for genetic change that go beyond point mutations, and these expand the ways in which new traits can appear. Functional modules within genes can be spliced together in novel ways. Whole genes can be accidentally duplicated in an organism's DNA, and the duplicates are free to mutate into genes for new, complex features. Comparisons of the DNA from a wide variety of organisms indicate that this is how the globin family of blood proteins evolved over millions of years.
11. Natural selection might explain microevolution, but it cannot explain the origin of new species and higher orders of life.
Evolutionary biologists have written extensively about how natural selection could produce new species. For instance, in the model called allopatry, developed by Ernst Mayr of Harvard University, if a population of organisms were isolated from the rest of its species by geographical boundaries, it might be subjected to different selective pressures. Changes would accumulate in the isolated population. If those changes became so significant that the splinter group could not or routinely would not breed with the original stock, then the splinter group would bereproductively isolatedand on its way toward becoming a new species.