Monday, November 2, 2020

3435. The Deep Anthropocene

By Lucas StephensErle Ellisand Dorian Fuller, Aeon,  October 1, 2020


Humanity’s transition from hunting and gathering to agriculture is one of the most important developments in human and Earth history. Human societies, plant and animal populations, the makeup of the atmosphere, even the Earth’s surface – all were irreversibly transformed.

When asked about this transition, some people might be able to name the Neolithic Revolution or point to the Fertile Crescent on a map. This widespread understanding is the product of years of toil by archaeologists, who diligently unearthed the sickles, grinding stones and storage vessels that spoke to the birth of new technologies for growing crops and domesticating animals. The story they constructed went something like this: beginning in the Near East some 11,000 years ago, humans discovered how to control the reproduction of wheat and barley, which precipitated a rapid switch to farming. Within 500 to 1,000 years, a scattering of small farming villages sprang up, each with several hundred inhabitants eating bread, chickpeas and lentils, soon also herding sheep and goats in the hills, some keeping cattle.

This sedentary lifestyle spread, as farmers migrated from the Fertile Crescent through Turkey and, from there, over the Bosporus and across the Mediterranean into Europe. They moved east from Iran into South Asia and the Indian subcontinent, and south from the Levant into eastern Africa. As farmers and herders populated new areas, they cleared forests to make fields and brought their animals with them, forever changing local environments. Over time, agricultural advances allowed ever larger and denser settlements to flourish, eventually giving rise to cities and civilisations, such as those in Mesopotamia, Egypt, the Indus and later others throughout the Mediterranean and elsewhere.

For many decades, the study of early agriculture centred on only a few other regions apart from the Fertile Crescent. In China, millet, rice and pigs gave rise to the first Chinese cities and dynasties. In southern Mexico, it was maize, squash and beans that were first cultivated and supported later civilisations such as the Olmecs or the Puebloans of the American Southwest. In Peru, native potato, quinoa and llamas were among species domesticated by 5,000 years ago that made later civilisations in the Andes possible. In each of these regions, the transition to agriculture set off trends of rising human populations and growing settlements that required increasing amounts of wood, clay and other raw materials from the surrounding environments.

Yet for all its sweep and influence, this picture of the spread of agriculture is incomplete. New technologies have changed how archaeology is practised, from the way we examine ancient food scraps at a molecular level, to the use of satellite photography to trace patterns of irrigation across entire landscapes. Recent discoveries are expanding our awareness of just how early, extensive and transformative humans’ use of land has been. The rise of agriculture was not a ‘point in time’ revolution that occurred only in a few regions, but rather a pervasive, socioecological shifting back and forth across fuzzy thresholds in many locations.

Bringing together the collective knowledge of more than 250 archaeologists, the ArchaeoGLOBE project in which we participated is the first global, crowdsourced database of archaeological expertise on land use over the past 10,000 years. It tells a completely different story of Earth’s transformation than is commonly acknowledged in the natural sciences. ArchaeoGLOBE reveals that human societies modified most of Earth’s biosphere much earlier and more profoundly than we thought – an insight that has serious implications for how we understand humanity’s relationship to nature and the planet as a whole.

Just as recent archaeological research has challenged old definitions of agriculture and blurred the lines between farmers and hunter-gatherers, it’s also leading us to rethink what nature means and where it is. The deep roots of how humanity transformed the globe pose a challenge to the emerging Anthropocene paradigm, in which human-caused environmental change is typically seen as a 20th-century or industrial-era phenomenon. Instead, it’s clearer than ever before that most places we think of as ‘pristine’ or ‘untouched’ have long relied on human societies to fill crucial ecological roles. As a consequence, trying to disentangle ‘natural’ ecosystems from those that people have managed for millennia is becoming less and less realistic, let alone desirable.

Our understanding of early agriculture derives mostly from the material remains of food – seeds, other plant remains and animal bones. Archaeologists traditionally document these finds from excavated sites and use them to track dates and distribution of different people and practices. Over the past several decades, though, practitioners have become more skilled at spotting the earliest signatures of domestication, relying on cutting-edge advances in chemistry, biology, imaging and computer science.

Archaeologists have greatly improved their capacity to trace the evolution of crops, thanks to advances in our capacity to recover minute plant remains – from silica microfossils to minute attachment scars of cereals, where the seeds attach to the rest of the plant. Along with early crops, agricultural weeds and storage pests such as mice and weevils also appeared. Increasingly, we can identify a broader biotic community that emerged around the first villages and spread with agriculture. For example, weeds that originated in the Fertile Crescent alongside early wheat and barley crops also show up in the earliest agricultural communities in places such as Germany and Pakistan.

Collections of animal bones provide evidence of how herded creatures changed physically through the process of domestication. Butchering marks on bones can help reconstruct culling strategies. From the ages and sizes of animals, archaeologists can deduce the populations of herds in terms of age and sex ratios, all of which reveals how herding differed from hunting. Herding systems themselves also vary, with some focused only on producing meat, and others on milk and wool too.

Measurements of bones and seeds have made great strides with technologies such as geometric morphometrics – complex mathematical shape analysis that allows for a more nuanced understanding of how varieties evolved and moved between regions. Biomolecular methods have also multiplied. The recovery of amino acid profiles from fragmented animal bones, for example, has allowed us to discern which animals they came from, even when they’re too degraded for visual identification. The increasingly sophisticated use and analysis of ancient DNA now allows researchers to track the development and distribution of domesticated animals and crops in great detail.

Archaeologists have also used mass spectrometry, a technique involving gas ions, to pinpoint which species were cooked together based on the presence of biomolecules such as lipids. Stable isotopes of carbon and nitrogen from animal bones and seeds give insight into where and how plants and animals were managed – allowing us to more fully sketch out ancient foodwebs from soil conditions to human consumption. Strontium isotopes in human and animal bones, meanwhile, allow us to identify migrations across a single organism’s lifetime, revealing more and earlier long-distance interconnections than previously imagined. Radiocarbon dating was already possible in the 1950s – but recent improvements that have reduced sample sizes and error margins allow us to build fine-grained chronologies and directly date individual crops.

With all these fresh data, it’s now possible to tell a much richer, more diverse story about the gradual evolutions and dispersals of early agriculture. By 6,000 years ago, the British Isles were being transformed by an imported collection of crops, weeds and livestock that had originated millennia earlier in the Near East. Similarly, millet, rice and pigs from central China had been spread as far as Thailand by 4,000 years ago, and began transforming much of the region’s tropical woodland to agricultural fields. New stories are constantly emerging too – including that sorghum, a grain crop, was domesticated in the savannahs of eastern Sudan more than 5,000 years ago, before the arrival of domesticated sheep or goats in that area. Once combined with Near Eastern sheep, goats and cattle, agropastoralism spread rapidly throughout most of sub-Saharan Africa by 2,000 years ago.

Advances in the study of plant silica micro-fossils (phytoliths) have helped trace banana cultivation from the Island of New Guinea more than 7,000 years ago – from where it spread through Island Southeast Asia, and eventually across the Indian Ocean to Africa, more than a millennium before Vasco da Gama navigated from Africa to India. These techniques have also revealed unforeseen agricultural origins – such as the forgotten cereal, browntop millet. It was the first staple crop of South India, before it was largely replaced by crops such as sorghum that were translocated from Africa. Many people might be surprised to learn that the early farming tradition in the Mississippi basin relied on pitseed goosefoot, erect knotweed and marsh elder some 3,000-4,000 years ago, long before maize agriculture arrived in the American Midwest.

Achaeologists don’t just study materials painstakingly uncovered in excavations. They also examine landscapes, patterns of settlement, and the built infrastructure of past societies to get a sense of the accumulated changes that humans have made to our environments. They have developed a repertoire of techniques that allow them to study the traces of ancient people on scales much larger than an individual site: from simply walking and documenting the density of broken pottery on the ground, to examining satellite imagery, using lidar (light and laser) and drones to build 3D models, even searching for subsurface magnetic anomalies to plot out the walls of buried cities.

As a result, new revelations about our deep past are constantly emerging. Recent discoveries in southwestern Amazonia showed that people were cultivating squash and manioc more than 10,000 years ago, and maize only a few thousand years later. They did so living in an engineered landscape consisting of thousands of artificial forested islands, within a seasonally flooded savannah.

Some of the most stunning discoveries have come from the application of lidar around Maya cities, buried underneath the tropical canopy in Central America. Lasers can penetrate this canopy to define the shapes of mounds, plazas, ceremonial platforms and long causeways that were previously indistinguishable from the topography of the jungle. A recent example in Mexico pushed back the time period for monumental construction to what we used to consider the very beginning of Maya civilisation – 3,000 years ago – and suggests the monuments were more widespread than previously believed.

In 2003, the climatologist William Ruddiman introduced the ‘early anthropogenic hypothesis’: the idea that agricultural land use began warming Earth’s climate thousands of years ago. While some aspects of this early global climate change remain unsettled among scientists, there’s strong consensus that land-use change was the greatest driver of global climate change until the 1950s, and remains a major driver of climate change today. As a result, global maps of historical changes in land use, and their effects on vegetation cover, soils and greenhouse gas emissions, are a critical component of all contemporary models for forecasting Earth’s future climate.

Deforestation, tilling the land and other agricultural practices alter regional and global climate because they release greenhouse gases from vegetation and soils, as well as altering the exchange of heat and moisture across Earth. These effects reverse when land is abandoned and vegetation recovers or is restored. Early changes in agricultural land use therefore have major implications in understanding climate changes of the past, present and future.

The main global map of historical land use deployed in climate models is HYDE (the History Database of the Global Environment), combining contemporary and historical patterns of land use and population across the planet over the past 12,000 years. Despite this huge span of space and time, with notable exceptions, HYDE is based largely on historical census data that go back to 1960, mostly from Europe.

HYDE’s creator, a collaborator in ArchaeoGLOBE, has long requested help from historians, scientists and archaeologists to build a stronger empirical basis for HYDE’s global maps – especially for the deep past, where data are especially lacking. The data needed to improve the HYDE database exist, but reside in a format that’s difficult to access – the expert knowledge of archaeologists working in sites and regions around the world. The problem is that no single archaeologist has the breadth or time-depth of knowledge required.

Archaeologists typically study individual regions and time periods, and have only background knowledge on wider areas. Research methods and terminology also aren’t standardised worldwide, making syntheses difficult, rare and subjective. To construct a comprehensive global database of past land use, you need to gather information from hundreds of regional specialists and collate it, allowing this mosaic of individual studies to emerge as a single picture. This was exactly what we did for ArchaeoGLOBE.

In 2018, we surveyed more than 1,300 archaeologists around the world, and synthesised their responses into ArchaeoGLOBE. The format of our questionnaire was based on 10 time-slices from history (from 10,000 years ago, roughly the beginning of agriculture, to 1850 CE, the industrial era in Europe); 146 geographic regions; four levels of land-use prevalence; and five land-use categories (foraging/hunting/gathering/fishing; pastoralism; extensive agriculture; intensive agriculture; urbanism).

We ended up receiving 711 regional assessments from 255 individual archaeologists – resulting in a globally complete, if uneven, map of archaeological knowledge. After synthesis and careful analysis, our results (along with 117 other co-authors) were published in 2019 in Science. We also made all our data and analysis available online, at every stage of the research process – even before we had finished collecting it – in an effort to stimulate the culture of open knowledge-sharing in archaeology as a discipline.

The resulting data-trove allows researchers to compare land-use systems over time and in different regions, as well as to aggregate their cumulative, global impacts at different points over the past 10,000 years. When we compared ArchaeoGLOBE results with HYDE, we found that archaeological assessments showed much earlier and more widespread agricultural land use than HYDE suggested – and, therefore, more intensive land use than had been factored into climate change assessments. Indeed, the beginnings of intensive agriculture in ArchaeoGLOBE were earlier than HYDE’s across more than half of Earth’s current agricultural regions, often by 1,000 years or more.

By 3,000 years ago, Earth’s terrestrial ecology was already largely transformed by hunter-gatherers, farmers and pastoralists – with more than half of regions assessed engaged in significant levels of agriculture or pastoralism. For example, the Kopaic Basin in the Greek region of Boeotia was drained and converted from wetland to agricultural land in the 13th century BCE. This plain – roughly 1,500 hectares (15 sq km) in size – surrounded by steep limestone hills, had been a large, shallow lake since the end of the last Ice Age. Late Bronze Age residents of the area, members of what we call the Mycenaean culture, constructed a hydraulic infrastructural system on a massive scale to drain the wetland and claim it for agriculture. They channelised rivers, dug drainage canals, built long dikes and expanded natural sinkholes to direct the water off what would have been nutrient-rich soil. Eventually, when the Mycenaean civilisation collapsed at the end of the Bronze Age, the basin flooded again and returned to its previous wetland state. Legend has it that Heracles filled in the sinkholes as revenge against a local king. The area was not successfully drained again until the 20th century.

These examples highlight a general trend we found that agriculture and pastoralism gradually replaced foraging-hunting-gathering around the world. But the data also show that there were reversals and different subsistence economies, from foraging to farming, operating in parallel in some places. Moreover, agriculture and pastoralism are not the only practices that transform environments. Hunter-gatherer land use was already widespread across the globe (82 per cent of regions) by 10,000 years ago. Through the selective harvest and translocation of favoured species, hunting (sometimes to extinction) and the use of fire to dramatically alter landscapes, most of the terrestrial biosphere was already significantly influenced by human activities, even before the domestication of plants and animals.

ArchaeoGLOBE is both a cause and a consequence of a dramatic change in perspective about how early land use produced long-term global environmental change. Archaeological knowledge is increasingly becoming a crucial instrument for understanding humanity’s cumulative effect on ecology and the Earth system, including global changes in climate and biodiversity. As a discipline, the mindset of archaeology stands in contrast to earlier perspectives grounded in the natural sciences, which have long emphasised a dichotomy between humans and nature.

In the ‘pristine myth’ paradigm from the natural sciences, as the geographer William Denevan called it, human societies are recent destroyers, or at the very least disturbers, of a mostly pristine natural world. Denevan was reacting against the portrayal of pre-1492 America as an untouched paradise, and he used the substantial evidence of indigenous landscape modification to argue that the human presence was perhaps more visible in 1492 than 1750. Recent popular conceptions of the Anthropocene risk making a similar mistake, drawing a thin bright line at 1950 and describing what comes after as a new, modern form of ecological disaster. Human changes to the environment are cumulative and were substantial at different scales throughout our history. The deep trajectory of land use revealed by ArchaeoGLOBE runs counter to the idea of pinpointing a single catalytic moment that fundamentally changed the relationship between humanity and the Earth system.

The pristine myth also accounts for why places without contemporary intensive land use are often dubbed ‘wilderness’ – such as areas of the Americas depopulated by the great post-Columbian die-off. Such interpretations, perpetuated by scientists, have long supported colonial narratives in which indigenous hunter-gatherer and even agricultural lands are portrayed as unused and ripe for productive use by colonial settlers.

The notion of a pristine Earth also pervaded the thinking of early conservationists in the United States such as John Muir. They were intent on preserving what they saw as the nobility of nature from a mob of lesser natural life, and also those eager to manage wilderness areas to maintain the trophy animals they enjoyed hunting. For example, the governor of California violently forced Indigenous peoples out of Yosemite Valley in the 19th century, making way for wilderness conservation. These ideas went hand-in-hand with a white supremacist view of humanity that cast immigrants and the poor as a type of invasive species. It was not a great leap of theorising to move from a notion of pristine nature to seeing much of humanity as the opposite – a contaminated, marring mass. In both realms, the human and the natural, the object was to exclude undesirable people to preserve bastions of the unspoilt world. These extreme expressions of a dichotomous view of nature and society are possible only by ignoring the growing evidence of long-term human changes to Earth’s ecology – humans were, and are still, essential components of most ‘natural’ ecosystems.

Humans have continually altered biodiversity on many scales. We have changed the local mix of species, their ranges, habitats and niches for thousands of years. Long before agriculture, selective human predation of many non-domesticated species shaped their evolutionary course. Even the relatively small hunter-gatherer populations of the late Pleistocene were capable of negatively affecting animal populations – driving many megafauna and island species extinct or to the point of extinction. But there have also been widespread social and ecological adaptations to these changes: human management can even increase biodiversity of landscapes and can sustain these increases for thousands of years. For example, pastoralism might have helped defer climate-driven aridification of the Sahara, maintaining mixed forests and grassland ecosystems in the region for centuries.

This recognition should cause us to rethink what ‘nature’ and ‘wilderness’ really are. If by ‘nature’ we mean something divorced from or untouched by humans, there’s almost nowhere on Earth where such conditions exist, or have existed for thousands of years. The same can be said of Earth’s climate. If early agricultural land use began warming our climate thousands of years ago, as the early anthropogenic hypothesis suggests, it implies that no ‘natural’ climate has existed for millennia.

A clear-eyed appreciation for the deep entanglement of the human and natural worlds is vital if we are to grapple with the unprecedented ecological challenges of our times. Naively romanticising a pristine Earth, on the other hand, will hold us back. Grasping that nature is inextricably linked with human societies is fundamental to the worldview of many Indigenous cultures – but it remains a novel and often controversial perspective within the natural sciences. Thankfully, it’s now gaining prominence within conservation circles, where it’s shifting attitudes about how to enable sustainable and resilient stewardship of land and ecosystems.

Viewing humans and nature as entwined doesn’t mean that we should shrug our shoulders at current climatic trends, unchecked deforestation, accelerating extinction rates or widespread industrial waste. Indeed, archaeology supplies numerous examples of societal and ecosystem collapse: a warning of what happens if we ignore the consequences of human-caused environmental change.

But ecological crises are not inevitable. Humans have long maintained sustainable environments by adapting and transforming their societies. As our work demonstrates, humans have shaped the ecology of this planet for thousands of years, and continue to shape it.

We live at a unique time in history, in which our awareness of our role in changing the planet is increasing at the precise moment when we’re causing it to change at an alarming rate. It’s ironic that technological advances are simultaneously accelerating both global environmental change and our ability to understand humans’ role in shaping life on Earth. Ultimately, though, a deeper appreciation of how the Earth’s environments are connected to human cultural values helps us make better decisions – and also places the responsibility for the planet’s future squarely on our shoulders.

About the authors: 

Lucas Stephens is a senior research analyst at the Environmental Law and Policy Center in Chicago. He was a specialist researcher at the ArchaeoGLOBE project.

Erle Ellis

is a professor of geography and environmental systems at the University of Maryland, Baltimore County. He is a member of the Anthropocene Working Group, a fellow of the Global Land Programme, a senior fellow of the Breakthrough Institute, and an advisor to the Nature Needs Half movement. He is the author of Anthropocene: A Very Short Introduction (2018).

Dorian Fuller

is professor of archaeobotany at University College London.

No comments:

Post a Comment