Thursday, September 24, 2020

3429. Dear News Media, Stop Covering the US as If It’s a Democracy

By Rob Wijnberg, The Correspondent, Septermber 16, 2020


[For an audio of this text please visit: https://soundcloud.com/thecorrespondent]


The problem with the fall of a democracy is that it doesn’t simply happen, like a rain shower or a thunderstorm. It unfolds, like the slow and steady warming of the climate.


Liberties aren’t eliminated, they are restricted and violated – until they erode. Rights aren’t abolished, they are undermined and trampled – until they become privileges. Truths aren’t buried, they are mocked and twisted – until everyone has their own.


A democracy doesn’t stumble and fall; it slides into decline.


The problem with daily news is that it obsesses over what’s happening, making it harder to grasp what unfolds. Breaking news, by its nature, is ill-equipped to cover the demise of democracy – just as the weather report never really shows us the climate is changing. 


Breaking news shows the world as a place of sheer madness without rhyme or reason – a non-stop series of unrelated events. It’s like a diary without a memory or a notion of the future: it tells us of today, while it has forgotten all about yesterday, and pretending there’s no tomorrow. It warns and warns and warns, but immediately forgets what it’s warning against – thus never learning from its own wailing sirens.


For four years, US news has been what you get when you combine a North Korean obsession with the head of state with Rupert Murdoch’s business model

For the past four years, ever since Donald J Trump took presidential office, this fundamental flaw in the fabric of news has hit harder than ever before.


For four years, US news has been what you get when you combine a North Korean obsession with the head of state with Rupert Murdoch’s business model. A deranged cult of personality, interrupted only by commercial breaks. A presidential hypnosis, paid for by Procter & Gamble and Amazon. A totalitarian Twitterocracy in which we lurch from incident, to riot, to tweet, to disaster, to lunacy, to lie, to crisis, to disbelief, to attack, to mudslinging, to insult, to conspiracy theory, without facing the consequences of the pattern – the steady slide into decline.


The disturbing story behind all this frenzied chaos of news is that of a country that is a democracy in name only, a kleptocracy 


 in actual practice, and well on its way to becoming an autocracy full stop.


A country that has seen its public sphere crumble into neoliberal rubble over the past 40 years. That has seen its conservative party transform into a fact-free sectarian movement. That has seen the gap between poor and rich, black and white, urban and rural grow into a chasm so wide that it is “United” in name only. That is held together only by worn-out myths – myths about free and fair elections, about social mobility, about being a beacon to the world. About itself.


Or, as the late stand-up comedian George Carlin once said: “It’s called the American Dream for a reason. You have to be asleep to believe it.”


The United States of America, a republic without the ‘public’


It would be a major misconception to assume that the downfall of US democracy started in November 2016, when Trump was elected. In fact, it’s the other way around: the first openly kleptocratic president moving into the White House marked the consummation of its decay, not its initial conception.


Born from theft, built on slavery, held together by self-deception, the United States has grown to become the richest poor country in the history of humankind. It is a country that has violence in its DNA, inequality embedded in its genes, and a completely mythical self-image as its national identity.


It’s a country with the world’s highest GDP, where 40 million people live below the poverty line. 


The only industrialised nation on the planet without universal healthcare, any real social welfare system or decent retirement provisions. The only free nation where 1 in 40 adults are behind bars and which has more guns in circulation than people living within its borders.  The only western economy where the richest three inhabitants hold more wealth than the poorest half of the entire population. 


The US is, in short, a country without a social contract. It’s a republic that has stripped away the “public”. And it’s led by a political party that can no longer be called a party in any real sense.


Because while many journalists still tend to refer to it as such, the Republican party is no longer a political party at all – it’s become a sectarian movement. Its transformation is rooted in an increasingly intimate alliance with the country’s corporate elite, tied to an increasingly radical type of identity politics.


By this point, political scientists Thomas E Mann and Norman J Ornstein conclude, the party has become a “radical insurgency” – “ideologically extreme; scornful of compromise; unmoved by conventional understanding of facts, evidence and science; and dismissive of the legitimacy of its political opposition.” In fact, there are almost no moderate Republicans left:





And there’s no need for them, either, because the GOP doesn’t really represent the people. Its power base rests solely on the illusion of a democratic mandate. For the American spectacle that dominates our front-pages every four years has very little to do with “free” or “fair” elections.


It takes millions and millions of dollars to even run for president in the first place, and candidates need at least half a billion dollars to fund their PR campaigns to have a decent shot at winning. That funding is largely solicited from the country’s biggest banks, titans of technology, pharmaceutical firms, oil companies and individual billionaires, whose financial support ensures their political influence.


The victory paid for with these hundreds of millions of lobby dollars can hardly be considered representative of the American people. To be sure, democracies are never perfect – all reflect the structural inequalities in which they exist, and the ways in which the powerful can tug the systems in their favour. But almost nowhere in the world is the gap between the political preferences of ordinary voters and the priorities of the elite as great as it is in the United States. And almost nowhere else do the results of the election differ so dramatically from how people cast their ballots in the voting booth.


For the final result is based not on the popular vote, but on the electoral college: a system that was, by design, heavily biased in favour of the less densely populated Southern states as part of a workaround for the persistence of slavery. 


This system has been made even less representative in every election by means of a process known as gerrymandering. 


No wonder voter turnout in the US is among the lowest in the industrialised world. Nearly half of eligible voters do not take part in the elections. This isn’t just because of political apathy; it is also caused by deliberate voter suppression. 


 Millions of US Americans – most of them living in poverty and black – are systematically blocked from voting. Those who do venture to try usually end up spending hours standing in an excruciatingly long line, only to cast their ballot on a highly unreliable voting machine. 


In other words, the US lacks nearly all the elements of a functioning democracy: a social contract, a representative electoral system, free and fair elections, political parties that follow democratic practices, and universal suffrage. 


Instead, it has an electability threshold starting at hundreds of millions of dollars, a political process completely determined by billionaires and large corporations, an electoral system that is fundamentally skewed, and a discouraged and sabotaged electorate.

Democracy? What democracy?


How the US became a kleptocracy


From that perspective, the ascendance of Donald J Trump to the seat of US power isn’t an astounding deviation from the natural order of things, but rather the completely logical outcome of a development that has been progressing for decades.


A country without any sense of the common good, grown fat on exploitation, held together by fundamental falsehoods will ultimately get a leader who suits that setting perfectly: a leader without a coherent ideology, driven by greed and self-enrichment, owing no fealty to fact.


What is new, however, is how openly kleptocratic the US has become. Behind the inimitable vagaries of Trump’s personality and political agenda, one factor is consistently present: the systematic enrichment of himself, his family and the elite to which they belong – and the shamelessness with which it all takes place.


This shamelessness was apparent from day one. At Trump’s very first press conference as president, he stood behind the lectern and gestured to a table covered in stacks of paper. Those folders, he said, were full of documents describing how he had relinquished control of his business empire.


Those documents, it turned out later, were all blank. 


The empty sheets of paper that represented his first official lie as head of state would be filled by more than 20,000 demonstrable lies in the four years that followed.



In the meantime, Trump rerouted millions of US tax dollars into his own business accounts. In the three-year period from 2015 to 2018 alone, over $16m in campaign funds and taxpayer money went to Trump’s own businesses. The president frequently spent the night in his own hotels, tucking at least a million in public funds into his own pocket. Staying in those selfsame hotels, the US Secret Service paid as much as $650 per room per night to accommodate the president’s security detail, expending hundreds of thousands of additional tax dollars that went to Trump-owned properties.   


Trump also appointed his immediate family members and their spouses to positions of power – on an “unofficial” basis, so as to circumvent the rules against nepotism. His daughter Ivanka Trump became senior adviser to the president, as did her husband Jared Kushner; Andrew Giuliani, the son of Trump’s personal attorney, was appointed his “Public Liaison Assistant”. His sons Don Jr and Eric Trump were put in charge of the Trump Organization. Since their father’s inauguration, they made over a hundred million dollars in property deals – some of which required approval from the Trump administration itself. 


Those same family members turned this year’s Republican National Convention, the political party’s main event leading up to the national elections, into a parade of brazen kleptocracy: six of the 12 key speakers (!) shared the president’s last name.


Meanwhile, Trump cut $6bn from the federal budget for subsidised housing – direct competition to his own property empire – and slashed federal taxes by nearly $1.5tn, with over a third of that going to the richest 1% of US citizens – meaning multimillionaires just like him. 70% of the tax cut went directly into the pockets of the top 20% earners in the country. 


As cherry on top of the kleptocratic cake, Trump used his presidential powers to pardon 25 convicted criminals – nearly all of them friends, acquaintances or political supporters, and nearly all unusually early in his presidency, letting his “inner circle” know that they have nothing to fear in terms of legal prosecution.


“No other president has exercised the clemency power for such a patently personal and self-serving purpose,” warned House committee members calling for an investigation. But their warnings fell on deaf ears, since Trump had now single-handedly derailed the safeguards put in place to prevent abuses of power.


Case in point? The US justice department is currently moving to shield the president from prosecution for rape, a level of political intervention in the judicial process that is unprecedented even by US standards. 


Democracy in the dark, concealed behind the curtain of news-as-usual

All of these things, each and every one, story after story, have been in the news.


Even the pattern hasn’t gone unnoticed.


Four weeks after Donald Trump was sworn into office as president of the United States of America, The Washington Post unveiled its new slogan: “Democracy Dies in Darkness”. It was the first official slogan in the newspaper’s long history, referencing a phrase used by its most famous investigative journalist Bob Woodward, who broke the Watergate affair.


The slogan isn’t merely a slogan anymore – it’s basically the theme of most of the reporting. The work being done by the Post’s own David Fahrenthold, as well as his New York Times counterparts David Barstow, Susanne Craig and Russ Buettner, author Andrea Bernstein, journalist Katherine Sullivan and others, 


And while a pandemic rages across the nation unimpeded, the country moves closer to the brink of a deep economic crisis, and tensions in the streets flare higher and higher every day, the president is openly inciting violence, politicising the Department of Justice and federal law enforcement agencies for personal gain, and preemptively questioning the legitimacy of the upcoming national elections, just in case.


All those signals point in only one direction: the US is rapidly becoming an authoritarian state.


Those who warn of the impending autocracy can only ever be alarmist. Either we’re proven wrong, or our warnings are already too late. For a democracy doesn’t fall, it slides into decline. Its demise cannot be predicted, only revealed in retrospect.


The fact that people can still sound the warning means that it is not too late. A free press is indeed the light shining in the darkness that keeps a democracy alive. But if a free press is the shining beacon, then some of its most pernicious habits are the curtain concealing its light.


There is a popular theory that divides the American news landscape into two sides: left-wing liberal vs right-wing conservative media.


And indeed, when you switch back and forth between Fox News and CNN, or between Breitbart and the New York Times, it’s like hopping from Mars to Venus and back again; even the gravity is different. Not hard to guess who’s the devil and who’s the messiah on these two disparate planets.


So, yes, there’s a lot of truth to the theory. But because the theory focuses mainly on the differences between news media, it neatly evades their similarities.


Beneath those superficial differences is one fundamental commonality: a shared definition of news. To put it more simply, left-wing and right-wing media are talking differently about the same things. Crazy, sensational, unusual, bad things that happened today. Current affairs plus absurdity times outrage.


That’s one of the reasons that Masha Gessen, Russian-American journalist and one of the world’s leading experts on how authoritarian regimes work, argues in Surviving Autocracy that the media should cover “Trumpism not as news, but as a system.”


Many news media outlets are still operating on default settings, covering a democracy rather than reporting on an emergent authoritarian regime. Even now, they’re still attending the daily White House press briefings as if they were normal press conferences rather than a vehicle for systematically disseminating lies and misinformation. Even now, they’re still quoting patent falsehoods in headlines and articles as if they were solid statements by legitimate government sources rather than deliberately misleading propaganda from the mouths of so-called spokespersons. Even now, they conflate impartiality and “false balance”, as if every truth lies somewhere in the middle. 


Even now, they are still broadcasting Trump’s campaign rallies live, although they know full well those rallies will contain incitements to violence, showcase conspiracy theories and pose a genuine hazard to public health. (Remember when Trump suggested injecting bleach as a possible response to the coronavirus?) 


Even now, they are still referring to the Republicans as a political party, their gatherings as party conventions, and their lies as campaign promises. Even now, they are still hoping their leader will display “presidential” conduct, as if things might return to “normal” at some point.


Even now, they still talk about “the White House” as if it has the same meaning it once had.


But as journalism professor Jay Rosen puts it, “There is no White House. Not in the way journalists came to use that term. [...] Those words, ‘the White House’ are still used, but there is no clear referent. The metonymy broke.”


An election about democracy itself

The reality is that we can no longer report on US politics, and on these elections in particular, through the lens of news-as-usual. An emergent autocracy demands fundamentally different journalistic standards and practices.


It demands a journalism that leaves no room for “bothsidism” – a feigned semblance of “neutrality” by portraying politics as a conflict between two parties that are similar in nature. A journalism in which lies are not first spread as “quotes”, only to be debunked later by a fact-checker. A journalism in which deliberate propaganda and misleading claims are no longer referred to as a “press conference”, “briefing” or “convention”. 


In short, we need a journalism in which news media are united not in their shared obsession with breaking news, but in their joint defence of democracy.


For on the ballot in the upcoming elections isn’t merely a choice between left or right, progressive or conservative, Trump or Biden. On the ballot this time are the elections themselves.


Translated from Dutch by Joy Phillips.

Monday, September 21, 2020

3428. Book Review: Amber Waves: The Extraordinary Biography of Wheat

By Bee Wilson, London Review of Books, September 24, 2020



Not many people have heard of Norman Borlaug, but his invention – the high-yield, short-straw wheat that fuelled the Green Revolution – is consumed every day by the majority of humans on the planet. Without Borlaug’s wheat, there would be no modern food as we know it. Everything from sandwiches to pizza to soy sauce to animal feed is manufactured from wheats adapted from Borlaug’s. ‘Wheat is in everything!’ a friend of mine exclaimed with fury after being diagnosed with coeliac disease. To those of us who live far from the land, wheat seems a changeless and universal part of the countryside, the stuff of harvest festivals and corn dollies. We don’t imagine it was or could be any different. All we ask is that it should be there to feed us. 


After lockdown started, neighbours on my street in Cambridge formed a WhatsApp group. It soon became apparent that one of the group’s main functions would be to pool information about flour. Participants shared sightings of plain white flour in local shops or online suppliers with the secretive thrill of foragers who’ve just discovered a patch of wild garlic or chanterelles. When one neighbour managed to get hold of some, it would be portioned up and distributed or bartered for other rare treasures – yeast or a jar of sourdough starter. There was excitement when someone discovered an online source that promised to deliver bags of organic plain flour in only two working days. But sometimes, as with foraging tips, you would find the source stripped by the time you got there; other people in other streets were flour-fixated too. 


The pandemic flour shortages – which weren’t unique to Britain – were driven not just by regular consumers stocking up but by peoplewho never normally buy flour. In April, a representative for British and Irish millers said that even with millers working ‘round the clock’ there was only enough capacity for 15 per cent of UK households to buy a bag of flour a week. Plain flour has never in recent decades been something for which demand exceeds supply, not least because our shops are full of items ready-made from industrial wheat, from croissants to muffins, bagels to noodles. One of the curious things about the pandemic flour shortages is that items made from wheat were never in short supply. Even at the height of panic buying there were plenty of flour-based products in British shops, but somehow none of them stopped people wanting to buy flour itself. 


The Harvesters (detail), by Pieter Bruegel the Elder, 1565. The Metropolitan Museum of Art, Rogers Fund, 1919


If you want to kill an hour or so making a loaf of banana bread – or a few days making sourdough – you need to start with a bag of flour. Plenty of other forms of time-consuming cookery could have been used to pass the hours and days of the pandemic. We could have chosen to pickle vegetables or to roll tiny meatballs by hand or to spend hours skimming and clarifying consommé. But few other forms of cookery have anything like the mass appeal of wheat-based baking (unless it’s wheat-based boiling in the form of pasta). 


Plain white flour has many drawbacks as a food, one of which is lack of flavour. Most mass-produced raw white flour tastes of almost nothing, although if you try very hard, you may notice a faint aroma of wallpaper paste. It’s also lacking in nutrients, even if, unlike coeliacs, you are able to tolerate gluten. As the journalist Wendell Steavenson writes, white flour is ‘a pure starch so nutritionally void’ that by law vitamins must be added back into it. White flour must be fortified with calcium, iron, thiamin and niacin to make up for the fact that the nutritious part of the wheat has been taken away during the milling process. And yet what wheat flour lacks in flavour and nutrients, it makes up for in the gratification it gives in the mouth and the stomach after you combine it with other ingredients and apply heat. Flour can be engineered into a series of deeply likeable textures, from the softness of sponge cake to the crispness of a cracker to the custardy satisfaction of a Yorkshire pudding. Perhaps the fear and uncertainty of the current situation made people want to get back to our staple food in its purest and most basic form. But plain flour is neither pure nor basic: it is the endpoint of a series of technological processes and inputs, incorporating plant breeding and chemical fertilisers as well as advances in milling and globalised distribution networks. In 2019, wheat was grown on more land than any other food crop: 538 million acres across the globe. On average, it contributes the largest amount of calories to the human diet of any foodstuff, according to data from the CIAT (the International Centre for Tropical Agriculture), a research group for the Food and Agriculture Association. 


In 2009, the average human had access to 498 calories a day from wheat compared with 349 calories from oils, 333 calories from rice and 281 calories from sugar and other sweeteners. In some countries, such as Turkey and France, per capita wheat consumption is a great deal higher and in others, such as Cameroon (where maize is the staple food) or the Philippines (rice), much lower. But it’s striking that wheat consumption has been increasing fast since the 1960s, even in traditional rice economies such as China and Japan. The supply of wheat in China rose from fewer than 200 calories per person a day in 1961 to nearly 600 in 2009. Across Asia, the gradual substitution of wheat for rice has been a near universal marker of economic development. 


The human relationship with wheat is the subject of Catherine Zabinski’s short book Amber Waves, which presents itself as a ‘biography’ of the grain, although she reminds us on page three that ‘wheat isn’t a person’ in case we were liable to be confused. Zabinski, a plant and soil ecologist at Montana State University, seeks to tell ‘a story of a group of grasses whose existence became complicated by its convergence with our own species and our never-ending need for more food’. The vast consumption of wheat today is linked to the fact that it is the main ingredient in so many convenience foods. If you want to satisfy hunger quickly and cheaply, the odds are that you will turn to a wheat-based food (unless you opt for potatoes, in the form of crisps or chips). You might buy a healthy wrap or an unhealthy burger or a pie or a sandwich or a slice of pizza or a tub of instant ramen or a samosa or a slice of toast or a bowl of bran flakes. Whichever choice you make, you will end up eating the same industrial wheat. No other grain comes in such a vast range of ready-to-eat foods. Yet it must have taken great perseverance and ingenuity for our Neolithic ancestors to add wheat to their diets. The calories it contains are remarkably difficult to access compared with other items in the hunter-gatherer diet such as wild fruits and nuts and honey and meat. Wheat was originally a wild grass, as Zabinski explains, and ‘grass seeds are small and hard and impenetrable’. 


In evolutionary terms, wild wheat seeds do not want to be eaten, because as soon as they are broken open, they cease to be a seed. In this, grains differ from wild fruits, which positively invite animals to eat them. Fruit is luscious and sweet in order to appeal to creatures that will eat the flesh and excrete the seeds, thus dispersing them. Wild wheat seeds, by contrast, have extremely hard hulls to deter predators. Every seed, as Thor Hanson put it in The Triumph of Seeds (2015), consists of three elements: a baby, lunch and a box. The ‘baby’ is the embryo of the new plant. The ‘lunch’ is the nutritive tissue that provides energy reserves until the seed can start to absorb nutrients from the soil. In the case of wheat seeds, this is a combination of protein and carbohydrate, while in oil seeds such as sunflower seeds the lunch is mostly fat. Finally, every seed is contained in a ‘box’: a defence mechanism to protect the germ from hungry animals. In theory, a chilli seed stops anyone from eating it by burning them. An almond kernel defends itself by being bitter, and having a slightly poisonous taste (which backfired when humans acquired a love of that curious marzipan flavour). A wheat seed protects itself with a series of viciously hard layers: first a hull, and then a layer of bran, made up of a fruit coat and a seed coat fused together. Only when both of these layers have been penetrated do you reach the wheat germ (the baby embryo) and the wheat starch (the lunch). These defences might have been enough to put off most herbivores, but humans – omnivores in possession of tools – were not so easily deterred. 


Stone, fire and water were the three methods used to get inside a wheat seed. When they proved too hard to crack, hunter-gatherers would burn or soak them to soften the hull. Some early wheat eaters settled in the Fertile Crescent of the Levant, in Abu Hureyra, a site in modern-day Syria first excavated in 1971. These people – who were not farmers – lived in small circular huts with hearths for cooking outside. Archaeologists have found evidence, from around 13,000 bc, that they hunted a range of animals for food, including gazelles, asses, boars, hares, foxes and birds of various kinds. They also left traces of more than 120 plant foods including ‘wild grapes, figs, pears, hackberries, mahaleb cherries, sour wild plums, yellow hawthorn, wild capers, juniper berries’. Near the hearths, archaeologists also found traces of charred wheat seeds. 


When you pick a blackberry, you can enjoy it just as someone in Abu Hureyra did a wild plum or cherry 15,000 years ago. But with wheat, multiple problems need to be solved before it can be eaten. Zabinski invites us to imagine being a hungry forager faced with a patch of wild wheat. At least these grass seeds don’t try to escape, unlike a gazelle or a wild bird. They can be stored for many months, unlike a plum or a pear which needs to be eaten quickly, unless you can find a way to preserve it. The challenge with a wheat seed is how to get at the goodness inside it. Grinding technologies were needed. In Abu Hureyra, the grindstone used was the saddle quern, which Zabinski describes as a ‘two-part grinding tool’, though in truth it consisted of three parts: two stones and one woman. The first part was the lower stone on which the grain was placed: a flat saddle-shaped piece of rock. The second, much smaller piece was the rubbing stone – like the pestle in a pestle and mortar. The final and most important part was the woman, who kneeled behind the quern and used her weight to crush the seeds with the rubbing stone. Eventually they broke down into flour. Over time, the woman’s body began to wear down too. Female bones at Abu Hureyra show strain to the toes, hips, knees and shoulders from hours spent at the grindstone. 


What did the ancients do with their hard-won flour? Unlike a haunch of meat, a handful of flour can’t easily be cooked in the fire. People seem to have sometimes eaten roasted or raw whole grains, but these were tough on the digestion and the teeth. The possibilities of wheat cookery expanded hugely with the invention of pottery, in which soft porridge-type dishes could be cooked. In Abu Hureyra, pottery arrived around 8000 years ago. The new porridgey diet meant that more people survived into adulthood and those who survived had better teeth, as Zabinski notes. Pottery was one of the vital conditions for the human dependence on grain. 


Some say that humans domesticated wheat; others that wheat domesticated humans. In Sapiens, Yuval Noah Harari argued that this wild grass succeeded in completely changing the human way of life, in ways that weren’t always beneficial for humans. With the adoption of wheat, the communal living and varied diet of hunter-gatherer societies was exchanged for the back-breaking labour and relatively monotonous food produced by farming. What wheat offered in return, Harari wrote, was population growth: ‘the ability to keep more people alive’. If wheat shortages have often been the precursor of revolution, surpluses are a prerequisite of political power and security. 


Whether wheat was the cause or the effect of farming, it’s certainly true that the human relationship with wheat – as well as with other grains like rice, millet, barley, rye and oats – changed dramatically with the beginnings of agriculture. Many of the earliest cities and civilisations were founded on wheat farming. Mesopotamia, Egypt, Greece and Rome were all wheat cultures. The state’s security depended in part on its control of the granaries. Those civilisations, such as ancient Greece, whose soil wasn’t suited to wheat needed to make sure they could import it from elsewhere, trading it for wine and oil. In hunter-gatherer societies, food production is shared across the community and food is valued for its own sake. In farming societies, by contrast, some work the land while others pursue different occupations. The people who actually ploughed the fields and planted the seeds no longer held high status because the food they produced was taken for granted. The adoption of wheat farming was the first stage in a long process of human disconnection from responsibility for food production. 


The very earliest types of wheat were two wild varieties: einkorn and emmer. These have now reappeared in health food shops, where they are sold as ‘ancient grains’. Einkorn and emmer are much higher in protein than modern wheat: modern bread flour is 12-14 per cent protein and modern cake flour is only 7-11 per cent protein, but einkorn and emmer have a protein concentration between 16 and 28 per cent (an egg is 13 per cent protein by mass). Einkorn, which is native to the Levant, was slightly easier to grow in cooler climates and on less fertile soils but could only be ground very coarsely and yields just one grain of wheat per flower (hence ‘einkorn’, or ‘onegrain’). In the hot climate of Egypt, emmer was preferred. It is genetically similar to the durum wheat used today to make pasta and couscous (and which makes up 5-8 per cent of modern wheat production) although it has harder hulls. The Egyptians took emmer flour, mixed it with salt and water and cooked the mixture on hot stone slabs. Bread! 


At some point, in a field of emmer in the Levant, a new kind of wheat started to grow: the ancestor of modern bread wheat. Given its subsequent history, the most surprising thing about bread wheat is that the original hybrid wasn’t engineered by humans but arose spontaneously. An emmer plant in the wild crossed itself with a goatgrass to produce wheat a bit like modern-day spelt. ‘Fourteen chromosomes from goatgrass plus 28 chromosomes from emmer equals a new hybrid with 42 chromosomes,’ Zabinski writes. These seeds were quickly adopted by early farmers. Apart from the fact that it tasted good, this grain had the huge advantage of having softer hulls than emmer, although bread wheat was still a demanding crop. A series of clay tablets survives from Mesopotamia, describing – in cuneiform – the elaborate and meticulous stages of wheat farming: the oxen which trampled the soil after the spring flooding; the workers who broke up the clumps of soil; the farmers who planted the seeds in rows at the right depth and at exactly the right spacing; the careful use of floodwater for irrigation. 


Many aspects of pre-industrial wheat technology have left traces in our language. We may still speak of having our ‘nose to the grindstone’ or of ‘ploughing our own furrow’. Few of us would know chaff if we saw it, but the sheer labour that once went into wheat harvesting and threshing has left traces in our collective memory. When Spaniards colonised Peru they brought with them not just wheat, rye, barley and oats but hoes, spades, sickles, mills, carts and ards – large hooks attached to a wooden beam to which animals were yoked to plough the fields. As wheat travelled the world, the seeds needed to be adapted to many different soils and climates. When wheat arrived in North America with the Puritans in the 17th century, it didn’t seem suited to the cold winters. Over time, the colonists developed different seeds for different parts of the vast continent. ‘There was spring wheat and winter wheat, red wheat and white wheat, hard wheat and soft wheat.’ Soft wheat is easier to grind and makes light cakes and pastries but hard wheat – higher in a sticky protein called gluten – is easier to make bread out of, particularly if you want it to rise (gluten traps air bubbles during the fermentation process). 


Experiments in wheat breeding only started in earnest in the mid 19th century. Before then, all wheats were landraces: highly localised variants adapted to particular terrains and environments. One of the key features of an 18th-century field of landrace wheat was its diversity. A single field would have contained many different varieties of seed. What landraces sometimes lacked in yield they made up for in resilience. ‘Landraces,’ Zabinski explains, ‘are valuable because ... the smaller, scraggly, less productive individuals may also hold the genes for greater tolerance to abnormal rainfall or late frost or fungal pathogens.’ For the past 170 or so years, however, mainstream wheat breeding has aimed at getting rid of this biodiversity by selecting seed that has certain consistent traits, such as yield or disease resistance. ‘Much of the effort behind breeding wheat varieties in the late 19th and early 20th centuries was aimed at finding varieties that could thrive in the diverse climates of North America, and varieties that were resistant to the rusts, smuts and insects that fed on wheat.’ 


With modern plant breeding came the idea that wheat could become one single ideal substance rather than a series of interrelated and localised species. Much of the focus of early grain breeding was on keeping a constant supply of hardy wheat even in years of prolonged frost or drought. As Mark Carleton, the top ‘cerealist’ in the US in the late 1800s, remarked: ‘It isn’t what a wheat yields in the best years – it’s how it stands the worst ones.’ The first hugely successful wheat breeder of the 20th century was the Canadian Charles Saunders, who is mentioned nowhere in Zabinski’s book, although she does mention his greatest invention, Marquis wheat, launched in 1904. Saunders designed Marquis wheat to be high yielding, robust, early maturing and very high in gluten. There were many failed attempts before he hit on the perfect formula, which resulted from crossing Red Fife wheat (already popular in North America) with Hard Red Calcutta from northern India. This new wheat ripened earlier than Red Fife but was similarly good for baking. Marquis, Noel Kingsbury writes in Hybrid: The History and Science of Plant Breeding (2009), ‘set the standard for bread wheat quality globally’. In 1920, it accounted for 90 per cent of all wheat grown in Canada. 


Saunders’s achievements were eclipsed by those of Norman Borlaug, who was hired after the Second World War by the Rockefeller Foundation to lead a programme designed to increase wheat production in Mexico. He pursued this task with meticulous and single-minded devotion. The first thing he had to do was to make the wheat more resistant to rust, a fungal disease. For three years straight, the Mexican wheat harvest had been reduced by half as a result of rust. Borlaug experimented with more than two hundred crosses before, in 1948, he was satisfied that he had found four early-maturing varieties that could withstand rust attacks. Zabinski describes the delicate work involved in cross-pollinating wheat:  


If you want to cross two plants, you must prevent self-fertilisation by opening the tiny scales of the floret, and with a pair of fine tweezers plucking the three anthers without losing any of the pollen in the process. Then in a day or two, when the stigma is mature (it will resemble a plume), you add pollen to the plant by carefully shaking the anthers from the plant you want to be the other parent over the recipient plant’s stigma, bagging the flowering head to prevent any other pollen from entering, and hoping that the cross worked. 


Then Borlaug applied himself to his real interest, which was finding a wheat capable of feeding the world. It has often been said that a billion lives were saved (or made possible) by Borlaug’s work, for which he won a Nobel Prize in 1970. He wanted to find a wheat that Mexican farmers could grow intensively using nitrogen fertilisers and irrigation. The problem with most wheat varieties, he believed, was that they wasted too much energy in growing tall. He heard about some semi-dwarf varieties of wheat that had been brought from Japan to the US: mutants with much shorter straw than normal wheats. 


Borlaug transformed a tall and long-maturing crop into one that was short, stubby, highly productive and quick maturing (but very hungry for water and fertiliser). To get a sense of how radically wheat fields changed because of him, Kingsbury suggests looking at Brueghel’s The Harvesters. This scene is completely unlike a modern wheatfield. In Brueghel’s painting, the wheat is as tall as a child: a maze of yellow you could lose yourself in. The grasses reach almost to the top of a man’s head as he trudges past carrying an earthenware jug. In the distance, women can be glimpsed walking through a corridor of wheat that reaches up to their shoulders. The wheatfields in East Anglia I sometimes walk through with my dog are puny by comparison. By 1962, Borlaug had developed two new semi-dwarf wheats: Penjamo 620 and Pitic 62. In combination with industrial farming techniques, they were so successful that Mexico was able to become a net exporter of wheat. It was in India and Pakistan, however, that his dwarf wheat had the biggest impact, as part of the Green Revolution, in which grain yields saw unprecedented increases. Wheat production in Pakistan rose by 60 per cent between 1967 and 1969, and by 1974 India was self-sufficient in cereals. Wheat harvests increased so rapidly in India in 1968 that schools had to be closed to free up extra space for warehousing the grain. Brueghel’s ‘The Harvesters’ (1565) 


Some say that Borlaug’s dwarf wheats were the greatest invention in human history – how many others can claim that their work saved a billion lives? – and yet his work is implicated in many of the problems with the global food supply, from its tendency to perpetuate social inequalities to its lack of biodiversity. Borlaug’s short-straw wheats, and the variants that came afterwards, can only achieve their high yields in conjunction with industrial pesticides, fertilisers and irrigation. Some studies have suggested that the Green Revolution widened the gap between rich and poor farmers in countries such as India because not every farmer could afford the necessary inputs – such as the cost of irrigation and machinery – needed to grow Borlaug’s wheat. Modern wheat farming is also damaging to the soil. In part, as Zabinski explains, this is because it is an annual crop which completes its entire lifecycle in a single year, taking nutrients from the soil while giving very little back. Traditional wheat farmers used crop rotation to address this problem, alternating a year of wheat with a year of peas or beans to fix nitrogen in the soil. High-yield wheat, by contrast, can lead to soil exhaustion. 


In economic terms, everything is a trade-off. But even on its own terms – as a cure for human hunger – Borlaug’s wheat has not succeeded. We can’t blame him for the fact that there are still nearly 800 million acutely malnourished people in the world. A more pertinent question is why so many people alive today have access to more than enough calories from wheat and yet are still malnourished, lacking in basic micronutrients such as iron and B vitamins. Borlaug’s wheat was designed to produce the maximum amount of energy per field and he focused on this to the exclusion of other questions such as whether it would deliver the nutrients humans need. He did not foresee a future – one his wheat helped bring about – in which millions of poorer consumers worldwide would be obese and yet also suffering from ‘hidden hunger’ because their diets are low in protein and essential micronutrients. 


Industrial wheat is a very efficient system if you ignore the consequences, the baker Andrew Whitley argued in Cereal, an excellent six-part audio series about wheat, broadcast last year through the Farmerama podcast. He covered many questions more or less ignored by Zabinski’s book, such as the taste of wheat and how it is milled. The theme of the programme – which featured interviews with bakers, millers, farmers and food activists – was that a sequence of logical and reasonable-seeming steps has resulted in a dysfunctional food system. The puffy sliced bread sold in every supermarket seems cheap only when you ignore the external costs, which include not just the ecological problems associated with intensive farming but the fact that many eaters can’t digest bread made by the ‘Chorleywood process’ (which uses large quantities of yeast and additives to replace the slow fermentation of traditional bread). Whitley calls sliced white bread ‘Peter Pan bread’ because it doesn’t age in the usual way. Both modern bread and modern flour are designed to be ‘shelf-stable’, something modern milling techniques have made possible. 


Other than Borlaug’s experiments in plant breeding, the key development in modern wheat was the invention of roller milling in the late 19th century, which isn’t mentioned by Zabinski. As the chef Dan Barber explains in The Third Plate (2014), it was roller milling which made possible the emergence of white flour as a flavourless commodity that could be stored for long periods of time and transported long distances. As hunter-gatherers discovered, three parts of wheat are edible: the husky bran on the outside, the germ and the endosperm. When grain was stone milled, all three parts were ground together and their oils and nutrients intermingled. White flour was originally made by taking wholemeal flour and sieving out the bran, but it retained some of the goodness of the wholegrain. Because of the oil, the flour could go rancid quickly and needed to be consumed while it was fresh. Steel roller milling – pioneered in the 1860s – was a completely different process. Instead of grinding all three layers of wheat together, the outer layers are gradually stripped off, leaving only the white endosperm behind. Roller milling gave bakers a much finer flour to work with, ideal for making featherlight pastry and soft white bread, but far less nutritious – and less flavourful – than the flours of the 18th century. 


I wrote to Premier Foods, which owns McDougalls, the UK leader in retail flour, to ask what varieties of wheat go into its plain flour, where it is grown and what criteria are used for selecting the wheat. Someone promised to get back to me, but after three weeks and three reminders, they still hadn’t answered my questions. Finally, someone wrote to say that ‘the wheat used in our McDougalls plain flour is UK grown, variety will depend on what the farmer feels he can grow to meet our specification requirements.’ They didn’t say what those specification requirements are. British flour tends to be low in gluten, so much of the flour used in our baked goods is imported, mostly from Canada, Denmark, Germany, Latvia and the US. The leading miller in the UK, Whitworths, has urged farmers to grow more high-protein bread wheats to cushion the blow of Brexit, but most of the wheat grown in the UK is low-quality grain exported for animal feed. In 2018, according to Cereal, 6.5 million tonnes of British wheat was used to feed livestock. The harder high-gluten wheats that are considered the best for bread don’t grow particularly well in the wet British climate. It is possible to make good bread with the softer varieties of wheat that grow well here, but you have to adjust your expectations of what good bread actually is. 


Can the current wheat system be reformed to make it better at delivering bread that is both nutritious and good to eat? At the moment, every link in the chain is premised on cheapness and uniformity. Changing one link would mean changing everything. Someone who experimented with creating a whole new chain was Martin Wolfe, a plant pathologist who died last year. At his farm near Fressingfield in Suffolk, Wolfe developed cereal populations based on the principle of diversity rather than uniformity. He became convinced that the solution to disease resistance in cereals was not using increasing amounts of pesticides but growing diverse fields of grain. Wolfe’s most celebrated experiment with wheat was called the YQ (Yield Quality) project. The idea was to try to preserve the high yield of Borlaug’s short-straw wheat but to cross it with varieties that had better eating and baking qualities. Wolfe took twenty wheats and crossed them. Half were chosen for yield, the other half for quality. The resulting YQ wheat had the diversity and resilience of the old landraces but a much higher yield per acre and a rich nutty taste. 


The first baker to use it to make a commercial loaf of bread was Kimberley Bell, who makes and sells a variety of YQ loaves at the Small Food Bakery in Nottingham. I met Bell at a conference in Barcelona last year and was struck to hear her describe the sheer variety of flavours she can detect in different wheats: malty or nutty and even tasting uncannily like meat. I wished I could go out and buy some of the wheats she described, but this isn’t easy to do. This summer, after lockdown ended, I finally made it to the Small Food Bakery, where I bought several loaves of YQ bread as well as one baked from a Nordic landrace variety called Øland. Bell also bakes vast freeform loaves which she then cuts into smaller pieces and sells by weight. This makes for an extra-damp crumb so you can taste the cereal even more clearly, distinct from the caramel flavours of the crust. When I got home, I cut off a fat slice and inhaled, trying to work out what the bread reminded me of. It smelled rich and heady, like a piece of wildflower honeycomb. It was delicious. But, as Cereal makes clear, this bread cannot easily be delivered by our existing food economy because it is the end product of a completely different series of operations, from farm to mill to oven. And so we are left with our dusty bags of nondescript white powder.

Tuesday, September 15, 2020

3427. Comparing the U.S. and China’s Response to COVID-19

By by Vijay Prashad and John Ross, Monthly Review Online, September 14, 2020

In Washington Post reporter Bob Woodward’s new book, Rage, he reports on interviews he did in February and March with U.S. President Donald Trump about the coronavirus. Trump admitted that the virus was virulent, but he decided to underplay its danger. “I wanted to always play it down,” Trump said, “because I don’t want to create a panic.” Despite months of warnings from the Chinese authorities, Trump and his health secretary Alex Azar completely failed to prepare for the global pandemic.
The United States continues to have the largest total number of cases of COVID-19. The government continues to flounder as the number of cases escalates. Not one state in the country seems immune to the spread of the disease.
Meanwhile, in China, ever since the virus was crushed in Wuhan, the government merely has had to contain small-scale localized outbreaks; in the last month, China has had zero domestically transmitted COVID-19 cases. Martin Wolf wrote in the Financial Times on March 31 that China was successful in “bringing the disease under control in Hubei and halting its spread across China.” There was never a pan-China outbreak. It is more accurate to call it a Hubei outbreak.

Measuring People’s Lives

While Trump lied to his own citizens about the disease, China’s president Xi Jinping said that his government would be “putting people first.” China hastily subordinated its economic priorities to the task of saving lives.
As a consequence of a science-based approach, China’s government broke the chain of infection very quickly. By early September, this country of 1.4 billion had 85,194 COVID-19 cases and 4,634 deaths (India, with a comparable population, had 4.8 million cases and 80,026 deaths; India is losing more lives each week than the total deaths in China).
The United States, meanwhile, has suffered from 198,680 deaths and 6.7 million cases. In absolute numbers, the U.S. deaths are about 43 times China’s and the case number is about 79 times higher.
The U.S. government, unlike the government in China, hesitated to properly craft a lockdown and test the population. That is why, in per capita terms, U.S. deaths are about 186 times higher than those in China and the cases are about 343 times higher.
Trump’s racist attempt to pin the blame on China is pure diversion. China contained the virus. The U.S. has totally failed to do so. The enormous number of U.S. deaths were ‘Made in Washington,’ not ‘Made in China.’

Measuring the Economy

In the first quarter of 2020, the Chinese gross domestic product (GDP) fell by 6.8 percent compared to a year earlier. Due to the fast elimination of domestic transmission of the virus, economic recovery in China has been rapid. By the second quarter, China’s GDP has been up 3.2 percent compared to the same period in 2019. The International Monetary Fund projects that China will be the only major economy to experience positive growth.
How did China’s economy rebound so fast? The answer is clear: the socialist character of the economy. By July, China’s state investment was 3.8 percent above its level of a year ago, while private investment is still 5.7 percent below 2019. China has used its powerful state sector to boost itself out of recession. This illustrates the macro-efficiency of the state sector.
In mid-August, the Communist Party of China’s theoretical journal Qiushi (Seeking Truth) published a speech by Xi Jinping, in which he said, “The foundation of China’s political economy can only be a Marxist political economy, and not be based on other economic theories.” The main principles of this are “people-centered development thinking.” This was the foundation of the government’s response to the pandemic and the economy in its context.
Trump, meanwhile, made it very clear that his administration would not conduct anything near a national lockdown; it seems his priority was to protect the economy over American lives. As early as March, when there was no sign that the pandemic could be controlled in the United States, Trump announced, “America will again and soon be open for business—very soon.”

Disaster in the United States

Inefficient policies in the United States resulted in runaway COVID-19 infection rates. The basic protocols—masks, hand sanitizer—were not taken seriously. And the impact on the U.S. economy has been catastrophic.
The U.S. made it clear that it was not going to pursue anything near a people-centered approach. Trump’s entire emphasis was to keep the economy open, largely because he remains of the view that his election victory will come via the pocketbook; the human cost of this policy is ignored. The U.S. only had half a lockdown, and little testing and contact tracing.
The GDP of the United States in the second quarter fell by 9.5 percent as compared to a year earlier. There is no indication of strong improvement. The IMF estimates that U.S. economic contraction will be about 6.6 percent for the year. The “risk ahead,” writes the IMF, “is that a large share of the U.S. population will have to contend with an important deterioration of living standards and significant economic hardship for several years to come.” The disruption will have long-term implications. These problems are laid out clearly by the IMF: “preventing the accumulation of human capital, eroding labor force participation, or contributing to social unrest.” This is the exact opposite of the scenario unfolding in China.
It is as if we live on two planets. On one planet, there is outrage about the hypocrisy in what Trump said to Woodward, and outrage about the collapse of both the health system and the economy—with a harsh road forward to rebuild either. On the other planet, the chain of infection has been broken, although the Chinese government remains vigilant and is willing to sacrifice short-term economic growth to save the lives of its citizenry.
Trump’s attack on China, his threats to decouple the United States from China, his racist noises about the “Chinese virus”—all this is bluster designed as part of an information war to delegitimize China. Xi Jinping, meanwhile, has focused on “dual circulation,” which means domestic measures to raise living standards and eliminate poverty, and on the Belt and Road Initiative; both of these will lessen Chinese dependence on the United States.
Two planets might begin to drift apart, one moving in the direction of the future, the other out of control.