Check back in five years – but as it stands, this is one of the most important books I’ve come across. What started as a study on “work” quickly turned into one on human nature.
It took me two decades (emphasis: TWENTY – years!) to pause and think critically about work. Culture is one hell of a drug. Like virtually everyone alive, I was so busy “surviving” – or “accomplishing”… until slowing seeing the vapor of it all. Satisfying in brief moments, then gone… leaving one to ponder the deeper meaning of things. Of course, the obvious reflex is doubling-down, setting sights on new, loftier summit(s), and plunging back in. Yet this too, eventually, boomerangs. I had no choice but to confront my concerns in writing, which coincidentally catalyzed this blog two years ago. As I write this, I am months away from leaving work indefinitely.
How Did We Get Here?
What started this 5-day, 40ish-hour-week religion? What might life feel like if we weren’t so dominated by it? Is this even possible to taste? Are work and life synonymous? How can we ever know? What’s the point of prosperity, if we only taste it on nights and weekends? Even then, are we really tasting it, or just filling our time with distraction? These questions fascinate me.
To be clear: I don’t believe work should be avoided or abolished, nor our market economy burned-down in favor of something revolutionary. Merely that, we each owe it to ourselves to DEEPLY understand our relationship with work, before we voluntarily commit to a lifetime of it, under the dim guise of: “it’s what society does”.
Which brings me to this book. Suzman takes us BEHIND the question – into the origin of humanity – telling the story of what led us here. Rooted in rigorous anthropological insight held both by him and hundreds of references.
In July 2021, the book’s excerpts alone gave me reason to share. Now aptly on Labor Day 2022, my full read concludes. This book may as well be titled – “An Important History Lesson on Humanity”, which, incidentally covers work. My narrow expectations were far exceeded – I have a new angle from which to consider human effort, and new horizons of curiosity to explore next… which will materially influence my life. Is this not the best possible outcome of any book?
I highly suggest the full tour… below are the excerpts and themes that meant the most to me.
Can Humanity Escape ‘Work’?
None made the case as comprehensively as the twentieth century’s most influential economist, John Maynard Keynes. He predicted in 1930 that by the early twenty-first century capital growth, improving productivity, and technological advances should have brought us to the foothills of an economic “promised land” in which everybody’s basic needs were easily satisfied and where, as a result, nobody worked more than fifteen hours in a week.
Most of us still work just as hard as our grandparents and great-grandparents did, and our governments remain as fixated on economic growth and employment creation as at any point in our recent history. More than this, with private and state pension funds groaning under the weight of their obligations to increasingly aged populations, many of us are expected to work almost a decade longer than we did half a century ago; and despite unprecedented advances in technology and productivity in some of the world’s most advanced economies like Japan and South Korea, hundreds of avoidable deaths every year are now officially accredited to people logging eye-watering levels of overtime. Humankind, it seems, is not yet ready to claim its collective pension. Understanding why requires recognizing that our relationship with work is far more interesting and involved than most traditional economists would have us believe.
The (Misguided) ‘Economic Problem’
What classical economists refer to as the “economic problem” and sometimes also as the “problem of scarcity.” – holds that we are rational creatures cursed with insatiable appetites and that because there are simply not enough resources to satisfy everybody’s wants, everything is scarce. The idea that we have infinite wants but that all resources are limited sits at the beating heart of the definition of economics as the study of how people allocate scarce resources to meet their needs and desires.
To economists, then, scarcity is what drives us to work, for it is only by working—by making, producing, and trading scarce resources—that we can ever begin to bridge the gap between our apparently infinite desires and our limited means.
But the problem of scarcity offers a bleak assessment of our species. It insists that evolution has molded us into selfish creatures, cursed to be forever burdened by desires that we can never satisfy. And as much as this assumption about human nature may seem obvious and self-evident to many in the industrialized world, to many others, like the Ju/’hoansi “Bushmen” of southern Africa’s Kalahari, who still lived as hunter-gatherers through to the late twentieth century, it does not ring true. I have been documenting their often traumatic encounter with a relentlessly expanding global economy since the early 1990s. It is an often brutal story, set in a frontier between two profoundly different ways of life, each grounded in very different social and economic philosophies based on very different assumptions about the nature of scarcity. For the Ju/’hoansi, the market economy and the assumptions about human nature that underwrite it are as bewildering as they are frustrating. They are not alone in this. Other societies who continued to hunt and gather into the twentieth century, from the Hadzabe of East Africa to the Inuit in the Arctic, have similarly struggled to make sense of and adapt to norms of an economic system predicated on eternal scarcity.
They did not routinely store food, cared little for accumulating wealth or status, and worked almost exclusively to meet only their short-term material needs. Their economic life was organized around the presumption of abundance rather than a preoccupation with scarcity. And this being so, there is good reason to believe that because our ancestors hunted and gathered for well over 95 percent of Homo sapiens’ 300,000-year-old history, the assumptions about human nature in the problem of scarcity and our attitudes to work have their roots in farming.
When economists define work as the time and effort we spend meeting our needs and wants, they dodge an obvious problem: beyond the energy we expend to secure our most basic needs—food, water, air, warmth, companionship, and safety—there is very little that is universal about what constitutes a necessity. More than this, necessity often merges so imperceptibly with desire that it can be impossible to separate them.
Abandoning the idea that the economic problem is the eternal condition of the human race does more than extend the definition of work beyond how we make a living. It provides us with a new lens through which to view our deep historical relationship with work from the very beginnings of life through to our busy present. It also raises a series of new questions. Why do we now afford work so much more importance than our hunting and gathering ancestors did? Why, in an era of unprecedented abundance, do we remain so preoccupied with scarcity?
What Motivates Work
At its most fundamental, work is an energy transaction and the capacity to do certain kinds of work – this distinguishes living organisms from dead, inanimate matter. For only living things actively seek out and capture energy specifically to live, to grow, and to reproduce. We are not the only species who are routinely profligate with energy; or who become listless, depressed, and demoralized when they are deprived of purpose and there is no work to do.
Humanity’s 4 Leaps – Which Influenced Effort to Live, or Work
Possibly as long as a million years ago, humans mastered fire. In learning how to outsource some of their energy needs to flames, they acquired the gift of more time free from the food-quest, the means to stay warm in the cold, and the ability to vastly extend their diets, so fueling the growth of ever more energy-hungry, harder-working brains.
2) Farming & Food Storage
Around 12,000 years ago our ancestors began to routinely store food and experiment with cultivation, transforming their relationships with their environments, with each other, with scarcity, and with work.
Around 8,000 years ago, when some agricultural societies started to generate big enough food surpluses to sustain large urban populations. And it too represents a major new chapter in the history of work—one defined not by the need to capture energy by working in the fields, but rather by the demands of spending it.
Because most urban people’s material needs were met by farmers who produced food in the countryside, they focused their restless energy in pursuit of status, wealth, pleasure, leisure, and power. Cities quickly became crucibles of inequality, a process that was accelerated by the fact that within cities people were not bound together by the same intimate kinship and social ties that were characteristic of small rural communities. As a result, people living in cities increasingly began to bind their social identity ever more tightly to the work they did and find community among others who pursued the same trade as them.
4) Factories & Fossil Fuel
18th century – factories and mills belching smoke from great chimneys appeared, as populations in Western Europe learned to unlock ancient stores of energy from fossil fuels and transform them into hitherto unimaginable material prosperity. Cities become far busier as a result of the turbocharging of our collective preoccupation with scarcity and work—paradoxically as a result of there being more stuff than ever before. And while it is still too early tell, it is hard to avoid the suspicion that future historians will not distinguish between the first, second, third, and fourth industrial revolutions, but will instead consider this extended moment as critical as any other in our species’ relationship with work.
Mythological Roots – ‘Work’ Protects Us From Chaos
The diverse mythologies of the world reveal some things that are universal to human experience – that our world — no matter how perfect it was at the moment of creation—is subject to chaotic forces and that humans must work to keep these in check.
A missionary speaking to 200 Ju/’hoansi Bushmen, some of the last living foragers, reminded them that because humans were created in God’s image, they too were expected to toil for six days and on the seventh to rest, and offer gratitude for the uncountable blessings that the Lord had bestowed upon them.
The Ju/’hoansi associated the creator God with order, predictability, rules, manners, and continuity, G//aua was associated with randomness, chaos, ambiguity, discord, and disorder. And the Ju/’hoansi detected G//aua’s devilish hand at work in all sorts of different things. They noticed it, for instance, when lions behaved uncharacteristically; when someone fell mysteriously ill; when a bowstring frayed or a spear snapped; or when they were persuaded by a mysterious inner voice to sleep with someone else’s spouse while being only too aware of the discord this would cause.
A Grecian story describes an angry Zeus punishes humankind by concealing from them the knowledge of how to sustain themselves for a year on the basis of only a day’s labor. He insists that the gods are angered by “the man who lives in idleness” and moreover that it was only through hard work that “men become rich in flocks and wealthy.”
Origin of the Word, ‘Work’
It was in 1828 that Coriolis first introduced the term “work” to describe the force that needed to be applied to move an object over a particular distance.
“Work” is now used to describe all transfers of energy, from those that occur on a celestial scale when galaxies and stars form to those that take place at a subatomic level.
Life Serves Entropy
Life actively works to survive, grow, and reproduce potentially in spite of what some physicists consider to be the “supreme law of the universe”: the second law of thermodynamics, also known as the law of entropy.
Living organisms were all thermodynamic engines: like steam engines they required fuel in the form of food, air, and water to work, and in working they also converted some of this fuel into heat that was subsequently lost to the universe.
Life has been busy harvesting free energy, storing it in ATP molecules, and then putting it to work on our planet for a very long time.
More on this idea here.
Earliest Life (And Work)
The first creatures with both tissue and proper nervous systems are thought to have evolved in the oceans around 700 million years ago. But it was not until around 540 million years ago during the Cambrian explosion that animal life really started to flourish.
Additional energy from increasing atmospheric and marine oxygen certainly played a role in kick-starting the Cambrian explosion. But what likely played a more important role was that evolution began to positively select in favor of some life forms that harvested their energy from a novel, much richer source of free energy than oxygen: they consumed other living things that had already gone to the trouble of collecting and concentrating energy and vital nutrients in their flesh, organs, shells, and bones.
Purposive vs Purposeful Behavior
Human work is purposeful, they insisted, whereas work done by animals is only ever purposive. It is an important distinction. A builder working purposefully to build a wall for a garage extension has a clear idea what the finished wall will look like, and he has mentally rehearsed all the steps necessary to build it as per the architect’s plans. But he is not mixing cement and laying bricks in the summer heat for this purpose alone. It is, after all, neither his wall nor his design. He is doing this work because he is motivated by a whole series of second- and third-order ambitions.
When a pack of lions stalk a wildebeest, their base motivation is to secure the energy necessary to survive. But in responding to their instinct, they act far more purposefully than, for example, intestinal bacteria seeking out a carbohydrate molecule. They use cover to stalk their prey, work as a team, deploy a strategy of sorts, and make decisions throughout the process of the hunt, based on which outcome they imagine would best satisfy their purposive urge to chew on the flesh and organs of another creature.
The Origin of Distance Running, Hunting, Language
A persistence hunt is simple in theory. It involves finding a suitable animal, ideally one weighed down with heavy horns, and then pursuing it relentlessly, offering it no opportunity to rest, rehydrate, or cool down, until eventually the dehydrated, overheating, and delirious animal freezes, a ghost of itself, and invites the hunter to walk up casually and take its life.
This method of hunting was not unique to southern Africa. Paiute and Navajo Native Americans used to run down pronghorn antelope in this way; Tarahumara hunters in Mexico ran down deer that, once exhausted, they suffocated with their bare hands; and some Australian Aboriginals occasionally made use of this technique when hunting kangaroo.
Homo erectus must have hunted in this way and that this form of hunting must also have played a part in making us bipedal—in molding our bodies for long-distance running, in developing the ability to cool our bodies with sweat, and adapting our minds to the challenges of inferring meaning from animal tracks, the most ancient form of writing. Hunting was almost certainly among the selective pressures that encouraged the development of our ancestors’ ability to develop complex language.
Animals Fixed vs
Human Plastic Skills
Where most animal species have evolved a series of highly specialized capabilities honed over generations of natural selection, enabling them to exploit specific environments, our ancestors shortcut this process by becoming progressively more plastic and more versatile. In other words, they became skilled at acquiring skills.
Homo sapiens’ ability to acquire and master skills as different as shooting arrows with lethal accuracy and performing microsurgery is written into our hands, arms, eyes, and body shapes. Not only are we the product of the different kinds of work our ancestors did and the skills they acquired, but we are also shaped progressively over the course of our lives by the different kinds of work we do.
Whales and other cetaceans, who have comparable lifespans to humans when not being harpooned for high-end steak and “scientific research,” are born competent swimmers; most hoofed mammals can walk, and all infant primates—save humans—are able to cling to their mother’s back or neck with fierce determination from the moment they leave the womb. Homo sapiens newborns, by contrast, are helpless.
An item recovered from Kathu Pan, the unimaginatively named “Kathu Pan hand-ax.” Found adjacent to the tooth-plates of an extinct species of elephant, it was probably made by a relative of Homo erectus sometime between 750,000 and 800,000 years ago.
Homo erectus and others diligently made hand-axes while operating on instinctive autopilot with only a vague sense of why, until some 300,000 years ago they suddenly crossed a critical genetic Rubicon that spontaneously ushered in a new era of innovation.
Born To Inhale Information
With our super-plastic neocortices and well-organized senses, Homo sapiens are the gluttons of the informavore world. We are uniquely skilled at acquiring, processing, and ordering information, and uniquely versatile when it comes to letting that information shape who we are. And when we are deprived of sensory information, like a prisoner in solitary confinement, we conjure sometimes fantastical information-rich worlds from the darkness to feed our inner informavore.
When we sleep we dream; when we are awake we constantly seek out stimulation and engagement; and when we are deprived of information we suffer.
Fire, Cooked Food = Big Brains, Free Time
Our brains only constitute 2 percent of our total body weight but they consume around 20 percent of our energy resources. For chimpanzees, whose brains are roughly one-third the size of our own, the energy used is closer to 12 percent and for most other mammals it is between 5 and 10 percent.3 Building and maintaining such big brains on the basis of a foraged raw-food, vegetarian diet would have been impossible.
By “predigesting” foods through the process of cooking, fire made a significant proportion of digestive plumbing redundant. Cooking also helped redesign our faces. Eating softer, cooked foods meant that having big-muscled jaws ceased to be a selective advantage. So as our ancestors’ brains grew, their jaws shrank.
If by mastering fire and cooking, Homo erectus secured greater energy returns for less physical effort, then as their brains grew so did the amount of time available to them to apply their intelligence and energy to activities other than finding, consuming, and digesting food.
Boredom: the Catalyst of Leisure, Social Skills
As Nietzsche (who also credited boredom with breathing life into some of his most influential ideas) put it, “for thinkers and sensitive spirits, boredom is that disagreeable windless calm of the soul that precedes a happy voyage and cheerful winds.” Nietzsche was almost certainly right. The only obvious adaptive advantage of boredom is its ability to inspire the creativity, curiosity, and restlessness that motivates us to explore, seek novel experiences, and take risks. Psychologists also remind us that boredom is a more fertile mother of invention than necessity, and that it can stimulate very un-Nietzschean pro-social thoughts as well as a heightened sense of self-awareness, a perspective that is theologized in Zen Buddhism.
Boredom’s ability to induce fidgeting, ferreting, and creativity must also have played a role in persuading our ancestors to make art, an activity that is simultaneously work and leisure, that is emotionally, intellectually, and aesthetically functional, but of no practical value to foragers in terms of the food quest.
As our ancestors gained more free time, making or keeping peace by humoring, entertaining, persuading, and engaging others—rather than beating them into submission—will have become an ever more important skill. To do this would have required emotional engagement, empathy, and, above all, the ability to communicate.
The need to occupy ever more restless minds during free time was an evolutionary pressure that likely selected in favor of those who could liberate others from the burden of boredom: the socially able, the articulate, the imaginative, the musical, and the verbally astute—those who could use language to tell stories, entertain, charm, calm, amuse, inspire, and seduce.
Social Intelligence Overpowers Muscle
When our ancestors outsourced some of their energy requirements to fire, they took the first steps toward creating a world where the physically powerful sometimes play second fiddle to the articulate and charismatic.
Where is ‘Work’ Leading Us?
Lee’s desire to experience hunter-gatherer life was not shaped solely by academic curiosity. Like many others whose earliest childhood memories were forged during the Second World War, Lee struggled to buy wholeheartedly into the narrative of progress that had shaped his parents’ and grandparents’ attitudes to life, work, and well-being. He wondered whether a better understanding of how our hunter-gatherer ancestors lived might offer some insights into the fundamental nature of our species “stripped of the accretions and complications brought about by agriculture, urbanization, advanced technology, and national and class conflict. “It is still an open question,” wrote Lee, “whether man will be able to survive the exceedingly complex and unstable ecological conditions he has created for himself” and whether “the efflorescence of technology” that followed the agricultural revolution would lead us to Utopia or “to extinction.”
Foragers = A Good Long Life, 15 Hr Work Weeks
In October 1963, Richard Borshay Lee, a doctoral student enrolled in the anthropology program at the University of California, set up a makeshift camp near a waterhole in the remote desert in northeast Botswana. He was there to spend time among one of the last of the world’s few largely isolated hunting and gathering societies, the northern Ju/’hoansi, or as he referred to them at the time, the “!Kung Bushmen.”
Lee told his audience that despite the fact that he conducted his research during a drought so severe that most of rural Botswana’s farming population only survived courtesy of emergency food-aid drops, the Ju/’hoansi needed no external assistance and sustained themselves easily on wild foods and hunting. He said that each individual in the band he followed consumed an average 2,140 calories per day, a figure close to 10 percent higher than the recommended daily intake for people of their stature. What was most remarkable was that the Ju/’hoansi were able to acquire all the food they needed on the basis of “a modest effort”—so modest, in fact, that they had far more “free time” than people in full-time employment in the industrialized world.
He calculated that economically active adults spent an average of just over seventeen hours per week on the food quest, in addition to roughly an additional twenty hours per week on other chores like preparing food, gathering firewood, erecting shelters, and making or fixing tools. This was less than half the time employed Americans spent at work, getting to work, and on domestic chores.
What interested Sahlins the most was not how much more leisure time hunter-gatherers enjoyed compared to stressed-out workers in agriculture or industry, but the “modesty of their material requirements.” “Wants may be easily satisfied,” Sahlins noted, “either by producing much or desiring little.” Hunter-gatherers, he argued, achieved this by desiring little and so, in their own way, were more affluent than a Wall Street banker.
Delayed vs Immediate-Return Economies
In the summer of 1957, James Woodburn became the first social anthropologist to develop a long-term relationship with the Hadzabe. Just like Richard Lee, he was struck by how little effort it took for the bow-hunting Hadzabe to feed themselves. He also noted that, like the Ju/’hoansi, they met nutritional needs easily, “without much effort, much forethought, much equipment or much organization.” Woodburn described the Hadzabe as having an “immediate return economy.”
He contrasted this with the “delayed return economies” of industrial and farming societies. Those who settled in frostier climates, where seasons were more starkly pronounced than they were for African and other foragers in the humid tropics and subtropics, had to take a different approach to work, at least for part of the year. Hunter-gatherer societies of America’s Pacific Northwest coast, like the Kwakwaka’wakw and Coast Salish and Tsimshian, lived in large permanent settlements, stored food on a large scale, and were deeply preoccupied with achieving social rank, which they did through lavish discharges of gifts. Their fisheries were so seasonally productive that for much of the year people in these societies spent most of their time and energy developing a rich artistic tradition, playing politics, holding elaborate ceremonies, and hosting sumptuous ritual feasts.
Living in these environments not only demanded that people did more work but also that they organized their working lives differently, for part of the year at least. Preparing for winter required significantly more planning for them than it did for African foragers.
It is unlikely to be a coincidence that the efflorescence of artwork in Europe and Asia that archaeologists and anthropologists once assumed indicated Homo sapiens crossing a crucial cognitive threshold may well have been the progeny of long winter months.
The Emergence of ‘Future Planning’
In occasionally storing food and organizing their working year to accommodate intense seasonal variations, European and Asian foraging populations took an important step toward adopting a longer-term, more future-focused relationship with work. In doing so, they also developed a different relationship with scarcity, one that resembles that which shapes our economic life now in some important respects.
For Some, Work Defaults as Life’s Sole Purpose
His letter to William Grimes was, above all, an unemotional meditation on the meaninglessness of a life without useful work to do. Born in Sydney in 1892, Childe was the foremost prehistorian of the interwar years, publishing hundreds of influential papers and twenty books over the course of his career. But at the age of sixty-four he had reached the dismal conclusion that he had no “further useful contributions to make”. On the evening of Saturday, October 19, 1957, hikers negotiating the cliffs near Govett’s Leap in Australia’s Blue Mountains found a pair of spectacles, a pipe, a compass, and a hat, all neatly arranged on top of a folded mackintosh raincoat.
Surplus Energy = Focus on Culture, Art, Transcendence, Specialized Jobs
Göbekli Tepe will always cling to its deepest secrets. But its importance in the history of our species’ relationship with work is clear. It is the first evidence anywhere of people securing sufficient surplus energy to work over many consecutive generations to achieve a grand vision unrelated to the immediate challenge of securing more energy, and one that was intended to endure long beyond the lives of its builders. It may not be anything near the scale and complexity of the Egyptian pyramids or Mayan temples, but its construction must have demanded a similarly complex division of labor and skilled masons, artists, carvers, designers, and carpenters, who depended on others to feed them. It is, in other words, the first unambiguous evidence of a society in which many people had something resembling full-time, highly specialized jobs.
Early Farming = Hard and Miserable
The agricultural revolution not only enabled the rapid growth of the human population but also fundamentally transformed how people engaged with the world around them: how they reckoned their place in the cosmos and their relationships with the gods, with their land, with their environments, and with each other. By 6,000 years ago farming was a well-established subsistence strategy across many parts of Asia, Arabia, and North, South, and Central America.
Newer research has since reaffirmed that climate-change-induced scarcity played an important role in pushing some human populations down the path toward being food producers.
Farming, Initially, Was More Work, Worse Quality of Life, More Vulnerable
As farming societies grew more productive and captured more energy from their environments, energy appeared to be scarcer and people had to work harder to meet their basic needs. This was because, up until the Industrial Revolution, any gains in productivity farming peoples generated as a result of working harder, adopting new technologies, techniques, or crops, or acquiring new land were always soon gobbled up by populations that quickly grew to numbers that could not be sustained. As a result, while agricultural societies continued to expand, prosperity was usually only ever fleeting, and scarcity evolved from an occasional inconvenience that foragers stoically endured every once in a while to a near perennial problem. In many respects, the hundreds of generations of farmers who lived before the fossil-fuel revolution paid for our extended lifespans and expanded waistlines now by enduring lives that were mostly shorter, bleaker, and harder than ours, and almost certainly tougher than those of their foraging ancestors.
Those of the Foraging Ju/’hoansi and Hadzabe who reached puberty would be considered very unlucky if they did not live well beyond sixty. There is broad consensus that before the Industrial Revolution kicked into gear and significant advances in medicine began to make an impact, the agricultural revolution did nothing at all to extend the lifespan of the average person, and indeed in many instances shortened it relative to the lifespans of remote foragers like the Ju/’hoansi. Graveyards from all the world’s great agricultural civilizations through to the Industrial Revolution tell an enduring tale of systematic nutritional deficiencies, anemia, episodic famines, and bone deformations as a result of repetitive, arduous labor, in addition to an alarming array of horrendous and sometimes fatal work-induced injuries.
Over long period of time farming societies were far more likely to suffer severe, existentially threatening famines than foragers. Foraging may be much less productive and generate far lower energy yields than farming but it is also much less risky. First because foragers tended to live well within the natural limits imposed by their environments, and secondly because where subsistence farmers relied on one or two staple crops, foragers in even the bleakest environments relied on dozens of different food sources.
Clear evidence of farming collapses has been found in our genomes. Comparisons of ancient and modern genomes in Europe, for example, point to sequences of catastrophes that wiped out between 40 and 60 percent of established populations, dramatically reducing the genetic diversity of their descendants. These genetic bottleneck events clearly coincided with the expansion of farming societies through central Europe around 7,500 years ago, and then later into northwestern Europe about 6,000 years ago. Depleted soils, diseases, famines, and later conflicts were recurrent causes of catastrophe. in farming societies. But these only ever briefly stalled the rise of agriculture. Even despite these challenges, farming was ultimately much more productive than foraging, and populations almost always recovered within a few generations, so sowing the seeds for a future collapse, amplifying their anxieties about scarcity, and encouraging their expansion into new space.
Why Did Farming Only Seem To Get Harder, Yet Expand?
If the trajectory of human history was shaped by farming societies with the highest-yielding, energy-rich crops, why was life in these societies so much more laborious than it was for foragers? This was a question that preoccupied the Reverend Thomas Robert Malthus. Why, he wondered, after centuries of incremental progress that raised agricultural productivity, did most people still work so hard and yet live in poverty?
First, whenever an improvement in a society’s agricultural or economic output is diluted as a result of population growth, it is now convention to describe this as a “Malthusian trap.” Economic historians who like to reduce global history to the dull metric of “real incomes” have found no shortage of good evidence of Malthusian traps catching out unsuspecting societies all over the world before the Industrial Revolution. And in every instance, they note, where a surge in agricultural productivity as a result of a clever new technology made one or two lucky generations thrive, population growth quickly restored everything back to a more miserly baseline.
Second, for most farmers the only obvious solution to the labor problem was to procreate, but in doing so this was not only an additional mouth to feed, but after a point resulted in a noticeable decline in food yields per person. This left farmers with few options: go hungry, take land from a neighbor, or expand into virgin territory. The history of agriculture’s rapid spread through Asia, Europe, and Africa shows that in many instances they chose the last.
The busy algorithms set loose by the paleogeneticists that have offered new insights into agriculture’s expansion. And taken in conjunction with archaeological data and oral histories, the story they tell in most cases is one of the displacement, replacement, and even genocide of established hunter-gatherer populations by rapidly growing populations of agriculturalists on the run from Malthusian traps.
Comparison of DNA extracted from the bones of Europe’s early farmers with that of DNA extracted from the bones of Europe’s ancient hunting and gathering populations shows that agriculture in Europe spread courtesy of populations of farmers expanding into new lands, and in the process displacing and eventually replacing established hunter-gatherer populations rather than assimilating them.
9,600 years ago, much of the Middle East had been transformed into a network of small agricultural settlements. As communities across the Middle East grew more dependent on farmed grains, their fields and farms displaced wild animal and plant populations, making it increasingly hard for even the most determined foragers to sustain themselves by hunting and gathering alone.
Marriage With Time: The Calendar, The Future, and “Urgency”
Hunter-gatherers enjoy the rewards of their labor immediately in the form of a meal and the pleasure of feeding others, whereas, a warehouse packer only ever secures the promise of future reward in the form of a token that can later be exchanged for something useful or to pay off a debt.
The people who constructed Stonehenge and several other grand monuments were the beneficiaries of thousands of years of slowly improving agricultural productivity, and so were among the first to generate sufficiently splendid surpluses to abandon their fields for months at a time, dragging huge rocks over mountains and valleys and then assembling them into monumental structures. What is also certain is that Stonehenge is a massive—albeit low-resolution—calendar.
Farmers have to align working lives to the reproductive and growth cycles of their livestock and plants, which in turn are aligned to those of the environments that feed them. Whenever work urgently needed to be done, the consequences of not doing so were almost always considerably greater for farmers than they were for foragers. Failing to mend a broken fence could translate into days in pursuit of lost sheep. Failing to irrigate a thirsty crop, deal with pests, or remove weeds at the earliest possible opportunity might mean no harvest at all. To produce food requires that you live at once in the past, present, and future.
To focus most of your effort for future rewards is also to dwell in a universe of endless possibilities—some good, some hard to call, and many bad. So when farmers imagined great harvests, they simultaneously invoked images of droughts and floods, pests, disease-ridden livestock, predators. Where foragers stoically accepted occasional hardships, farmers persuaded themselves that things could always be better if they worked a little harder.
To farm you have to set yourself apart from your environment. While farmers recognized that their livelihoods depended on their ability to harness natural forces and operate within natural cycles, they also believed wherever nature intruded unbidden into domesticated spaces it became a pest.
Deeper Marriage With Time: Money
Farmers saw themselves as exchanging their labor with the environment for the promise of future food. In a sense, they considered the work they did to make land productive to mean that the land owed them a harvest and in effect was in their debt. Unsurprisingly, farmers tended to extend the labor/debt relationship they had with their land to their relationships with each other.
While Benjamin Franklin believed that money must have been invented to facilitate barter, his experiences negotiating treaties with Iroquois suggested to him that they were not interested in trading to accumulate wealth. He believed that they had other priorities, which gave him cause to question some of his own: “Our laborious manner of life they see as slavish and base. [Colonized folks] are hostage to infinite artificial wants, no less craving than those of Nature” that were often “difficult to satisfy,” the Indians had only “few . . . wants,” which were easily met by “the spontaneous productions of nature with the addition of very little labor, if hunting and fishing may indeed be called labor when Game is so plenty.” As a result, Franklin noted somewhat enviously, the Indians enjoyed an “abundance of leisure,” which they used for debate, reflection, and refining their oratorical skills.
When Caroline Humphrey, a Professor of Anthropology at Cambridge, conducted an exhaustive review of the ethnographic and historical literature looking for societies that had barter systems like those described by Smith, she eventually gave up and concluded “no example of a barter economy, pure and simple, has ever been described, let alone the emergence from it of money,” and that “all available ethnography suggests that there never has been such a thing.”
There is overwhelming evidence that while money may be used principally as “store of value” and a medium of exchange, its origins do not lie in barter, but rather in the credit and debt arrangements that arose between farmers—who were, in effect, waiting for their land to pay them.
The earliest Mesopotamian city-states, like Uruk, were almost certainly the first societies in which farmers were productive enough to sustain significant urban populations who didn’t want or need to muddy their feet digging in the fields. These were also the first places for which there is solid evidence for money in the form of inscribed clay ledgers. And while this currency was enumerated in silver and grain, it rarely changed hands in physical form. Many transactions took the form of IOUs that were logged by temple accountants, so enabling value to exchange hands virtually.
People exchanges based on credit. Over the course of the year when farmers took credit from beer-brewers, merchants, and temple officials, they were in effect simply transferring onward the debts owed them by their land.
The word “capital,” stems from the Latin root of capitalis, which in turn comes from the Proto-Indo-European word kaput, meaning head, which to this day remains the principal term used when denominating livestock. The word “fee” likewise is an elaboration on the old Proto-Germanic and Gothic word for cattle—feoh—just as the word “pecuniary” and currencies like the peso have their roots in the Latin term pecu, meaning cattle or flock, which itself is thought to share similar origins to the Sanskrit term pasu, which also refers to cattle.
MORE Energy Capture – Animals, Slaves, Cities, Privilege
Fifteen millennia ago humans and domesticated animals comprised a barely measurable fraction of a percent of the total mammalian biomass on earth. Today they comprise a remarkable 96 percent. Humans account for 36 percent of that total, and the livestock account for 60 percent.
Domestic animals also played a vital role in determining which agricultural societies captured the most energy, grew the fastest. When it came to heavy-duty tasks like plowing, a single good ox could do the work of five burly men. The domestication of cattle was important not because it provided people with protein, but rather because it enabled the greater intensification of grain farming and a means to transport these surpluses from the country to the city. And what’s more, they did so mainly by capturing and converting energy from plants that humans couldn’t eat, and through their labor, manure, and in the final instance their flesh, converting those into forms that humans could eat.
Humans have evolved the ability to be selective in deploying the empathy that underwrites our social natures. For workers in large slaughterhouses, denying empathy is relatively easy to do because, unlike hunters who often saw their prey at their magnificent best, butchers often see livestock at their diminished worst, inhaling the smells of death while standing in pens outside the slaughterhouse. Farming societies adopted a variety of different approaches to dealing with the ethical problem of killing animals. Some simply chose to hide the messy business. This is the approach we take in many cities now.
Rome changed from being one where small-scale freeman farmers provided the bulk of grain to one where large farming estates called latifundia dominated agricultural production. Each of these estates depended almost entirely on slaves, who were enumerated alongside livestock in farm inventories. For the four centuries between 200 BC and AD 200, it is thought that between a quarter and a third of the population of Rome and greater Italy were slaves.
Emergence of “Privilege”
Aristotle believed animals to possess diminished souls, like Descartes, he insisted that animals lacked reason and because of this it was fine to kill and consume them without qualm. To his mind, this was all part of the natural order. “Plants are for the sake of animals, and . . . other animals are for the sake of human beings.”
Well-to-do Romans were more likely than Greeks to kill and torture their slaves for trivial indiscretions. But otherwise they expressed similar attitudes to slavery and work as the ancient Greeks and, like Victorian Britons nearly two millennia later, considered themselves to be the inheritors of the ancient Greeks’ civilization. They too considered manual work demeaning, and working for a living to be vulgar.
Those with lots of capital and lots of slaves were able to amass wealth many orders of magnitude larger than poorer Roman citizens, who had to work for a living in a labor marketplace in which competent slaves would always be the economic choice.
Rome at its peak hosted a million citizens and was able to maintain its legions, armies of bureaucrats, senators, slaves, guilds, and colosseums by sucking in energy surpluses generated by farmers across the empire. Rome’s eventual collapse was ultimately hastened by the corrosive inequality at its heart. The trouble Rome’s leaders went to in order to keep their citizens distracted foreshadows the congregation of ever more people in big cities and towns, places where for the first time in human history a majority of people’s work did not focus on the procurement of the energy resources they needed to survive.
Mass Migration to Cities – The “Urban Revolution”
In 1991, close to three-quarters of all Namibians still lived in the countryside. Namibia’s total population has doubled. While the rural population has increased by one-fifth, Namibia’s urban population has quadrupled in size. threshold. In 2008, more people lived in cities than in the countryside for the first time in our species’ history. Up to 1.6 billion people now live in slums and shanty towns. [surrounding cities that do not yet have infrastructure to support their load]. The largest—like Kibera in Kenya, Ciudad Neza outside Mexico City, Orangi Town in Pakistan, and Mumbai’s Dharavi—have populations counted in millions and are in some ways cities within cities. The movement of 250 million rural Chinese into cities to take up jobs in its rapidly growing manufacturing sector between 1979 and 2010 was the single largest migration event in human history.
For Vere Gordon Childe, the “urban revolution” was the crucial second phase of the agricultural revolution. The urban phase, he argued, only ever came about once a critical threshold in agricultural productivity was crossed and farmers were able to generate consistently large enough surpluses to support bureaucrats, artists, politicians, and others that they were generous enough not to think of as “freeloaders.” Where energy was abundant, people, like masked weavers [birds that instinctually build and continuously rebuild nests], first used it to build great monolithic monuments like Göbekli Tepe or Stonehenge, and later proper towns and cities.
In much the same way that some scientists speculate that entropy meant that the appearance of life on earth was almost inevitable, so history suggests that the creation of cities and towns wherever people became sufficiently productive food producers was inevitable too. Like living organisms, cities are born, sustained, and grown by capturing energy.
The rapid expansion of the northern towns and cities that were to become the epicenter of Britain in the eighteenth century was not solely to meet the labor demands, nor was it the result of optimistic country folk moving into the cities with ambitions. Rather it was catalyzed by substantial and rapid improvements in agricultural productivity that were made possible by technological advances. Coupled with the consolidation of agricultural landholdings by wealthier farmers, this meant that there was simply no useful work for many among the fast-growing rural population to do in the countryside.
The Emergence Of Many “Jobs”, Beyond Survival
One in five people who lived in cities in the most productive ancient agricultural economies were pioneers of a whole new way of working. The bigger cities grew and the busier their citizens got. Energy went into sourcing the materials for building, maintaining, and renewing basic infrastructure. This resulted in the emergence of many new specialist trades, like carpentry, stonemasonry and architecture, engineering, hydrology, and sewerage. Lots of energy also went into building temples and sustaining holy orders, to flatter and appease demanding deities with sacrifices and tributes, as well as meeting the entirely novel challenge of maintaining order among large assemblies of people whose ancestors for 300,000 years had lived in small mobile bands. This required bureaucrats, judges, soldiers, and those who specialized in keeping order and binding people together into urban communities with common values, beliefs, and goals. Cities lived or died on the basis of common rules of behavior and the ability of their citizens to bind themselves together with shared experiences, beliefs, and values.
And among the ranks of Roman service-sector personnel were lawyers, scribes, secretaries, accountants, chefs, administrators, advisers, teachers, prostitutes, poets, musicians, sculptors, painters, entertainers, and courtesans who—assuming they could secure the right patronage or were independently wealthy—could dedicate all their working lives to achieving mastery of their particular art.
The Start of Military / Law Enforcement
City dwellers’ work was determined by the demands of expending energy, and one of the first things it was used for was the development of professional standing armies capable of keeping peace within the city walls and protecting energy resources or expanding access to them.
Inscriptions on tombstones and written records from Imperial Rome describe 268 different career paths that ancient Romans pursued.
Written Language – More Abstract “Work”
Like agriculture, writing systems were developed independently by unrelated populations in different parts of the world within a relatively short period of time. In the oldest phase, spanning 4,500 years and beginning possibly 10,000 years ago, transactions were accounted for using clay tokens representing units of goods. The next phase involved transforming these three-dimensional tokens into pictographs on clay tablets, again used for accounting. And the final phase, the precursor to alphabetic writing, began around 5,000 years ago and involved using pictographs to systematically represent spoken language.
Like any other complex skills acquired and mastered when young and cognitively plastic, it clearly has some impact on shaping how our brains are organized and how we think and perceive the world.
There is no debate that the invention of writing led to a whole universe of new, previously unimaginable desk jobs and professions, from scribes to architects, many of which were high status not least because of the energy and effort that was invested in mastering literacy. “Put writing in your heart that you may protect yourself from hard labor of any kind,” an Egyptian father famously said to his son as he dispatched him to school in the third millennium BC, adding that “the scribe is released from manual tasks” and that it is “he who commands.” It is clear that literacy fundamentally transformed the nature and exercise of power and commerce.
Doctors exist because we like to live and because we dislike pain; artists and entertainers exist to bring us pleasure; hairstylists exist because some of us like to look good or need a sympathetic ear to listen; DJs exist because we like to dance; and bureaucrats exist because even the most passionate anarchists want the buses to run on schedule.
In much the same way that masked weavers and bowerbirds use their surplus energy to build elaborate and often unnecessary structures, so humans, when gifted sustained energy surpluses, have always directed that energy into something purposeful. From this perspective, the emergence of many ancient service-sector professions was simply a result of the fact that wherever and whenever there has been a large, sustained energy surplus, people (and other organisms) have found creative ways to put it to work.
Industrial Boom and Infinite Desire (Hedonic Treadmill)
City people are constantly confronted by others who have much more (and better) stuff than they do. For as long as people have congregated in cities, their ambitions have been molded by a different kind of scarcity from that which shapes those of subsistence farmers, a form of scarcity articulated in the language of aspiration, jealousy, and desire rather than of absolute need. And for most, this kind of relative scarcity is the spur to work long hours, to climb the social ladder, and to keep up with the Joneses.
John Maynard Keynes argued that the economic problem had two distinct components: “Absolute needs.” – like food, water, warmth, comfort, companionship, and safety, and “Relative needs.” – needs he believed truly were infinite, because as soon as we met any of them they would be quickly replaced by another probably more ambitious one.
The idea that inequality is natural and inevitable is invoked as often in the teachings of Vedic, Confucian, Islamic, and European classical philosophy as it is in the rhetoric of many politicians. For almost as long as people have lived in cities and recorded their thoughts in writing, there have been those who, like Aristotle, have insisted that inequality is an inescapable fact of life. Many historians have argued that even if inequality is not a brute fact of human nature, then along with zoonotic diseases, despotism, and war, it was probably a direct and immediate consequence of our embrace of agriculture. They reason that as soon as people had big surpluses to hoard, exchange, or distribute, the more miserable angels of our nature took over.
4,500 years ago, Uruk, like New York, London, or Shanghai today, was anything but egalitarian. Merchants and moneymen were able to leverage their control over the supply and distribution of surpluses to achieve a status comparable to that of nobles and clergy. Citizens of Uruk fell into five distinct social classes. 1) Royalty, 2) Priests, 3) Soldiers, Merchants, Accountants, Architects, Astrologers, Teachers 4) Working Class, 5) Slaves.
In places like Uruk, becoming a wealthy merchant was almost certainly the only path that ordinary people could follow to bridge the chasm that separated them from nobility. Accumulating wealth offered the opportunity of upward mobility for those who worked the hardest, were luckiest, and were the most cunning.
Agricultural Productivity and Population Boom
The proportion of people employed in agriculture is usually a pretty good measure of a country’s wealth. Those with the highest proportion of farming are typically among the poorest. All of the ten countries where over three-quarters of the workforce still describe themselves as farmers are in sub-Saharan Africa. By contrast, the United States is less than 2 percent, whose agricultural industry routinely produces such huge surpluses that close to 660 pounds of food per person is wasted in the pipeline between field and plate every year.
This is the norm in most industrialized countries where agriculture has dramatically increased productivity while simultaneously vastly reducing the dependency on human labor.
Most important among these were the adoption of the highly efficient Dutch plow, which turned the sod better than its predecessors could, and could be pulled by a single draft animal; the intensive use of both natural and artificial fertilizers; a greater focus on selective breeding; and more sophisticated crop rotation systems. Between 1550 and 1850, net yields in wheat and oats per acre farmed in Britain nearly quadrupled. This increase in productivity catalyzed a surge in population growth. In 1750, the population of Great Britain was around 5.7 million people. But thanks to the surge in agricultural productivity it tripled to 16.6 million by 1850, and by 1871, double that again. And where roughly half of Britain’s workforce were farmers in 1650, by 1850 that had dropped to one in five.
Emergence of Luxuries
East India Company was the largest manufacturer and exporter of goods anywhere in the world. Its relatively cheap chintz, cotton, and calico textiles fed a consumer revolution among the well-to-do in urban Europe.
Caribbean slaves spent their days hacking through fields of sugar cane and stoking the fires needed to transform the raw cane into molasses, sugar, and rum. Sugar products soon became by far the most important of all of colonial Britain’s food imports from the New World. By the dawn of the twentieth century, per capita sugar consumption in the United Kingdom was a tooth-rotting quarter of a pound every day.
Fossil Fuel, Engines, Factories
Factories, barges, railways, and ships were powered by coal. Coal after was not always easy to find. It was also hard, often dangerous, work to mine. It was only after steam engines came into widespread use that coal and other fossil fuels became an important energy source.
Hero of Alexandria, an engineer in Roman Egypt, built a simple spinning steam engine he called an aelopile in the first century. English military engineer Thomas Savery filed a patent in 1698 for “a new invention for raising of water and occasioning motion to all sorts of mill work by the impellent force of fire” – the first serious use of steam. The most important design was unveiled in 1712 by Thomas Newcomen.
The construction of hundreds of large steam-powered textile mills and factories between 1760 and 1840 created thousands of new jobs for migrants to Britain’s cities and towns.
The Consequence Automation and Luddites
The early years of the Industrial Revolution were marked by the mass culling of a whole range of well-established and sometimes even ancient professions, from weavers to farriers, while creating a handful of opportunities for a new class of workers comprised of aspirant engineers, scientists, designers.
Ned Ludd, a troublesome apprentice in a cotton mill who, one day in 1779, according to legend, grabbed a mallet and pounded two stocking frames into matchsticks in a fit of anger. After this incident, it became customary for anyone who accidentally damaged any machinery in a mill or factory in the course of their work to proclaim their innocence and deadpan that “Ned Ludd did it.”
Brutal Work Hours / Quality of Life
Richard Arkwright, the inventor of the spinning frame—a machine for binding thread—established a series of mills across the north of England between 1771 and 1792. He was one of the principal targets of the Luddite Rebellion, and is now often thought of as “the inventor of the factory system.” Those who worked in his factories were expected to perform six thirteen-hour shifts over the course of a week, and any who showed up late were docked two days’ pay. He did allow employees a week of annual vacation (unpaid) on the condition that they did not leave town while taking it.
Those in the cities labored longer hours, ate poorly, breathed air fouled with smog, drank dirty water, and endured diseases like tuberculosis—which accounted for up to one-third of all recorded deaths in the UK between 1800 and 1850.
When factories needed small bodies to work in cramped spaces or nimble fingers to fix fiddly parts on big machines, there were plenty of children who could be drafted in, as often as not from local orphanages. Children were such compliant and versatile laborers that by the turn of the nineteenth century, close to half of all Britain’s factory workers were under the age of fourteen.
Rising Tides of “Wealth” and Mass Consumption
By the 1850s a proportion of wealth began to trickle down to those working on the factory floors in the form of improved wages and better housing. This process was led by several very wealthy factory owners in an early incarnation of what now would be labeled “corporate social responsibility.” This period marked the beginning of many people viewing the work they did exclusively as a means to purchase more stuff, so closing the loop of production and consumption that now sustains so much of our contemporary economy.
Over the course of the seventeenth and eighteenth centuries, increase in artisanal manufacturing, and the import of exotic novelties like linens, porcelain, ivory, ostrich feathers, spices, and sugar from the colonies sparked a “consumer revolution”. As more and more people became dependent on cash wages, consumption became more influential in shaping the aspirations of what would later be referred to as the working classes.
Many of the new luxury items that fueled Europe’s consumer revolution were things that were useful regardless of the status, however, many other luxury items appealed exclusively to the pursuit of status. People wanted items for no reason but because they wished to emulate others who had them.
Elites Perpetually Distinguish, Lower Classes Perpetually Chase
Urban dwellers, even in ancient cities, often dressed to impress. After all, among the crowds in a busy urban plaza it is impossible to tell a noble from a commoner if they both happen to be wearing identical outfits. The tendency among lower classes and castes in cities the world over to emulate the wealthy causes much fretting and tutting among elites determined to maintain the optics of rank. Some urban elites, like the extravagantly bewigged and sequinned courtiers who strutted about the gardens in the Palace of Versailles during the reign of the Sun King, Louis XIV, achieved this by adopting insanely elaborate and expensive fashions that the poor could never hope to copy.
Britain’s cities began to swell over the course of the seventeenth and eighteenth centuries, and aspirant families sought to emulate the wealthier classes within the home too. Homewares in particular emerged as important signifiers of status, especially among people living in the rows and rows of undifferentiated houses that were built to accommodate urban migrants. Unsurprisingly, it did not take long for ambitious entrepreneurs to begin to explore opportunities to mass-produce things like affordable porcelain and ceramic homewares, mirrors, combs, books, clocks, carpets, and all sorts of different kinds of furniture. The desire of poorer people in cities across Europe to consume what were once luxuries enjoyed only by the very rich was just as influential in shaping the history of work as the invention of technologies to exploit the energy in fossil fuels.
In 1887, Emile Durkheim was in no doubt that new fashions were often quickly embraced by the poorer and more marginal hoping to emulate the rich and powerful. He was also in no doubt that fashions were, by their nature, ephemeral. “Once a fashion has been adopted by everyone,” he noted, “it loses all its value.”
Industrial Anxiety, Anger, Frustration, Anomie
Emile Durkheim believed simple societies operated like rudimentary machines with lots of easily interchangeable parts, and complex societies functioned more like living bodies and were made up of lots of highly specialized organs that could not be substituted for one another. Chiefs and shamans in simple societies could simultaneously be foragers, hunters, farmers, and builders, but in complex societies lawyers could not moonlight as surgeons any more than admirals could moonlight as architects. Durkheim also believed that people in primitive societies typically had a far stronger sense of community and belonging than people in more complex urban ones, and to this extent were happier and more sure of themselves. If everyone in a primitive society performed interchangeable roles, he reasoned, then they would be bound into a kind of “mechanical solidarity”. He contrasted this with life in modern urban societies, where people performed many, often very different roles and so developed very different perspectives of the world, and insisted that this not only made it harder to bind people together but also induced a potentially fatal and always debilitating social disease that he dubbed “anomie.”
He was particularly intrigued by the fact that, almost paradoxically, the increase in prosperity that accompanied industrialization in France had resulted in more suicides and greater social stress. This led him to conclude that it was the changes associated with urbanization and industrial development that were a major driver of anomie.
He insisted that anomie was characterized by what he called the “malady of infinite aspiration,” a condition arising when there are “no limits to men’s aspirations” because they “no longer know what is possible and what is not, what is just and what is unjust, which claims and expectations are legitimate and which are immoderate.”
As energy-capture rates have surged, new technologies have come online and our cities have continued to swell, constant and unpredictable change has become the new normal everywhere, and anomie looks increasingly like the permanent condition of the modern age.
Taylorism: Efficiency, Efficiency, Efficiency
By the time of his death in 1915, Frederick Winslow Taylor was eulogized by titans of industry like Henry Ford as the “father of the efficiency movement,” and declared by management consultants to be the “Newton or Archimedes of the science of work.”
Innovator vs Mule
At Midvale Steel Works, Taylor began to conduct experiments with his stopwatch, carefully observing and timing different tasks to see whether he could shave a few seconds off various critical processes, and redesign job roles to ensure that laborers would find it difficult to waste effort. The same freedom that Taylor was granted to conduct his efficiency experiments at Midvale would be denied to other similarly innovative and ambitious individuals in workplaces that adopted his scientific management technique. Instead, they’d be shackled to rigid, target-driven, repetitive work regimes where innovation was prohibited and the most important role of managers was to ensure that workers performed as they were instructed to.
“It is only through enforced standardization of methods, enforced adoption of the best implements and working conditions, and enforced cooperation that this faster work can be assured,” he explained in Scientific Management. “Taylorism,” as it came to be called, was adopted in many workplaces, but never more famously than at the Ford Motor Company. In 1903, Henry Ford hired Taylor to assist him in developing a new production process for the now iconic Model T Ford. He cut down the production time of a single Model T Ford from twelve hours to ninety-three minutes, and with that cut the price of them from $825 to $575.
Taylor thought that the reason that most people took jobs and went to work was, fundamentally, for the financial rewards and the products they might purchase with them. One problem was, the right person for most of the non-managerial jobs Taylor designed was someone with limited imagination, boundless patience, and a willingness to obediently do the same repetitive tasks day in and day out.
The problem with Taylorism as Samuel Gompers saw it was not the profits it generated for factory owners, but the fact that it robbed workers of the right to find meaning and satisfaction in the work they did by transforming them into “high speed automatic machines”.
Bigger Than Taylor
Efficiency had been in the air ever since the very first stirrings of the Industrial Revolution—Adam Smith had already outlined the basic principles of the efficiency movement in his Wealth of Nations—and by the nineteenth century factory owners everywhere understood the correspondence between productivity, efficiency, and profit, even if they hadn’t yet worked out the best means to achieve it. Working hours for manual laborers in particular were declining rapidly as productivity went up. Taylor’s genius was simply that he was the first to approach the problem.
Emergence of Work-Life Balance, Leisure
John Lubbock was the driving force behind Parliament’s adoption of the Bank Holiday Act of 1871. “Saint Lubbock,” as he was affectionately known in the 1870s, was an early and enthusiastic advocate of maintaining a good work–life balance. “Work is a necessity of existence,” he explained, but “Rest is not idleness,” because “to lie sometimes on the grass under trees on a summer’s day, listening to the murmur of the water, or watching the clouds float across the sky, is by no means a waste of time.”
Prior to this, the only substantive regulations dealing with workers’ rights were those in the Factory Act of 1833, which limited the workweek for women and for children under the age of eighteen to sixty hours per week, but imposed no restrictions on the number of hours that men might be required to work. After the Bank Holiday Act had been passed in 1871, it would take another 128 years and the implementation of the European Union’s Working Time directive in the late 1990s before any restrictions on male working hours would enter Britain’s statute books. Even so, by 1870, the workweek for most men and women employed in many factories had already declined from around seventy-eight hours per week to around sixty, based on six ten-hour shifts.
Lubbock’s most important achievements were only possible because he was wealthy enough to afford to do exactly what he wanted to.
In 1888, the first successful legal strike in British history, when the “matchgirls” working for one of Britain’s largest match producers, Bryant and May, took to the streets to protest about their toxic working conditions and demand an end to fourteen-hour shifts. Despite unions progressively growing in power and influence, working hours still remained high and most people worked a six-day, fifty-six-hour week until after the First World War came to an end in 1918.
The 40-Hour (Almost 30-Hour) Workweek
Henry Ford—who by 1918 employed close to 200,000 in his American factories, and nearly as many again at his factories in European capitals, Canada, South Africa, Australia, Asia, and Latin America—leading the way, the forty-hour week, based on five eight-hour shifts and weekends off, became the norm in most big manufacturing industries. The Great Depression put further downward pressure on working hours as companies cut production. This process spurred an embryonic “shorter hours movement,” and very nearly persuaded the Roosevelt administration to introduce the thirty-hour workweek into law in the form of the Black-Connery 30-Hours Bill, which sailed through the Senate in 1932 with a fifty-three to thirty majority. Pulled at the last minute when President Roosevelt got cold feet, the bill was abandoned, and as the worst of the Depression passed, hours crept steadily upward again. Between 1930 and 1980 the average workweek in the United States remained fairly consistently between thirty-seven and thirty-nine hours per week. This was two or three hours shorter than in almost every other industrialized country. But in the last decades of the twentieth century, they started to creep slowly upward again.
Revisiting Keynes’s Prophecy
In 2007, Yale economist Fabrizio Zilliboti revisited Keynes’s predictions. He calculated that, based on growth rates, a fourfold increase in living standards had already occurred by 1980, and that, assuming growth trends continued, by 2030 we would witness a “17 fold increase in the standard of living, amounting to more than double Keynes’s upper bound.”
But working hours have not declined as Keynes predicted. Indeed, despite labor productivity in industrialized nations having risen roughly four- or five-fold since the end of the Second World War, average weekly working hours everywhere have continued to gravitate toward an average of just under forty hours per week, and then remain stubbornly stuck there.
When John Maynard Keynes imagined his utopian future, because everybody’s basic needs were easily met, inequality had become an irrelevance. Only the foolish did more work than they needed to. Almost like a foraging society, his utopia was a place where anyone who pursued wealth for wealth’s sake invited ridicule rather than praise. “The love of money as a possession—as distinguished from the love of money as a means to the enjoyments and realities of life—will be recognized for what it is, a somewhat disgusting morbidity, one of those semi-criminal, semi-pathological propensities which one hands over with a shudder to the specialists in mental disease,” he explained. “I see us free, therefore, to return to some of the most sure and certain principles of religion and traditional virtue—that avarice is a vice, that the exaction of usury is a misdemeanor, and the love of money is detestable.”
Kellogg’s Brief 30-Hour Workweek
Will Kellogg revolutionized food production in the United States. A serial innovator, he experimented with and applied all the latest trends in management, production, and marketing, including Taylorism. By the 1920s, his company and its principal product were a household name in the United States and it would not take long before it expanded internationally.
When the Great Depression struck in 1929, Kellogg did something else that was unusual. He cut full-time working hours at his factories from an already reasonable forty hours a week to a comfortable thirty hours a week, based on five six-hour shifts. By doing this, he was able to create an entire shift’s worth of new full-time jobs in a period when up to a quarter of Americans were unemployed. It seemed a sensible thing to do for other reasons too. By the 1930s, American workers were already lobbying for shorter working hours after companies like Henry Ford’s had successfully introduced weekends and five-day weeks with no noticeable dip in productivity. In 1935 Kellogg boasted in a newspaper article that “we can [now] afford to pay as much for six hours as we formerly paid for eight.”
Until the 1950s, the thirty-hour week remained the norm at Kellogg’s factories. Then, somewhat to the surprise of management, three-quarters of Kellogg’s factory staff voted in favor of returning to eight-hour shifts and a forty-hour week. Some of the workers explained that they wished to return to an eight-hour day because the six-hour shifts meant they spent too much time getting under the feet of irritable spouses back at home. But most were clear: they wanted to work longer hours to take home more money, to purchase more or better versions of the endless procession of constantly upgraded consumer products coming on to the market during America’s affluent postwar era.
The Boom of Convenience, Squandered Affluence, Artificial Scarcity
In the late 1940s and early 1950s, war-weary Americans set about building Chevrolet Bel-Airs instead of tanks, converting their accumulated munitions piles into nitrogen-based fertilizers, and repurposing their radar technology into microwave ovens. This nourished a newly reconfigured American dream set against a background of ice cream in the home freezer, TV dinners, and fast-food-fueled annual interstate vacations.
This prosperity convinced John Kenneth Galbraith, the Canadian-born Professor of Economics at Harvard, that advanced economies like the United States’ were already sufficiently productive to meet the basic material needs of all their citizens and hence that the economic problem as defined by John Maynard Keynes had, more or less, been solved.
Nevertheless, he reckoned that the United States was not making particularly good use of its wealth. “No problem has been more puzzling to thoughtful people than why, in a troubled world, we make such poor use of our affluence,” he wrote.
One of the main reasons that Galbraith took this view was post-war Americans’ seemingly limitless appetite for purchasing things they didn’t need. Galbraith believed that by the 1950s most Americans’ material desires were as manufactured as the products they purchased to satisfy them. Because most people’s basic economic needs were now easily met, he argued, producers and advertisers conspired to invent new artificial needs to keep the hamster wheel of production and consumption rolling rather than investing in public services. Real scarcity, in other words, was a thing of the past.
Manufactured Desire – The Advertising Industry
Manufacture of desire is at least as old as the first cities. In ancient metropolises, advertising took many forms familiar to us now, from the seductive pornographic tableaus that decorated the walls of brothels in Pompeii to elegantly printed handbills and flyers emblazoned with cute logos and snappy slogans distributed by craftspeople in Song Dynasty China. But until recently advertising was something most people did for themselves. That all changed with mass-circulation newspapers.
In the United States, the birth of advertising as a revenue-generating industry in its own right is now often credited to none other than Benjamin Franklin. In 1729, after purchasing the Pennsylvania Gazette, Franklin struggled to turn a profit through sales alone, and wondered whether he might defray the costs by selling space in the paper to local traders and manufacturers wanting to drum up new business. Franklin prominently advertised one of his own inventions, the Franklin Stove, to see if that would help. Doing so won him a double victory. Sales of the Franklin Stove surged, and other tradesmen soon took notice and purchased advertising space in the Pennsylvania Gazette.
By the 1930s, advertisers began to focus more and more on catching readers’ eyes with snappy slogans in different fonts, and on adding pictures. Advertising was as important to marquee brands like Kellogg’s and Ford as any part of their operations. As Henry Ford famously commented, “Stopping advertising to save money is like stopping your watch to save time.”
Advertisers realized the unprecedented power of television to pump messages directly into people’s homes and workplaces. It was just over a decade since the agency N. W. Ayer had come up with what is now widely regarded as the most influential advertising tagline in United States’ history, “a diamond is forever.”
Consumption Serves as a Pacifier for Inequality
For Galbraith thought consumption made people worry less about inequality because, as long as they were able to purchase new consumer products once in a while, they felt that they were upwardly mobile and so closing the gap between themselves and others. “It has become evident to conservatives and liberals alike,” he noted drily, “that increasing aggregate output is an alternative to redistribution or even to the reduction of inequality.”
Great Decoupling, Rise of Inequality, Pressure on Work Hours, Degrading Trust
For much of the twentieth century, there was a relatively stable relationship between labor productivity and wages in the United States and other industrialized countries. This meant that as the economy grew and labor output increased, the amount of money people took home in their paychecks grew at a similar rate.
In 1980, that relationship broke down. In the “Great Decoupling,” productivity, output, and gross domestic product all continued to grow, but wage growth for all but the highest paid stalled.
The Great Decoupling killed off any lingering downward pressure on the length of the workweek. Most people simply couldn’t afford to maintain their lifestyles by working fewer hours.
Some economists even dispute that The Great Decoupling happened. They argue the stark graphs of divergence between productivity and median wages are inaccurate because they don’t account for rising incidental benefits paid to U.S. employees. For many others, though, the Great Decoupling was the first clear evidence that technological expansion was cannibalizing the workforce and concentrating wealth in fewer hands.
In 1965, chief executives in the top 350 U.S. firms took home roughly twenty times the pay of an “average worker.” By 1980, CEOs in the same top bracket of firms took home thirty times the annual salary of an average worker, and by 2015, that number had surged to just shy of three hundred times.
Artificial “Talent War”
The global consultancy firm McKinsey & Company started hysteria in 1998, when they introduced the word “talent” to the ever growing lexicon of corporate speak when they headlined one of their Quarterly briefings to clients and potential clients. “There is a war on talent, and it will intensify,” proclaimed one. “All are vulnerable,” warned another. The difference between good and bad companies was not the processes they followed or how efficient they were, but the clever people steering those businesses. To future historians, the “war for talent” may appear to be one of the most elaborate corporate conspiracies of all time.
In a 2002 issue of the New Yorker, Malcolm Gladwell delivered an eviscerating critique on what he dubbed “The Myth of Talent.” He took the view that the whole thing had been kicked off by overpaid McKinsey executives buying into the myth of their own brilliance. He also implicated McKinsey and their talent mindset in creating the toxic culture that brought down one of their favored clients, Enron—which had filed for bankruptcy in 2001.
When in 2008 stock markets collapsed, it seemed for a brief moment that the inflated salaries and stupendous bonuses were a bubble due to burst – that the public would lose faith in the brilliance of “top talent” when the financial crisis revealed that their Midas touch only ever produced mountains of fool’s gold. But the talent narrative was so deeply embedded, as institutions started retrenching staff and closing operations to cut costs, many simultaneously dipped into their meager cash reserves to allocate large retention bonuses.
Deepening Institutional Mistrust
The 2008 crash did precipitate a sharp decline in public confidence in economists. If the so-called experts hadn’t seen the crisis coming, then there was good cause to question their expertise. The problem was that because economics had masqueraded as a science for so long, people quite reasonably began to treat expertise in general with more skepticism, even in far more solidly grounded sciences like physics and medicine. As a result, among the more unexpected casualties of the financial crisis was the once near-universal confidence in people like climate scientists warning of the dangers of anthropogenic climate change and epidemiologists trying to explain the benefits of immunization.
The Confusion of Wealth and Work
The gap between reality and perception is particularly extreme in the United States where material inequality is the most acute it has been for half a century. There, surveys revealed that even after the crash most laypeople underestimated the pay ratio between bosses and unskilled workers by more than a factor of ten. The enduring public illusion of greater material equality in places like the United States and the United Kingdom is in part a testament to the perseverance of the idea that there is a clear, even meritocratic, correspondence between wealth and hard work. Thus, while those who are very wealthy like to believe that they are worthy of the financial rewards they have accrued, many poorer people don’t want to mess with the dream that they too might achieve such riches if only they work hard enough.
*Willfully* Working To Death
Following an investigation by Japan’s Ministry of Labor, the official cause of Miwa Sado’s death was changed to “karoshi”: death by overwork. In the month preceding her death, Sado had clocked an exhausting 159 hours of official overtime. That was equivalent to working two full eight-hour shifts every weekday over a four-week period. At the end of the year, the Ministry of Labor certified that 190 deaths occurred over the course of 2013 as a result of either karoshi (illness) or karo jisatsu (suicide) from working too hard.
Despite a well-funded government campaign in Japan to persuade people to go on holiday once in a while, since the turn of the millennium most Japanese workers still take fewer than half the total days of fully paid leave offered them.
China’s growth has been catalyzed by a disciplined and affordable labor force that has hoovered up manufacturing operations from businesses across the globe, and transformed China into the world’s largest producer and exporter of manufactured goods. In 2016, CCTV, the state broadcaster, which usually only resorts to hyperbole when they have good news to share, announced that more than half a million Chinese citizens die from overworking every year.
This is especially so for those working in China’s frenetic high-technology sector, led by companies like Baidu, Alibaba, Tencent, and Huawei. They now order their working lives according to the mantra “996.” The two 9s refer to the requirements to put in twelve-hour days, from 9 a.m. to 9 p.m., and the 6 refers to the six days of the week that employees with ambitions to get anywhere are expected to be at their workstations.
What makes the individual stories of karoshi and karo jisatsu different from these is the fact that what drove the likes of Miwa Sado to lose or take their lives was not the risk of hardship or poverty but their own ambitions refracted through the expectations of their employers.
The “own ambitions” is a major aspect of my 2020 essay, here.
The CEO of France Telecom was forced to step down and several senior managers were put on trial charged with “moral harassment,” as a consequence of the toxic working culture they instilled at the company and that prosecutors insisted contributed to thirty-five suicides among staff members over the course of 2008 and 2009.
First published in 1971, The Confessions of a Workaholic is now out of print and its advice is largely forgotten, but Pastor Wayne Oates’s neologism “workaholic” was instantly ushered into our everyday vocabulary.
Too Tired for Quality Leisure
Taking into account time spent getting to and from the workplace and doing essential household activities like shopping, housework, and childcare, working a standard forty-hour week does not leave a great deal of time for leisure. Unsurprisingly, most people in full-time employment use the bulk of their pure leisure time for restful, passive activities like watching TV.
Poor quality leisure time is a major aspect of my 2020 essay, here.
Our Deeper Ties to Work
Contrary to the narrative that the marketplace is a hotbed of kill-or-be-killed competition, for much of history people in similar trades usually cooperated, collaborated, and supported one another. Many of us not only spend our working lives in the company of colleagues, but also a fair portion of our lives outside of the workplace in their company too.
When Emile Durkheim contemplated possible solutions to the problem of anomie, he recognized that relationships forged in the workplace might help build the “collective consciousness” that once bound people into small, well-integrated village communities.
The Romans’ artisan collegia were not just trade organizations lobbying on behalf of their members’ interests. They played a vital role in establishing the civic identities for the humiliores—the lower classes—based on work, and then binding them into the larger hierarchies that bound Roman society. In many respects, the collegia operated like autonomous villages within the city. Each had its own customs, rituals, modes of dress, and festivals, and its own patrons, magistrates, and general assemblies modeled on the Roman Senate.
Most modern city dwellers still tend to embed themselves into surprisingly small and often diffuse social networks, which become their individual communities.
Customs and Values
The work we do often becomes a social focal point, which in turn shapes our ambitions, values, and political affiliations. It is no coincidence that when we first test the waters with strangers at social gatherings in cities, we tend to ask them about the work they do. Nearly one in three Americans enters into at least one long-term sexual relationship with people they meet through work, and a further percent meet their spouses there.
Our views of the world with those of our colleagues, as our bonds to them are strengthened in the course of pursuing shared goals and celebrating shared achievements.
Purpose, Meaning, “Bullshit” Jobs
In 2013, David Graeber argued corporate lawyers, public relations executives, health and academic administrators, and financial service providers, he referred to as “bullshit jobs” and defined as forms “of employment that is so completely pointless, unnecessary, or pernicious that even the employee cannot justify its existence.”
Workplace surveys consistently find more people are dissatisfied with the work they do, suggests that this is often just a coping mechanism—a characteristic of a species whose evolutionary history has been shaped so profoundly by its need for purpose and meaning.
The tendency for organizational bureaucracies to balloon is now sometimes referred to as Parkinson’s Law, after Cyril Northcote Parkinson, who proposed it in a tongue-in-cheek article he published in The Economist in 1955. According to Parkinson’s Law, for bureaucracies to stay alive and grow, they must continuously harvest energy, in the form of cash, and do work even if, like energetic masked weavers, the work serves no more purpose than expending energy.
In the most recent iteration of Gallup’s annual State of the Global Workplace report, it is revealed that only very few people find their work meaningful or interesting.
An Oxford by Frey and Osborne concluded that not only were robots already queuing at the factory gates but that they had fixed their beady little robot-eyes on nearly half of all existing jobs in the United States. Based on a survey of 702 different professions, they reckoned that 47 percent of all current jobs in the United States had “high risk” of being automated out of existence by as early as 2030. A flood of similar studies followed. Governments, multilateral organizations, think tanks, gilded corporate clubs like the World Economic Forum, and, inevitably, the big management consultancy firms all got in on the act. While each deployed slightly different methodologies, their findings all added layers of detail to Frey and Osborne’s gloomy assessment.
Who will benefit from automation and how?
At first, from the late 1980s through to the early 2000s, the widespread adoption of increasingly affordable digital technologies helped drive substantial reductions in inequality between countries. It did this in particular by helping poorer countries to compete for and then capture a growing proportion of the global manufacturing industry. Now increased automation looks likely to halt or even reverse the trend. By taking labor ever more out of the equation, automation removes any advantage countries with lower wage demands might have, because the costs of technology, unlike labor, are pretty much the same everywhere.
Automation is likely to entrench further structural inequality between and within countries. It will do this firstly by diminishing opportunities for unskilled and semi-skilled people to find decent employment, while simultaneously inflating incomes of those few who continue to manage what are largely automated businesses. As importantly, it will increase returns on capital rather than labor, so expanding the wealth of those who have cash invested in these businesses.
This means straightforwardly that automation will generate further wealth for the already wealthy, while further disadvantaging those who do not have the means to purchase stakes in companies and so free-ride off the work done by automata.
Replacing Scarcity Economics With Abundance Economics
John Maynard Keynes believed the transition to near full automation signaled not just the end of scarcity but of all the social, political, and cultural institutions, norms, values, attitudes, and ambitions that had congealed around what once seemed the eternal challenge of solving the economic problem. He was, in other words, calling time on the economics of scarcity, demanding its replacement with a new economics of abundance.
Nearly thirty years later, John Kenneth Galbraith made a similar argument when he insisted that the economics of scarcity was sustained by desires manufactured by wily advertisers. Galbraith was also of the view that the transition to an economics of abundance would be organic and shaped by individuals relinquishing the pursuit of wealth in favor of worthier work. He also believed that this transition was already happening in post-war America and that at its vanguard was what he called the “New Class”—those who chose their employment not for the money but rather the other rewards it yielded, among them pleasure, satisfaction, and prestige.
Were Keynes still alive today, he may well conclude that he just got the timing wrong and that the “growing pains” of his utopia were indicative of a far more persistent, but ultimately curable, condition. Alternatively, he may conclude that his optimism was unfounded and that our desire to keep solving the economic problem was so strong that even if our basic needs were met, we would continue to create often pointless replacements.
The Ju/’hoansi and other Kalahari foragers are the descendants of a single population group who have lived continuously since the first emergence of modern Homo sapiens possibly as long as 300,000 years ago. If the ultimate measure of sustainability is endurance over time, then hunting and gathering is by far the most sustainable economic approach developed in all of human history. They remind us that our contemporary attitudes to work are not only the progeny of the transition to farming and our migration into cities, but also that the key to living well depends on moderating our personal material aspirations.
There has been a resurgence of interest in models of organizing our future based on dogma or idyllic fantasies of the past. And while these have little in common with the visions of more technically minded utopians, they are no less influential in shaping the opinions and attitudes among a significant proportion of the global population. The recent rise in many countries of the toxic nationalism that the architects of the United Nations hoped would be banished after the horrors of the Second World War is a reflection of this.
The purpose of this book is somewhat less prescriptive. One aim is to reveal how our relationship to work—in the broadest sense—is more fundamental than that imagined by the likes of Keynes. The relationship between energy, life, and work is part of a common bond we have with all other living organisms, and at the same time our purposefulness, our infinite skillfulness, and ability to find satisfaction in even the mundane are part of an evolutionary legacy honed since the very first stirrings of life on earth. The principal purpose, however, has been to loosen the claw-like grasp that scarcity economics has held over our working lives, and to diminish our corresponding and unsustainable preoccupation with economic growth. For by recognizing that many of the core assumptions that underwrite our economic institutions are an artifact of the agricultural revolution, amplified by our migration into cities, frees us to imagine a whole range of new, more sustainable possible futures for ourselves, and rise to the challenge of harnessing our restless energy, purposefulness, and creativity to shaping our destiny.
It’s one thing to read this article. It’s another thing entirely to read the full book. Finally, it’s another thing to read the full book, take notes, and spend weeks reviewing and reflecting until epiphanies stay with you – and consequently, materially shape your life for the better.
In support of that last and most powerful outcome, I highly suggest anyone who works, (which is effectively anyone), to read this book, which illustrates great detail with scrupulous references and stories.
Carefully understanding the history of human civilization and work is the only way to understand it as a means, not an end, and shape society accordingly.