Fifty years ago, Italian business leaders in the Club of Rome gave a jolt to the world in their path-breaking report
Limits to Growth. That thought leadership continues today as
Italian business leaders launch Regeneration 2030, a powerful call for
more holistic, ethical, and sustainable business practices to help the
world achieve the Sustainable Development Goals
(SDGs) and the Paris Climate Agreement.
The 50-year journey from Limits of Growth
to Regeneration 2030 shows how far we have come in understanding the
critical challenges facing humanity, but also how far we still have to
go to meet those challenges.
The half-century since Limits to Growth also defines my own
intellectual journey, since I began university studies at Harvard
University exactly 50 years ago as well. One of the first books that I
was assigned in my introductory economics course was
Limits to Growth. The book made a deep and lasting impression
on me. Here for the first time was a mathematical simulation of the
world economy and nature viewed holistically, and using new systems
dynamics modeling then underway at the Massachusetts
Institute of Technology (MIT).
Limits to Growth warned that compound economic growth was on a
path to overshoot the Earth’s finite resources, leading to a potential
catastrophe in the 21st century. My professor huffily
dismissed the book and its dire warning. The book,
the professor told us, had three marks against it. First, it was
written by engineers rather than economists. Second, it did understand
the wonders of a self-correcting market system. Third, it was written
at MIT, not at Harvard! Even at the time, I was
not so sure about this easy dismissal of the book’s crucial warning.
Fifty years later, and after countless international meetings,
conferences, treaties, thousands of weighty research studies, and most
importantly, after another half-century of our actual experience on the
planet, we can say the following. First, the growing
world economy is indeed overshooting the Earth’s finite resources.
Scientists now speak of the global economy exceeding the Earth’s
“planetary boundaries.” Second, the violation of these planetary
boundaries threatens the Earth’s physical systems and therefore
humanity itself.
Specifically, humanity is warming the climate;
destroying the habitat of millions of other species; and polluting the
air, freshwater systems, soils, and oceans. Third, the market economy by
itself will not stop this destruction.
Many of
the most dangerous actions – such as emitting climate-changing
greenhouse gases, destroying native forests, and adding chemical
nutrients to the rivers and estuaries – do not come with market signals
attached. Earth is currently treated as a free dumping
ground for many horrendously destructive practices.
Twenty years after Limits to Growth, in 1992, the world’s
governments assembled at the Rio de Janeiro Earth Summit to adopt
several environmental treaties, including the UN Framework Convention on
Climate Change (UNFCCC) and the Convention on Biological
Diversity.
Twenty years later, in 2012, the same governments
re-assembled in Rio to discuss the fact that the environmental treaties
were not working properly. Earth, they acknowledged, was in growing
danger. At that 2012 summit they committed to establish
Sustainable Development Goals (SDGs) to guide humanity to safety. In
2015, all 193 UN member states adopted the SDGs and a few weeks later
signed the Paris Climate Agreement to implement the 1992 climate treaty.
In short, we have gone a half-century from the first warnings to today.
We have adopted many treaties and many global goals, but in practice,
have still not changed course. The Earth continues to warm, indeed at
an accelerating rate. The Earth’s average
temperature is now 1.2°C warmer than in the pre-industrial period
(dated as 1880-1920), and is higher than at any time during the past
10,000 years of civilization.
Warming has accelerated to more than
0.3°C per decade, meaning that in the next decade we
will very possibly overshoot the 1.5°C warming limit that the world
agreed to in Paris.
A key insight for our future is that we now understand the difference
between mere “economic growth” and real economic progress. Economic
growth focuses on raising traditional measures of national income, and
is merely doing more of what we are already doing:
more pollution, more greenhouse gas emissions, more destruction of the
forests.
True economic progress aims to raise the wellbeing of
humanity, by ending poverty, achieving a fairer and more just economy,
ensuring the quality education for all children, preventing
new disease outbreaks, and increasing living standards through
sustainable technologies and business practices. True economic progress
aims to transform our societies and technologies to raise human
wellbeing.
Regeneration 2030 is a powerful business initiative led by Italian
business leaders committed to real transformation. Regeneration aims to
learn from nature itself, by creating a more circular economy that
eliminates wastes and pollution by recycling, reusing,
and regenerating natural resources. Of course, an economy can’t be
entirely circular – it needs energy from the outside (otherwise
violating the laws of thermodynamics).
But rather than the energy
coming from digging up and burning fossil fuels, the energy
of the future should come from the sun (including solar power, wind,
hydroelectric, and sustainable bioenergy) and from other safe
technologies. Even safe man-made fusion energy may be within technical
and economical reach in a few decades.
On my part, I am trying as well to help regenerate economics, to become a
new and more holistic academic discipline of sustainable development.
Just as business needs to be more holistic and aligned with the SDGs,
economics as an intellectual discipline needs
to recognize that the market economy must be embedded within an ethical
framework, and that politics must aim for the common good. Scientific
disciplines must work together, joining forces across the natural
sciences, policy sciences, human sciences, and
the arts.
Pope Francis has spurred the call for such a new and
holistic economics by encouraging young people to adopt a new “Economy
of Francesco,” inspired by the love of nature and humanity of St.
Francis of Assisi.
Sustainable Development, Regenerative Economy, and the Economy of
Francesco are, at the core, a new way of harnessing our know-how, 21st
century technologies, and ethics, to promote human wellbeing. The
first principle is the common good – and that
means that we must start with peace and cooperation. Ending the war in
Ukraine at the negotiating table without further delay, and finding
global common purpose between the West and East, is a good place for us
to begin anew.
Image above: This Hubble Space Telescope image by NASA of the cluster Westerlund 2 and its surroundingswas been released in 2015 to celebrate Hubble's 25th year in orbit. From (http://time.com/3833015/hubble-telescope-photo/).
Humanity has a lot of problems these days. Climate change, increasing economic inequality, crashing biodiversity, political polarization, and a global debt bubble are just a few of our worries.
None of these trends can continue indefinitely without leading to a serious failure of our civilization’s ability to maintain itself. Taken together, these metastasizing problems suggest we are headed toward some kind of historic discontinuity.
Serious discontinuities tend to disrupt the timelines of all complex societies (another name for civilizations—that is, societies with cities, writing, money, and full-time division of labor).
The ancient Roman, Egyptian, and Mayan civilizations all collapsed. Archaeologists, historians, and systems thinkers have spent decades seeking an explanation for this pattern of failure—a general unified theory of civilizational collapse, if you will.
One of the most promising concepts that could serve as the basis for such a theory comes from resilience science, a branch of ecology (the study of the relationship between organisms and their environments).
Why Civilizations Collapse: The Adaptive Cycle
Ecosystems have been observed almost universally to repeatedly pass through four phases of the adaptive cycle: exploitation, conservation, release, and reorganization. Imagine, for example, a Ponderosa pine forest.
Following a disturbance such as a fire (in which stored carbon is released into the environment), hardy and adaptable “pioneer” species of plants and small animals fill in open niches and reproduce rapidly.
This reorganization phase of the cycle soon transitions to an exploitation phase, in which those species that can take advantage of relationships with other species start to dominate. These relationships make the system more stable, but at the expense of diversity.
During the conservation phase, resources like nutrients, water, and sunlight are so taken up by the dominant species that the system as a whole eventually loses its flexibility to deal with changing conditions.
These trends lead to a point where the system is susceptible to a crash—a release phase. Many trees die, dispersing their nutrients, opening the forest canopy to let more light in, and providing habitat for shrubs and small animals. The cycle starts over.
Civilizations do roughly the same thing. In their early days, complex societies are populated with generalist pioneers (people who do lots of things reasonably well) living in an environment with abundant resources ready to be exploited. These people develop tools to enable them to exploit their resources more effectively.
Division of labor and trade with increasingly distant regions also aids in more thorough resource exploitation. Trading and administrative centers, i.e., cities, appear and grow. Money is increasingly used to facilitate trade, while debt enables a transfer of consumption from the future to the present. Specialists in violence, armed with improved weaponry, conquer surrounding peoples.
Complexity (more kinds of tools, more social classes, more specialization) solves problems and enables accumulation of wealth, leading to a conservation phase during which an empire is built and great achievements are made in the arts and sciences.
However, as time goes on, the costs of complexity accumulate and the resilience of the society declines. Tax burdens become unbearable, natural resources become depleted, environments become polluted, and conquered peoples become restless.
At its height, each civilization appears stable and invincible. Yet it is just at this moment of triumph that it is vulnerable to external enemies and internal discord. Debt can no longer be repaid. Conquered peoples revolt. A natural disaster breaks open the façade of stability and control.
Collapse often comes swiftly, leaving ruin in its wake.
But at least some of the components that made the civilization great (including tools and elements of practical knowledge) persist, and the natural environment has opportunity to regenerate and recover, eventually enabling reorganization and a new exploitation phase—that is, the rise of yet another civilization.
Energy Is Everything
Global industrial civilization shows significant signs of being in its conservation phase. Our accomplishments are mind-boggling, but our systems are overstretched, and problems (including climate change, inequality, and political dysfunction) are accumulating and worsening.
However, our civilization is different from any of its predecessors. Unlike the ancient Romans, Greeks, Egyptians, Shang Dynasty Chinese, Incas, Aztecs, and Mayans, we have built a civilization that is global in scope.
We have invented modes of transportation and communication previously unimaginable. Thanks to advances in public health and agriculture, the total human population has grown to many times its size when Roman armies marched across North Africa, Europe, and Britain. Have we perhaps outgrown the adaptive cycle and escaped natural checks to perpetual expansion?
In order to answer the question, we must first inquire why modern civilization has been so successful. The rise of technology, including advances in metallurgy and engineering, certainly played a part. These provided better ways of obtaining and harnessing energy.
But it’s the rapid shift in qualities and quantities of energy available to us that really made the difference.
Previously, people derived their energy from annual plant growth (food and firewood), and manipulated their environment using human and animal muscle power. These energy sources were inherently limited. But, starting in the 19th century, new technologies enabled us to access and harness the energy of fossil fuels. And fossil fuels—coal, oil, and natural gas—were able to provide energy in amounts far surpassing previous energy sources.
Energy is everything. All terrestrial ecosystems and all human societies are essentially machines for using (and dissipating) solar energy that has been collected and concentrated through photosynthesis. We like to think that money makes the world go ’round, but it is actually energy that enables us to do anything at all—from merely getting up in the morning to launching a space station. And having lots of energy available cheaply can enable us to do a great deal.
Fossil fuels represent tens of millions of years’ worth of stored ancient sunlight. They are energy-dense, portable, and storable sources of power. Accessing them changed nearly everything about human existence.
They were uniquely transformative in that they enabled higher rates of harvesting and using all other resources—via tractors, bulldozers, powered mining equipment, chainsaws, motorized fishing trawlers, and more.
Take just one example. In all previous agrarian civilizations, roughly three-quarters of the population had to farm in order to supply a food surplus to support the other 25 percent—who lived as aristocrats, traders, soldiers, artisans, and so on. Fossil fuels enabled the industrialization and automation of agriculture, as well as longer-distance distribution chains.
Today only one or two percent of the U.S. population need to farm full-time in order to supply everyone else with food. The industrialization of food systems has freed up nearly all of the former peasant class to move to cities and take up jobs in manufacturing, marketing, finance, advertising, management, sales, and so on.
Thus urbanization and the dramatic expansion of the middle class during the 20th century were almost entirely attributable to fossil fuels.
But fossil fuels have been a bargain with the devil: these are depleting, non-renewable resources, and burning them produces carbon dioxide and other greenhouse gases, changing the climate and the chemistry of the world’s oceans.
These are not small problems. Climate change by itself is far and away the most serious pollution dilemma any human society has ever faced, and could lead to crashing ecosystems, failing food systems, and widespread forced human migration.human needs and desires can be satisfied by self-reproducing machines.
Denial comes in shades, some of them quite benign. Many thoughtful and informed people acknowledge the threats of climate change, species extinctions, soil depletion, and so on, and insist that we can overcome these threats if we just try harder. They are often on the right track when they propose changes.
Elect different, more responsible politicians. Donate to environmental nonprofit organizations. Drive an electric car.
Put solar panels on our roofs. Start solar co-ops or regional non-profit utility companies that aim to source all electricity from renewable sources. Eat organic food. Shop at local farmers markets.
These are all actions that move society in the right direction (that is, away from the brink of failure)—but in small increments. Perhaps people can be motivated to undertake such efforts through the belief that a smooth transition and a happy future are possible, and that renewable energy will create plentiful jobs and lead to a perpetually growing green economy.
There is no point in discouraging such beliefs and their related actions; quite the contrary: they should, if anything, be encouraged. Such practical efforts, however motivated or rationalized, could help moderate collapse, even if they can’t prevent it (a point we’ll return to below). But an element of denial persists nonetheless—denial, that is, of the reality that the overall trajectory of modern industrial society is beyond our control, and that it leads inexorably toward overshoot and collapse.
What to Do?
All of the above may help us better understand why the world seems to be running off the rails. But the implications are horrific. If all this is true, then we now face more-or-less inevitable economic, social, political, and ecological calamity. And since industrial civilization is now global, and human population levels are multiples higher than in any previous century, this calamity could occur on a scale never seen before.
Although no one can possibly predict at this point just how complete and awful collapse might actually be, even human extinction is conceivable (though no one can say with any confidence that it is likely, much less inevitable).
This is more than a fragile human psyche can bear. One’s own mortality is hard enough to contemplate. A school of psychology (“terror management theory”) proposes that many of our cultural institutions and practices (religion, values of national identity) exist at least in part to help us deal with the intolerable knowledge of our inevitable personal demise.
How much harder must it be to acknowledge signs of the imminent passing of one’s entire way of life, and the extreme disruption of familiar ecosystems? It is therefore no wonder that so many of us opt for denial and distraction.
There’s no question that collapse is a scary word.
When we hear it, we tend to think immediately of images from movies like Mad Max and The Road. We assume collapse means a sudden and complete dissolution of everything meaningful. Our reasoning shuts down. But this is just when we need it most.
In reality, there are degrees of collapse, and history shows that the process has usually taken decades and sometimes centuries to unfold, often in stair-steps punctuated by periods of partial recovery. Further, it may be possible to intervene in collapse to improve outcomes—for ourselves, our communities, our species, and thousands of other species.
After the collapse of the Roman Empire, medieval Irish monks may have “saved civilization” by memorizing and transcribing ancient texts. Could we, with planning and motivation, do as much and more?
Many of the things we could do toward this end are already being done in order to avert climate change and other converging crises.
Again, people who voluntarily reduce energy usage, eat locally grown organic food, make the effort to get to know their neighbors, get off the consumer treadmill, reduce their debt, help protect local biodiversity by planting species that feed or shelter native pollinators, use biochar in their gardens, support political candidates who prioritize addressing the sustainability crisis, and contribute to environmental, population, and human rights organizations are all helping moderate the impending collapse and ensure that there will be more survivors. We could do more.
Acting together, we could start to re-green the planet; begin to incorporate captured carbon not only in soils, but in nearly everything we make, including concrete, paper, and plastics; and design a new economic system based on mutual aid rather than competition, debt, and perpetual growth. All of these efforts make sense with or without the knowledge that civilization is nearing its sell-by date.
How we describe the goals of these efforts—whether as ways of improving people’s lives, as ways to save the planet, as fulfilling the evolutionary potential of our species, as contributing to a general spiritual awakening, or as ways of moderating an inevitable civilizational crash—is relatively unimportant.
However, the Big Picture (an understanding of the adaptive cycle, the role of energy, and our overshoot predicament) adds both a sense of urgency, and also a new set of priorities that are currently being neglected.
For example, when civilizations collapse, culturally significant knowledge is typically lost. It’s probably inevitable that we will lose a great deal of our shared knowledge during the coming centuries. Much of this information is trivial anyway (will our distant descendants really suffer from not having the ability to watch archived episodes of Let’s Make a Deal or Storage Wars?).
Yet people across the globe now use fragile storage media—computer and server hard drives—to store everything from music to books to instruction manuals. In the event that the world’s electricity grids could no longer be maintained, we would miss more than comfort and convenience; we could lose science, higher mathematics, and history.
It’s not only the dominant industrial culture that is vulnerable to information loss. Indigenous cultures that have survived for millennia are being rapidly eroded by the forces of globalization, resulting in the extinction of region-specific knowledge that could help future humans live sustainably.
Upon whom does the responsibility fall to curate, safeguard, and reproduce all this knowledge, if not those who understand its peril?
Act Where You Are: Community Resilience
We at Post Carbon Institute (PCI) have been aware of the Big Picture since the founding of the organization 15 years ago. We’ve been privileged to meet, and draw upon the insights of, some of the pioneering ecologists of the 1960s, ’70s, and ’80s who laid the basis of our current understanding of resilience science, systems thinking, climate change, resource depletion, and much more. And we’ve strived to convey that understanding to a younger generation of thinkers and activists.
Throughout this time, we have continually grappled with the question, “What plan for action makes the most sense in the context of the Big Picture, given our meager organizational resources?”
After protracted discussion, we’ve hit upon a four-fold strategy. Encourage resilience building at the community level.
Resilience is the capacity of a system to encounter disruption and still maintain its basic structure and functions. When it is in its conservation phase, a system’s resilience is typically at its lowest level throughout the entire adaptive cycle. If it is possible at this point to build resilience into the human social system, and ecological systems, then the approaching release phase of the cycle may be more moderate and less intense.
Why undertake resilience building in communities, rather than attempting to do so at the national or international level? It’s because the community is the most available and effective level of scale at which to intervene in human systems.
National action is difficult these days, and not only in the United States: discussions about nearly everything quickly become politicized, polarized, and contested. It’s at the community level where we most directly interact with the people and institutions that make up our society. It’s where we’re most affected by the decisions society makes: what jobs are available to us, what infrastructure is available for our use, and what policies exist that limit or empower us.
And critically, it’s where the majority of us who do not wield major political or economic power can most directly affect society, as voters, neighbors, entrepreneurs, volunteers, shoppers, activists, and elected officials.
PCI has supported Transition Initiatives since its inception as one useful, locally replicable, and adaptable model for community resilience building.
Leave good ideas lying around.
Naomi Klein, in her book The Shock Doctrine, quotes economist Milton Friedman, who wrote:
“Only a crisis—actual or perceived—produces real change. When that crisis occurs, the actions that are taken depend on the ideas that are lying around. That, I believe, is our basic function: to develop alternatives to existing policies, to keep them alive and available until the politically impossible becomes the politically inevitable.”
Friedman and other neoliberal economists have used this “shock doctrine” for decades to undermine regional economies, national governments, and indigenous cultures in order to further the project of corporate-led economic globalization. Klein’s point is that the key to taking advantage of crises is having effective system-changing plans waiting in the wings for the ripe moment.
And that’s a strategy that makes sense as society as a whole teeters on the brink of an immensely disruptive shift.
What ideas and skills need to be lying around as industrial civilization crumbles?
One collection of ideas and skills that’s already handily packaged and awaiting adoption is permaculture—a set of design tools for living created by ecologists back in the 1970s who understood that industrial civilization would eventually reach its limits. Another set consists of consensus group decision-making skills. The list could go on at some length.
Target innovators and early adopters.
Back in the 1960s, Everett Rogers, a professor of communications, contributed the theory of the Diffusion of Innovations, which describes how, why, and at what rate new ideas, social innovations, and technology spread throughout culture.
The key to the theory is his identification of different types of individuals in the population, in terms of how they relate to the development and adoption of something new: innovators, early adopters, early majority, late majority, and laggards.
Innovators are important, but the success of their efforts depends on diffusion of the innovation among early adopters, who tend to be few in number but exceptionally influential in the general population.
At PCI, we have decided to focus our communications on early adopters.
Help people grasp the Big Picture.
Discussions about the vulnerability of civilization to collapse are not for everyone. Some of us are too psychologically fragile. All of us need a break occasionally, and time to feel and process the emotions that contemplating the Big Picture inevitably evokes.
But for those able to take in the information and still function, the Big Picture offers helpful perspective. It confirms what many of us already intuitively know. And it provides a context for strategic action.
Pro-Social, Nonpartisan
I’m frequently asked if I have hope for the future. My usual reply is along these lines: hope is not just an expectation of better times ahead; it is an active attitude, a determination to achieve the best possible outcome regardless of the challenges one is facing. PCI Fellow David Orr summed this up best when he wrote, “Hope is a verb with its sleeves rolled up.”
However, if that’s as far as the discussion goes, merely redefining “hope” may seem facile and unsatisfying. The questioner wants and needs reasonable grounds for believing that an outcome is possible that is something other than horrific. There is indeed evidence along these lines, and it should not be ignored.
Steven Pinker, in his book The Better Angels of Our Nature, argues that we humans are becoming more peaceful and cooperative. Now, it could be argued that any decline in violence during the past few decades can be seen as yet another indication that civilization is in a conservation phase of the adaptive cycle: we have attained a balance of power, facilitated by the wealth flowing ultimately from fossil fuels; perhaps violence is simply being held in abeyance until the dam breaks and we head into the release phase of the cycle. Nevertheless, evolution is real, and for humans it occurs more rapidly via culture than through genes. It is entirely possible, therefore, that we humans are rapidly evolving to live more peacefully in larger groups.
Earlier I explained how the findings of neuroscience help us understand why so many of us turn to denial and distraction in the face of terrible threats to civilization’s survival. Neuroscience also offers good news: it teaches us that cooperative impulses are rooted deep in our evolutionary past, just like competitive ones.
Self-restraint and empathy for others are partly learned behaviors, acquired and developed in the same way as our capacity for language. We inherit both selfishness and the capacity for altruism, but culture generally nudges us more in the direction of the latter, as parents are traditionally encouraged to teach their children to share and not to be wasteful or arrogant.
Disaster research informs us that, in the early phases of crisis, people typically respond with extraordinary degrees of cooperation and self-sacrifice (I witnessed this in the immediate aftermath of wildfires in my community of Santa Rosa, California). But if privation persists, they may turn toward blame and competition for scarce resources.
All of this suggests that the one thing that is most likely to influence how our communities get through the coming meta-crisis is the quality of relationships among members. A great deal depends on whether we exhibit pro-social attitudes and responses, while discouraging blame and panic. Those of us working to build community resilience need to avoid partisan frames and loaded words, and appeal to shared values. Everyone must understand that we’re all in this together.
The Big Picture can help here, if it aids people in grasping that the collapse of civilization is not any one group’s fault. It is only by pulling together that we can hope to salvage and protect what is most intrinsically valuable about our world, and perhaps even improve lives over the long term.
Hard times are in store. But that doesn’t mean there’s nothing we can do. Each day of relative normalcy that remains is an occasion for thankfulness and an opportunity for action.
Over the next few minutes I hope to share with you a little of what I’ve learned about the likely trajectory of industrial society for the remainder of this century, and some speculations about the possible role of music and related arts within that trajectory.
Perhaps the best way to introduce the ideas and information I want to share is to tell you some of my personal story.
I grew up in the Midwestern states in the 1950s and ’60s, where my interests swung between the sciences (my father was an industrial chemist) and the arts: I loved drawing and painting, and at age 11 fell in love with classical music.
I demanded that my parents get me a violin, and fortunately when they did they also paid for lessons with the concertmaster of the local symphony—a gentleman named Louis Riemer, who had studied briefly with Leopold Auer at Juilliard.
Mr. Riemer gave me a good technical foundation on the instrument, for which I will always be grateful. But, just as I was graduating high school and heading for college, the Summer of Love and the Vietnam War overtook America. Suddenly playing Haydn quartets seemed less interesting.
At the University of Iowa I continued with music lessons and played in the orchestra, but spent increasing amounts of time attending protests, experimenting with psychedelic drugs, and listening to the Grateful Dead.
I taught myself to play the guitar and spent the next seven years professionally playing electric guitar and electric violin in rock bands. But something else happened right after college that would eventually send me down an entirely different path: I started reading environmental literature.
Probably the most influential book I came across at the time was The Limits to Growth.
A team of young experts in a new field called systems dynamics, working at MIT, had used a computer to model the likely interactions between Earth’s resources, human population, pollution levels, food production, and other basic factors of the economy.
They found that, in their models and simulations, global growth in population and industrial output could be maintained for only a few decades, no matter how they jiggered the software or the input data.
Doubling Earth’s resources would put off the inevitable peak and decline by only a few years. The only way to generate a scenario without a crash was to model policies to end population growth and dramatically cut the rates at which we’re consuming resources.
In other words, the only way to avoid the collapse of civilization was to voluntarily scale back just about everything we’re doing that entails interaction with the physical world around us.
At the time, the Limits to Growth authors were optimistic that, once policy makers understood the alternatives and the consequences, they would choose to restrict population and consumption.
However, the notion that economic growth might fairly soon crash against the planet’s limits proved extremely unwelcome to economists and politicians, who had come to count on the endless growth of the economy to provide jobs for workers, profits for investors, and increasing tax revenues for governments.
Articles appeared in New York Times, Newsweek, and other prominent publications pretending to debunk the idea of natural limits.
Ronald Reagan would soon insist that “There are no such things as limits to growth, because there are no limits to the human capacity for intelligence, imagination, and wonder.” That’s an inspiring sentiment. But, of course, the MIT scientists hadn’t been modeling intelligence, imagination, or wonder.
They were looking at mineral resources, soil fertility, and the capacity of the atmosphere and oceans to absorb wastes and pollution.
Imagination and wonder are terrific, but by themselves they don’t increase the size of the world’s forest cover or the number of wild fish in the oceans. In reality, the pushback against the MIT study was all smoke and mirrors.
An abundance of subsequent research supported the Limits to Growth scenario studies. The computer software used in 1972 was primitive by current standards, but it has been upgraded regularly since then. The data have also evolved in the intervening decades.
Today you can supply upgraded software with the very latest figures on population, resources, food production, and industrial output, and climate change, and essentially the same scenarios will tumble onto your computer screen.
The “standard run” scenario, in which policy makers continue to seek as much growth as possible, always shows a peak and decline in world industrial output around the end of the first quarter of this century, followed by declining food production, then declining population.
And here we are, rapidly approaching the end of the first quarter of the century.
Five years after the publication of The Limits to Growth, I was experiencing my own limits—in terms of success in the commercial rock music scene. In retrospect, that was a very good thing.
Making music is often wonderful, but the music business often isn’t. With my interests straying toward other subjects, I started writing essays as a way of making sense of the world. My stuff started getting published, and soon I was making my living with words.
In effect, I was chronicling the early phase of society’s collision with natural limits as it was happening. Here’s the current scorecard: We’re now losing 25 billion tons of topsoil a year due to industrial agriculture.
At the same time, we’re adding 80 million new humans each year on a net basis, with our population growing by about a billion every 12 years.
Meanwhile, the planet is reeling from human-forced global warming: glaciers and permafrost are melting, the seas are rising, and the pace is accelerating.
Global wildlife populations have declined nearly 60 percent since the 1970s, and species are going extinct at 1,000 times the normal background rate.
Healthy coral reefs could be completely gone by 2050, and by then oceans may be almost completely free of fish due to climate change, overfishing, pollution, and habitat loss.
Over the years, I have written several books about fossil fuel depletion, co-authored a lengthy report on the unsustainability of our current food system, and researched and discussed climate change and other pollution issues.
I even produced a book in 2011 titled The End of Growth, which explains in some detail how we are living out the “standard run” scenario from 1972.
Along the way, I’ve tried to satisfy my own curiosity with regard to the question, How and why have humans gotten themselves into this mess? Finding answers required that I delve into history and anthropology.
It turns out that, while we humans have been expanding our range and altering our environments for millennia, our efforts got turbocharged starting in the nineteenth century.
The main driver was cheap, concentrated sources of energy in the forms of coal, oil, and natural gas—fossil fuels. These were a one-time-only gift from nature, and they changed everything.
Energy is essential to everything we do, and with cheap, abundant, concentrated energy a lot became possible that was previously unimaginable.
We used newly invented technologies to channel this sudden abundance of energy toward projects that everyone agreed were beneficial—growing more food, extracting more raw materials, manufacturing more products, transporting ourselves and our goods faster and over further distances, defeating diseases with modern medicine, entertaining ourselves, and protecting ourselves with advanced weaponry.
We used some of our fossil fuels to make electricity, an extremely versatile energy carrier that, among many other things, enabled music to be amplified, recorded, and reproduced on an assortment of media. In short, fossil fuels increased our power over the world around us, and the power of some of us over others.
But our increasing reliance on fossil fuels was in two respects a bargain with the devil.
First, extracting, transporting, and burning these fuels polluted air and water, and caused a subtle but gradually accelerating change in the chemistry of the planetary atmosphere and the world’s oceans.
Second, fossil fuels are finite, nonrenewable, and depleting resources that we exploit using the low-hanging fruit principle.
That means that as we extract and burn them, each new increment entails higher monetary and energy costs, as well as greater environmental risk.
Basing our entire economy on the ever-increasing rate at which we burn a finite fuel supply is the very definition of stupid. And yet we do this with brilliant technical efficiency.
Fossil fuels made us a more successful species, able to increase our numbers and averaged per-capita consumption, and powerful enough to steal rapidly increasing amounts of ecological space away from other creatures.
This success has had serious side effects, including the fouling of air and water, the decline and extinction of a rapidly growing list of other creatures, and the increasing lethality of warfare.
Fossil fuels made rapid economic growth possible, yet the expansion of Earth’s carrying capacity for humans, based on fossil fuels, must inevitably prove to be as temporary as those fuels themselves.
Like rapidly proliferating bacteria in a Petrie dish, we are destined to consume our nutrients and face the consequences.
In 1997, I was invited to help design, and teach in, one of the first college programs on sustainability. Ten years later, I joined the environmental nonprofit think tank Post Carbon Institute as Senior Fellow, a position I am happy to fill currently.
Throughout all these years there was always music. I played wedding and orchestra gigs, and enjoyed concerts and reading sessions with string quartets and string trios, and duos with guitar or piano.
Today, I still spend two hours a day practicing—you know the drill: an hour of scales, arpeggios, and etudes, followed by an hour or so of repertoire—doing my best to hone my modest technique and learn new music. It’s nearly always the highlight of my day.
How do these two activities—writing about our environmental crisis and playing music—fit together?
And more deeply, what role might music and the arts generally play as part of our human response to climate change and ecological overshoot?
In the 1997 film “Titanic,” Wallace Hartley, the violinist and leader of the band on the ill-fated ship, turns to his band mates as the water rises around him and says: “Gentlemen, it has been a privilege playing with you tonight.”
Is the only contribution we musicians can make at this moment in history to bravely go down with the ship, lifting the spirits of other passengers?
I think we can do quite a bit better. What I mean by that will take a while to unpack, and will require a little meander.
We might start by asking, What makes a culture worth sustaining?
One answer that comes to mind is, beauty—from the spare, honest beauty of a Zen temple or a shakuhachi flute, to the over-the-top ornate beauty of an Italian Renaissance cathedral or a Puccini opera. Aesthetics are a product of time and place.
But the human response to beauty, and the urge to create it, are instinctive and transcend humanity itself.
We know this because other animals are also obsessed with beauty. During the 1940s, English musicologist Len Howard devoted herself to studying the music of wild birds. According to Theodore Barber’s account of her work (in his marvelous book, The Human Nature of Birds),
She became personally acquainted with many and knew some for their entire lives. . . . Her intimate study of bird songs led to . . . surprising conclusions:
Birds, like humans, enjoy their songs. They take pleasure in singing, and they enjoy hearing even their territorial rivals sing.
Birds not only convey messages and express feelings and emotions in their songs, but at times they sing simply because they are happy.
[Birds of the same species] can be reliably identified by their unique variations of the species’ song. In fact, conspecific birds apparently differ in musical talent as much as humans. This unexpected variability is due to the individual bird’s interpretation of the theme, his technical ability in executing it, his “style” of delivery, and the quality or timbre of his voice. Some very poor singers are found in every songbird species. . . . There are also very superior musicians among songbirds. For instance, over a period of a few days, a talented blackbird creatively and spontaneously composed the opening phrase of the Rondo in Beethoven’s violin concerto. (He had not previously heard it.) During the remainder of the season he varied the interpretation of the phrase; “the pace was quickened toward the end . . . a rubato effect that added brilliance to the performance.”
Of course, it’s a long way from a bird’s song to a performance of Mahler’s “Resurrection” symphony; the latter is a lot more complicated and expensive to produce, and requires a lot of cooperation.
Music and the other arts came to be developed to extremes of complexity largely as a result of the process of professionalization—which again can only be understood in terms of anthropology and history.
Hunter-gatherers had music, but it was relatively simple—as simple and beautiful in its way as a birdsong. With more intensive means of food production—farming—we were able to produce food surpluses that could be stored.
That enabled the construction of cities and full-time division of labor. Homo sapiens has been around for about 350,000 years, but farming is a comparatively recent development, starting only about 10,000 years ago. It was a fateful shift.
For the first time in the human story, we see writing, money, and far more sophisticated weapons and other tools. We also see full-time artists and musicians.
Each of these developments, and each of these technologies, changed us. For example, Marshall McLuhan and others have pointed out that the use of writing, and especially alphabetic writing, tended to nudge our thought processes in certain directions. As the classicist Eric Havelock once put it.
It is only as language is written down that it becomes possible to think about it. The acoustic medium, being incapable of visualization, did not achieve recognition as a phenomenon wholly separable from the person who used it.
But in the alphabetized document the medium became objectified. There it was, reproduced perfectly in the alphabet . . . no longer a function of “me” the speaker but a document with an independent existence.
The earliest important document in alphabetic script was the Bible—The Book. And to this day millions of people regard that document with awe as an almost animate source of absolute wisdom and authority.
Johann Sebastian Bach was himself devoted to the Good Book, and he lived not far from the birthplace of the printing press, an invention that further intensified the psychological impact of the written word by emphasizing (through its movable type) the interchangeability of alphabetic characters, and by enabling the majority of the population to own and read printed Bibles.
The printing press also set inventors to contemplating the usefulness of interchangeable parts, thus helping seed the industrial revolution.
If the writing of words made human thinking more rational and sequential, the writing of music had an analogous effect.
Rather than being memorized, tunes could be jotted down and read later, perhaps by someone else who had never heard the tune before. Tunes could become more complicated, yet still be “remembered” on paper. Tunes could take on an existence of their own; they could be bought and sold.
Every new technological advantage implies the potential loss of some former ability. Writing, as Plato noted, saps the memory. Similarly, reliance on musical notation does little to foster the ability to improvise.
Everyone who has spent much time around a professional orchestra knows that most classical string players are spectacular sight-readers but utterly inept improvisers (though that’s changing). How many times have I been requested to “Play us a tune,” only to hear myself reply ineptly, “But I don’t have any music with me.
And so progress is usually a tradeoff. And like biological evolution, it is only temporarily directional. Evolution doesn’t have a final goal in mind; it’s just an endless process of adaptation.
Often it leads to dead ends. All species eventually go extinct, and, sometimes, vast numbers of species go extinct all at once. Similarly, cultural evolution appears to proceed in cycles: over the past ten thousand years, roughly 24 civilizations have arisen, but they have all tended to go through a process of expansion and then collapse.
With our linguistic brains, we tend to assign cosmic meanings to these gains, and often-rapid losses, of complexity. But in the end, it’s not about smiling or angry gods; it’s not about human ingenuity or collective moral decay; it’s about environmental carrying capacity.
In his theory of culture, anthropologist Marvin Harris located the arts in what he called the superstructure of society, together with religions and ideologies. In Harris’s formulation, the superstructure and structure (politics, economic system) of society primarily tend to respond to changes in infrastructure, which is the interface between society and nature, the means of production and reproduction.
With one type of infrastructure (hunting and gathering), we get a consistent set of tools, religious practices, and ways of organizing society, across the globe.
With another type of infrastructure (early forms of agriculture) we see the rise of kingdoms, the appearance of sky-god religions, writing, and so on—whether in India, China, Central America, or Mesopotamia.
Harris’s view would have been that the industrial revolution and overwhelming societal changes that flowed from it—the growth of the middle class, credit, advertising, mass marketing, propaganda, mass political movements—didn’t happen primarily because of literary, musical, or artistic efforts; they occurred largely because we discovered rich new energy sources.
Abstract expressionism didn’t drive the social, cultural, and psychological changes of the twentieth century; rather, the art of Pollock, Kline, and de Kooning emerged in response to the development of photography and psychoanalysis, and to the social and personal alienation brought about by industrialism.
With color photographic reproductions everywhere cheaply available, representational art came to seem hokey and pointless. Instead of painting people and nature, the artist’s job was now to portray the interior of the psyche. Similarly, electronic music—including amplified rock music—followed upon the electrification of society, it didn’t inspire it.
Material conditions change; then consciousness changes; and new art forms follow to express changing consciousness. Sometimes the artist appears as a revolutionary or a social critic—think Woody Guthrie, Rage Against the Machine, or Geto Boyz. Other times, the artist is little more than a commercial or political tool.
In either case, the artist’s efforts help shape the terms by which society adapts consciousness to its infrastructural regime. The artist does modify culture, but cannot do so in a vacuum.
Where there are grounds for a revolutionary movement, the artist can help give it identity and cohesion.
On the other hand, employed by society’s elites, the artist can forge images that galvanize enthusiastic cooperation—whether in support of a political candidate, or in service to the projects of selling more breakfast cereal or waging a war.
The enormous complexity of modern industrial civilization theoretically offers a far wider scope for creativity than was the case in previous societies: every industrial artifact—from the paper clip to the computer mouse to the laser scanner in the grocery store to the handle on a refrigerator—has to be designed. We in the modern industrial world are thus surrounded by art to a degree unparalleled in any earlier society.
City dwellers must exert effort—sometimes, considerable effort—to see a surface not designed by another human, or to hear a sound not generated by humans or their machines, including music playback machines.
In addition, the population densities that are afforded by the modern city, and thus the opportunities for interaction among artists, permit an extraordinary level of development of technique. There are more piano virtuosi alive today, playing at a higher level of technical perfection, than at any other time in history.
The same with nearly every other medium: there are more highly skilled sculptors, painters, calligraphers, ballroom dancers, or whatever, than ever before.
But we pay a cumulative price for this artistic bonanza. By confining ourselves within a human-designed—and thus human-centered—universe, we cut ourselves off from the true source of art—which is nature.
Technical perfection and media sophistication cannot replace naturalness of gesture. We stumble from the movie theater, sated and numbed. We get into the car, cue up some music, and drive home.
We turn on the television and glance at it occasionally as we devour a logo-emblazoned deli sandwich from the refrigerator. The semblance of life grows ever more convincing as the reality of life disappears in a forest clear-cut somewhere beyond view from the highway.
However, as I tried to convey a few minutes ago, the current environment for the arts—urban industrial society—is basically unsustainable.
Which brings us to the subject of our future. Society a few decades from now will operate very differently from how it does now, or it won’t be operating at all. At the base of this shift will be our energy regime: society will have to move away from fossil fuels this century to avert catastrophic climate change. And if it doesn’t, fossil fuels will move away from us as a result of depletion.
One way or another, our societal infrastructure will shift. This will probably be as profound a historic rupture as the industrial revolution itself, maybe comparable to the agricultural revolution 10,000 years ago.
It’s tempting to think that we can just unplug coal power plants, plug in solar panels, and continue living essentially the same as we do now. But this is wrong in two ways.
First, it’s important to understand the fundamental differences between intermittent renewable energy sources like solar and wind, and depleting but available-on-demand fossil fuels. I recently co-authored a study, with David Fridley of Lawrence Berkeley National Laboratory, titled Our Renewable Future, in which we examined how energy usage will need to change to accommodate these new energy sources.
We concluded that energy usage in highly industrialized nations like the United States will have to decline significantly, and whole sectors—transportation, manufacturing, and agriculture—will need to be transformed to run on electricity rather than gaseous or liquid fuels.
Our existing systems were built to fit the strengths of our incumbent energy sources; nearly everything will require rethinking to take advantage of the inherent qualities of solar and wind power. It would make sense, for example, to decentralize systems, to make them more distributed and localized, and to use energy when it’s available, rather than expecting to use it 24/7.
But there’s another reason that it would be wrong to think we can keep living essentially as we do now as, and after, we make the energy transition: our ecological crisis is not all about climate change. If climate change were the sum total of our environmental challenge, then all we’d need to do is get rid of carbon emissions and we’d be good to go.
Don’t get me wrong: climate change is by far the worst pollution dilemma humans have ever faced, and if we don’t deal with it all of Earth’s creatures are in for one hell of a ride. Yet in addition to climate change we also face mass species extinctions due to habitat loss, along with the depletion of soil, water, and minerals.
Our population continues to grow even as habitat and resources disappear.
We need a more comprehensive way of framing the ecological crisis; I prefer to speak of overshoot, a term familiar to population ecologists.
Due to a temporary energy subsidy, we have grown our population and consumption beyond levels that can be sustained long-term, and we are eroding Earth’s capacity to support future generations. The only way to deal with overshoot is to dial back the whole human enterprise.
One way or another, whether as a result of adaptation or collapse, we can look forward to a future characterized by lower overall rates of consumption of energy and materials. That raises the question of equity. Will a few luxuriate in abundance while multitudes starve? That’s a recipe for revolutions, coups, and the rise of dictators.
Or will we learn to share both resources and scarcity while choosing to reduce our population to a sustainable size?
Our future will also hold less complexity. That’s because societal complexity requires energy. So if less energy is available, that will inevitably translate to less globalization and more localized, smaller-scale economies.
Our future will feature a less-stable climate. We will need more resilience—more adaptability, as well as redundancy in critical systems. We will need to learn how to fit into nature’s cycles rather than imagining that we can dominate our planet and move on to other planets once we’ve chewed our way through this one.
If, rather than simply collapsing, society adapts by becoming less centralized, more localized; if population and consumption (especially in wealthy countries) shrink rather than continually growing, then how will artists be affected by this extraordinary transformation?
How could they help lead it?
Perhaps the obvious answer is to produce sustainability-themed operas, motion pictures, concerti, country-and-western songs, string quartets, and computer game soundtracks. However, I think we could also be more—um, creative in our thinking.
First, I think we need to be honest with ourselves. The next years and decades will be filled with challenges of all kinds—foreseeable and unforeseeable. It will be a turbulent time and may not provide a stable platform for a tranquil, uninterrupted career in a symphony orchestra or even a touring rock band. It’s hard enough to be a successful musician in the world as it is, but someone’s about to move the goalposts, deflate the football, and rewrite the rules of the game.
That doesn’t mean that making music isn’t worth the effort. It just means it will be important to avoid tunnel vision, and to pay attention to what’s happening in society as a whole so as to be able to adapt quickly and be in position to take advantage of opportunities.
I’d like to suggest three broad projects for musicians and other artists for the remainder of this century:
Preserve our culture’s greatest achievements. Musicians tend to assume that the works of Bach, Mozart, Ellington, and other great composers constitute a common heritage that will last for the ages. It’s sobering to reflect on how much was lost of ancient Egyptian, Greek, and Roman culture when those civilizations fell. Sheet music printed on acid-laced paper will disintegrate over time; so will magnetic tape, CDs, and computer hard drives. Music cannot survive if it isn’t continually refreshed in live performance. If we really love this music, it’s up to us to carry it forward—to play it and to teach the needed and satisfying skills of music performance to younger generations.
Help society adapt. As societies change, it is up to artists to reflect people’s feelings and experiences back to them, transformed into art that’s inspiring and healing. Think of how Beethoven helped reflect the beginnings of modern democracy, the Romantic Movement in poetry and philosophy, and the nascent industrial revolution—in music that shattered the aristocratic formalism of previous generations. Or recall how Shostakovich translated the horrific and protracted siege of Stalingrad into his tragic yet also hopeful Eighth Symphony. Now think ahead. We have embarked on a century in which all the systems we have built since the start of the industrial revolution—our food system, our transport systems, our energy system, our buildings systems, our financial system, and possibly our political and governance systems as well—will prove unsustainable. At the same time, the natural world will be shifting around us in unprecedented ways. Everything will be up for change, redesign, and negotiation. This may turn out to be the great fulcrum of history. Artists will have the opportunity and duty to translate the resulting tumultuous human experience into words, images, and music that help people not just to mentally understand, but to viscerally come to grips with events. And society will need the service of artists as never before as we re-weave the fabric of local community.
Do what artists always do, what even the birds do: celebrate life’s beauty. Our charge is to do this well, in fact better than ever. Life is precious, and our planet is precious. As Joni Mitchell put it,
Don’t it always seem to go
That you don’t know what you’ve got ’til it’s gone?
They paved paradise
Put up a parking lot
Perhaps the most important job of the artist, after all, is to remind us that we’re already in paradise.
[IB Publisher's note: Have you noticed that humans have all but disappeared from our interactions with corporations - be it the phone company, Amazon, Google, Apple, Microsoft and even local, state and federal government? Should children be trained to be polite to Apple's Siri or Amazon's Alexa? What's next? Obedience?]
Futurists warning about the threats of AI are looking in the wrong place. Humanity is already facing an existential threat from an artificial intelligence we created hundreds of years ago. It’s called the Corporation.
Some of the leading thinkers of our time are unleashing a stream of warnings about the threat of artificial intelligence taking over from humans. Earlier this month, Stephen Hawking predicted it could be “the worst event in the history of our civilization” unless we find a way to control its development.
Billionaire Elon Musk has formed a company to try to keep humans one step ahead of what he sees as an existential AI threat.
The scenario that terrifies them is that, in spite of the best intentions, we might create a force more powerful than all of humanity with a value system that doesn’t necessarily incorporate human welfare.
Once it reaches a critical mass, this force could take over the world, control human activity, and essentially suck all life out of the earth while it optimizes for its own ends.
Prominent futurist Nick Bostrom gives an example of a superintelligence designed with the goal of manufacturing paperclips that transforms the entire earth into one gigantic paperclip manufacturing facility.
These futurists are right to voice their concerns, but they’re missing the fact that humans have already created a force that is well on its way to devouring both humanity and the earth in just the way they fear. It’s called the Corporation.
Corporations “enthroned”
When corporations were first formed back in the seventeenth century, their inventors—just like modern software engineers—acted with what they believed were good intentions. The first corporate charters were simply designed to limit an investor’s liability to the amount of their investment, thus encouraging them to finance risky expeditions to India and Southeast Asia.
However, an unintended consequence soon emerged, known as moral hazard: with the potential upside greater than the downside, reckless behavior ensued, leading to a series of spectacular frauds and a market crash that resulted in corporations being temporarily banned in England in 1720.
Thomas Jefferson and other leaders of the United States, aware of the English experience, were deeply suspicious of corporations, giving them limited charters with tightly constrained powers.
However, during the turmoil of the Civil War, industrialists took advantage of the disarray, leveraging widespread political corruption to expand their influence.
Shortly before his death, Abraham Lincoln lamented what he saw happening with a resounding prophecy: “Corporations have been enthroned … An era of corruption in high places will follow… until wealth is aggregated in a few hands … and the Republic is destroyed.”
Corporations took full advantage of their new-found dominance, influencing state legislatures to issue charters in perpetuity giving them the right to do anything not explicitly prohibited by law.
The tipping point in their path to domination came in 1886 when the Supreme Court designated corporations as “persons” entitled to the protections of the Fourteenth Amendment, which had been passed to give equal rights to former slaves enfranchised after the Civil War.
Since then, corporate dominance has only been further enhanced by law, culminating in the notorious Citizen United case of 2010, which lifted restrictions on political spending by corporations in elections.
Sociopaths with global reach
Corporations, just like a potential runaway AI, have no intrinsic interest in human welfare. They are legal constructions: abstract entities designed with the ultimate goal of maximizing financial returns for their investors above all else.
If corporations were in fact real persons, they would be sociopaths, completely lacking the ability for empathy that is a crucial element of normal human behavior.
Unlike humans, however, corporations are theoretically immortal, cannot be put in prison, and the larger multinationals are not constrained by the laws of any individual country.
With the incalculable advantage of their superhuman powers, corporations have literally taken over the world. They have grown so massive that an astonishing sixty-nine of the largest hundred economies in the world are not nation states but corporate entities.
Corporations have been able to use their transnational powers to dictate their own terms to virtually any country in the world.
As a result of decades of globalization, corporations can exploit the free movement of capital to build factories in nations with the weakest labor unions, or locate polluting plants in countries with lax environmental laws, basing their decisions solely on maximizing returns for their shareholders. Governments compete with each other to make their nations the most attractive for corporate investment.
Corporations wield their vast powers to control the minds of consumers, enthralling them into a state of perpetual consumption.
In the early twentieth century, Paul Bernays, a mastermind of corporate empowerment, boldly stated his game plan as “the conscious and intelligent manipulation of the organized habits and opinions of the masses.” He declared ominously that “those who manipulate this unseen mechanism of society constitute an invisible government that is the true ruling power of this country.”
The sinister words of Wayne Chilicki, chief executive of General Mills, show how Bernays’ vision has been perpetuated: “When it comes to targeting kid consumers, we at General Mills… believe in getting them early and having them for life.”
The result of this corporate takeover of humanity is a world careening out of control, where nature is mercilessly ransacked to extract the raw materials required to increase shareholder value in a vortex of perpetual economic growth, without regard to the quality of human life and with no concern for the welfare of future generations.
Corporate takeover of global governance
Instead of being pilloried for their vast destruction, those who dedicate themselves to their corporate overlords are richly rewarded and elevated to positions of even greater power and prestige.
ExxonMobil, for example, has been exposed as having lied shamelessly about climate change, knowing for decades about its consequences and yet deliberately concealing the facts, thus condemning present and future generations to havoc.
Instead of facing jail time, the CEO during much of this period, Rex Tillerson, is now the U.S. Secretary of State, overseeing the global relationships of the most powerful country in the world.
In fact, the current U.S. cabinet represents the most complete takeover yet of the U.S. government by corporations, with nearly 70% of top administration jobs filled by corporate executives.
In the words of Robert Weissman, president of Public Citizen, “In the Trump administration, auto industry lobbyists are setting transportation policy, Boeing has a top perch at the Department of Defense, Wall Street is in control of financial policy and regulatory agencies, and corporate defense lawyers staff the key positions in the Justice Department.”
Corporations are inserting themselves into international agreements, so they can further their interests even more effectively. At the 2015 World Economic Forum in Davos, a new Global Redesign Initiative set out an agenda for multinational corporations to engage directly in global governance.
The UN’s Sustainable Development Goals, proudly announced in 2015 as a vision to reduce poverty, adopted their approach by inviting corporations to a seat at its table to impact UN policy, while calling for further globalization.
Fossil fuel companies have infiltrated the annual global COP meetings on climate change, ensuring they can compromise any actions that might hurt them, even as the world faces the threat of climate catastrophe.
The takeover of global governance by multinational corporations has permitted them to undermine human welfare everywhere in the pursuit of profit.
Nestlé remorselessly buys control of rural communities’ groundwater reservoirs to sell as bottled water, leaving them to foot the bill for environmental cleanup, with the result that in countries such as Columbia sugary bottled drinks are frequently cheaper than plain water.
As a result of the chemicals sold by global agribusiness companies such as Cargill and Monsanto, it’s been estimated by UN officials that the world’s topsoil can only support about sixty more years of harvests.
In these cases, and countless others like them, humans and the earth alike are mere fodder for the insatiable appetite of an amoral, inhuman intelligence run amok.
There is an alternative
The corporate takeover of humanity is so all-encompassing that it’s difficult to visualize any other possible global system. Alternatives do, however, exist. Around the world, worker-owned cooperatives have demonstrated that they can be as effective as corporations—or more so—without pursuing shareholder wealth as their primary consideration.
The Mondragon cooperative in Spain, with revenues exceeding €12 billion, shows how this form of organization can efficiently scale.
There are also structural changes that can be made to corporations to realign their values system with human welfare. Corporate charters can be amended to optimize for a triple bottom line of social, environmental, and financial outcomes (the so-called “triple Ps” of people, planet, and profit.)
A “beneficial” or B-Corp certification, which holds companies to social and environmental performance standards, is becoming more widely adopted and is now held by over 2,000 corporations in over fifty countries around the world.
Ultimately, if we are stop this force from completely taking over humanity, these alternative approaches need to be codified into our national and international governance. Imagine a world where corporate charters were only granted if they adopted a triple bottom line, and where shareholder lawsuits threatened every time a company broke one of its own social and environmental standards.
Until that happens, it may be that the “worst event in the history of our civilization” is not the future development of modern AI, but the decision by a group of 17th century politicians to unleash the power of the Corporation on an unsuspecting humanity.
A curious thing about Homo sapiens is that we are clever enough to document — in exquisite detail — various trends that portend the collapse of modern civilization, yet not nearly smart enough to extricate ourselves from our self-induced predicament.
This was underscored once again in October when scientists reported that flying insect populations in Germany have declined by an alarming 75 per cent in the past three decades accompanied, in the past dozen years, by a 15 per cent drop in bird populations.
Trends are similar in other parts of Europe where data are available. Even in Canada, everything from casual windshield “surveys” to formal scientific assessments show a drop in insect numbers.
Meanwhile, domestic populations of many insect-eating birds are in freefall.
Ontario has lost half its whip-poor-wills in the past 20 years; across the nation, such species as nighthawks, swallows, martins and fly-catchers are down by up to 75 per cent; Greater Vancouver’s barn and bank swallows have plummeted by 98 per cent since 1970. Heard much about these things in the mainstream news?
Too bad. Biodiversity loss may turn out to be the sleeper issue of the century. It is caused by many individual but interacting factors — habitat loss, climate change, intensive pesticide use and various forms of industrial pollution, for example, suppress both insect and bird populations.
But the overall driver is what an ecologist might call the “competitive displacement” of non-human life by the inexorable growth of the human enterprise.
On a finite planet where millions of species share the same space and depend on the same finite products of photosynthesis, the continuous expansion of one species necessarily drives the contraction and extinction of others.
Politicians take note — there is always a conflict between human population/economic expansion and “protection of the environment."
Remember the 40 to 60 million bison that used to roam the great plains of North America?
They — along with the millions of deer, pronghorns, wolves and lesser beasts that once animated prairie ecosystems — have been “competitively displaced,” their habitats taken over by a much greater biomass of humans, cattle, pigs and sheep.
And not just North Americans — Great Plains sunshine also supports millions of other people-with-livestock around world who depend, in part on North American grain, oil-seed, pulse and meat exports.
Competitive displacement has been going on for a long time. Scientists estimate that at the dawn of agriculture 10,000 years ago, H. sapiens comprised less than one per cent of the total weight of mammals on the planet. There were probably only two to four million people on Earth at the time.
Since then, humans have grown to represent 35 per cent of a much larger total biomass; toss in domestic pets and livestock, and human domination of the world’s mammalian biomass rises to 98.5 per cent!
One needs look no further to explain why wildlife populations globally have plunged by nearly 60 per cent in the past half century.
Wild tigers have been driven from 93 per cent of their historic range and are down to fewer than 4,000 individuals globally; the population of African elephants has imploded by as much as 95% to only 500,000 today; poaching drove black rhino numbers from an already much reduced 70,000 in 1960 to only 2,500 individuals in the early 1990s. (With intense conservation effort, they have since rebounded to about 5,000).
And those who still think Canada is still a mostly pristine and under-populated wilderness should think again — half the wildlife species regularly monitored in this country are in decline, with an average population drop of 83 per cent since 1970.
Did I mention that B.C.’s southern resident killer whale population is down to only 76 animals? That’s in part because human fishers have displaced the orcas from their favoured food, Chinook salmon, even as we simultaneously displace the salmon from their spawning streams through hydro dams, pollution and urbanization.
The story is similar for familiar species everywhere and likely worse for non-charismatic fauna. Scientists estimate that the “modern” species extinction rate is 1,000 to as much as 10,000 times the natural background rate.
The global economy is busily converting living nature into human bodies and domestic livestock largely unnoticed by our increasingly urban populations. Urbanization distances people psychologically as well as spatially from the ecosystems that support them.
The human band-wagon may really have started rolling 10 millennia ago but the past two centuries of exponential growth greatly have accelerated the pace of change. It took all of human history — let’s say 200,000 years — for our population to reach one billion in the early 1800s, but only 200 years, 1/1000th as much time, to hit today’s 7.6 billion!
Meanwhile, material demand on the planet has ballooned even more — global GDP has increased by over 100-fold since 1800; average per capita incomes by a factor of 13. (rising to 25-fold in the richest countries).
Consumption has exploded accordingly — half the fossil fuels and many other resources ever used by humans have been consumed in just the past 40 years.
See graphs in: Steffen, W et al. 2015.The trajectory of the Anthropocene: The Great Acceleration. The Anthropocene Review, Volume: 2 Issue: 1, page(s): 81-98.)
Why does any of this matter, even to those who don’t really give a damn about nature per se? Apart from the moral stain associated with extinguishing thousands of other life-forms, there are purely selfish reasons to be concerned.
For example, depending on climate zone, 78 per cent to 94 per cent of flowering plants, including many human food species, are pollinated by insects, birds and even bats. (Bats — also in trouble in many places — are the major or exclusive pollinators of 500 species in at least 67 families of plants.)
As much as 35 per cent of the world’s crop production is more or less dependent on animal pollination, which ensures or increases the production of 87 leading food crops worldwide.
But there is a deeper reason to fear the depletion and depopulation of nature. Absent life, planet earth is just an inconsequential wet rock with a poisonous atmosphere revolving pointlessly around an ordinary star on the outer fringes of an undistinguished galaxy.
It is life itself, beginning with countless species of microbes, that gradually created the “environment” suitable for life on Earth as we know it.
Biological processes are responsible for the life-friendly chemical balance of the oceans; photosynthetic bacteria and green plants have stocked and maintain Earth’s atmosphere with the oxygen necessary for the evolution of animals.
The same photosynthesis gradually extracted billions of tons of carbon from the atmosphere, storing it in chalk, limestone and fossil fuel deposits, so that Earth’s average temperature (currently about 15º C) has remained for geological ages in the narrow range that makes water-based life possible, even as the sun has been warming (i.e. stable climate is partially a biological phenomenon.); countless species of bacteria, fungi and a veritable menagerie of micro-fauna continuously regenerate the soils that grow our food.
Unfortunately, depletion-by-agriculture is even faster — by some accounts we have only just over a half-century’s worth of arable soils left.
In short, H. sapiens depends utterly on a rich diversity of life-forms to provide various life-support functions essential to the existence and continued survival of human civilization.
With an unprecedented human-induced great global die-off well under way, what are the chances the functional integrity of the ecosphere will survive the next doubling of material consumption that everyone expects before mid-century?
Here’s the thing: climate change is not the only shadow darkening humanity’s doorstep. While you wouldn’t know it from the mainstream media, biodiversity loss arguably poses an equivalent existential threat to civilized existence.
While we’re at it, let’s toss soil/landscape degradation, potential food or energy shortages and other resource limits into the mix.
And if you think we’ll probably be able to “handle” four out of five such environmental problems, it doesn’t matter.
The relevant version of Liebig’s Law states that any complex system dependent on several essential inputs can be taken down by that single factor in least supply (and we haven’t yet touched upon the additional risks posed by the geopolitical turmoil that would inevitably follow ecological destabilization).
There are many policy options, from simple full-cost pricing and consumption taxes; through population initiatives and comprehensive planning for a steady-state economy; to general education for voluntary (and beneficial) lifestyle changes, all of which would enhance global society’s prospects for long-term survival.
Unique human qualities, from high intelligence (e.g., reasoning from the evidence), through the capacity to plan ahead to moral consciousness, may well be equal to the task but lie dormant — there is little hint of political willingness to acknowledge the problem let alone elaborate genuine solutions (which the Paris climate accord is not).
Bottom line? The world seems in denial of looming disaster; the “C” word remains unvoiced. Governments everywhere dismissed the 1992 scientists’ Warning to Humanity that “...a great change in our stewardship of the Earth and the life on it is required, if vast human misery is to be avoided” and will similarly ignore the scientists’ “second notice." (Published on Nov. 13, this warning states that most negative trends identified 25 years earlier “are getting far worse.”)
Despite cascading evidence and detailed analysis to the contrary, the world community trumpets “growth-is-us” as its contemporary holy grail.
Even the United Nations’ Sustainable Development Goals are fixed on economic expansion as the only hammer for every problematic nail. Meanwhile, greenhouse gases reach to at an all-time high, marine dead-zones proliferate, tropical forests fall and extinctions accelerate.
Just what is going on here? The full explanation of this potentially fatal human enigma is no doubt complicated, but Herman Melville summed it up well enough in Moby Dick: “There is no folly of the beasts of the earth which is not infinitely outdone by the madness of men.”
Image above: The Andromeda galaxy behind a silhouette of mountains. From original article.
Back in the 1950s, sociologist C. Wright Mills wrote cogently about what he called “crackpot realism”—the use of rational, scientific, utilitarian means to pursue irrational, unscientific, or floridly delusional goals. It was a massive feature of American life in Mills’ time, and if anything, it’s become more common since then.
Since it plays a central role in the corner of contemporary culture I want to discuss this week, I want to put a few moments into discussing where crackpot realism comes from, and how it wriggles its way into the apple barrel of modern life and rots the apples from skin to core.
Let’s start with the concept of the division of labor.
One of the great distinctions between a modern industrial society and other modes of human social organization is that in the former, very few activities are taken from beginning to end by the same person.
A woman in a hunter-gatherer community, as she is getting ready for the autumn tuber-digging season, chooses a piece of wood, cuts it, shapes it into a digging stick, carefully hardens the business end in hot coals, and then puts it to work getting tubers out of the ground.
Once she carries the tubers back to camp, what’s more, she’s far more likely than not to take part in cleaning them, roasting them, and sharing them out to the members of the band.
A woman in a modern industrial society who wants to have potatoes for dinner, by contrast, may do no more of the total labor involved in that process than sticking a package in the microwave.
Even if she has potatoes growing in a container garden out back, say, and serves up potatoes she grew, harvested, and cooked herself, odds are she didn’t make the gardening tools, the cookware, or the stove she uses.
That’s division of labor: the social process by which most members of an industrial society specialize in one or another narrow economic niche, and use the money they earn from their work in that niche to buy the products of other economic niches.
Let’s say it up front: there are huge advantages to the division of labor. It’s more efficient in almost every sense, whether you’re measuring efficiency in terms of output per person per hour, skill level per dollar invested in education, or what have you.
What’s more, when it’s combined with a social structure that isn’t too rigidly deterministic, it’s at least possible for people to find their way to occupational specialties for which they’re actually suited, and in which they will be more productive than otherwise.
Yet it bears recalling that every good thing has its downsides, especially when it’s pushed to extremes, and the division of labor is no exception.
Crackpot realism is one of the downsides of the division of labor. It emerges reliably whenever two conditions are in effect.
The first condition is that the task of choosing goals for an activity is assigned to one group of people and the task of finding means to achieve those goals is left to a different group of people.
The second condition is that the first group needs to be enough higher in social status than the second group that members of the first group need pay no attention to the concerns of the second group.
Consider, as an example, the plight of a team of engineers tasked with designing a flying car. People have been trying to do this for more than a century now, and the results are in: it’s a really dumb idea.
It so happens that a great many of the engineering features that make a good car make a bad aircraft, and vice versa; for instance, an auto engine needs to be optimized for torque rather than speed, while an aircraft engine needs to be optimized for speed rather than torque.
Thus every flying car ever built—and there have been plenty of them—performed just as poorly as a car as it did as a plane, and cost so much that for the same price you could buy a good car, a good airplane, and enough fuel to keep both of them running for a good long time.
Engineers know this.
Still, if you’re an engineer and you’ve been hired by some clueless tech-industry godzillionaire who wants a flying car, you probably don’t have the option of telling your employer the truth about his pet project—that is, that no matter how much of his money he plows into the project, he’s going to get a clunker of a vehicle that won’t be any good at either of its two incompatible roles—because he’ll simply fire you and hire someone who will tell him what he wants to hear.
Nor do you have the option of sitting him down and getting him to face what’s behind his own unexamined desires and expectations, so that he might notice that his fixation on having a flying car is an emotionally charged hangover from age eight, when he daydreamed about having one to help him cope with the miserable, bully-ridden public school system in which he was trapped for so many wretched years.
So you devote your working hours to finding the most rational, scientific, and utilitarian means to accomplish a pointless, useless, and self-defeating end. That’s crackpot realism.
You can make a great party game out of identifying crackpot realism—try it sometime—but I’ll leave that to my more enterprising readers.
What I want to talk about right now is one of the most glaring examples of crackpot realism in contemporary industrial society. Yes, we’re going to talk about space travel again.
No question, a fantastic amount of scientific, technological, and engineering brilliance went into the quest to insert a handful of human beings for a little while into the lethal environment of deep space and bring them back alive.
Visit one of the handful of places on the planet where you can get a sense of the sheer scale of a Saturn V rocket, and the raw immensity of the effort that put a small number of human bootprints on the Moon is hard to miss. What’s much easier to miss is the whopping irrationality of the project itself.
(I probably need to insert a parenthetical note here. Every time I blog about the space program, I can count on fielding at least one comment from some troll who insists that the Moon landings never happened.
It so happens that I’ve known quite a few people who worked on the Apollo project; some of them have told me their stories and shown me memorabilia from what was one of the proudest times of their lives; and given a choice between believing them, and believing some troll who uses a pseudonym to hide his identity but can’t hide his ignorance of basic historical and scientific facts, well, let’s just say the troll isn’t going to come in first place. Nor is his comment going to go anywhere but the trash. ‘Nuf said.)
Outer space simply isn’t an environment where human beings can survive for long.
It’s near-perfect vacuum at a temperature a few degrees above absolute zero; it’s full of hard radiation streaming out from the huge unshielded fusion reactor at the center of our solar system; it’s also got chunks of rock, lots of them, whizzing through it at better than rifle-bullet speeds; and the human body is the product of two billion years of evolutionary adaptation to environments that have the gravity, atmospheric pressure, temperature ranges, and other features that are found on the Earth’s surface and, as far as we know, nowhere else in the universe.
A simple thought experiment will show how irrational the dream of human expansion into space really is.
Consider the harshest natural environments on this planet—the stark summits of the Himalayas; the middle of the East Antarctic ice sheet in winter; the bleak Takla Makan desert of central Asia, the place caravans go to die; the bottom of the Marianas Trench, where the water pressure will reduce a human body to paste in seconds.
Nowhere in the solar system, or on any of the exoplanets yet discovered by astronomers, is there a place that’s even as well suited to human life as the places I’ve just named.
Logically speaking, before we try to settle the distant, airless, radiation-blasted deserts of Mars or the Moon, wouldn’t it make sense first to build cities on the Antarctic ice or in the lightless depths of the ocean?
With one exception, in fact, every one of the arguments that has been trotted out to try to justify the settlement of Mars can be applied with even more force to the project of settling Antarctica.
In both cases, you’ve got a great deal of empty real estate amply stocked with mineral wealth, right? Antarctica, though, has a much more comfortable climate than Mars, not to mention abundant supplies of water and a breathable atmosphere, both of which Mars lacks.
Furthermore, it costs a lot less to get your colonists to Antarctica, they won’t face lethal irradiation on the way there, and there’s at least a chance that you can rescue them if things go very wrong.
If in fact it made any kind of sense to settle Mars, the case for settling Antarctica would be far stronger.
So where are the grand plans, lavishly funded by clueless tech-industry godzillionaires, to settle Antarctica? Their absence shows the one hard fact about settling outer space that next to nobody is willing to think about: it simply doesn’t make sense.
The immense financial and emotional investments we’ve made in the notion of settling human beings on other planets or in outer space itself would be Exhibit A in a museum of crackpot realism.
This is where the one exception I mentioned above comes in—the one argument for settling Mars that can’t also be made for settling Antarctica. This is the argument that a Martian colony is an insurance policy for our species.
If something goes really wrong on Earth, the claim goes, and human beings die out here, having a settlement on Mars gives our species a shot at survival.
Inevitably, given the present tenor of popular culture, you can expect to hear this sort of logic backed up by embarrassingly bad arguments. I’m thinking, for example, of a rant by science promoter Neil DeGrasse Tyson, who likes to insist that dinosaurs are extinct today because they didn’t have a space program.
We’ll assume charitably that Tyson spent long nights stargazing in his teen years, and so tended to doze off during his high school biology classes; no doubt that’s why he missed three very obvious facts about dinosaurs.
The first is that they were the dominant life forms on land for well over a hundred million years, which is a good bit longer than our species shows any likelihood of being able to hang on; the second is that the vast majority of dinosaur species went extinct for ordinary reasons—there were only a very modest number of dinosaur species around when the Chicxulub meteorite came screaming down out of space to end the Cretaceous Period; and the third is that dinosaurs aren’t extinct—we call them birds nowadays, and in terms of number of species, rates of speciation, and other standard measures of evolutionary vigor, they’re doing quite a bit better than mammals just now.
Set aside the bad logic and the sloppy paleontology, though, and the argument just named casts a ruthlessly clear light on certain otherwise obscure factors in our contemporary mindset.
The notion that space travel gets its value as a way to avoid human extinction goes back a long ways. I recall a book by Italian journalist Oriana Falacci, compiling her interviews with leading figures in the space program during its glory days; she titled it If The Sun Dies, after the passionate comment along these lines by one of her interviewees.
Behind this, in turn, lies one of the profound and usually unmentioned fears that shapes the modern mind: the terror of deep time.
There’s a profound irony in the fact that the geologists who first began to figure out the true age of the Earth lived in western Europe in the early nineteenth century, when most people believed that the world was only some six thousand years old.
There have been plenty of cultures in recorded history that had a vision of time expansive enough to fit the facts of geological history, but the cultures of western Europe and its diaspora in the Americas and Australasia were not among them.
Wedded to literalist interpretations of the Book of Genesis, and more broadly to a set of beliefs that assigned unique importance to human beings, the people who faced the first dim adumbrations of the vastness of Earth’s long history were utterly unprepared for the shock, and even less ready to have the first unnerving guesses that the Earth might be millions of years old replaced by increasingly precise measurements that gave its age in the billions of years, and that of the universe in the trillions.
The brutal nature of the shock that resulted shouldn’t be underestimated.
A society that had come to think of humanity as creation’s darlings, dwelling in a universe with a human timescale, found itself slammed facefirst into an unwanted encounter with the vast immensities of past and future time. That encounter had a great many awkward moments.
The self-defeating fixation of evangelical Christians on young-Earth creationism can be seen in part as an attempt to back away from the unwelcome vista of deep time; so is the insistence, as common outside Christian churches as within them, that the world really will end sometime very soon and spare us the stress of having to deal with the immensity of the future.
For that matter, I’m not sure how many of my readers know how stunningly unwelcome the concept of extinction was when it was first proposed: if the universe was created for the benefit of human beings, as a great many people seriously argued in those days, how could there have been so many thousands of species that lived and died long ages before the first human being walked the planet?
Worse, the suspicion began to spread that the future waiting for humanity might not be an endless progression toward bigger and better things, as believers in progress insisted, or the end of the world followed by an eternity of bliss for the winning team, as believers in Christianity insisted, but extinction: the same fate as all those vanished species whose bones kept surfacing in geological deposits.
It’s in the nineteenth century that the first stories of human extinction appear on the far end of late Romanticism, just as the same era saw the first tales that imagined the history of modern civilization ending in decline and fall.
People read The Black Cloud and After London for the same rush of fascinated horror that they got from Frankenstein and Dracula, and with the same comfortable disbelief once the last page turned—but the same scientific advances that made the two latter books increasingly less believable made tales of humanity’s twilight increasingly more so.
It became fashionable in many circles to dismiss such ideas as mere misanthropy, and that charge still gets flung at anyone who questions current notions of humanity’s supposed future in space. It’s a curious fact that I tend to field such comments from science fiction writers, more than from anyone else just now.
A few years ago, when I sketched out a fictive history of the next ten billion years that included human extinction millions of years from now, SF writer David Brin took time out of his busy schedule to denounce it as “an infuriating paean to despair.” Last month’s post on the worlds that never were, similarly, fielded a spluttering denunciation by S.M. Stirling.
It was mostly a forgettable rehash of the standard arguments for an interstellar future—arguments, by the way, that could be used equally well to justify continued faith in perpetual motion—but the point I want to raise here is that Stirling’s sole reaction to Aurora, Kim Stanley Robinson’s brilliant fictional critique of the interstellar-travel mythos, was to claim dismissively that Robinson must have suffered an attack of misanthropy.
Some of my readers may remember Verruca Salt, the archetypal spoiled brat in Willy Wonka and the Chocolate Factory.
When her father didn’t give her whatever she happened to want, her typical response was to shriek, “You don’t love me!” I think of that whenever somebody trots out the accusation of misanthropy in response to any realistic treatment of the limits that will shape the human future.
It’s not misanthropy to point out that humanity isn’t going to outlast the sun or leap breezily from star to star; it’s simple realism, just as reminding someone that they will inevitably die is an expression not of hatred but of common sense.
You, dear reader, will die someday. So will I, and so will every other human being.
That fact doesn’t make our lives meaningless; quite the contrary, it’s when we come to grips with the fact of our own mortality that we have our best shot at achieving not only basic maturity, but that condition of reflective attention to meaning that goes by the name of wisdom.
In exactly the same way, recognizing that humanity will not last forever—that the same Earth that existed and flourished long before our species came on the scene will exist and flourish long after our species is gone—might just provide enough of a boost of wisdom to help us back away from at least some of the more obviously pigheaded ways we’re damaging the biosphere of the only planet on which we can actually live.
There’s something else to be found in the acceptance of our collective mortality, though, and I’m considering exploring it in detail over the months ahead.
Grasp the fact that our species is a temporary yet integral part of the whole system we call the biosphere of the Earth, and it becomes a good deal easier to see that we are part of a story that didn’t begin with us, won’t end with us, and doesn’t happen to assign us an overwhelmingly important role.
Traumatic though this may be for the Verruca Saltish end of humanity, with its distinctly overinflated sense of importance, there’s much to be gained by ditching the tantrums, coming to terms with our decidedly modest place in the cosmos, and coming to understand the story in which we play our small but significant part.
We invite articles submitted by our readers. You can add an article (subject to editorial approval) by emailing the Juan Wilson with what you want to post.
We will include your name as the author, or, if you are forwarding another's words, we will add you as the source.