American Delusionalism

SUBHEAD: Endless recycling of failed predictions blind us to the unwelcome future closing in.

By John Michael Greer on 19 March 2014 for the Arch Druid Report -
(http://thearchdruidreport.blogspot.com/2014/03/american-delusionalism-or-why-history.html)


Image above: Detail of painting "Life After People", by James Sparkes, 2012.  From (http://www.behance.net/gallery/Life-After-People/3842583).

One of the things that reliably irritates a certain fraction of this blog’s readers, as I’ve had occasion to comment before, is my habit of using history as a touchstone that can be used to test claims about the future. No matter what the context, no matter how wearily familiar the process under discussion might be, it’s a safe bet that the moment I start talking about historical parallels, somebody or other is going to pop up and insist that it really is different this time.

In a trivial sense, of course, that claim is correct. The tech stock bubble that popped in 2000, the real estate bubble that popped in 2008, and the fracking bubble that’s showing every sign of popping in the uncomfortably near future are all different from each other, and from every other bubble and bust in the history of speculative markets, all the way back to the Dutch tulip mania of 1637.

It’s quite true that tech stocks aren’t tulips, and bundled loans backed up by dubious no-doc mortgages aren’t the same as bundled loans backed up by dubious shale leases—well, not exactly the same—but in practice, the many differences of detail are irrelevant compared to the one crucial identity. 

Tulips, tech stocks, and bundled loans, along with South Sea Company shares in 1730, investment trusts in 1929, and all the other speculative vehicles in all the other speculative bubbles of the last five centuries, different as they are, all follow the identical trajectory:  up with the rocket, down with the stick.

That is to say, those who insist that it’s different this time are right where it doesn’t matter and wrong where it counts. I’ve come to think of the words “it’s different this time,” in fact, as the nearest thing history has to the warning siren and flashing red light that tells you that something is about to go very, very wrong. When people start saying it, especially when plenty of people with plenty of access to the media start saying it, it’s time to dive for the floor, cover your head with your arms, and wait for the blast to hit.

With that in mind, I’d like to talk a bit about the recent media flurry around the phrase “American exceptionalism,” which has become something of a shibboleth among pseudoconservative talking heads in recent months. Pseudoconservatives?

Well, yes; actual conservatives, motivated by the long and by no means undistinguished tradition of conservative thinking launched by Edmund Burke in the late 18th century, are interested in, ahem, conserving things, and conservatives who actually conserve are about as rare these days as liberals who actually liberate.

Certainly you won’t find many of either among the strident voices insisting just now that the last scraps of America’s democracy at home and reputation abroad ought to be sacrificed in the service of their squeaky-voiced machismo.

As far as I know, the phrase “American exceptionalism” was originally coined by none other than Josef Stalin—evidence, if any more were needed, that American pseudoconservatives these days, having no ideas of their own, have simply borrowed those of their erstwhile Communist bogeyman and stood them on their heads with a Miltonic “Evil, be thou my good.” 

Stalin meant by it the opinion of many Communists in his time that the United States, unlike the industrial nations of Europe, wasn’t yet ripe for the triumphant proletarian revolution predicted (inaccurately) by Marx’s secular theology.

Devout Marxist that he was, Stalin rejected this claim with some heat, denouncing it in so many words as “this heresy of American exceptionalism,” and insisting (also inaccurately) that America would get its proletarian revolution on schedule.

While Stalin may have invented the phrase, the perception that he thus labeled had considerably older roots. In a previous time, though, that perception took a rather different tone than it does today.

A great many of the early leaders and thinkers of the United States in its early years, and no small number of the foreign observers who watched the American experiment in those days, thought and hoped that the newly founded republic might be able to avoid making the familiar mistakes that had brought so much misery onto the empires of the Old World.

Later on, during and immediately after the great debates over American empire at the end of the 19th century, a great many Americans and foreign observers still thought and hoped that the republic might come to its senses in time and back away from the same mistakes that doomed those Old World empires to the misery just mentioned.

These days, by contrast, the phrase “American exceptionalism” seems to stand for the conviction that America can and should make every one of those same mistakes, right down to the fine details, and will still somehow be spared the logically inevitable consequences.

The current blind faith in American exceptionalism, in other words, is simply another way of saying “it’s different this time.”  Those who insist that God is on America’s side when America isn’t exactly returning the favor, like those who have less blatantly theological reasons for their belief that this nation’s excrement emits no noticeable odor, are for all practical purposes demanding that America must not, under any circumstances, draw any benefit from the painfully learnt lessons of history.

 I suggest that a better name for the belief in question might be "American delusionalism;" it’s hard to see how this bizarre act of faith can do anything other than help drive the American experiment toward a miserable end, but then that’s just one more irony in the fire.

The same conviction that the past has nothing to teach the present is just as common elsewhere in contemporary culture. I’m thinking here, among other things, of the ongoing drumbeat of claims that our species will inevitably be extinct by 2030. As I noted in a previous post here, this is yet another expression of the same dubious logic that generated the 2012 delusion, but much of the rhetoric that surrounds it starts from the insistence that nothing like the current round of greenhouse gas-driven climate change has ever happened before.

That insistence bespeaks an embarrassing lack of knowledge about paleoclimatology. Vast quantities of greenhouse gases being dumped into the atmosphere over a century or two? Check; the usual culprit is vulcanism, specifically the kind of flood-basalt eruption that opens a crack in the earth many miles in length and turns an area the size of a European nation into a lake of lava. The most recent of those, a smallish one, happened about 6 million years ago in the Columbia River basin of eastern Washington and Oregon states.

 Further back, in the Aptian, Toarcian, and Turonian-Cenomanian epochs of the late Mesozoic, that same process on a much larger scale boosted atmospheric CO2 levels to three times the present figure and triggered what paleoclimatologists call "super-greenhouse events." Did those cause the extinction of all life on earth? Not hardly; as far as the paleontological evidence shows, it didn’t even slow the brontosaurs down.

Oceanic acidification leading to the collapse of calcium-shelled plankton populations? Check; those three super-greenhouse events, along with a great many less drastic climate spikes, did that.

The ocean also contains very large numbers of single-celled organisms that don’t have calcium shells, such as blue-green algae, which aren’t particularly sensitive to shifts in the pH level of seawater; when such shifts happen, these other organisms expand to fill the empty niches, and everybody further up the food chain gets used to a change in diet.

When the acidification goes away, whatever species of calcium-shelled plankton have managed to survive elbow their way back into their former niches and undergo a burst of evolutionary radiation; this makes life easy for geologists today, who can figure out the age of any rock laid down in an ancient ocean by checking the remains of foraminifers and other calcium-loving plankton against a chart of what existed when.

Sudden climate change recently enough to be experienced by human beings? Check; most people have heard of the end of the last ice age, though you have to read the technical literature or one of a very few popular treatments to get some idea of just how drastically the climate changed, or how fast.

=The old saw about a slow, gradual warming over millennia got chucked into the dumpster decades ago, when ice cores from Greenland upset that particular theory. The ratio between different isotopes of oxygen in the ice laid down in different years provides a sensitive measure of the average global temperature at sea level during those same years. According to that measure, at the end of the Younger Dryas period about 11,800 years ago, global temperatures shot up by 20° F. in less than a decade.

Now of course that didn’t mean that temperatures shot up that far evenly, all over the world.  What seems to have happened is that the tropics barely warmed at all, the southern end of the planet warmed mildly, and the northern end experienced a drastic heat wave that tipped the great continental ice sheets of the era into rapid collapse and sent sea levels soaring upwards.

Those of my readers who have been paying attention to recent scientific publications about Greenland and the Arctic Ocean now have very good reason to worry, because the current round of climate change has most strongly affected the northern end of the planet, too, and scientists have begun to notice historically unprecedented changes in the Greenland ice cap. In an upcoming post I plan on discussing at some length what those particular historical parallels promise for our future, and it’s not pretty.

Oh, and the aftermath of the post-Younger Dryas temperature spike was a period several thousand years long when global temperatures were considerably higher than they are today. The Holocene Hypsithermal, as it’s called, saw global temperatures peak around 7° F. higher than they are today—about the level, that is, that’s already baked into the cake as a result of anthropogenic emissions of greenhouse gases.  It was not a particularly pleasant time.=

Most of western North America was desert, baked to a crackly crunch by drought conditions that make today’s dry years look soggy; much of what’s now, at least in theory, the eastern woodland biome was dryland prairie, while both coasts got rapidly rising seas with a side order of frequent big tsunamis—again, we’ll talk about those in the upcoming post just mentioned. Still, you’ll notice that our species survived the experience.


As those droughts and tsunamis might suggest, the lessons taught by history don’t necessarily amount to "everything will be just fine." The weird inability of the contemporary imagination to find any middle ground between business as usual and sudden total annihilation has its usual effect here, hiding the actual risks of anthropogenic climate change behind a facade of apocalyptic fantasies.

Here again, the question "what happened the last time this occurred?" is the most accessible way to avoid that trap, and the insistence that it’s different this time and the evidence of the past can’t be applied to the present and future puts that safeguard out of reach.

For a third example, consider the latest round of claims that a sudden financial collapse driven by current debt loads will crash the global economy once and for all.

That sudden collapse has been being predicted year after weary year for decades now—do any of my readers, I wonder, remember Dr. Ravi Batra’s The Great Depression of 1990?—and its repeated failure to show up and perform as predicted seems only to strengthen the conviction on the part of believers that this year, like some financial equivalent of the Great Pumpkin, the long-delayed crash will finally put in its long-delayed appearance and bring the global economy crashing down.

I’m far from sure that they’re right about the imminence of a crash; the economy of high finance these days is so heavily manipulated, and so thoroughly detached from the real economy where real goods and services have to be produced using real energy and resources, that it’s occurred to me more than once that the stock market and the other organs of the financial sphere might keep chugging away in a state of blissful disconnection to the rest of existence for a very long time to come.

Still, let’s grant for the moment that the absurd buildup of unpayable debt in the United States and other industrial nations will in fact become the driving force behind a credit collapse, in which drastic deleveraging will erase trillions of dollars in notional wealth. Would such a crash succeed, as a great many people are claiming just now, in bringing the global economy to a sudden and permanent stop?

Here again, the lessons of history provide a clear and straightforward answer to that question, and it’s not one that supports the partisans of the fast-crash theory. Massive credit collapses that erase very large sums of notional wealth and impact the global economy are hardly a new phenomenon, after all.

One example—the credit collapse of 1930-1932—is still just within living memory; the financial crises of 1873 and 1893 are well documented, and there are dozens of other examples of nations and whole continents hammered by credit collapses and other forms of drastic economic crisis. Those crises have had plenty of consequences, but one thing that has never happened as a result of any of them is the sort of self-feeding, irrevocable plunge into the abyss that current fast-crash theories require.

The reason for this is that credit is merely one way by which a society manages the distribution of goods and services. That’s all it is. Energy, raw materials, and labor are the factors that have to be present in order to produce goods and services.  Credit simply regulates who gets how much of each of these things, and there have been plenty of societies that have handled that same task without making use of a credit system at all.

A credit collapse, in turn, doesn’t make the energy, raw materials, and labor vanish into some fiscal equivalent of a black hole; they’re all still there, in whatever quantities they were before the credit collapse, and all that’s needed is some new way to allocate them to the production of goods and services.

This, in turn, governments promptly provide. In 1933, for example, faced with the most severe credit collapse in American history, Franklin Roosevelt temporarily nationalized the entire US banking system, seized nearly all the privately held gold in the country, unilaterally changed the national debt from "payable in gold" to "payable in Federal Reserve notes" (which amounted to a technical default), and launched a flurry of other emergency measures.  The credit collapse came to a screeching halt, famously, in less than a hundred days.

Other nations facing the same crisis took equally drastic measures, with similar results. While that history has apparently been forgotten across large sections of the peak oil blogosphere, it’s a safe bet that none of it has been forgotten in the corridors of power in Washington DC and elsewhere in the world.

More generally, governments have an extremely broad range of powers that can be used, and have been used, in extreme financial emergencies to stop a credit or currency collapse from terminating the real economy. Faced with a severe crisis, governments can slap on wage and price controls, freeze currency exchanges, impose rationing, raise trade barriers, default on their debts, nationalize whole industries, issue new currencies, allocate goods and services by fiat, and impose martial law to make sure the new economic rules are followed to the letter, if necessary, at gunpoint.

Again, these aren’t theoretical possibilities; every one of them has actually been used by more than one government faced by a major economic crisis in the last century and a half. Given that track record, it requires a breathtaking leap of faith to assume that if the next round of deleveraging spirals out of control, politicians around the world will simply sit on their hands, saying "Whatever shall we do?" in plaintive voices, while civilization crashes to ruin around them.

What makes that leap of faith all the more curious is in the runup to the economic crisis of 2008-9, the same claims of imminent, unstoppable financial apocalypse we’re hearing today were being made—in some cases, by the same people who are making them today.  (I treasure a comment I fielded from a popular peak oil blogger at the height of the 2009 crisis, who insisted that the fast crash was upon us and that my predictions about the future were therefore all wrong.

Their logic was flawed then, and it’s just as flawed now, because it dismisses the lessons of history as irrelevant and therefore fails to take into account how the events under discussion play out in the real world.

That’s the problem with the insistence that this time it really is different: it disables the most effective protection we’ve got against the habit of thought that cognitive psychologists call "confirmation bias," the tendency to look for evidence that supports one’s pet theory rather than seeking the evidence that might call it into question. The scientific method itself, in the final analysis, is simply a collection of useful gimmicks that help you sidestep confirmation bias.

That’s why competent scientists, when they come up with a hypothesis to explain something in nature, promptly sit down and try to think up as many ways as possible to disprove the hypothesis.  Those potentials for disproof are the raw materials from which experiments are designed, and only if the hypothesis survives all experimental attempts to disprove it does it take its first step toward scientific respectability.

It’s not exactly easy to run controlled double-blind experiments on entire societies, but historical comparison offers the same sort of counterweight to confirmation bias. Any present or future set of events, however unique it may be in terms of the fine details, has points of similarity with events in the past, and those points of similarity allow the past events to be taken as a guide to the present and future.

This works best if you’ve got a series of past events, as different from each other as any one of them is from the present or future situation you’re trying to predict; if you can find common patterns in the whole range of past parallels, it’s usually a safe bet that the same pattern will recur again.

Any time you approach a present or future event, then, you have two choices: you can look for the features that event has in common with other events, despite the differences of detail, or you can focus on the differences and ignore the common features. The first of those choices, it’s worth noting, allows you to consider both the similarities and the differences.

Once you’ve got the common pattern, it then becomes possible to modify it as needed to take into account the special characteristics of the situation you’re trying to understand or predict: to notice, for example, that the dark age that will follow our civilization will have to contend with nuclear and chemical pollution on top of the more ordinary consequences of decline and fall.

If you start from the assumption that the event you’re trying to predict is unlike anything that’s ever happened before, though, you’ve thrown out your chance of perceiving the common pattern. What happens instead, with motononous regularity, is that pop-culture narratives such as the sudden overnight collapse beloved of Hollywood screenplay writers smuggle themselves into the picture, and cement themselves in place with the help of confirmation bias.

The result is the endless recycling of repeatedly failed predictions that plays so central a role in the collective imagination of our time, and has helped so many people blind themselves to the unwelcome future closing in on us.

.

No comments :

Post a Comment