Showing posts with label Reality. Show all posts
Showing posts with label Reality. Show all posts

Stop and Assess

SUBHEAD: Let’s pause to make an assessment of where we stand as Winter finally coils into Spring.

By James Kunstler on 23 April 2018 for Kunstler.com -
(http://kunstler.com/clusterfuck-nation/stop-and-assess/)


Image above: A man waits at dawn, after sleeping in his car, to see a free ‘mobile doctor’ in Olean, New York. Photo by Spencer Platt. From (https://www.theguardian.com/inequality/2017/jun/20/is-the-american-dream-really-dead).

America has become Alzheimer Nation. Nothing is remembered for more than a few minutes. The news media, which used to function as a sort of collective brain, is a memory hole that events are shoved down and extinguished in.

An attack in Syria, you ask? What was that about? Facebook stole your…what? Four lives snuffed out in a… a what? Something about waffles? Trump said… what?

Let’s pause today and make an assessment of where things stand in this country as Winter finally coils into Spring.

As you might expect, a nation overrun with lawyers has litigated itself into a cul-de-sac of charges, arrests, suits, countersuits, and allegations that will rack up billable hours until the Rockies tumble.

The best outcome may be that half the lawyers in this land will put the other half in jail, and then, finally, there will be space for the rest of us to re-connect with reality.

What does that reality consist of?

Troublingly, an economy that can’t go on as we would like it to: a machine that spews out ever more stuff for ever more people. We really have reached limits for an industrial economy based on cheap, potent energy supplies. The energy, oil especially, isn’t cheap anymore.

The fantasy that we can easily replace it with wind turbines, solar panels, and as-yet-unseen science projects is going to leave a lot of people not just disappointed but bereft, floundering, and probably dead, unless we make some pretty severe readjustments in daily life.

We’ve been papering this problem over by borrowing so much money from the future to cover costs today that eventually it will lose its meaning as money — that is, faith that it is worth anything. That’s what happens when money is just a representation of debt that can’t be paid back.

This habit of heedless borrowing has enabled the country to pretend that it is functioning effectively. Lately, this game of pretend has sent the financial corps into a rapture of jubilation.

The market speed bumps of February are behind us and the road ahead looks like the highway to Vegas at dawn on a summer’s day.

Tesla is the perfect metaphor for where the US economy is at: a company stuffed with debt plus government subsidies, unable to deliver the wished-for miracle product — affordable electric cars — whirling around the drain into bankruptcy.

Tesla has been feeding one of the chief fantasies of the day: that we can banish climate problems caused by excessive CO2, while giving a new lease on life to the (actually) futureless suburban living arrangement that we foolishly invested so much of our earlier capital building. In other words, pounding sand down a rat hole.

Because none of that is going to happen.

The true message of income inequality is that the nation as a whole is becoming incrementally impoverished and eventually even the massive “wealth” of the one-percenters will prove to be fictitious, as the things it is represented in — stocks, bonds, currencies, Manhattan apartments — hemorrhage their supposed value.

The very wealthy will be a lot less wealthy while everybody else is in a life-and-death struggle to remain fed, housed, and warm. And, of course, that only increases the chance that some violent social revolution will take away even that remaining residue of wealth, and destroy the people who held it.

What lies ahead is contraction. Of everything. Activity, population. The industrial economy is not going to be replaced by a super high tech utopia, because that wished-for utopia needs an industrial economy underneath to support it. This is true, by the way, for all the other “advanced” nations.

China has a few more years of dependable oil supply left and then they will discover that they can no longer manufacture solar panels or perhaps not even run the magnificent electronic surveillance system they are so artfully building. Their political system will prove to be at least as fragile as our own.

The time may even come when the young people, of the USA especially, have to put aside their boundary-smashing frolics of the day and adjust the precooked expectations they’ve been handed to the actual contraction at hand, and what it means for making a life under severely different conditions. It means, better learn how to do something really practical and not necessarily high tech.

Better figure out a part of the country that will be safe to live in. Better plan on hunkering down there when the people stuck in the less favorable places make a real mess of things.

.

Obscuring Nature

SUBHEAD: How cleanliness and energy efficiency are damaging our relation to nature.

By Gunner Rundgren on 16 April 2018 for arden Earth -
(http://gardenearth.blogspot.co.uk/2018/04/how-cleanliness-and-efficiency-obscure.html)


Image above: A techno-optimist trying to grow "Green" food under artificial light in PVC irrigation tubes. From (http://www.thecoolist.com/geeky-gardening-how-to-grow-vegetables-with-green-technology/).

Alienation
Instead of retreating into urban eco-sanctuaries and buying industrial fare in hygienic and eco-friendly packaging, people need to grow, tend to animals, muck, dig, cook and bake. Only then can we expect people to become ecologically literate and realise that we are part of nature.

After the discovery of ”germs” and their role in disease, humans initiated a war on bacteria for two centuries. It is just the last decades that we start to realize that we are totally dependent on them.

There are so many of them inside our body and on our skin that one could almost claim that we are an agglomeration of germs. While we still know that there are the bad ones we should avoid we are also aware of that some dirt is beneficial. Somethings similar need to happen with efficiency.

The realization that there are fairly hard physical limits to our civilization, sometimes called Planetary Boundaries, has made efficient the buzzword of the day.

Efficency
Of course this is hardly nothing new, scarcity was the rule for most of human existence and efficient use of resources was part of the daily struggle. When fossil fuels were systematically put into our service followed a period of assumed limitless growth and limitless waste.

For a long period, efficiency was defined mainly in relation to the use of labour and the silver bullet of enterprise was to substitute nature resources with labour. Which meant more use of energy, more use of minerals, water, rocks and sand; more everything – but labour.

Now, there are growing insights that nature resources are not as abundant and limitless as we believed and that there are also limits in the receiving end. We can’t just pump our waste into the natural pools be it the oceans or the atmosphere.

It is therefore quite natural, and good, that we look for more efficient ways of using resources. But in my view the solutions are often misguided.

Technology
Farming is perhaps the best examples of this. Nowadays we are told that we should grow plants or fish indoor with artificial light to save water and land.

And the most used argument in favour of a vegan lifestyle is that there I less need for land to grow plants than to grow animals.

Lab meats are said to solve our craving for meat in a better way. Efficient use of land is also a major argument for the use of chemical fertilizers, pesticides and GMOs.

Most urban dwellers have no idea of how food is grown and how animals and plants interact in natural systems and they therefore easily buy into a narrative that goes like this:
“Humans are squeezing out other species, raze the rainforest to feed cattle or oil palm and cut down mangrove to grow shrimp.

Agriculture destroys the water and the atmosphere, pesticides kills, it even destroys its own foundation, the soil. Most agriculture land is used to feed cattle which also are most harmful for the climate.”
While there is some merit in all this (with the exception for the blame on grazing cattle) the solution which has gained traction is to withdraw humanity into sustainable cities where the food is grown within city walls. In this way we can leave the rest of nature to all the other creatures in God’s garden.

Overall, the alleged efficiency of most of these systems is an illusion because land and resources are mostly used to the same extent as earlier – but somewhere else. See example further down.

What worries me a lot more than the miscalculations, however, is the view of our relationship with nature that is reflected in this narrative.

The idea that we can save both ourselves and nature by retracting from nature, limit our interaction with nature to watching Animal Planet and going whale watching or gorilla spotting on eco-touristic trips.

For sure, those creatures need all those nature reserves that we have created and we need to expand those in parts of the world, in particular to coastal areas. But, as with germs I am afraid we draw this too far.

Many advocate artificial production systems in a similar way as sterility was promoted as an ideal for hygiene.

But distancing people more from germs mostly make them much less able to strike a balanced view on the merits of washing their hands or throwing away leftovers.

Dirt on Hands
In a similar way, I think that instead of withdrawing into urban eco-sanctuaries people need to immerse themselves in nature and dirt.

They need to grow, tend to animals, muck, dig, cook and bake rather than buying industrial fare in hygienic eco-friendly packaging.

Only then can we expect people to appreciate the real work, the resources needed, the interaction between humans, animals and plants.

Only then can we expect people to become ecologically literate and realise that we are part of nature.

Saving Resource Myths

The most flagrant myth is that vertical indoor farms powered by LED lights saves space. When you point out that they require a lot of energy, you are told that that energy can come from solar panels, fully renewable and benign.

We can leave the discussion about exactly how benign solar panels are when it comes to resources.

We can also leave the discussion how to store solar energy over the seasons in the Northern parts of the globe, and just focus on the area used.

Do indoor farms really save space?

Let’s envision a house with a vertical farm in the basement and let us put solar panels on top of the building. The roof is hit by sunlight with an intensity of some 1000 W per square meter.

Our solar panels are very efficient and convert 15% to electricity that will give us 150Watts per square meter. The basement is powered by efficient LED lights.

If we want to grow lettuce we will need about 250Watts per square meter for 12 hours per day. Assuming very small losses in transmission and for the light we can grow 0.6 square meters of lettuce for each square meter of roof area.

Each layer of plants in the vertical farm thus needs a much bigger area of solar panels to produce the electricity needed. And this is only growing lettuce. If we were to grow tomatoes, grain, potatoes or cabbage we would need much higher light intensity.

These calculations are in reality extremely optimistic. Of course, in the winter where I live there is almost no solar energy produced at all. To produce food in winter we would need solar panel areas perhaps 25 times as big as each layer in our indoor farm!

So for a farm with 10 layers we would need 250 times the area somewhere else, outside of the sustainable city’s walls.

These are back-of-the-envelope calculations, an art which seems long forgotten. You can read more here.

.

NTHE is a four letter word

SUBHEAD: Near Term Human Extinction may be coming after a few of us get through the eco-bottleneck.

By Albert Bates on 25 March 2018 for The Great Change -
(http://peaksurfer.blogspot.com/2018/03/nthe-is-four-letter-word.html)


Image above: Illustration of a cityscape post near term human extinction. From (http://thefallingdarkness.com/near-term-human-extinction-a-conversation-with-guy-mcpherson/).

"Collective neurosis can be attributed to a concatenation of causes — diet, electrosmog, epigenetic triggering by microplastics in our toothpaste — take your choice."

We are not talking about climate deniers now, who have their own brand of insanity, but we keep hearing the same mantra chanted by otherwise respectable scientists and policymakers that, “climate change may be catastrophic but it won’t be the end of us.”

We hear that so often we almost never challenge it, not wishing to divert an otherwise productive conversation into what we know to be a blind alley. Nonetheless, we think the statement is at best deluded and at worst just a milder form of denialism. It is not science. It is faith. It is also human neurophysiology.
Brain imaging research has shown that a major neural region associated with cognitive flexibility is the prefrontal cortex — specifically two areas known as the dorsolateral prefrontal cortex (dlPFC) and the ventromedial prefrontal cortex (vmPFC). Additionally, the vmPFC was of interest to the researchers because past studies have revealed its connection to fundamentalist-type beliefs. For example, one study showed individuals with vmPFC lesions rated radical political statements as more moderate than people with normal brains, while another showed a direct connection between vmPFC damage and religious fundamentalism. For these reasons, in the present study, researchers looked at patients with lesions in both the vmPFC and the dlPFC, and searched for correlations between damage in these areas and responses to religious fundamentalism questionnaires.
Bobby Azarian, Raw Story, March 14, 2018

In the quote above, Azarian is referring to a study published a year ago in Neuropsychologia that connected cognitive flexibility with the ventromedial prefrontal cortex and proved that damage to that part of the brain hinders adaptive or flexible behavior, locking out world views that run contrary to some preconception. The study correlated brain-damaged veterans with religious fundamentalism.

The preconception most often grasped by NTHE (Near Term Human Extinction ) deniers is the notion that “humans survived far worse cataclysms to arrive at their present condition” —  the Toba event 70,000 years ago, for instance, when the human population was reduced to perhaps 10,000–30,000 individuals — “and we invariably rebound.”

The example most often cited is the 2005 Rutgers mDNA study showing all pre-1492 native populations of the Americas  —  well over 1 billion by some estimates  —  having descended from 70 or fewer individuals who crossed the land bridge between Asia and North America.

This is a variant of the techno-cornucopianism of Bill Gates or Elon Musk, but in their cases — building new desert cities in Arizona or seed colonies on Mars — that being externalized, absent a cold fusion Spindletop, is biophysical economics.

We have previously reviewed the hypothesis of Danny Brower and Ajit Varki that an evolutionary leap allowed homo to access higher consciousness by hard-wiring a neural pathway for denying reality.

Arguably that same pathway induces otherwise rational-seeming people to allow for the possibility of catastrophic climate change (already well underway) while denying the possibility of it leading to near-term human extinction (NTHE).

In our view, this colors the debate over what we should be doing by reducing the urgency.

Ironically there may have been human genotypes that suppressed their denial gene better than ours does. One of the effects of genetic bottlenecks is that selected genes (such as those offering a more balanced use of denial) fail to be passed along to succeeding populations.

Our personal view is that while we think NTHE can yet be avoided, the time for action grows short and as we as we walk out onto the razon’s edge and grow more desperate we will likely make many foolish mistakes, any one of which could trigger NTHE.

Appointing John Bolton the National Security Advisor, for instance. In 2016 USAnians fed up with the tweedledee-tweedledum two-party system opted to just hurl a hand grenade into the White House and stand back.

If one grenade was not enough, we still have President Bannon to look forward to in 2020 or 2024 if Cambridge Analytica can keep up with the AI revolution with respect to Big Data.

Collective neurosis can be attributed to a concatenation of causes — diet, electrosmog, epigenetic triggering by microplastics in our toothpaste — take your choice. Visionary forebears who saw these bottlenecks coming — Garrett Hardin, R. Buckminster Fuller, M. King Hubbert — all argued that the best antidote was better public education.

But at least in the US, public education was hijacked in the ‘90s by the vmPFC-lesioned hoards of Zombie Fundamentalists before being handed over to Betsy DeVoss for the final coup d’gras.

Whatever long wave or ergot diet issued humanity into the Dark Ages seems to be replaying now, and it could hardly arrive at a worse time from the standpoint of the organized climate solutioneering required to avert Anthropogenic NTHE.

We need to be in top form to survive this next bottleneck. We’d do better without the denial. Too bad climate scientists can’t afford to hire Cambridge Analytica themselves.

.

The Terror of Deep Time

SUBHEAD: It's vital for humans understand the story in which we play our small but significant part.

By John Michael Greer on 21 September 2017 in Resilience -
(http://www.resilience.org/stories/2017-09-21/terror-deep-time/)


Image above: The Andromeda galaxy behind a silhouette of mountains. From original article.

Back in the 1950s, sociologist C. Wright Mills wrote cogently about what he called “crackpot realism”—the use of rational, scientific, utilitarian means to pursue irrational, unscientific, or floridly delusional goals. It was a massive feature of American life in Mills’ time, and if anything, it’s become more common since then.

Since it plays a central role in the corner of contemporary culture I want to discuss this week, I want to put a few moments into discussing where crackpot realism comes from, and how it wriggles its way into the apple barrel of modern life and rots the apples from skin to core.

Let’s start with the concept of the division of labor.

One of the great distinctions between a modern industrial society and other modes of human social organization is that in the former, very few activities are taken from beginning to end by the same person.

A woman in a hunter-gatherer community, as she is getting ready for the autumn tuber-digging season, chooses a piece of wood, cuts it, shapes it into a digging stick, carefully hardens the business end in hot coals, and then puts it to work getting tubers out of the ground.

Once she carries the tubers back to camp, what’s more, she’s far more likely than not to take part in cleaning them, roasting them, and sharing them out to the members of the band.

A woman in a modern industrial society who wants to have potatoes for dinner, by contrast, may do no more of the total labor involved in that process than sticking a package in the microwave.

Even if she has potatoes growing in a container garden out back, say, and serves up potatoes she grew, harvested, and cooked herself, odds are she didn’t make the gardening tools, the cookware, or the stove she uses.

That’s division of labor: the social process by which most members of an industrial society specialize in one or another narrow economic niche, and use the money they earn from their work in that niche to buy the products of other economic niches.

Let’s say it up front: there are huge advantages to the division of labor. It’s more efficient in almost every sense, whether you’re measuring efficiency in terms of output per person per hour, skill level per dollar invested in education, or what have you.

What’s more, when it’s combined with a social structure that isn’t too rigidly deterministic, it’s at least possible for people to find their way to occupational specialties for which they’re actually suited, and in which they will be more productive than otherwise.

Yet it bears recalling that every good thing has its downsides, especially when it’s pushed to extremes, and the division of labor is no exception.

Crackpot realism is one of the downsides of the division of labor. It emerges reliably whenever two conditions are in effect.

The first condition is that the task of choosing goals for an activity is assigned to one group of people and the task of finding means to achieve those goals is left to a different group of people.

The second condition is that the first group needs to be enough higher in social status than the second group that members of the first group need pay no attention to the concerns of the second group.

Consider, as an example, the plight of a team of engineers tasked with designing a flying car. People have been trying to do this for more than a century now, and the results are in: it’s a really dumb idea.

It so happens that a great many of the engineering features that make a good car make a bad aircraft, and vice versa; for instance, an auto engine needs to be optimized for torque rather than speed, while an aircraft engine needs to be optimized for speed rather than torque.

Thus every flying car ever built—and there have been plenty of them—performed just as poorly as a car as it did as a plane, and cost so much that for the same price you could buy a good car, a good airplane, and enough fuel to keep both of them running for a good long time.

Engineers know this.

Still, if you’re an engineer and you’ve been hired by some clueless tech-industry godzillionaire who wants a flying car, you probably don’t have the option of telling your employer the truth about his pet project—that is, that no matter how much of his money he plows into the project, he’s going to get a clunker of a vehicle that won’t be any good at either of its two incompatible roles—because he’ll simply fire you and hire someone who will tell him what he wants to hear.

Nor do you have the option of sitting him down and getting him to face what’s behind his own unexamined desires and expectations, so that he might notice that his fixation on having a flying car is an emotionally charged hangover from age eight, when he daydreamed about having one to help him cope with the miserable, bully-ridden public school system in which he was trapped for so many wretched years.

So you devote your working hours to finding the most rational, scientific, and utilitarian means to accomplish a pointless, useless, and self-defeating end. That’s crackpot realism.

You can make a great party game out of identifying crackpot realism—try it sometime—but I’ll leave that to my more enterprising readers.

What I want to talk about right now is one of the most glaring examples of crackpot realism in contemporary industrial society. Yes, we’re going to talk about space travel again.

No question, a fantastic amount of scientific, technological, and engineering brilliance went into the quest to insert a handful of human beings for a little while into the lethal environment of deep space and bring them back alive.

Visit one of the handful of places on the planet where you can get a sense of the sheer scale of a Saturn V rocket, and the raw immensity of the effort that put a small number of human bootprints on the Moon is hard to miss. What’s much easier to miss is the whopping irrationality of the project itself.

(I probably need to insert a parenthetical note here. Every time I blog about the space program, I can count on fielding at least one comment from some troll who insists that the Moon landings never happened.

It so happens that I’ve known quite a few people who worked on the Apollo project; some of them have told me their stories and shown me memorabilia from what was one of the proudest times of their lives; and given a choice between believing them, and believing some troll who uses a pseudonym to hide his identity but can’t hide his ignorance of basic historical and scientific facts, well, let’s just say the troll isn’t going to come in first place. Nor is his comment going to go anywhere but the trash. ‘Nuf said.)

Outer space simply isn’t an environment where human beings can survive for long.

It’s near-perfect vacuum at a temperature a few degrees above absolute zero; it’s full of hard radiation streaming out from the huge unshielded fusion reactor at the center of our solar system; it’s also got chunks of rock, lots of them, whizzing through it at better than rifle-bullet speeds; and the human body is the product of two billion years of evolutionary adaptation to environments that have the gravity, atmospheric pressure, temperature ranges, and other features that are found on the Earth’s surface and, as far as we know, nowhere else in the universe.

A simple thought experiment will show how irrational the dream of human expansion into space really is.

Consider the harshest natural environments on this planet—the stark summits of the Himalayas; the middle of the East Antarctic ice sheet in winter; the bleak Takla Makan desert of central Asia, the place caravans go to die; the bottom of the Marianas Trench, where the water pressure will reduce a human body to paste in seconds.

Nowhere in the solar system, or on any of the exoplanets yet discovered by astronomers, is there a place that’s even as well suited to human life as the places I’ve just named.

Logically speaking, before we try to settle the distant, airless, radiation-blasted deserts of Mars or the Moon, wouldn’t it make sense first to build cities on the Antarctic ice or in the lightless depths of the ocean?

With one exception, in fact, every one of the arguments that has been trotted out to try to justify the settlement of Mars can be applied with even more force to the project of settling Antarctica.

In both cases, you’ve got a great deal of empty real estate amply stocked with mineral wealth, right? Antarctica, though, has a much more comfortable climate than Mars, not to mention abundant supplies of water and a breathable atmosphere, both of which Mars lacks.

Furthermore, it costs a lot less to get your colonists to Antarctica, they won’t face lethal irradiation on the way there, and there’s at least a chance that you can rescue them if things go very wrong.

If in fact it made any kind of sense to settle Mars, the case for settling Antarctica would be far stronger.

So where are the grand plans, lavishly funded by clueless tech-industry godzillionaires, to settle Antarctica? Their absence shows the one hard fact about settling outer space that next to nobody is willing to think about: it simply doesn’t make sense.

The immense financial and emotional investments we’ve made in the notion of settling human beings on other planets or in outer space itself would be Exhibit A in a museum of crackpot realism.

This is where the one exception I mentioned above comes in—the one argument for settling Mars that can’t also be made for settling Antarctica. This is the argument that a Martian colony is an insurance policy for our species.

If something goes really wrong on Earth, the claim goes, and human beings die out here, having a settlement on Mars gives our species a shot at survival.

Inevitably, given the present tenor of popular culture, you can expect to hear this sort of logic backed up by embarrassingly bad arguments. I’m thinking, for example, of a rant by science promoter Neil DeGrasse Tyson, who likes to insist that dinosaurs are extinct today because they didn’t have a space program.

We’ll assume charitably that Tyson spent long nights stargazing in his teen years, and so tended to doze off during his high school biology classes; no doubt that’s why he missed three very obvious facts about dinosaurs.

The first is that they were the dominant life forms on land for well over a hundred million years, which is a good bit longer than our species shows any likelihood of being able to hang on; the second is that the vast majority of dinosaur species went extinct for ordinary reasons—there were only a very modest number of dinosaur species around when the Chicxulub meteorite came screaming down out of space to end the Cretaceous Period; and the third is that dinosaurs aren’t extinct—we call them birds nowadays, and in terms of number of species, rates of speciation, and other standard measures of evolutionary vigor, they’re doing quite a bit better than mammals just now.

Set aside the bad logic and the sloppy paleontology, though, and the argument just named casts a ruthlessly clear light on certain otherwise obscure factors in our contemporary mindset.

The notion that space travel gets its value as a way to avoid human extinction goes back a long ways. I recall a book by Italian journalist Oriana Falacci, compiling her interviews with leading figures in the space program during its glory days; she titled it If The Sun Dies, after the passionate comment along these lines by one of her interviewees.

Behind this, in turn, lies one of the profound and usually unmentioned fears that shapes the modern mind: the terror of deep time.

There’s a profound irony in the fact that the geologists who first began to figure out the true age of the Earth lived in western Europe in the early nineteenth century, when most people believed that the world was only some six thousand years old.

There have been plenty of cultures in recorded history that had a vision of time expansive enough to fit the facts of geological history, but the cultures of western Europe and its diaspora in the Americas and Australasia were not among them.

Wedded to literalist interpretations of the Book of Genesis, and more broadly to a set of beliefs that assigned unique importance to human beings, the people who faced the first dim adumbrations of the vastness of Earth’s long history were utterly unprepared for the shock, and even less ready to have the first unnerving guesses that the Earth might be millions of years old replaced by increasingly precise measurements that gave its age in the billions of years, and that of the universe in the trillions.

The brutal nature of the shock that resulted shouldn’t be underestimated.

A society that had come to think of humanity as creation’s darlings, dwelling in a universe with a human timescale, found itself slammed facefirst into an unwanted encounter with the vast immensities of past and future time. That encounter had a great many awkward moments.

The self-defeating fixation of evangelical Christians on young-Earth creationism can be seen in part as an attempt to back away from the unwelcome vista of deep time; so is the insistence, as common outside Christian churches as within them, that the world really will end sometime very soon and spare us the stress of having to deal with the immensity of the future.

For that matter, I’m not sure how many of my readers know how stunningly unwelcome the concept of extinction was when it was first proposed: if the universe was created for the benefit of human beings, as a great many people seriously argued in those days, how could there have been so many thousands of species that lived and died long ages before the first human being walked the planet?

Worse, the suspicion began to spread that the future waiting for humanity might not be an endless progression toward bigger and better things, as believers in progress insisted, or the end of the world followed by an eternity of bliss for the winning team, as believers in Christianity insisted, but extinction: the same fate as all those vanished species whose bones kept surfacing in geological deposits.

It’s in the nineteenth century that the first stories of human extinction appear on the far end of late Romanticism, just as the same era saw the first tales that imagined the history of modern civilization ending in decline and fall.

People read The Black Cloud and After London for the same rush of fascinated horror that they got from Frankenstein and Dracula, and with the same comfortable disbelief once the last page turned—but the same scientific advances that made the two latter books increasingly less believable made tales of humanity’s twilight increasingly more so.

It became fashionable in many circles to dismiss such ideas as mere misanthropy, and that charge still gets flung at anyone who questions current notions of humanity’s supposed future in space. It’s a curious fact that I tend to field such comments from science fiction writers, more than from anyone else just now.

A few years ago, when I sketched out a fictive history of the next ten billion years that included human extinction millions of years from now, SF writer David Brin took time out of his busy schedule to denounce it as “an infuriating paean to despair.” Last month’s post on the worlds that never were, similarly, fielded a spluttering denunciation by S.M. Stirling.

It was mostly a forgettable rehash of the standard arguments for an interstellar future—arguments, by the way, that could be used equally well to justify continued faith in perpetual motion—but the point I want to raise here is that Stirling’s sole reaction to Aurora, Kim Stanley Robinson’s brilliant fictional critique of the interstellar-travel mythos, was to claim dismissively that Robinson must have suffered an attack of misanthropy.

Some of my readers may remember Verruca Salt, the archetypal spoiled brat in Willy Wonka and the Chocolate Factory.

When her father didn’t give her whatever she happened to want, her typical response was to shriek, “You don’t love me!” I think of that whenever somebody trots out the accusation of misanthropy in response to any realistic treatment of the limits that will shape the human future.

It’s not misanthropy to point out that humanity isn’t going to outlast the sun or leap breezily from star to star; it’s simple realism, just as reminding someone that they will inevitably die is an expression not of hatred but of common sense.

You, dear reader, will die someday. So will I, and so will every other human being.

That fact doesn’t make our lives meaningless; quite the contrary, it’s when we come to grips with the fact of our own mortality that we have our best shot at achieving not only basic maturity, but that condition of reflective attention to meaning that goes by the name of wisdom.

In exactly the same way, recognizing that humanity will not last forever—that the same Earth that existed and flourished long before our species came on the scene will exist and flourish long after our species is gone—might just provide enough of a boost of wisdom to help us back away from at least some of the more obviously pigheaded ways we’re damaging the biosphere of the only planet on which we can actually live.

There’s something else to be found in the acceptance of our collective mortality, though, and I’m considering exploring it in detail over the months ahead.

Grasp the fact that our species is a temporary yet integral part of the whole system we call the biosphere of the Earth, and it becomes a good deal easier to see that we are part of a story that didn’t begin with us, won’t end with us, and doesn’t happen to assign us an overwhelmingly important role.

Traumatic though this may be for the Verruca Saltish end of humanity, with its distinctly overinflated sense of importance, there’s much to be gained by ditching the tantrums, coming to terms with our decidedly modest place in the cosmos, and coming to understand the story in which we play our small but significant part.

.

How America Lost Its Mind

SUBHEAD: Our post-truth moment is the sum of mind-sets that have always made America "exceptional".

By Kurt Andersen on 24 August 2017 for The Atlantic -
(https://www.theatlantic.com/magazine/archive/2017/09/how-america-lost-its-mind/534231/)


Image above: Illustration of American fantasies over the decades by R. Kikuo Johnson. From original article.

“You are entitled to your own opinion,
but you are not entitled to your own facts.”
— Daniel Patrick Moynihan

“We risk being the first people in history to have been
able to make their illusions so vivid, so persuasive,
so ‘realistic’ that they can live in them.”
— Daniel J. Boorstin,

When did America become untethered from reality? I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community.

People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.

A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness.

“Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books. They’re elitist.

Constantly telling us what is or isn’t true. Or what did or didn’t happen. Who’s Britannica to tell me the Panama Canal was finished in 1914?

If I wanna say it happened in 1941, that’s my right. I don’t trust books—they’re all fact, no heart … Face it, folks, we are a divided nation … divided between those who think with their head and those who know with their heart …



Because that’s where the truth comes from, ladies and gentlemen—the gut.”

Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldn’t have made any sense as jokes. For all the fun, and all the many salutary effects of the 1960s—the main decade of my childhood—I saw that those years had also been the big-bang moment for truthiness.

And if the ’60s amounted to a national nervous breakdown, we are probably mistaken to consider ourselves over it.

Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. Some of my best friends are very religious, and others believe in dubious conspiracy theories.

What’s problematic is going overboard—letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control.

From the start, our ultra-individualism was attached to epic dreams, sometimes epic fantasies—every American one of God’s chosen people building a custom-made utopia, all of us free to reinvent ourselves by imagination and will.

In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us.

And most of us haven’t realized how far-reaching our strange new normal has become.

Much more than the other billion or so people in the developed world, we Americans believe—really believe—in the supernatural and the miraculous, in Satan on Earth, in reports of recent trips to and from heaven, and in a story of life’s instantaneous creation several thousand years ago.

We believe that the government and its co-conspirators are hiding all sorts of monstrous and shocking truths from us, concerning assassinations, extraterrestrials, the genesis of , the 9/11 attacks, the dangers of vaccines, and so much more.

And this was all true before we became familiar with the terms post-factual and post-truth, before we elected a president with an astoundingly open mind about conspiracy theories, what’s true and what’s false, the nature of reality.

We have passed through the looking glass and down the rabbit hole. America has mutated into Fantasyland.

How widespread is this promiscuous devotion to the untrue? How many Americans now inhabit alternate realities? Any given survey of beliefs is only a sketch of what people in general really think.

But reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.

Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”

More than half say they’re absolutely certain heaven exists, and just as many are sure of the existence of a personal God—not a vague force or universal spirit or higher power, but some guy. A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists.

A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.

Almost a quarter believe that vaccines cause autism, and that Donald Trump won the popular vote in 2016.

A quarter believe that our previous president maybe or definitely was (or is?) the anti-Christ.

According to a survey by Public Policy Polling, 15 percent believe that the “media or the government adds secret mind-controlling technology to television broadcast signals,” and another 15 percent think that’s possible. A quarter of Americans believe in witches.

 Remarkably, the same fraction, or maybe less, believes that the Bible consists mainly of legends and fables—the same proportion that believes U.S. officials were complicit in the 9/11 attacks.

When I say that a third believe X and a quarter believe Y, it’s important to understand that those are different thirds and quarters of the population.

Of course, various fantasy constituencies overlap and feed one another—for instance, belief in extraterrestrial visitation and abduction can lead to belief in vast government cover-ups, which can lead to belief in still more wide-ranging plots and cabals, which can jibe with a belief in an impending Armageddon.

Why are we like this?

The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.

Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites. Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.

 A senior physician at one of America’s most prestigious university hospitals promotes “miracle cures” on his daily TV show. Cable channels air documentaries treating mermaids, monsters, ghosts, and angels as real.

When a political-science professor attacks the idea “that there is some ‘public’ that shares a notion of reality, a concept of reason, and a set of criteria by which claims to reason and rationality are judged,” colleagues just nod and grant tenure.

The old fringes have been folded into the new center. The irrational has become respectable and often unstoppable.

Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense.

During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.

American moxie has always come in two types. We have our wilder, faster, looser side: We’re overexcited gamblers with a weakness for stories too good to be true.

But we also have the virtues embodied by the Puritans and their secular descendants: steadiness, hard work, frugality, sobriety, and common sense.

A propensity to dream impossible dreams is like other powerful tendencies—okay when kept in check. For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.

The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.

The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them.

Before the internet, crackpots were mostly isolated, and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the web, just like actual news. Now all of the fantasies look real.

Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily.

Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.

Treating real life as fantasy and vice versa, and taking preposterous ideas seriously, is not unique to Americans.

But we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.

This is American exceptionalism in the 21st century. The country has always been a one-of-a-kind place. But our singularity is different now.

We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.

People see our shocking Trump moment—this post-truth, “alternative facts” moment—as some inexplicable and crazy new American phenomenon. But what’s happening is just the ultimate extrapolation and expression of mind-sets that have made America exceptional for its entire history.

America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump.

In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.

The 1960s and the Beginning of the End of Reason


Image above: Illustration of American 1960's Counter Culture mixed with nutcakes of today. by R. Kikuo Johnson. From original article.

I don't regret or disapprove of many of the ways the ’60s permanently reordered American society and culture. It’s just that along with the familiar benefits, there have been unreckoned costs.

In 1962, people started referring to “hippies,” the Beatles had their first hit, Ken Kesey published One Flew Over the Cuckoo’s Nest, and the Harvard psychology lecturer Timothy Leary was handing out psilocybin and LSD to grad students.

And three hours south of San Francisco, on the heavenly stretch of coastal cliffs known as Big Sur, a pair of young Stanford psychology graduates founded a school and think tank they named after a small American Indian tribe that had lived on the grounds long before. “In 1968,” one of its founding figures recalled four decades later,
Esalen was the center of the cyclone of the youth rebellion. It was one of the central places, like Mecca for the Islamic culture. Esalen was a pilgrimage center for hundreds and thousands of youth interested in some sense of transcendence, breakthrough consciousness, LSD, the sexual revolution, encounter, being sensitive, finding your body, yoga—all of these things were at first filtered into the culture through Esalen. By 1966, ’67, and ’68, Esalen was making a world impact.
This is not overstatement. Essentially everything that became known as New Age was invented, developed, or popularized at the Esalen Institute. Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural.

The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking (also: massage, hot baths, sex, and sex in hot baths). It was a headquarters for a new religion of no religion, and for “science” containing next to no science.

The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.

Not long before Esalen was founded, one of its co-founders, Dick Price, had suffered a mental breakdown and been involuntarily committed to a private psychiatric hospital for a year.

His new institute embraced the radical notion that psychosis and other mental illnesses were labels imposed by the straight world on eccentrics and visionaries, that they were primarily tools of coercion and control. This was the big idea behind One Flew Over the Cuckoo’s Nest, of course.

And within the psychiatric profession itself this idea had two influential proponents, who each published unorthodox manifestos at the beginning of the decade—R. D. Laing (The Divided Self) and Thomas Szasz (The Myth of Mental Illness).

“Madness,” Laing wrote when Esalen was new, “is potentially liberation and renewal.” Esalen’s founders were big Laing fans, and the institute became a hotbed for the idea that insanity was just an alternative way of perceiving reality.

These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people.

Mental illness, both Szasz and Laing said, is “a theory not a fact.” This is now the universal bottom-line argument for anyone—from creationists to climate-change deniers to anti-vaccine hysterics—who prefers to disregard science in favor of his own beliefs.

You know how young people always think the universe revolves around them, as if they’re the only ones who really get it?

And how before their frontal lobes, the neural seat of reason and rationality, are fully wired, they can be especially prone to fantasy?

In the ’60s, the universe cooperated: It did seem to revolve around young people, affirming their adolescent self-regard, making their fantasies of importance feel real and their fantasies of instant transformation and revolution feel plausible.

Practically overnight, America turned its full attention to the young and everything they believed and imagined and wished.

If 1962 was when the decade really got going, 1969 was the year the new doctrines and their gravity were definitively cataloged by the grown-ups. Reason and rationality were over.

The countercultural effusions were freaking out the old guard, including religious people who couldn’t quite see that yet another Great Awakening was under way in America, heaving up a new religion of believers who “have no option but to follow the road until they reach the Holy City … that lies beyond the technocracy … the New Jerusalem.”

That line is from The Making of a Counter Culture: Reflections on the Technocratic Society and Its Youthful Opposition, published three weeks after Woodstock, in the summer of 1969. Its author was Theodore Roszak, age 35, a Bay Area professor who thereby coined the word counterculture.

Roszak spends 270 pages glorying in the younger generation’s “brave” rejection of expertise and “all that our culture values as ‘reason’ and ‘reality.’ ” (Note the scare quotes.)

So-called experts, after all, are “on the payroll of the state and/or corporate structure.” A chapter called “The Myth of Objective Consciousness” argues that science is really just a state religion.

To create “a new culture in which the non-intellective capacities … become the arbiters of the good [and] the true,” he writes, “nothing less is required than the subversion of the scientific world view, with its entrenched commitment to an egocentric and cerebral mode of consciousness.” He welcomes the “radical rejection of science and technological values.”








Earlier that summer, a University of Chicago sociologist (and Catholic priest) named Andrew Greeley had alerted readers of The New York Times Magazine that beyond the familiar signifiers of youthful rebellion (long hair, sex, drugs, music, protests), the truly shocking change on campuses was the rise of anti-rationalism and a return of the sacred—“mysticism and magic,” the occult, séances, cults based on the book of Revelation.

When he’d chalked a statistical table on a classroom blackboard, one of his students had reacted with horror: “Mr. Greeley, I think you’re an empiricist.”

As 1969 turned to 1970, a 41-year-old Yale Law School professor was finishing his book about the new youth counterculture. Charles Reich was a former Supreme Court clerk now tenured at one of ultra-rationalism’s American headquarters.

But hanging with the young people had led him to a midlife epiphany and apostasy. In 1966, he had started teaching an undergraduate seminar called “The Individual in America,” for which he assigned fiction by Kesey and Norman Mailer. He decided to spend the next summer, the Summer of Love, in Berkeley. On the road back to New Haven, he had his Pauline conversion to the kids’ values.

His class at Yale became hugely popular; at its peak, 600 students were enrolled. In 1970, The Greening of America became The New York Times’ best-selling book (as well as a much-read 70-page New Yorker excerpt), and remained on the list for most of a year.

At 16, I bought and read one of the 2 million copies sold. Rereading it today and recalling how much I loved it was a stark reminder of the follies of youth. Reich was shamelessly, uncritically swooning for kids like me.  

The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers:

There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.”  

Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents.  

Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.

And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.”

If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.

Reich praises the “gaiety and humor” of the new Consciousness III wardrobe, but his book is absolutely humorless—because it’s a response to “this moment of utmost sterility, darkest night and most extreme peril.”

Conspiracism was flourishing, and Reich bought in. Now that “the Corporate State has added depersonalization and repression” to its other injustices, “it has threatened to destroy all meaning and suck all joy from life.” Reich’s magical thinking mainly concerned how the revolution would turn out.

“The American Corporate State,” having produced this new generation of longhaired hyperindividualists who insist on trusting their gut and finding their own truth, “is now accomplishing what no revolutionaries could accomplish by themselves.

The machine has begun to destroy itself.” Once everyone wears Levi’s and gets high, the old ways “will simply be swept away in the flood.”

The inevitable/imminent happy-cataclysm part of the dream didn’t happen, of course. The machine did not destroy itself. But Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible …

There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified.

Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.

Reich’s faith was the converse of the Enlightenment rationalists’ hopeful fallacy 200 years earlier.

Granted complete freedom of thought, Thomas Jefferson and company assumed, most people would follow the path of reason.

Wasn’t it pretty to think so.

I remember when fantastical beliefs went fully mainstream, in the 1970s.

My irreligious mother bought and read The Secret Life of Plants, a big best seller arguing that plants were sentient and would “be the bridesmaids at a marriage of physics and metaphysics.” The amazing truth about plants, the book claimed, had been suppressed by the FDA and agribusiness.

My mom didn’t believe in the conspiracy, but she did start talking to her ficuses as if they were pets.

In a review, The New York Times registered the book as another data point in how “the incredible is losing its pariah status.”

Indeed, mainstream publishers and media organizations were falling over themselves to promote and sell fantasies as nonfiction.

In 1975 came a sensational autobiography by the young spoon bender and mind reader Uri Geller as well as Life After Life, by Raymond Moody, a philosophy Ph.D. who presented the anecdotes of several dozen people who’d nearly died as evidence of an afterlife.

The book sold many millions of copies; before long the International Association for Near Death Studies formed and held its first conference, at Yale.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood.

Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs.

That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large:

All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe.

The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking?

Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs.

That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large:

All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests.

Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe.

The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia.

When I first read that, at age 18, I loved the quotation marks. If reality is simply the result of rules written by the powers that be, then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.

A more extreme academic evangelist for the idea of all truths being equal was a UC Berkeley philosophy professor named Paul Feyerabend. His best-known book, published in 1975, was Against Method: Outline of an Anarchistic Theory of Knowledge.

“Rationalism,” it declared, “is a secularized form of the belief in the power of the word of God,” and science a “particular superstition.”

In a later edition of the book, published when creationists were passing laws to teach Genesis in public-school biology classes, Feyerabend came out in favor of the practice, comparing creationists to Galileo. Science, he insisted, is just another form of belief.

“Only one principle,” he wrote, “can be defended under all circumstances and in all stages of human development. It is the principle: anything goes.”

Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger. This was understandable, given the times: colonialism ending, genocide of American Indians confessed, U.S. wars in the developing world.

Who were we to roll our eyes or deny what these people believed? In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science.

If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.

In 1968, a UC Davis psychologist named Charles Tart conducted an experiment in which, he wrote, “a young woman who frequently had spontaneous out-of-body experiences”—didn’t “claim to have” them but “had” them—spent four nights sleeping in a lab, hooked up to an EEG machine.

Her assigned task was to send her mind or soul out of her body while she was asleep and read a five-digit number Tart had written on a piece of paper placed on a shelf above the bed. He reported that she succeeded.

Other scientists considered the experiments and the results bogus, but Tart proceeded to devote his academic career to proving that attempts at objectivity are a sham and magic is real. In an extraordinary paper published in 1972 in Science, he complained about the scientific establishment’s “almost total rejection of the knowledge gained” while high or tripping.

He didn’t just want science to take seriously “experiences of ecstasy, mystical union, other ‘dimensions,’ rapture, beauty, space-and-time transcendence.” He was explicitly dedicated to going there. A “perfectly scientific theory may be based on data that have no physical existence,” he insisted.

The rules of the scientific method had to be revised. To work as a psychologist in the new era, Tart argued, a researcher should be in the altered state of consciousness he’s studying, high or delusional “at the time of data collection” or during “data reduction and theorizing.”

Tart’s new mode of research, he admitted, posed problems of “consensual validation,” given that “only observers in the same [altered state] are able to communicate adequately with one another.”

Tart popularized the term consensus reality for what you or I would simply call reality, and around 1970 that became a permanent interdisciplinary term of art in academia. Later he abandoned the pretense of neutrality and started calling it the consensus trance—people committed to reason and rationality were the deluded dupes, not he and his tribe.

Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”

Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else.

Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certainty. 
 
Yet once the intellectual mainstream thoroughly accepted that there are many equally valid realities and truths, once the idea of gates and gatekeeping was discredited not just on campuses but throughout the culture, all American barbarians could have their claims taken seriously.

Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.

The term useful idiot was originally deployed to accuse liberals of serving the interests of true believers further on the left. In this instance, however, postmodern intellectuals—post-positivists, poststructuralists, social constructivists, post-empiricists, epistemic relativists, cognitive relativists, descriptive relativists—turned out to be useful idiots most consequentially for the American right.

“Reality has a well-known liberal bias,” Stephen Colbert once said, in character, mocking the beliefs-trump-facts impulse of today’s right. Neither side has noticed, but large factions of the elite left and the populist right have been on the same team.

 [IB Publisher's note: If you've read to here you are only part way through - about 40%. To read about Kurt Anderson's take the 70's and 80's and beyond go to (https://www.theatlantic.com/magazine/archive/2017/09/how-america-lost-its-mind/534231/) and search for:

"Conspiracy and Paranoia in the 1970s"

 Enjoy!



.

The Big Contraction

SUBHEAD: An interview with James Howard Kunstler on the nature of our unraveling future.

Interview by Erico Tavares on 30 March 2017 for LinkedIn -
(https://www.linkedin.com/pulse/big-contraction-interview-james-howard-kunstler-erico-matias-tavares)


Image above: Family "camping out" of their 1958 Chevy Brookwood station wagon. Back when the car was new they were not "homeless", they were just "roughing it". From (http://www.ultraswank.net/advertising/classic-hand-drawn-car-ads-from-the-us/).

E Tavares: Thank you for being with us today. You have been writing about worsening societal issues, what you call “entropy in action”, for many years. Broadly speaking, why do you think the US is in so much trouble?

JH Kunstler: We’ve been sowing the seeds for our predicament since the end of World War II. You might even call this process “The Victory Disease.” In practical terms it represents sets of poor decisions with accelerating bad consequences.

For instance, the collective decision to suburbanize the nation. This was not a conspiracy. It was consistent with my new theory of history, which is Things happen because they seem like a good idea at the time.

In 1952 we had plenty of oil and the ability to make a lot of cars, which were fun, fun, fun! And we turned our war production expertise into the mass production of single family houses built on cheap land outside the cities. But the result now is that we’re stuck in a living arrangement with no future, the greatest misallocation of resources in the history of the world.

Another bad choice was to offshore most of our industry. Seemed like a good idea at the time; now you have a citizenry broadly impoverished, immiserated, and politically inflamed.

Of course, one must also consider the possibility that industrial society was a historic interlude with a beginning, middle, and end, and that we are closer to the end of the story than the middle.

It was, after all, a pure product of the fossil fuel bonanza, which is also coming to an end (with no plausible replacement in view.) I don’t view all this as the end of the world, or of civilization, per se, but we’re certainly in for a big re-set of the terms for remaining civilized.

I’ve tried to outline where this is all going in my four-book series of the “World Made By Hand” novels, set in the near future. If we’re lucky, we can fall back to sets of less complex social and economic arrangements, but it’s unclear whether we will land back in something like the mid-nineteenth century, or go full-bore medieval, or worse. One thing we can be sure of: the situation we face is one of comprehensive discontinuity — a lot of things just stop, beginning with financial arrangements and long-distance supply lines of resources and finished goods.

Then it depends whether we can respond by reorganizing life locally in this nation at a finer scale — if it even remains a unified nation. Anyway, implicit in this kind of discontinuity is the possibility for disorder. We don’t know how that will go, and how we come through it depends on the degree of disorder.

ET: Fair points, but one remarkable feature of Western civilization has been its resilience. In less than a decade the US has been able not only to reverse the historical decline in domestic crude oil production but also come up with natural gas as an expansive new source of energy. It now exports both of these commodities. Ditto for food production, where it can afford the luxury of using 40% of its corn production as car fuel. Doesn’t all this contradict what you had postulated in “The Long Emergency” back in 2005?

JHK: We flatter ourselves a bit to harp on our “resilience.” More realistically, history is an emergent process and societies are emergent phenomena which necessarily respond to the circumstances that reality presents. Sh*t happens and sh*t unhappens and then re-happens differently. The oil situation is grossly misunderstood by the public, including you, as implicit in the question you have just put to me.

We are not exporting any meaningful quantities of oil or natural gas. In fact we’re still importing nearly 8 million barrels of oil a day.

The shale oil “miracle” has largely been a manifestation of low interest lending into an industry that can’t pay back its loans, even as it produces like mad at a loss. You can look at it as a simple equation: oil over $75 a barrel crushes economies and oil under $75 a barrel crushes oil companies. To date, American oil companies have not made a red cent off the shale oil “miracle.”

It seemed like a good idea at the time, and it kept a lot of people busy for a while, but it was essentially a stunt that is not paying for itself and it has a short horizon. The public only sees lower gasoline prices at the pump; they have no idea how low prices are wrecking the oil industry. The result of all this will be an incrementally smaller global oil industry and fewer customers for its products — without anything to replace it.

The crux of the matter is the falling Energy Return on Investment (EROI). In the 1950s you got 100 barrels of Texas crude for every equivalent barrel of energy you sunk into the project. That’s 100 to 1. Shale oil gives you about 5 to 1.

Tar sands are a little worse. The worldwide average EROI these days is 17 to 1 (including Arabian oil, deep water, etc.). We can’t run all the systems of our “advanced” society at those ratios, and that is why we have been running up the debt so dramatically — borrowing from the future to cover the cost of living as we do.

And that is exactly why we are heading into financial clusterfuck as it becomes increasingly evident that the debt will never be paid back. This will wreck the banking system, and that will force everything else to change, including the dynamic of how we produce and distribute food. So, no, none of what I am saying here contradicts my 2005 book, “The Long Emergency”, though it has played out with some strange twists in the story.


Image above: A 1958 Chevy Brookwood station wagon as it appears today. With that big cargo bay and isolated location it could easily be used by homeless persons roughing it. From (http://usedfromusa.com/chevrolet-ads/145244-1958-chevrolet-brookwood-station-wagon.html).

ET: Another theme you talked about in that book is that in order to cope with looming energy and food crises Americans would have to eventually live in smaller-scale, localized and semi-agrarian communities. All part of a process you call the Big Contraction. 

However, a McKinsey Global Institute research paper from 2012 predicts quite the opposite for most of the world in the coming years, as depicted in the graph above. Indeed, growing urbanization has been one of the major trends so far in the 21st century. What do they see here that you don’t?

JHK: McKinsey’s prescience may be on a level with what the CIA failed to see in the 1990 collapse of the Soviet Union. Everybody and his mother is predicting that our cities will only get bigger and bigger. I will impudently state that they are all mistaken.

Our cities have attained a scale which cannot be sustained, given the capital and resource scarcities we face immediately ahead. This is what they don’t see: the fragility of the fossil fuel supply system (and everything that depends on it) and its relation to money and capital formation.

McKinsey and its compadres are dumb extrapolationists — they look at what’s been going on and they say we’ll only get more of it in a bigger package. These people are the “intellectual-yet-idiots” that Nassim Nicholas Taleb identifies so shrewdly.

For one thing, the successful places in the years ahead will be those places with a meaningful relationship to food production. I believe the action in US will shift to the now-derelict small towns and small cities, especially places around the extensive inland waterway system and the Great Lakes.

The giant metroplexes, so-called, will contract, probably in a messy way that includes great losses of notional real estate value and battles between various ethnic groups as to who gets to inhabit the districts with remaining value (e.g. close to the waterfront).

This contraction has already occurred in many cities of the heartland — Detroit, St. Louis, Milwaukee, Cleveland, etc. In contrast, booming New York, Boston, San Francisco and Dallas are purely products of the financialization of economy, and disorder in the banking system will hit them very hard. The suburbs around these places are next to go. Their destiny is either slums, salvage, or ruins.

ET: One aspect that we find fascinatingly provocative in your work is your description of modern urban landscapes, and how instead of being welcoming social spaces they now cause anxiety, even repulsion. What have modern architects missed in relation to their predecessors? Is that in any way related to the cultural revolution of the 1960s, which profoundly impacted much of the Western world?

JHK: The architects are a dysfunctional sub-culture in themselves. Suffice it to say they are hand-maidens of the corporate racketeers and victims of a particularly virulent form of techno-narcissism that infects our culture of wishful thinking and solipsism.

But the condition of the landscape is a product of much more than architects. The suburban project comes to us courtesy of banking, the automobile and trucking interests, national chain retail, municipal planning officials (who know nothing of urban design), traffic engineers, and many other ultra-specialists who populate this matrix of racketeering.

They have produced an everyday environment that is positively punishing to human neurology. It makes people sad, lonely, confused, angry, anxious, and despondent. They didn’t do it on purpose. It was just the blowback from their methods, customs, and practices. The zoning ordinances crafted and refined over a hundred years now mandate a suburban sprawl outcome in most American places.

Look, life is tragic. As I began to say in this interview, societies can make some pretty poor choices. Our choice to live in a drive-in utopia was a terrible blunder and now we’re stuck with the consequences. Notice that the outcome on the European landscape is still rather different. They will have plenty of problems in The Long Emergency, but at least they did not destroy their old city centres, and when the time for contraction comes, as it will, they have something of great value to contract back into.

ET: You talk about a “population overshoot” relating to demographic explosions in Africa and the Middle East that you claim cannot be sustained by the existing resources of those regions. Why do you say that?

JHK: Much of this region is desert wasteland. The populations of the “nations” in it (many boundaries drawn arbitrarily by the victors of World War One) have exploded numerically. The region can’t feed nor water itself, nor employ its exploded population. It is purely a product of fossil fuel pseudo-prosperity. It went this way for less than a century and then it will be over.

For the moment these populations (especially the young men) are exploding in political violence. Categorically, “normal” life will not continue as it has. We’re already seeing the gross disintegration of whole societies. It will accelerate.


James Howard Kunstler's thinking gained prominence after the publication of his book The Geography of Nowhere (1994), a history of American suburbia and urban development “because [he] believe[s] a lot of people share [his] feelings about the tragic landscape of highway strips, parking lots, housing tracts, mega-malls, junked cities and ravaged countryside that makes up the everyday environment where most Americans live and work.” This was followed by The Long Emergency (2005) and most recently Too Much Magic (2012), both non-fictional books. Starting with World Made by Hand in 2008, he has written a series of science fiction novels about such a culture in the future.
.

Nuclear Industry Bankruptcy

SUBHEAD: Westinghouse Chapter 11 filing a defining moment in the retreat of the nuclear power industry.

By Nika Knight on 29 March 2017 for Common Dreams -
(http://www.commondreams.org/news/2017/03/29/nuclear-power-suffers-major-blow-westinghouse-bankruptcy)


Image above: A Westinghouse nuclear plant under construction near Waynesboro, Georgia. IB Publisher's note: This nuclear power plant faces a bleak future if completed. Recent history demonstrates that this part of Georgia is experiencing "Weather Whiplashing" including extreme drought. This is not a good prognosis for the plant's long range ability cool nuclear fuel. From original article.

Major nuclear power company Westinghouse, a U.S. subsidiary of Japan's Toshiba, filed for Chapter 11 bankruptcy on Wednesday in a massive blow to the industry.

The filing marked "a defining moment in the decades-long downward spiral of the global nuclear power industry," wrote Greenpeace Japan in a statement.

"Toshiba/Westinghouse is responsible for building more nuclear reactors worldwide than any other entity," the group observed. "With the financial meltdown of Westinghouse, Toshiba also recently announced its plans to withdraw from foreign construction projects—a move that has far-reaching implications outside Japan and the U.S., such as the construction of three reactors in the U.K. at Moorside."

"We have all but completely pulled out of the nuclear business overseas," Toshiba president Satoshi Tsunakawa said at a news conference, according to the New York Times.

The Times further reports:
The filing comes as the company's corporate parent, Toshiba of Japan, scrambles to stanch huge losses stemming from Westinghouse's troubled nuclear construction projects in the American South. Now, the future of those projects, which once seemed to be on the leading edge of a renaissance for nuclear energy, is in doubt.

"This is a fairly big and consequential deal," said Richard Nephew, a senior research scholar at the Center on Global Energy Policy at Columbia University. "You've had some power companies and big utilities run into financial trouble, but this kind of thing hasn't happened."
"Toshiba/Westinghouse find themselves a victim of their own hubris and a nuclear industry where financial prudence was never a strong point," Greenpeace Germany added in a brief (pdf).

It's underscoring the global meltdown of the nuclear power industry, argued Greenpeace Japan energy campaigner Ai Kashiwagi.

 "If we look at how nuclear stacks up against renewables, it's clearly in freefall," Kashiwagi said. "An estimated 147 gigawatts  of renewable power was added in 2015, compared to just 11 gigawatts for nuclear power in the same year."

"For too long the nuclear industry has locked away huge amounts of capital at the expense of developing increasingly affordable renewable energy and updating energy grids," Kashiwagi added.

"The future of energy in Japan and globally will be renewables and it's time governments get on board."

See also:
Ea O Ka Aina: Fukushima worse than ever 2/5/17

.

Retail zombies haunt the malls

SUBHEAD: The long list of retailers not accepting bankruptcy is holding the rest of the industry back.

By Mirrian Gottfried on 15 February 2017 for Wall Street Journal -
(https://www.wsj.com/articles/retail-zombies-haunt-industry-1487152981)


Image above: Teen apparel retailer Wet Seal is closing all of its stores. The retail sector would be better off if some others did the same. Photo by Justin Sullivan. From original article.

Let them die.

Investors are normally a ruthless bunch but some are keeping alive a range of battered retailers, which is making things worse for the already struggling industry.

More retailers are teetering on the edge of bankruptcy than at any point since the recession. Moody’s rates the debt of 19 retailers, or 13.5% of the retailers it covers, as “speculative, of poor standing and subject to very high credit risk” or worse.

That is up from only 5.6% of the ratings agency’s retail portfolio at the end of 2011 and compares with 16% in 2009 in the middle of the financial crisis.

As dismal as things are among stores fighting e-commerce competition and endless price pressure, not that many have gone bankrupt. American Apparel, Limited Stores, Wet Seal and Sports Authority so far have been more the exception rather than the rule.

The roster of the living dead is mostly made up of household names, including publicly held companies Sears Holdings, Fairway Group Holdings and Bon-Ton Stores and private-equity-owned David’s Bridal, TOMS Shoes, True Religion Apparel, Nine West Holdings, Payless ShoeSource, Gymboree, Claire’s Stores and J. Crew as well as parent company Chinos Intermediate Holdings.

The future doesn’t look any brighter. A Republican proposal to tax imports by making them nondeductible expenses and exempt exports could further burden these companies.

To buy themselves time, some of the companies have done distressed-debt exchanges, in which bondholders agree to take a haircut, and other more creative arrangements. In September, Claire’s said it swapped $574 million of debt for new term loans that mature in 2021.

And in December, J. Crew moved $250 million worth of intellectual property to a Cayman Islands subsidiary with the aim of borrowing against the assets and using the proceeds to buy back some of its debt. Gymboree, which has a $769 million secured term loan due February 2018, could end up looking for a similar out.

But investors may just be prolonging the inevitable. “What is the end game?” asks Moody’s retail analyst Charles O’Shea.

The problem is investors don’t want to believe the end is near. Instead bondholders are clinging to the idea that at least part of the dire situation is a temporary—the result of bad weather, a dip in tourism or fluctuations in oil and gas prices—as opposed to a secular decline. By allowing the most troubled retailers to live on, investors are contributing to the glut of bricks-and-mortar stores that has been weighing on retail margins, leading to store closures even at healthier retailers such as Macy’s.

A rise in bankruptcies wouldn’t be painless for the survivors. There would be inventory liquidations and even more vacancies at malls. Still, stronger retailers’ best hope for survival may be putting the zombie retailers to rest.

.