Showing posts with label Fantasy. Show all posts
Showing posts with label Fantasy. Show all posts

Covid19 Best Case Scenario

SUBHEAD: Imagining the aftermath of the pandemic and monetary collapse as a time of rebirth.

By Atossa Araxia Abrahamian on 20 March 2020 for the Nation -
(https://www.thenation.com/article/world/coronavirus-future-fiction/)

   
Image above: A view of optimism from a perspective of Islam. From (https://aboutislam.net/spirituality/be-optimistic-about-the-future-trust-in-allahs-plan/).

The virus made itself known in the Chinese city of Wuhan in December in the form of a respiratory illness not unlike pneumonia.

At first, no one knew quite where the disease had come from, but it seemed to touch workers at a wet market where exotic live animals were sold.

Before long, a 61-year-old man with preexisting health conditions died. He’d been a regular at the market, so they blamed the bats, then the pangolins, then the shoppers who procured these delicacies, and finally, just China.

Within weeks, the region was on lockdown, and flights were canceled.

But it was too late for containment. The virus had taken up residence in lungs and on fingertips, clothing and cardboard. Deterred only by soap and water, it traveled far and wide: to South Korea and Thailand, to Seattle and London.

One case was detected in the prophetically named French ski town of Contamines; a large outbreak occurred in Milan before spreading thick and fast throughout Italy. Hospital wards filled.

People panic-shopped for hand sanitizer and, bafflingly, toilet paper. On January 30, the World Health Organization declared the virus a public health emergency; six weeks later, it deemed the crisis a full-blown pandemic.

As winter gave way to spring, the virus crept into schools, cafés, subway cars, and nursing homes. Universities closed dorms and moved to conduct classes online. Remote work protocols were adopted. Service work dried up, dealing cab drivers and waiters and aestheticians an economic blow.

Children were told to stay home from school; parents were not told what to do with their children. But we are social creatures, unfit for long periods of solitude. When large gatherings were shut down, phone calls went unscreened and were even answered. People checked in. People cared.

As the plague spread, the human cost was staggering. Tens of thousands died. Millions more were sickened. It hit the elderly the hardest, as well as those with underlying conditions. The funeral industry boomed, as did the appeal of apocalyptic cults and slickly branded start-up religions.

Fortunately, children were mostly spared, and communities came together to make sure they caught up on their schoolwork in the absence of classrooms, courses, and teachers.

Facing anger, outrage, and grief from their citizens, governments realized that those who could not do their jobs remotely—not to mention those whose work had dried up–would be destitute if they did not receive significant aid.

So that’s what workers received: help, in the form of cash, food, and services. Means-testing went out the window. Work requirements were a joke. Debt payments and water bills and evictions were suspended, then canceled altogether.

Central banks enacted radical measures to stimulate the economy. There were no interest rates left to cut, so lending turned into giving.

No one asked where all the money was coming from, because everyone understood that this was where it had always come from. Some states actually ended up saving money: the happy result of all wars’ being put on hold thanks to a unanimous resolution in the UN Security Council.

Iran reached a détente with Israel after medical researchers banded together to develop a treatment that saved the life of millions, including former prime minister Benjamin Netanyahu. The treatment prevented him from infecting his cellmates in his supermax prison; he ended up succumbing to a stroke.

All but a tiny number of inmates in the United States were released. State funerals for politicians who said they could pray their way out of becoming sick were broadcast online, but attended by no one.

Military contractors started churning out medical supplies; soldiers mobilized to build homes and hospitals; unemployed workers pledged to build small-scale local green infrastructure. Austerity became a distant nightmare of the past.

With the airline industry in shambles and industrial activity at a virtual standstill, carbon emissions dropped dramatically. Demand for oil dried up, too.

Endangered species, unaffected by the virus, began to proliferate. Bats were studied and revered for their immunity to this virus, and many others. Pangolins were never seen at the dinner table again.

Because of stringent precautionary measures and warmer temperatures, the virus did not hit African states as hard as Western ones—a small mercy that nonetheless pushed countries there to establish a continental health system, with the help of the World Health Organization and an interest-free grant from the World Trade Organization, which changed its mission statement entirely.

Instead of lending to economically ailing nations, it would pool funding and make debt-free development grants, reasoning it was the only way to avoid a market crash.

Refugees living in camps—in South America, Lebanon, Greece, and beyond—were rehoused in decent accommodations to cut back on the risk of spreading the infection. They helped with relief efforts, earning them the admiration of locals and helping them integrate in their new homes.

Under the crushing weight of an overburdened health care system, countries began recognizing each others’ medical licenses, easing visa restrictions on doctors and nurses from less affected regions to emigrate and offering high-quality health care to everyone, no questions asked.

People necessarily crossed fewer borders, but when they did, they were greeted with open arms.

The TSA stopped banning liquids on flights, beginning with 12-ounce containers of hand sanitizer. Scientists worked around the clock to develop vaccines; philanthropists poured money into the initiative, even though they would no longer receive tax breaks for their efforts.

As their daily lives were upended, reorganized, and reimagined by the demands of the pandemic and the community, workers around the world adjusted to their new rhythms.

In China, where the crisis began, months of lockdown gave way to blue skies and clean breezes. The smog had cleared—a result of massive factory shutdowns. The sun shone brighter. It was easier to breathe.

Young peopled wondered, Why couldn’t the air be so clean every day? Why did they have to choose?

Farmers even found their livestock thriving, and their crops growing better—a consequence of cleaner soil and water, as well as regulation by health authorities to prevent immunocompromization and animal-borne infections.

For office workers, as the months passed, they began to question the way they had been living before the virus. They missed human contact, but not their commutes.

They wanted to see their colleagues, though were relieved to shed the artifice of the nine-to-five, the endless meetings, the pretending to be busy at all hours of the day, the sad desk lunches and minute-counting.

They worked when they needed to, and stopped when it was over. They spent more time with their families and made bad music and bad art.

See also:
World After Covid-19 Pandemic 3/20/20
Island Breath: Is Corona Virus a Bioweapon 2/20/20
.

Living in a Hopper

SUBHEAD: Edward Hopper's painting "Western Motel" has been built and occupied in a museum.

By David Pescovitz on 22 November 2019 for Boing Boing -
(https://boingboing.net/2019/11/22/sleeping-inside-one-of-edward.html)


Image above: Painting "Western Motel" by Edward Hopper 1957. From original article. Click to enlarge.

As part of the Virginia Museum of Fine Arts' "Edward Hopper and the American Hotel" exhibition, the curators have created a brilliant installation and visitor experience that's seemingly made for Instagram.

They built a physical version of Hopper's above painting "Western Hotel" (1957) and offered overnight stays inside the artwork. The overnight packages sold out very quickly. The New York Times' Margot Boyer-Dry was one of the first guests:

Every detail here was inspired by Edward Hopper’s 1957 painting “Western Motel,” which has been brought to vibrant, three-dimensional life. The only thing missing is the mysterious woman whose burgundy dress matches the bedspread. But that’s where the museum guest comes in.

I was the second person to stay in the museum’s Hopper hotel room, essentially becoming its subject for a night. (Before it sold out through February, the room cost anywhere from $150 a night to $500 for a package, including dinner, mini golf and a tour with the curator.)

My time there was short — a standard stay runs from 9 p.m. to 8 a.m. — and awkward. I had traveled all day to reach Richmond, and these pristinely basic quarters were the main event. Ultimately, it reminded me of every other hotel room I’ve ever stayed in...

Ellen Chapman, a Richmond resident who stayed the night before I did, was more focused on the novelty of an art overnight. “I’ve always had that childhood fantasy of spending the night in a museum,” she said. “The remarkable part for me was waking up, drinking my coffee and looking at this amazing exhibit right next to me.”

Every detail of Edward Hopper’s “Western Motel” has been brought to life at the Virginia Museum of Fine Arts, where you can spend the night https://nyti.ms/34a1vl1




What's Better than Seeing a Hopper Painting?
(https://www.nytimes.com/2019/11/21/arts/design/edward-hopper-virginia-museum.html)

By Margot Boyer-Dry on 21 November 2019 for theNew York Times


Image above: Museum visitor viewing  "Western Motel" installation that is rented out as a hotel room within the Virginia Museum of Fine Arts. From original article. Click to enlarge.

Behind a pane of glass at the Virginia Museum of Fine Arts, a wooden bed frame anchors a sparsely decorated motel room. Vintage suitcases have been arranged at the foot of the bed, and light streams in diagonally through a window, just beyond which a green Buick is visible, parked in the foreground of a mesa landscape.

It looks like the setting of a painting, and it is. Every detail here was inspired by Edward Hopper’s 1957 painting “Western Motel,” which has been brought to vibrant, three-dimensional life. The only thing missing is the mysterious woman whose burgundy dress matches the bedspread. But that’s where the museum guest comes in.

I was the second person to stay in the museum’s Hopper hotel room, essentially becoming its subject for a night. (Before it sold out through February, the room cost anywhere from $150 a night to $500 for a package, including dinner, mini golf and a tour with the curator.)

My time there was short — a standard stay runs from 9 p.m. to 8 a.m. — and awkward. I had traveled all day to reach Richmond, and these pristinely basic quarters were the main event. Ultimately, it reminded me of every other hotel room I’ve ever stayed in.

The “Hopper Hotel Experience” is the flashy centerpiece of “Edward Hopper and the American Hotel,” an exhibition featuring about 60 of the artist’s hospitality-themed works, including paintings, sketches and early-career cover illustrations for the trade magazine, Hotel Management.

Also on view are 35 works by other American artists exploring travel in America across time and medium, from Robert Salmon’s 1830 painting “Dismal Swamp Canal” to a 2009 photograph by Susan Worsham titled “Marine, Hotel Near Airport, Richmond, VA.”

Leo G. Mazow, the show’s curator, said he intends the Hopper room to do more than just generate buzz. “So many people say, ‘Well, Hopper’s about alienation.’” But for Mr. Mazow, Hopper’s themes of “transience and transportation yield a particular type of detachment,” which the hotel experience explores.

Hopper’s painting career coincided with the period when automobile production and expanding highway infrastructure made travel possible for a broader range of Americans.

A lifelong New Yorker, Hopper and his wife, Jo, took several extended road trips, during which he painted common elements of American life: hotels, motels and guesthouses; lighthouses; restaurants; city streets and interiors. His quietly dramatic depictions of those spaces and the people in them came to define an American aesthetic.


Image above: Ellen Chapman, a resident of Richmond, Va., inside the Hopper room at the museum. She said her stay fulfilled a childhood fantasy. From original article. Click to enlarge.

How America Lost Its Mind

SUBHEAD: Our post-truth moment is the sum of mind-sets that have always made America "exceptional".

By Kurt Andersen on 24 August 2017 for The Atlantic -
(https://www.theatlantic.com/magazine/archive/2017/09/how-america-lost-its-mind/534231/)


Image above: Illustration of American fantasies over the decades by R. Kikuo Johnson. From original article.

“You are entitled to your own opinion,
but you are not entitled to your own facts.”
— Daniel Patrick Moynihan

“We risk being the first people in history to have been
able to make their illusions so vivid, so persuasive,
so ‘realistic’ that they can live in them.”
— Daniel J. Boorstin,

When did America become untethered from reality? I first noticed our national lurch toward fantasy in 2004, after President George W. Bush’s political mastermind, Karl Rove, came up with the remarkable phrase reality-based community.

People in “the reality-based community,” he told a reporter, “believe that solutions emerge from your judicious study of discernible reality … That’s not the way the world really works anymore.

A year later, The Colbert Report went on the air. In the first few minutes of the first episode, Stephen Colbert, playing his right-wing-populist commentator character, performed a feature called “The Word.” His first selection: truthiness.

“Now, I’m sure some of the ‘word police,’ the ‘wordinistas’ over at Webster’s, are gonna say, ‘Hey, that’s not a word!’ Well, anybody who knows me knows that I’m no fan of dictionaries or reference books. They’re elitist.

Constantly telling us what is or isn’t true. Or what did or didn’t happen. Who’s Britannica to tell me the Panama Canal was finished in 1914?

If I wanna say it happened in 1941, that’s my right. I don’t trust books—they’re all fact, no heart … Face it, folks, we are a divided nation … divided between those who think with their head and those who know with their heart …



Because that’s where the truth comes from, ladies and gentlemen—the gut.”

Whoa, yes, I thought: exactly. America had changed since I was young, when truthiness and reality-based community wouldn’t have made any sense as jokes. For all the fun, and all the many salutary effects of the 1960s—the main decade of my childhood—I saw that those years had also been the big-bang moment for truthiness.

And if the ’60s amounted to a national nervous breakdown, we are probably mistaken to consider ourselves over it.

Each of us is on a spectrum somewhere between the poles of rational and irrational. We all have hunches we can’t prove and superstitions that make no sense. Some of my best friends are very religious, and others believe in dubious conspiracy theories.

What’s problematic is going overboard—letting the subjective entirely override the objective; thinking and acting as if opinions and feelings are just as true as facts.

The American experiment, the original embodiment of the great Enlightenment idea of intellectual freedom, whereby every individual is welcome to believe anything she wishes, has metastasized out of control.

From the start, our ultra-individualism was attached to epic dreams, sometimes epic fantasies—every American one of God’s chosen people building a custom-made utopia, all of us free to reinvent ourselves by imagination and will.

In America nowadays, those more exciting parts of the Enlightenment idea have swamped the sober, rational, empirical parts.

Little by little for centuries, then more and more and faster and faster during the past half century, we Americans have given ourselves over to all kinds of magical thinking, anything-goes relativism, and belief in fanciful explanation—small and large fantasies that console or thrill or terrify us.

And most of us haven’t realized how far-reaching our strange new normal has become.

Much more than the other billion or so people in the developed world, we Americans believe—really believe—in the supernatural and the miraculous, in Satan on Earth, in reports of recent trips to and from heaven, and in a story of life’s instantaneous creation several thousand years ago.

We believe that the government and its co-conspirators are hiding all sorts of monstrous and shocking truths from us, concerning assassinations, extraterrestrials, the genesis of , the 9/11 attacks, the dangers of vaccines, and so much more.

And this was all true before we became familiar with the terms post-factual and post-truth, before we elected a president with an astoundingly open mind about conspiracy theories, what’s true and what’s false, the nature of reality.

We have passed through the looking glass and down the rabbit hole. America has mutated into Fantasyland.

How widespread is this promiscuous devotion to the untrue? How many Americans now inhabit alternate realities? Any given survey of beliefs is only a sketch of what people in general really think.

But reams of survey research from the past 20 years reveal a rough, useful census of American credulity and delusion. By my reckoning, the solidly reality-based are a minority, maybe a third of us but almost certainly fewer than half.

Only a third of us, for instance, don’t believe that the tale of creation in Genesis is the word of God. Only a third strongly disbelieve in telepathy and ghosts. Two-thirds of Americans believe that “angels and demons are active in the world.”

More than half say they’re absolutely certain heaven exists, and just as many are sure of the existence of a personal God—not a vague force or universal spirit or higher power, but some guy. A third of us believe not only that global warming is no big deal but that it’s a hoax perpetrated by scientists, the government, and journalists.

A third believe that our earliest ancestors were humans just like us; that the government has, in league with the pharmaceutical industry, hidden evidence of natural cancer cures; that extraterrestrials have visited or are visiting Earth.

Almost a quarter believe that vaccines cause autism, and that Donald Trump won the popular vote in 2016.

A quarter believe that our previous president maybe or definitely was (or is?) the anti-Christ.

According to a survey by Public Policy Polling, 15 percent believe that the “media or the government adds secret mind-controlling technology to television broadcast signals,” and another 15 percent think that’s possible. A quarter of Americans believe in witches.

 Remarkably, the same fraction, or maybe less, believes that the Bible consists mainly of legends and fables—the same proportion that believes U.S. officials were complicit in the 9/11 attacks.

When I say that a third believe X and a quarter believe Y, it’s important to understand that those are different thirds and quarters of the population.

Of course, various fantasy constituencies overlap and feed one another—for instance, belief in extraterrestrial visitation and abduction can lead to belief in vast government cover-ups, which can lead to belief in still more wide-ranging plots and cabals, which can jibe with a belief in an impending Armageddon.

Why are we like this?

The short answer is because we’re Americans—because being American means we can believe anything we want; that our beliefs are equal or superior to anyone else’s, experts be damned.

Once people commit to that approach, the world turns inside out, and no cause-and-effect connection is fixed. The credible becomes incredible and the incredible credible.

The word mainstream has recently become a pejorative, shorthand for bias, lies, oppression by the elites. Yet the institutions and forces that once kept us from indulging the flagrantly untrue or absurd—media, academia, government, corporate America, professional associations, respectable opinion in the aggregate—have enabled and encouraged every species of fantasy over the past few decades.

 A senior physician at one of America’s most prestigious university hospitals promotes “miracle cures” on his daily TV show. Cable channels air documentaries treating mermaids, monsters, ghosts, and angels as real.

When a political-science professor attacks the idea “that there is some ‘public’ that shares a notion of reality, a concept of reason, and a set of criteria by which claims to reason and rationality are judged,” colleagues just nod and grant tenure.

The old fringes have been folded into the new center. The irrational has become respectable and often unstoppable.

Our whole social environment and each of its overlapping parts—cultural, religious, political, intellectual, psychological—have become conducive to spectacular fallacy and truthiness and make-believe. There are many slippery slopes, leading in various directions to other exciting nonsense.

During the past several decades, those naturally slippery slopes have been turned into a colossal and permanent complex of interconnected, crisscrossing bobsled tracks, which Donald Trump slid down right into the White House.

American moxie has always come in two types. We have our wilder, faster, looser side: We’re overexcited gamblers with a weakness for stories too good to be true.

But we also have the virtues embodied by the Puritans and their secular descendants: steadiness, hard work, frugality, sobriety, and common sense.

A propensity to dream impossible dreams is like other powerful tendencies—okay when kept in check. For most of our history, the impulses existed in a rough balance, a dynamic equilibrium between fantasy and reality, mania and moderation, credulity and skepticism.

The great unbalancing and descent into full Fantasyland was the product of two momentous changes. The first was a profound shift in thinking that swelled up in the ’60s; since then, Americans have had a new rule written into their mental operating systems: Do your own thing, find your own reality, it’s all relative.

The second change was the onset of the new era of information. Digital technology empowers real-seeming fictions of the ideological and religious and scientific kinds. Among the web’s 1 billion sites, believers in anything and everything can find thousands of fellow fantasists, with collages of facts and “facts” to support them.

Before the internet, crackpots were mostly isolated, and surely had a harder time remaining convinced of their alternate realities. Now their devoutly believed opinions are all over the airwaves and the web, just like actual news. Now all of the fantasies look real.

Today, each of us is freer than ever to custom-make reality, to believe whatever and pretend to be whoever we wish. Which makes all the lines between actual and fictional blur and disappear more easily.

Truth in general becomes flexible, personal, subjective. And we like this new ultra-freedom, insist on it, even as we fear and loathe the ways so many of our wrongheaded fellow Americans use it.

Treating real life as fantasy and vice versa, and taking preposterous ideas seriously, is not unique to Americans.

But we are the global crucible and epicenter. We invented the fantasy-industrial complex; almost nowhere outside poor or otherwise miserable countries are flamboyant supernatural beliefs so central to the identities of so many people.

This is American exceptionalism in the 21st century. The country has always been a one-of-a-kind place. But our singularity is different now.

We’re still rich and free, still more influential and powerful than any other nation, practically a synonym for developed country. But our drift toward credulity, toward doing our own thing, toward denying facts and having an altogether uncertain grip on reality, has overwhelmed our other exceptional national traits and turned us into a less developed country.

People see our shocking Trump moment—this post-truth, “alternative facts” moment—as some inexplicable and crazy new American phenomenon. But what’s happening is just the ultimate extrapolation and expression of mind-sets that have made America exceptional for its entire history.

America was created by true believers and passionate dreamers, and by hucksters and their suckers, which made America successful—but also by a people uniquely susceptible to fantasy, as epitomized by everything from Salem’s hunting witches to Joseph Smith’s creating Mormonism, from P. T. Barnum to speaking in tongues, from Hollywood to Scientology to conspiracy theories, from Walt Disney to Billy Graham to Ronald Reagan to Oprah Winfrey to Trump.

In other words: Mix epic individualism with extreme religion; mix show business with everything else; let all that ferment for a few centuries; then run it through the anything-goes ’60s and the internet age. The result is the America we inhabit today, with reality and fantasy weirdly and dangerously blurred and commingled.

The 1960s and the Beginning of the End of Reason


Image above: Illustration of American 1960's Counter Culture mixed with nutcakes of today. by R. Kikuo Johnson. From original article.

I don't regret or disapprove of many of the ways the ’60s permanently reordered American society and culture. It’s just that along with the familiar benefits, there have been unreckoned costs.

In 1962, people started referring to “hippies,” the Beatles had their first hit, Ken Kesey published One Flew Over the Cuckoo’s Nest, and the Harvard psychology lecturer Timothy Leary was handing out psilocybin and LSD to grad students.

And three hours south of San Francisco, on the heavenly stretch of coastal cliffs known as Big Sur, a pair of young Stanford psychology graduates founded a school and think tank they named after a small American Indian tribe that had lived on the grounds long before. “In 1968,” one of its founding figures recalled four decades later,
Esalen was the center of the cyclone of the youth rebellion. It was one of the central places, like Mecca for the Islamic culture. Esalen was a pilgrimage center for hundreds and thousands of youth interested in some sense of transcendence, breakthrough consciousness, LSD, the sexual revolution, encounter, being sensitive, finding your body, yoga—all of these things were at first filtered into the culture through Esalen. By 1966, ’67, and ’68, Esalen was making a world impact.
This is not overstatement. Essentially everything that became known as New Age was invented, developed, or popularized at the Esalen Institute. Esalen is a mother church of a new American religion for people who think they don’t like churches or religions but who still want to believe in the supernatural.

The institute wholly reinvented psychology, medicine, and philosophy, driven by a suspicion of science and reason and an embrace of magical thinking (also: massage, hot baths, sex, and sex in hot baths). It was a headquarters for a new religion of no religion, and for “science” containing next to no science.

The idea was to be radically tolerant of therapeutic approaches and understandings of reality, especially if they came from Asian traditions or from American Indian or other shamanistic traditions. Invisible energies, past lives, astral projection, whatever—the more exotic and wondrous and unfalsifiable, the better.

Not long before Esalen was founded, one of its co-founders, Dick Price, had suffered a mental breakdown and been involuntarily committed to a private psychiatric hospital for a year.

His new institute embraced the radical notion that psychosis and other mental illnesses were labels imposed by the straight world on eccentrics and visionaries, that they were primarily tools of coercion and control. This was the big idea behind One Flew Over the Cuckoo’s Nest, of course.

And within the psychiatric profession itself this idea had two influential proponents, who each published unorthodox manifestos at the beginning of the decade—R. D. Laing (The Divided Self) and Thomas Szasz (The Myth of Mental Illness).

“Madness,” Laing wrote when Esalen was new, “is potentially liberation and renewal.” Esalen’s founders were big Laing fans, and the institute became a hotbed for the idea that insanity was just an alternative way of perceiving reality.

These influential critiques helped make popular and respectable the idea that much of science is a sinister scheme concocted by a despotic conspiracy to oppress people.

Mental illness, both Szasz and Laing said, is “a theory not a fact.” This is now the universal bottom-line argument for anyone—from creationists to climate-change deniers to anti-vaccine hysterics—who prefers to disregard science in favor of his own beliefs.

You know how young people always think the universe revolves around them, as if they’re the only ones who really get it?

And how before their frontal lobes, the neural seat of reason and rationality, are fully wired, they can be especially prone to fantasy?

In the ’60s, the universe cooperated: It did seem to revolve around young people, affirming their adolescent self-regard, making their fantasies of importance feel real and their fantasies of instant transformation and revolution feel plausible.

Practically overnight, America turned its full attention to the young and everything they believed and imagined and wished.

If 1962 was when the decade really got going, 1969 was the year the new doctrines and their gravity were definitively cataloged by the grown-ups. Reason and rationality were over.

The countercultural effusions were freaking out the old guard, including religious people who couldn’t quite see that yet another Great Awakening was under way in America, heaving up a new religion of believers who “have no option but to follow the road until they reach the Holy City … that lies beyond the technocracy … the New Jerusalem.”

That line is from The Making of a Counter Culture: Reflections on the Technocratic Society and Its Youthful Opposition, published three weeks after Woodstock, in the summer of 1969. Its author was Theodore Roszak, age 35, a Bay Area professor who thereby coined the word counterculture.

Roszak spends 270 pages glorying in the younger generation’s “brave” rejection of expertise and “all that our culture values as ‘reason’ and ‘reality.’ ” (Note the scare quotes.)

So-called experts, after all, are “on the payroll of the state and/or corporate structure.” A chapter called “The Myth of Objective Consciousness” argues that science is really just a state religion.

To create “a new culture in which the non-intellective capacities … become the arbiters of the good [and] the true,” he writes, “nothing less is required than the subversion of the scientific world view, with its entrenched commitment to an egocentric and cerebral mode of consciousness.” He welcomes the “radical rejection of science and technological values.”








Earlier that summer, a University of Chicago sociologist (and Catholic priest) named Andrew Greeley had alerted readers of The New York Times Magazine that beyond the familiar signifiers of youthful rebellion (long hair, sex, drugs, music, protests), the truly shocking change on campuses was the rise of anti-rationalism and a return of the sacred—“mysticism and magic,” the occult, séances, cults based on the book of Revelation.

When he’d chalked a statistical table on a classroom blackboard, one of his students had reacted with horror: “Mr. Greeley, I think you’re an empiricist.”

As 1969 turned to 1970, a 41-year-old Yale Law School professor was finishing his book about the new youth counterculture. Charles Reich was a former Supreme Court clerk now tenured at one of ultra-rationalism’s American headquarters.

But hanging with the young people had led him to a midlife epiphany and apostasy. In 1966, he had started teaching an undergraduate seminar called “The Individual in America,” for which he assigned fiction by Kesey and Norman Mailer. He decided to spend the next summer, the Summer of Love, in Berkeley. On the road back to New Haven, he had his Pauline conversion to the kids’ values.

His class at Yale became hugely popular; at its peak, 600 students were enrolled. In 1970, The Greening of America became The New York Times’ best-selling book (as well as a much-read 70-page New Yorker excerpt), and remained on the list for most of a year.

At 16, I bought and read one of the 2 million copies sold. Rereading it today and recalling how much I loved it was a stark reminder of the follies of youth. Reich was shamelessly, uncritically swooning for kids like me.  

The Greening of America may have been the mainstream’s single greatest act of pandering to the vanity and self-righteousness of the new youth. Its underlying theoretical scheme was simple and perfectly pitched to flatter young readers:

There are three types of American “consciousness,” each of which “makes up an individual’s perception of reality … his ‘head,’ his way of life.”  

Consciousness I people were old-fashioned, self-reliant individualists rendered obsolete by the new “Corporate State”—essentially, your grandparents.  

Consciousness IIs were the fearful and conformist organization men and women whose rationalism was a tyrannizing trap laid by the Corporate State—your parents.

And then there was Consciousness III, which had “made its first appearance among the youth of America,” “spreading rapidly among wider and wider segments of youth, and by degrees to older people.”

If you opposed the Vietnam War and dressed down and smoked pot, you were almost certainly a III. Simply by being young and casual and undisciplined, you were ushering in a new utopia.

Reich praises the “gaiety and humor” of the new Consciousness III wardrobe, but his book is absolutely humorless—because it’s a response to “this moment of utmost sterility, darkest night and most extreme peril.”

Conspiracism was flourishing, and Reich bought in. Now that “the Corporate State has added depersonalization and repression” to its other injustices, “it has threatened to destroy all meaning and suck all joy from life.” Reich’s magical thinking mainly concerned how the revolution would turn out.

“The American Corporate State,” having produced this new generation of longhaired hyperindividualists who insist on trusting their gut and finding their own truth, “is now accomplishing what no revolutionaries could accomplish by themselves.

The machine has begun to destroy itself.” Once everyone wears Levi’s and gets high, the old ways “will simply be swept away in the flood.”

The inevitable/imminent happy-cataclysm part of the dream didn’t happen, of course. The machine did not destroy itself. But Reich was half-right. An epochal change in American thinking was under way and “not, as far as anybody knows, reversible …

There is no returning to an earlier consciousness.” His wishful error was believing that once the tidal surge of new sensibility brought down the flood walls, the waters would flow in only one direction, carving out a peaceful, cooperative, groovy new continental utopia, hearts and minds changed like his, all of America Berkeleyized and Vermontified.

Instead, Consciousness III was just one early iteration of the anything-goes, post-reason, post-factual America enabled by the tsunami.

Reich’s faith was the converse of the Enlightenment rationalists’ hopeful fallacy 200 years earlier.

Granted complete freedom of thought, Thomas Jefferson and company assumed, most people would follow the path of reason.

Wasn’t it pretty to think so.

I remember when fantastical beliefs went fully mainstream, in the 1970s.

My irreligious mother bought and read The Secret Life of Plants, a big best seller arguing that plants were sentient and would “be the bridesmaids at a marriage of physics and metaphysics.” The amazing truth about plants, the book claimed, had been suppressed by the FDA and agribusiness.

My mom didn’t believe in the conspiracy, but she did start talking to her ficuses as if they were pets.

In a review, The New York Times registered the book as another data point in how “the incredible is losing its pariah status.”

Indeed, mainstream publishers and media organizations were falling over themselves to promote and sell fantasies as nonfiction.

In 1975 came a sensational autobiography by the young spoon bender and mind reader Uri Geller as well as Life After Life, by Raymond Moody, a philosophy Ph.D. who presented the anecdotes of several dozen people who’d nearly died as evidence of an afterlife.

The book sold many millions of copies; before long the International Association for Near Death Studies formed and held its first conference, at Yale.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood.

Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs.

That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large:

All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests. Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe.

The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking?

Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia.

During the ’60s, large swaths of academia made a turn away from reason and rationalism as they’d been understood. Many of the pioneers were thoughtful, their work fine antidotes to postwar complacency. The problem was the nature and extent of their influence at that particular time, when all premises and paradigms seemed up for grabs.

That is, they inspired half-baked and perverse followers in the academy, whose arguments filtered out into the world at large:

All approximations of truth, science as much as any fable or religion, are mere stories devised to serve people’s needs or interests.

Reality itself is a purely social construction, a tableau of useful or wishful myths that members of a society or tribe have been persuaded to believe.

The borders between fiction and nonfiction are permeable, maybe nonexistent. The delusions of the insane, superstitions, and magical thinking? Any of those may be as legitimate as the supposed truths contrived by Western reason and science. The takeaway: Believe whatever you want, because pretty much everything is equally true and false.

These ideas percolated across multiple academic fields. In 1965, the French philosopher Michel Foucault published Madness and Civilization in America, echoing Laing’s skepticism of the concept of mental illness; by the 1970s, he was arguing that rationality itself is a coercive “regime of truth”—oppression by other means. Foucault’s suspicion of reason became deeply and widely embedded in American academia.

When I first read that, at age 18, I loved the quotation marks. If reality is simply the result of rules written by the powers that be, then isn’t everyone able—no, isn’t everyone obliged—to construct their own reality? The book was timed perfectly to become a foundational text in academia and beyond.

A more extreme academic evangelist for the idea of all truths being equal was a UC Berkeley philosophy professor named Paul Feyerabend. His best-known book, published in 1975, was Against Method: Outline of an Anarchistic Theory of Knowledge.

“Rationalism,” it declared, “is a secularized form of the belief in the power of the word of God,” and science a “particular superstition.”

In a later edition of the book, published when creationists were passing laws to teach Genesis in public-school biology classes, Feyerabend came out in favor of the practice, comparing creationists to Galileo. Science, he insisted, is just another form of belief.

“Only one principle,” he wrote, “can be defended under all circumstances and in all stages of human development. It is the principle: anything goes.”

Over in anthropology, where the exotic magical beliefs of traditional cultures were a main subject, the new paradigm took over completely—don’t judge, don’t disbelieve, don’t point your professorial finger. This was understandable, given the times: colonialism ending, genocide of American Indians confessed, U.S. wars in the developing world.

Who were we to roll our eyes or deny what these people believed? In the ’60s, anthropology decided that oracles, diviners, incantations, and magical objects should be not just respected, but considered equivalent to reason and science.

If all understandings of reality are socially constructed, those of Kalabari tribesmen in Nigeria are no more arbitrary or faith-based than those of college professors.

In 1968, a UC Davis psychologist named Charles Tart conducted an experiment in which, he wrote, “a young woman who frequently had spontaneous out-of-body experiences”—didn’t “claim to have” them but “had” them—spent four nights sleeping in a lab, hooked up to an EEG machine.

Her assigned task was to send her mind or soul out of her body while she was asleep and read a five-digit number Tart had written on a piece of paper placed on a shelf above the bed. He reported that she succeeded.

Other scientists considered the experiments and the results bogus, but Tart proceeded to devote his academic career to proving that attempts at objectivity are a sham and magic is real. In an extraordinary paper published in 1972 in Science, he complained about the scientific establishment’s “almost total rejection of the knowledge gained” while high or tripping.

He didn’t just want science to take seriously “experiences of ecstasy, mystical union, other ‘dimensions,’ rapture, beauty, space-and-time transcendence.” He was explicitly dedicated to going there. A “perfectly scientific theory may be based on data that have no physical existence,” he insisted.

The rules of the scientific method had to be revised. To work as a psychologist in the new era, Tart argued, a researcher should be in the altered state of consciousness he’s studying, high or delusional “at the time of data collection” or during “data reduction and theorizing.”

Tart’s new mode of research, he admitted, posed problems of “consensual validation,” given that “only observers in the same [altered state] are able to communicate adequately with one another.”

Tart popularized the term consensus reality for what you or I would simply call reality, and around 1970 that became a permanent interdisciplinary term of art in academia. Later he abandoned the pretense of neutrality and started calling it the consensus trance—people committed to reason and rationality were the deluded dupes, not he and his tribe.

Even the social critic Paul Goodman, beloved by young leftists in the ’60s, was flabbergasted by his own students by 1969. “There was no knowledge,” he wrote, “only the sociology of knowledge. They had so well learned that … research is subsidized and conducted for the benefit of the ruling class that they did not believe there was such a thing as simple truth.”

Ever since, the American right has insistently decried the spread of relativism, the idea that nothing is any more correct or true than anything else.

Conservatives hated how relativism undercut various venerable and comfortable ruling ideas—certain notions of entitlement (according to race and gender) and aesthetic beauty and metaphysical and moral certainty. 
 
Yet once the intellectual mainstream thoroughly accepted that there are many equally valid realities and truths, once the idea of gates and gatekeeping was discredited not just on campuses but throughout the culture, all American barbarians could have their claims taken seriously.

Conservatives are correct that the anything-goes relativism of college campuses wasn’t sequestered there, but when it flowed out across America it helped enable extreme Christianities and lunacies on the right—gun-rights hysteria, black-helicopter conspiracism, climate-change denial, and more.

The term useful idiot was originally deployed to accuse liberals of serving the interests of true believers further on the left. In this instance, however, postmodern intellectuals—post-positivists, poststructuralists, social constructivists, post-empiricists, epistemic relativists, cognitive relativists, descriptive relativists—turned out to be useful idiots most consequentially for the American right.

“Reality has a well-known liberal bias,” Stephen Colbert once said, in character, mocking the beliefs-trump-facts impulse of today’s right. Neither side has noticed, but large factions of the elite left and the populist right have been on the same team.

 [IB Publisher's note: If you've read to here you are only part way through - about 40%. To read about Kurt Anderson's take the 70's and 80's and beyond go to (https://www.theatlantic.com/magazine/archive/2017/09/how-america-lost-its-mind/534231/) and search for:

"Conspiracy and Paranoia in the 1970s"

 Enjoy!



.

Refusing the Call

SUBHEAD: Updating the parable of the Hobbit and the Lord of the Rings to our predicament.

By John Michael Greer on 23 April 2014 for the Archdruid Report -
(http://thearchdruidreport.blogspot.com/2014/04/refusing-call-tale-rewritten.html)


Image above: Gandalf tries to entice Frodo Baggins to join an adventure. Still frame from the movie "The Hobbit: An Unexpected Journey".  From (http://inthenameofageek.blogspot.com/2012/12/the-hobbit-unexpected-journey-review.html).

I have been wondering for some time now how to talk about the weirdly autumnal note that sounds so often and so clearly in America these days.

Through the babble and clatter, the seven or eight television screens yelling from the walls of every restaurant you pass and all the rest of it, there comes a tone and a mood that reminds me of wind among bare branches and dry leaves crackling underfoot.

It's as though even the people who insist most loudly that it’s all onward and upward from here don’t believe it any more, and those for whom the old optimism stopped being more than a soothing shibboleth a long time ago are hunching their shoulders, shutting their eyes tight, and hoping that things can still hold together for just a little while longer.

It’s not just that American politicians and pundits are insisting at the top of their lungs that the United States can threaten Russia with natural gas surpluses that don’t exist, though that’s admittedly a very bad sign all by itself.

It’s that this orgy of self-congratulatory nonsense appears in the news right next to reports that oil and gas companies are slashing their investments in the fracking technology and shale leases that were supposed to produce those imaginary surpluses, having lost a great deal of money pursuing the shale oil mirage, while Russia and Iran pursue a trade deal that will make US sanctions against Iran all but irrelevant, and China is quietly making arrangements to conduct its trade with Europe in yuan rather than dollars.

Strong nations in control of their own destinies, it’s fair to note, don’t respond to challenges on this scale by plunging their heads quite so enthusiastically into the sands of self-deception.

To shift temporal metaphors a bit, the long day of national delusion that dawned back in 1980, when Ronald Reagan famously and fatuously proclaimed “it’s morning in America,” is drawing on rapidly toward dusk, and most Americans are hopelessly unprepared for the coming of night.

They’re unprepared in practical terms, that is, for an era in which the five per cent of us who live in the United States will no longer dispose of a quarter of the world’s energy supply and a third of its raw materials and industrial products, and in which what currently counts as a normal American lifestyle will soon be no more than a fading memory for the vast majority.

They’re just as unprepared, though, for the psychological and emotional costs of that shattering transformation—not least because the change isn’t being imposed on them at random by an indifferent universe, but comes as the inevitable consequence of their own collective choices in decades not that long past.

The hard fact that most people in this country are trying not to remember is this: in the years right after Reagan’s election, a vast number of Americans enthusiastically turned their backs on the promising steps toward sustainability that had been taken in the previous decade, abandoned the ideals they’d been praising to the skies up to that time, and cashed in their grandchildrens’ future so that they didn’t have to give up the extravagance and waste that defined their familiar and comfortable lifestyles.

As a direct result, the nonrenewable resources that might have supported the transition to a sustainable future went instead to fuel one last orgy of wretched excess. Now, though, the party is over, the bill is due, and the consequences of that disastrous decision have become a massive though almost wholly unmentionable factor in our nation’s culture and collective psychology.

A great many of the more disturbing features of contemporary American life, I’m convinced, can’t be understood unless America’s thirty-year vacation from reality is taken into account. A sixth of the US population is currently on antidepressant medications, and since maybe half of Americans can’t afford to get medication at all, the total number of Americans who are clinically depressed is likely a good deal higher than prescription figures suggest.

The sort of bizarre delusions that used to count as evidence of serious mental illness—baroque conspiracy theories thickly frosted with shrill claims of persecution, fantasies of imminent mass death as punishment for humanity’s sins, and so on—have become part of the common currency of American folk belief.

For that matter, what does our pop culture’s frankly necrophiliac obsession with vampires amount to but an attempt, thinly veiled in the most transparent of symbolism, to insist that it really is okay to victimize future generations for centuries down the line in order to prolong one’s own existence?

Mythic and legends such as this can be remarkably subtle barometers of the collective psyche. The transformation that turned the vampire from just another spooky Eastern European folktale into a massive pop culture presence in industrial North America has quite a bit to say about the unspoken ideas and emotions moving through the crawlspaces of our collective life.

In the same way, it’s anything but an accident that the myth of the heroic quest has become so pervasive a presence in the modern industrial world that Joseph Campbell could simply label it “the monomyth,” the basic form of myth as such.

In any sense other than a wholly parochial one, of course, he was quite wrong—the wild diversity of the world’s mythic stories can’t be forced into any one narrative pattern—but if we look only at popular culture in the modern industrial world, he’s almost right.

The story of the callow nobody who answers the call to adventure, goes off into the unknown, accomplishes some grand task, and returns transformed, to transform his surroundings in turn, is firmly welded into place in the imagination of our age.

You’ll find it at the center of J.R.R. Tolkien’s great works of fantasy, in the most forgettable products of the modern entertainment industry, and everything in between and all around.

Yet there’s a curious blind spot in all this: we hear plenty about those who answer the call to adventure, and nothing at all about those who refuse it. Those latter don’t offer much of a plot engine for an adventure story, granted, but such a tale could make for a gripping psychological study—and one that has some uncomfortably familiar features.

With that in mind, with an apology in the direction of Tolkien’s ghost, and with another to those of my readers who aren’t lifelong Tolkien buffs with a head full of Middle-earth trivia—yes, I used to sign school yearbooks in fluent Elvish—

I’d like to suggest a brief visit to an alternate Middle-earth: one in which Frodo Baggins, facing the final crisis of the Third Age and the need to leave behind everything he knew and loved in order to take the Ring to Mount Doom, crumpled instead, with a cry of “I can’t, Gandalf, I just can’t.” Perhaps you’ll join me in a quiet corner of The Green Dragon, the best inn in Bywater, take a mug of ale from the buxom hobbit barmaid, and talk about old Frodo, who lived until recently just up the road and across the bridge in Hobbiton.

You’ve heard about the magic ring he had, the one that he inherited from his uncle Bilbo, the one that Gandalf the wizard wanted him to go off and destroy? That was thirty years ago, and most folk in the Shire have heard rumors about it by now.

Yes, it’s quite true; Frodo was supposed to leave the Shire and go off on an adventure, as Bilbo did before him, and couldn’t bring himself to do it. He had plenty of reasons to stay home, to be sure. He was tolerably well off and quite comfortable, all his friends and connections were here, and the journey would have been difficult and dangerous.

Nor was there any certainty of success—quite the contrary, it’s entirely possible that he might have perished somewhere in the wild lands, or been caught by the Dark Lord’s servants, or what have you.

So he refused, and when Gandalf tried to talk to him about it, he threw the old wizard out of Bag End and slammed the round green door in his face. Have you ever seen someone in a fight who knows that he’s in the wrong, and knows that everyone else knows it, and that knowledge just makes him even more angry and stubborn? That was Frodo just then.

Friends of mine watched the whole thing, or as much of it as could be seen from the garden outside, and it was not a pleasant spectacle.

It’s what happened thereafter, though, that bears recalling. I’m quite sure that if Frodo had shown the least sign of leaving the Shire and going on the quest, Sauron would have sent Black Riders after him in a fine hurry, and there’s no telling what else might have come boiling up out of Mordor.

It’s by no means impossible that the Dark Lord might have panicked, and launched a hasty, ill-advised assault on Gondor right away.

For all I know, that may have been what Gandalf had in mind, tricking the Dark Lord into overreacting before he’d gathered his full strength, and before Gondor and Rohan had been thoroughly weakened from within.

Still, once Sauron’s spies brought him word that Frodo had refused to embark on the quest, the Dark Lord knew that he had a good deal less to fear, and that he could afford to take his time.

Ever since then, there have been plenty of servants of Mordor in and around the Shire, and a Black Rider or two keeping watch nearby, but nothing obvious or direct, nothing that might rouse whatever courage Frodo might have had left or convince him that he had to flee for his life.

Sauron was willing to be patient—patient and cruel. I’m quite sure he knew perfectly well what the rest of Frodo’s life would be like.

So Gandalf went away, and Frodo stayed in Bag End, and for years thereafter it seemed as though the whole business had been no more than a mistake. The news that came up the Greenway from the southern lands was no worse than before; Gondor still stood firm, and though there was said to be some kind of trouble in Rohan, well, that was only to be expected now and then.

Frodo even took to joking about how gullible he’d been to believe all those alarmist claims that Gandalf had made. Sauron was still safely cooped up in Mordor, and all seemed right with Middle-earth.

Of course part of that was simply that Frodo had gotten even wealthier and more comfortable than he’d been before. He patched up his relationship with the Sackville-Bagginses, and he invested a good deal of his money in Sandyman’s mill in Hobbiton, which paid off handsomely.

He no longer spent time with many of his younger friends by then, partly because they had their own opinions about what he should have done, and partly because he had business connections with some of the wealthiest hobbits in the Shire, and wanted to build on those.

He no longer took long walks around the Shire, as he’d done before, and he gave up visiting elves and dwarves when he stopped speaking to Gandalf.

But of course the rumors and news from the southern lands slowly but surely turned to the worse, as the Dark Lord gathered his power and tightened his grip on the western lands a little at a time. I recall when Rohan fell to Saruman’s goblin armies.

That was a shock for a great many folk, here in the Shire and elsewhere. Soon thereafter, though, Frodo was claiming that after all, Saruman wasn’t Sauron, and Rohan wasn’t that important, and for all anyone knew, the wizard and the Dark Lord might well end up at each other’s throats and spare the rest of us.

Still, it was around that time that Frodo stopped joking about Gandalf’s warnings, and got angry if anyone mentioned them in his hearing. It was around that same time, too, that he started insisting loudly and often that someone would surely stop Sauron.

One day it was the elves: after all, they had three rings of power, and could surely overwhelm the forces of Mordor if they chose to. Another day, the dwarves would do it, or Saruman, or the men of Gondor, or the Valar in the uttermost West. There were so many alternatives! His friends very quickly learned to nod and agree with him, for he would lose his temper and start shouting at them if they disagreed or even asked questions.

When Lorien was destroyed, that was another shock. It was after that, as I recall, that Frodo started hinting darkly that the elves didn’t seem to be doing anything with their three rings of power to stop Sauron, and maybe they weren’t as opposed to him as they claimed. He came up with any number of theories about this or that elvish conspiracy.

The first troubles were starting to affect the Shire by then, of course, and his investments were beginning to lose money; it was probably inevitable that he would start claiming that the conspiracy was aimed in part against hobbits, against the Shire, or against him in particular—especially the latter. They wanted his ring, of course. That played a larger and larger role in his talk as the years passed.

I don’t recall hearing of any particular change in his thinking when word came that Minas Tirith had been taken by the Dark Lord’s armies, but it wasn’t much later that a great many elves came hurrying along the East Road through the Shire, and a few months after that, word came that Rivendell had fallen.

That was not merely a shock, but a blow; Frodo had grown up hearing his uncle’s stories about Rivendell and the elves and half-elves who lived there. There was a time after that news came that some of us briefly wondered if old Frodo might actually find it in himself to do the thing he’d refused to do all those years before.

But of course he did nothing of the kind, not even when the troubles here in the Shire began to bite more and more deeply, when goblins started raiding the borders of the North Farthing and the Buckland had to be abandoned to the Old Forest. No, he started insisting to anyone who would listen that Middle-earth was doomed, that there was no hope left in elves or dying Númenor, that Sauron’s final victory would surely come before—oh, I forget what the date was; it was some year or other not too far from now.

He spent hours reading through books of lore, making long lists of reasons why the Dark Lord’s triumph was surely at hand. Why did he do that? Why, for the same reason that drove him to each of his other excuses in turn: to prove to himself that his decision to refuse the quest hadn’t been the terrible mistake he knew perfectly well it had been.

And then, of course, the Ring betrayed him, as it betrayed Gollum and Isildur before him. He came home late at night, after drinking himself half under the table at the Ivy Bush, and discovered that the Ring was nowhere to be found.

After searching Bag End in a frantic state, he ran out the door and down the road toward Bywater shouting “My precious! My precious!” He was weeping and running blindly in the night, and when he got to the bridge he stumbled; over he went into the water, and that was the end of him. They found his body in a weir downstream the next morning.

The worst of it is that right up to the end, right up to the hour the Ring left him, he still could have embarked on the quest. It would have been a different journey, and quite possibly a harder one. With Rivendell gone, he would have had to go west rather than east, across the Far Downs to Cirdan at the Grey Havens, where you’ll find most of the high-elves who still remain in Middle-earth.

From there, with such companions as might have joined him, he would have had to go north and then eastward through Arnor, past the ruins of Annuminas and Lake Evendim, to the dales of the Misty Mountains, and then across by one of the northern passes: a hard and risky journey, but by no means impossible, for with no more need to hinder travel between Rivendell and Lorien, the Dark Lord’s watch on the mountains has grown slack.

Beyond the mountains, the wood-elves still dwell in the northern reaches of Mirkwood, along with refugees from Lorien and the last of the Beornings. He could have gotten shelter and help there, and boats to travel down the River Running into the heart of Wilderland. From there his way would have led by foot to the poorly guarded northern borders of Mordor—when has Sauron ever had to face a threat from that quarter?

So you see that it could have been done. It could still be done, if someone were willing to do it. Even though so much of what could have been saved thirty years ago has been lost, even though Minas Tirith, Edoras, Lorien and Rivendell have fallen and the line of the kings of Gondor is no more, it would still be worth doing; there would still be many things that could be saved.

Nor would such a journey have to be made alone. Though Aragorn son of Arathorn was slain in the last defense of Rivendell, there are still Rangers to be found in Cirdan’s realm and the old lands of Arnor; there are elf-warriors who hope to avenge the blood shed at Rivendell, and dwarves from the Blue Mountains who have their own ancient grudges against the Dark Lord.

The last free Rohirrim retreated to Minhiriath after Éomer fell at Helm’s Deep, and still war against King Grima, while Gondor west of the river Gilrain clings to a tenuous independence and would rise up against Sauron at need. Would those and the elves of Lindon be enough? No one can say; there are no certainties in this business, except for the one Frodo chose—the certainty that doing nothing will guarantee Sauron’s victory.

And there might even still be a wizard to join such a quest. In fact, there would certainly be one—the very last of them, as far as I know. Gandalf perished when Lorien fell, I am sorry to say, and as for Saruman, the last anyone saw of him, he was screaming in terror as two Ringwraiths dragged him through the door of the Dark Tower; his double-dealing was never likely to bring him to a good end.

The chief of the Ringwraiths rules in Isengard now. Still, there was a third in these western lands: fool and bird-tamer, Saruman called him, having never quite managed to notice that knowledge of the ways of nature and the friendship of birds and beasts might have considerable value in the last need of Middle-earth. Radagast is his name; yes, that would be me.

Why am I telling you all this? Well, you are old Frodo’s youngest cousin, are you not? Very nearly the only one of his relatives with enough of the wild Tookish blood in you to matter, or so I am told. It was just a month ago that you and two of your friends were walking in the woods, and you spoke with quite a bit of anger about how the older generation of hobbits had decided to huddle in their holes until the darkness falls—those were your very words, I believe.

How did I know that? Why, a little bird told me—a wren, to be precise, a very clever and helpful little fellow, who runs errands for me from time to time when I visit this part of Middle-earth. If you meant what you said then, there is still hope.

And the Ring? No, it was not lost, or not for long. It slipped from its chain and fell from old Frodo’s pocket as he stumbled home that last night, and a field mouse spotted it. I had briefed all the animals and birds around Hobbiton, of course, and so she knew what to do; she dragged the Ring into thick grass, and when dawn came, caught the attention of a jay, who took it and hid it high up in a tree. I had to trade quite a collection of sparkling things for it!

But here it is, in this envelope, waiting for someone to take up the quest that Frodo refused.

The choice is yours, my dear hobbit. What will you do?

.

Elysium ad Nauseum

SUBHEAD: Elysium is everything that sucks about Hollywood feature films these days; violent, stupid, cynical, and unoriginal.

By Colin Berry on 6 August 2013 for Boing Boing -
(http://boingboing.net/2013/08/06/elysium-everything-that-sucks.html)


Image above: The Las Vegas Review Journal critic wrote "Elysium good summer sci-fi despite heavy political overtones". From (http://www.reviewjournal.com/columns-blogs/entertainment/movies/elysium-good-summer-sci-fi-despite-heavy-political-overtones).

[Author's note: what you’re about to read contains spoilers, but only if you haven’t seen a blockbuster since Nixon was president.]

[IB Publisher's note: Elyseium: The place at the ends of the earth to which certain favored heroes were conveyed by the gods after death.]

The fact that movie ticket sales have remained essentially stagnant for the past 10 years isn’t news to anyone. But if you’re interested to know why, look no further than the new sci-fi thriller Elysium, opening this week. Written and directed by Neill Blomkamp, the South African-Canadian director whose last film, District 9, came out in 2009, Elysium is much bigger and much more expensive — just under $100 million — and could serve as a poster child for everything that’s wrong with modern movies.

Set in 2154, Elysium tells the story of Max (Matt Damon), a lovable ex-con scraping out life on a ruined planet Earth, where crime, disease, and poverty are the norm for its (mostly) brown citizens. Nineteen miles above Earth, however, hangs Elysium, the ultimate gated community: a jewel-like space station inhabited only by super-rich, (mostly) white folks.

Though a series of unfortunate events, Max gets deathly sick and needs to get up to Elysium, where health problems are healed instantaneously with what looks like a magic tanning bed. (Each Elysian home has one.) To achieve this, he makes a Faustian deal to kidnap John Carlyle (William Fichtner), a rich tycoon, and steal his brain data, the ramifications of which could change civilization. Meanwhile, Max is also being hunted by Kruger (Sharlto Copley), a loathsome government sleeper agent and pawn of Elysium’s wicked overlord, Secretary Delacourt (Jodie Foster) who, besides French, speaks in halting English villain-ese. With me so far?

Woven into this adventure is Max’s friendship (and possible romance) with Frey (Alice Braga), whom he promised, years ago — and it’s in flashback, so you know it’s gonna happen — he’d bring to Elysium.

If all this sounds interesting, it’s not. From the first minute to the last, Blomkamp’s film is entirely predictable, lining up cinematic tropes — The childhood love interest! The charged object! The super-evil villains! The doomed best friend! — like dominoes to topple one by one.

The writing is dreadful. Characters are either completely good or completely evil; nuance in Blomkamp’s world is as rare as an ex-con on Elysium. The dialogue is pure torture: the scenes with Foster, one of her generation’s finest actors, are in particular the stuff of comic-book cliché. Got a favorite phrase from any major thriller or action flick? It’s in there. Hoping a likable character might do something mean, or one of the villains might reveal a little humanity? Think again, fellow movie-goer!

The responsibility for this lies directly with Blomkamp, who wrote the film and should know better. District 9, as I recall, had one or two pieces of moving dialogue, a storyline I couldn’t always predict, and an emotional spectrum beyond merely black or white.

What District 9 and Elysium do share, however, is carnage, and although I’m not in the film’s target demographic, I will say this (because someone has to, repeatedly): movie violence is a spreading disease on culture. Elysium, for all its purported good intentions — Blomkamp claims that (besides blowing things up) he’s interested in “serious topics” like universal health care and wealth discrepancy — the film is so thickly mired in gunfire, torture, and bloodshed that social issues are forgotten, scattered like spent shells on a dusty dystopian floor.

These kinds of films aren’t entertaining anymore; they’re offensive. Yet they sit side-by-side among countless clones, bullets in the chamber, cogs in wheels in the monstrous Movie Marketing machine — operating within the supercolossal Entertainment Industry — which only cares about money and formula and self-protection. No wonder the industry is dying. For our parts, as moviegoers, we are partly culpable for legitimizing these projects, in which innovation, risk, and surprise are next to nil, by continuing to buy their tickets.

Elysium-wise, is any of it worth watching? Some. The CGI for the space station itself, inspired by the Stanford Torus, is pretty cool. Wagner Moura, as the steampunk entrepreneur Spider, does a good job balancing manic and plain old crazy. But that’s really it — the film squarely fails the Bechdel test, and Copley’s Kruger is a disaster of agitated overacting.

I caught Elysium at a press screening in Los Angeles, where writers and critics like me filed in, picked up our free popcorn and parking passes, and sat down to watch. During the film, two young men in the audience chuckled a couple of times, after particularly explosive barrages of gun violence, but other than that, nobody seemed to react. As the credits began to roll, everybody got up and left. No applause, no discussion — nothing. We’d seen this movie a million times before.

The next day, on the top floor of a posh Beverly Hills hotel, we were allowed a rapid-fire roundtable interview with the director himself. Blomkamp seemed like a nice guy: amiable enough, responding to our questions with candor and what appeared to be not a lot of ego. But when I asked him why, for a hundred million bucks, he had chosen to take so few chances, he seemed caught a little off guard.

“My whole goal was big-scale cinema and archetypal storytelling,” he explained. “It will probably be the most expensive film I’ll ever make, but I think it’ll be something I’ll eternally be proud of. My next film will be far lower budget than this; the one after that may end my career. But to have a bunch of low-budget, super edgy films and not have something cinematic? I wouldn’t be happy not to have that in my body of work. Elysium really is the film I wanted to make.”

Elysium billboards and bus benches featuring Matt Damon’s shaved head are up all over L.A. right now, and all over the world. They’ll be up for a couple of weeks, before midnight crews pull them down and replace them with the next project, the next blockbuster, the next 90-minute exercise in toppled tropes. It’s what movies pass for now. While independent films struggle to get produced — brief meteors in the night sky — the movie industry hangs above us, Elysium-like, a bloated superstructure reserved only for the richest and lorded over by villains. Only rarely does a young director make it up there, slipping past the sentries and successfully landing his (or, even more rarely, her) scrappy ship on its surface without getting deported or shot down.

Maybe Blomkamp will be happy up there. Maybe he’ll make a difference in Hollywood. Maybe his next project will push against the formula, against the community he’s recently been welcomed into. We’ll see.

The MPAA estimates roughly 600 movies will be released in 2013. At last week’s screening, while I sat eating GMO popcorn and sipping high-fructose lemonade, watching a $100 million, cliché-riddled exercise that glorified guns and explosive violence, I thought to myself: We are doomed.

• Colin Berry is an author, editor, content consultant, speaker, and award-winning design journalist.

.