Oil / Economy like Flour / Bakery

SUBHEAD: How is an Oil Shortage Like a Missing Cup of Flour? The economy cannot afford high-priced oil. By Gail Tverberg on 2 February 2011 in Our Finite World - (http://ourfiniteworld.com/2011/02/02/how-is-an-oil-shortage-like-a-missing-cup-of-flou/#more-945) Image above: Ingredients for gooey butter cake. From (http://thepioneerwoman.com/tasty-kitchen-blog/2011/01/step-by-step-gooey-butter-cake).

If I bake a batch of cookies and the recipe calls for two cups of flour, but I have only one, it is pretty clear that I can’t bake a full batch of cookies. All I can make is half a batch. I will end up with half of the sugar, and half of the eggs, and half of the shortening that I originally planned to use left over.

Liebig’s Law of the Minimum applies in situations like this. In agriculture, it says that growth is controlled not by total resources available, but by the one in scarcest supply. If a baker does not have enough of one necessary ingredient, he will have to make a smaller batch. I wonder if it isn’t a little like this with oil and the economy.

Oil seems to me to be a necessary “ingredient” in our economy. If for some reason oil is not available (perhaps because the buyer cannot afford it), then to some extent other “ingredients” in the economy, like human labor and new houses and stores in shopping malls, are less-needed as well. That is why as oil consumption decreases, there are so many lay-offs, and the effect multiplies and affects all areas of the economy, even housing prices and demand for business property.

If worldwide oil price is on the high side (like it is now), customers are faced with a choice–should they buy the full amount of high-priced oil, or should they cut back in some way. For example, a state transportation department might find that asphalt (an oil product) is high priced. They might decide to buy less and fix fewer roads. If they do this, they won’t need as many workers to spread the asphalt, so they may lay off some workers. With less demand, refineries that make the asphalt won’t need to process as much oil, so some of the older refineries can be closed, and their workers laid off.

The laid off workers will have less money to spend, so they will cut back–go out to restaurants less, take fewer trips, and wait longer between haircuts. And of course, there will be little need to build new refineries, or to buy new trucks for spreading asphalt, so these changes will impact workers in the construction business and in the manufacturing of trucks. A laid off worker may miss mortgage payments, and this will trickle through the economy in other ways. Housing prices may drop from lack of demand because some workers have lost their jobs, and because foreclosed houses are on the market at low prices.

In some cases, there may be the possibility of substitution–in this example, switching to concrete or gravel roads instead. But even in this case, there might be layoffs–less need for refineries, for example. Also, spreading gravel might take fewer workers. Concrete roads might last longer, and therefore affect employment in years to come.

Let’s take another example. If oil prices rise, airlines will need to raise their prices to cover the cost of fuel. Because of higher prices, businesses can be expected to cut back on travel, and less-wealthy vacation travelers may stay home. The reduction in travel can be expected to lead to layoffs in the airline industry. There will be less demand for new airplanes (unless an inventor can truly figure out a way to make a more fuel-efficient airplane!), and less demand for workers who build the airplanes. Fewer travelers will pass through the airport, so airport restaurants and shops are likely to lay off workers.

As a third example, if oil prices rise, grocery stores will raise the price of the food they sell because oil is used in food production and transport, and stores will need to pass the higher costs through to the customer. While customers are likely to “trade down” to the less-expensive items offered, in total, they are still likely to spend more on groceries than in the past. To compensate, customers can be expected to cut back on their discretionary expenditures elsewhere. A few may even miss mortgage payments.

How can this problem of layoffs, debt defaults, and falling housing prices be avoided when oil prices rise? I am not sure that it can be.

If a government has a huge amount of money for oil subsidies, perhaps it can subsidize oil prices, so the effect isn’t felt throughout the economy. Usually, it is only the oil exporters who can afford such subsidies.

Or a government can make a rule that companies can’t lay off workers, no matter how much demand drops. Unfortunately, such a rule is likely to result in many bankrupt companies. If they continue making goods few can afford, they will end up with a lot of excess inventory as well.

Or governments can try to cap oil prices. But now we are running short of oil that can be extracted from the ground at low cost, so capping prices has the perverse effect of reducing supply. Governments can also raise taxes on oil companies, but to some extent this also has the effect of reducing supply. The fields that had marginal profitability before the tax hike are likely to be closed.

If the government wants to keep employment up, somehow it needs to find less expensive alternatives to oil, so as to stop this vicious cycle of higher oil prices sending the economy into a tail-spin. Higher priced substitutes are not helpful–they just make the situation worse! This is why most of the alternatives now under consideration are dead ends, unless the costs can be brought way down, say to $50 or $60 barrel. Even electric cars need to be inexpensive, to really help the economy.

Too many people don’t really understand where the economy is running into trouble, and are proposing solutions that can’t fix the problem. Our real problem is that the economy cannot afford high-priced oil; it is not that there is too little (high-priced) oil in the ground.

We have always assumed that we can have cheap and available ingredients for our societal “recipe” for how our current economy functions. Now this assumption is coming into question.

The Language of Tyranny

SUBHEAD: Once self-delusion no longer works it is the iron fist that speaks.

 By Chris Hedges on 6 February 2011 in Truthdig - 

Image above: A cover of George Orwell's "1984". From (http://greaterthanorequalto.net/blog/2009/08/1984).

 Empires communicate in two languages. One language is expressed in imperatives. It is the language of command and force. This militarized language disdains human life and celebrates hypermasculinity. It demands. It makes no attempt to justify the flagrant theft of natural resources and wealth or the use of indiscriminate violence. When families are gunned down at a checkpoint in Iraq they are referred to as having been “lit up.” So it goes.

 The other language of empire is softer. It employs the vocabulary of ideals and lofty goals and insists that the power of empire is noble and benevolent. The language of beneficence is used to speak to those outside the centers of death and pillage, those who have not yet been totally broken, those who still must be seduced to hand over power to predators.

The road traveled to total disempowerment, however, ends at the same place. It is the language used to get there that is different. This language of blind obedience and retribution is used by authority in our inner cities, from Detroit to Oakland, as well as our prison systems. It is a language Iraqis and Afghans know intimately.

But to the members of our dwindling middle class—as well as those in the working class who have yet to confront our new political and economic configuration—the powerful use phrases like the consent of the governed and democracy that help lull us into complacency. The longer we believe in the fiction that we are included in the corporate power structure, the more easily corporations pillage the country without the threat of rebellion.

Those who know the truth are crushed. Those who do not are lied to. Those who consume and perpetuate the lies—including the liberal institutions of the press, the church, education, culture, labor and the Democratic Party—abet our disempowerment. No system of total control, including corporate control, exhibits its extreme forms at the beginning. These forms expand as they fail to encounter resistance.

The tactic of speaking in two languages is as old as empire itself. The ancient Greeks and the Romans did it. So did the Spanish conquistadors, the Ottomans, the French and later the British. Those who inhabit exploited zones on the peripheries of empire see and hear the truth. But the cries of those who are exploited are ignored or demonized.

The rage they express does not resonate with those trapped in self-delusion, those who continue to trust in the ultimate goodness of empire. This is the truth articulated in Joseph Conrad’s “Heart of Darkness” and E.M. Forster’s “A Passage to India.” These writers understood that empire is about violence and theft. And the longer the theft continues, the more brutal empire becomes. The tyranny empire imposes on others it finally imposes on itself. The predatory forces unleashed by empire consume the host. Look around you.

The narratives we hear are those fabricated for us by the state, Hollywood and the press. These narratives are taught in our schools, preached in our pulpits and celebrated in war documentaries such as “Restrepo.” These narratives humanize and ennoble the enforcers of empire. The government, the military, the police and our intelligence agents are lionized. These control groups, we are assured, are the guardians of our virtues and our protectors. They produce our heroes.

And those who challenge this narrative—who denounce the lies—become the enemy. Those who administer empire—elected officials, corporate managers, generals and the celebrity courtiers who disseminate the propaganda—become very wealthy. They make immense fortunes whether they deliver the nightly news, sit on the boards of corporations, or rise, lavished with corporate endorsements, within the vast industry of spectacle and entertainment.

They all pay homage, even in moments defined as criticism, to the essential goodness of corporate power. They shut out all real debate. They ignore flagrant injustices and abuse. They peddle the illusions that keep us passive and amused. But as our society is reconfigured into an oligarchic system, with a permanent and vast underclass, along with a shrinking and unstable middle class, these illusions lose their power.

The language of pleasant deception must be replaced with the overt language of force. It is hard to continue to live in a state of self-delusion once unemployment benefits run out, once the only job available comes without benefits or a living wage, once the future no longer conforms to the happy talk that saturates our airwaves. At this point rage becomes the engine of response, and whoever can channel that rage inherits power.

The manipulation of that rage has become the newest task of the corporate propagandists, and the failure of the liberal class to defend core liberal values has left its members with nothing to contribute to the debate. The Belgian King Leopold, promising to abolish slavery and usher the Congolese into the “modern” era, was permitted by his European allies to form the Congo Free State in 1885. It was touted as a humanitarian gesture, as was the Spanish conquest of the Americas, as was our own occupation of Iraq. Leopold organized a ruthless force of native and foreign overseers—not unlike our own mercenary armies—to loot the Congo of ivory and rubber.

By the time the Belgian monarch was done, some 5 million to 8 million Congolese had been slaughtered. It was the largest act of genocide in the modern era until the Nazi Holocaust. Leopold, even in the midst of his rampage, was lionized in Europe for his virtue. He was loathed in the periphery—as we are in Iraq and Afghanistan—where the Congolese and others understood what he was about. But these voices, like the voices of those we oppress, were almost never heard. The Nazis, for whom the Holocaust was as much a campaign of plunder as it was a campaign to rid Europe of Jews, had two methods for greeting arrivals at their four extermination camps.

If the transports came from Western Europe, the savage Ukrainian and Lithuanian guards, with their whips, dogs and clubs, were kept out of sight. The wealthier European Jews were politely ushered into an elaborate ruse, including fake railway stations complete with flower beds, until once stripped naked they became incapable of resistance and could be herded in rows of five under whips into the gas chambers.

The Nazis knew that those who had not been broken, those who possessed a belief in their own personal empowerment, would fight back. When the transports came from the east, where Jews had long lived in fear, tremendous poverty and terror, there was no need for such theatrics. Mothers, fathers, the elderly and children, accustomed to overt repression and the language of command and retribution, were brutally driven from the transports by sadistic guards.

The object was to create mass hysteria. The fate of the two groups was the same. It was the tactic that differed. All centralized power, once restraints and regulations are abolished, once it is no longer accountable to citizens, knows no limit to internal and external plunder.

The corporate state, which has emasculated our government, is creating a new form of feudalism, a world of masters and serfs. It speaks to those who remain in a state of self-delusion in the comforting and familiar language of liberty, freedom, prosperity and electoral democracy. It speaks to the poor and the oppressed in the language of naked coercion. But, here too, all will end up in the same place.

Those trapped in the blighted inner cities that are our internal colonies or brutalized in our prison system, especially African-Americans, see what awaits us all. So do the inhabitants in southern West Virginia, where coal companies have turned hundreds of thousands of acres into uninhabitable and poisoned wastelands. Poverty, repression and despair in these peripheral parts of empire are as common as drug addiction and cancer. Iraqis, Afghans, Pakistanis and Palestinians can also tell us who we are.

They know that once self-delusion no longer works it is the iron fist that speaks. The solitary and courageous voices that rise up from these internal and external colonies of devastation are silenced or discredited by the courtiers who serve corporate power. And even those who do hear these voices of dissent often cannot handle the truth. They prefer the Potemkin facade. They recoil at the “negativity.”

Reality, especially when you grasp what corporations are doing in the name of profit to the planet’s ecosystem, is terrifying. All tyrannies come endowed with their own peculiarities. This makes it hard to say one form of totalitarianism is like another. There are always enough differences to make us unsure that history is repeating itself. The corporate state does not have a Politburo. It does not dress its Homeland Security agents in jackboots.

There is no raving dictator. American democracy—like the garishly painted train station at the Nazi extermination camp Treblinka—looks real even as the levers of power are in the hands of corporations.

But there is one aspect the corporate state shares with despotic regimes and the collapsed empires that have plagued human history. It too communicates in two distinct languages, that is until it does not have to, at which point it will be too late.


Pox Americana

SUBHEAD: Our unilateralists drove blithely through “The gates of hell”, imagining that they were the gates to paradise.  

By Tom Engelhardt on 7 February 2011 in TomDispatch - 

Image above: Poster promoting movie "Team America". From (http://www.allmoviephoto.com/photo/2004_team_america_wallpaper_001.html).
As we've watched the dramatic events in the Middle East, you would hardly know that we had a thing to do with them. Oh yes, in the name of its War on Terror, Washington had for years backed most of the thuggish governments now under siege or anxious that they may be next in line to hear from their people.

When it came to Egypt in particular, there was initially much polite (and hypocritical) discussion in the media about how our "interests" and our "values" were in conflict, about how far the U.S. should back off its support for the Mubarak regime, and about what a “tightrope” the Obama administration was walking. While the president and his officials flailed, the mildest of questions were raised about how much we should chide our erstwhile allies, or encourage the massed protestors, and about whether we should “take sides” (as though we hadn’t done so decisively over the last decades).

With popular cries for “democracy” and “freedom” sweeping through the Middle East, it’s curious to note that the Bush-era’s now-infamous “democracy agenda” has been nowhere in sight. In its brief and disastrous life, it was used as a battering ram for regimes Washington loathed and offered as a soft pillow of future possibility to those it loved.

Still, make no mistake, there’s a story in a Washington stunned and "blindsided," in an administration visibly toothless and in disarray as well as dismayed over the potential loss of its Egyptian ally, “the keystone of its Middle Eastern policy,” that’s so big it should knock your socks off. And make no mistake: part of the spectacle of the moment lies in watching that other great power of the Cold War era finally head ever so slowly and reluctantly for the exits.

You know the one I’m talking about. In 1991, when the Soviet Union disappeared and the United States found itself the last superpower standing, Washington mistook that for a victory most rare. In the years that followed, in a paroxysm of self-satisfaction and amid clouds of self-congratulation, its leaders would attempt nothing less than to establish a global Pax Americana. Their breathtaking ambitions would leave hubris in the shade.

The results, it's now clear, were no less breathtaking, even if disastrously so. Almost 20 years after the lesser superpower of the Cold War left the world stage, the “victor” is now lurching down the declinist slope, this time as the other defeated power of the Cold War era.

So don’t mark the end of the Cold War in 1991 as our conventional histories do. Mark it in the early days of 2011, and consider the events of this moment a symbolic goodbye-to-all-that for the planet’s “sole superpower.”

Abroads, Near and Far
The proximate cause of Washington’s defeat is a threatened collapse of its imperial position in a region that, ever since President Jimmy Carter proclaimed his Carter Doctrine in 1980, has been considered the crucible of global power, the place where, above all, the Great Game must be played out. Today, “people power” is shaking the “pillars” of the American position in the Middle East, while -- despite the staggering levels of military might the Pentagon still has embedded in the area -- the Obama administration has found itself standing by helplessly in grim confusion.

As a spectacle of imperial power on the decline, we haven’t seen anything like it since 1989 when the Berlin Wall came down. Then, too, people power stunned the world. It swept like lightning across the satellite states of Eastern Europe, those “pillars” of the old Soviet empire, most of which had (as in the Middle East today) seemed quiescent for years.

It was an invigorating time. After all, such moments often don’t come once in a life, no less twice in 20 years. If you don’t happen to be in Washington, the present moment is proving no less remarkable, unpredictable, and earthshaking than its predecessor.

Make no mistake, either (though you wouldn’t guess it from recent reportage): these two moments of people power are inextricably linked. Think of it this way: as we witness the true denouement of the Cold War, it’s already clear that the "victor" in that titanic struggle, like the Soviet Union before it, mined its own positions and then was forced to watch with shock, awe, and dismay as those mines went off.

Among the most admirable aspects of the Soviet collapse was the decision of its remarkable leader, Mikhail Gorbachev, not to call the Red Army out of its barracks, as previous Soviet leaders had done in East Germany in 1953, Hungary in 1956, and Prague in 1968. Gorbachev’s conscious (and courageous) choice to let the empire collapse rather than employ violence to try to halt the course of events remains historically little short of unique.

Today, after almost two decades of exuberant imperial impunity, Washington finds itself in an uncomfortably unraveling situation. Think of it as a kind of slo-mo Gorbachev moment -- without a Gorbachev in sight.

What we’re dealing with here is, in a sense, the story of two “abroads.” In 1990, in the wake of a disastrous war in Afghanistan, in the midst of a people’s revolt, the Russians lost what they came to call their “near abroad,” the lands from Eastern Europe to Central Asia that had made up the Soviet Empire. The U.S., being the wealthier and stronger of the two Cold War superpowers, had something the Soviets never possessed. Call it a “far abroad.” Now, in the midst of another draining, disastrous Afghan war, in the face of another people’s revolt, a critical part of its far abroad is being shaken to its roots.

In the Middle East, the two pillars of American imperial power and control have long been Egypt and Saudi Arabia -- along, of course, with obdurate Israel and little Jordan. In previous eras, the chosen bulwarks of “stability” and “moderation,” terms much favored in Washington, had been the Shah of Iran in the 1960s and 1970s (and you remember his fate), and Saddam Hussein in the 1980s (and you remember his fate, too). In the larger region the Bush administration liked to call “the Greater Middle East” or “the arc of instability,” another key pillar has been Pakistan, a country now in destabilization mode under the pressure of a disastrous American war in Afghanistan.

And yet, without a Gorbachevian bone in its body, the Obama administration has still been hamstrung. While negotiating madly behind the scenes to retain power and influence in Egypt, it is not likely to call the troops out of the barracks. American military intervention remains essentially inconceivable. Don’t wait for Washington to send paratroopers to the Suez Canal as those fading imperial powers France and England tried to do in 1956. It won’t happen. Washington is too drained by years of war and economic bad times for that.

Facing genuine shock and awe (the people’s version), the Obama administration has been shaken. It has shown itself to be weak, visibly fearful, at a loss for what to do, and always several steps behind developing events. Count on one thing: its officials are already undoubtedly worried about a domestic political future in which the question (never good for Democrats) could be: Who lost the Middle East? In the meantime, their oh-so-solemn, carefully calibrated statements, still in command mode, couched in imperial-speak, and focused on what client states in the Middle East must do, might as well be spoken to the wind. Like the Cheshire Cat’s grin, only the rhetoric of the last decades seems to be left.

The question is: How did this happen? And the answer, in part, is: blame it on the way the Cold War officially ended, the mood of unparalleled hubris in which the United States emerged from it, and the unilaterialist path its leaders chose in its wake.

Let’s do a little reviewing.

Second-Wave Unilateralism
When the Soviet Union dissolved, Washington was stunned -- the collapse was unexpected despite all the signs that something monumental was afoot -- and then thrilled. The Cold War was over and we had won. Our mighty adversary had disappeared from the face of the Earth.

It didn’t take long for terms like “sole superpower” and “hyperpower” to crop up, or for dreams of a global Pax Americana to take shape amid talk about how our power and glory would outshine even the Roman and British empires. The conclusion that victory -- as in World War II -- would have its benefits, that the world was now our oyster, led to two waves of American “unilateralism” or go-it-alone-ism that essentially drove the car of state directly toward the nearest cliff and helped prepare the way for the sudden eruption of people power in the Middle East.

The second of those waves began with the fateful post-9/11 decision of George W. Bush, Dick Cheney, Donald Rumsfeld, and company to “drain the global swamp” (as they put it within days of the attacks in New York and Washington). They would, that is, pursue al-Qaeda (and whomever else they decided to label an enemy) by full military means. That included the invasion of Afghanistan and the issuing of a with-us-or-against-us diktat to Pakistan, which reportedly included the threat to bomb that country “back to the Stone Age.”

It also involved a full-scale militarization, Pentagonization, and privatization of American foreign policy, and above all else, the crushing of Iraqi dictator Saddam Hussein and the occupation of his country. All that and more came to be associated with the term “unilateralism,” with the idea that U.S. military power was so overwhelming Washington could simply go it alone in the world with any “coalition of the billing” it might muster and still get exactly what it wanted.

That second wave of unilateralism, now largely relegated to the memory hole of history by the mainstream media, helped pave the way for the upheavals in Tunisia, Egypt, and possibly elsewhere. As a start, from Pakistan to North Africa, the Bush administration’s Global War on Terror, along with its support for thuggish rule in the name of fighting al-Qaeda, helped radicalize the region.

(Remember, for instance, that while Washington was pouring billions of dollars into the American-equipped Egyptian Army and the American-trained Egyptian officer corps, Bush administration officials were delighted to enlist the Mubarak regime as War on Terror warriors, using Egypt’s jails as places to torture terror suspects rendered off any streets anywhere.)

In the process, by sweeping an area from North Africa to the Chinese border that it dubbed the Greater Middle East into that War on Terror, the Bush administration undoubtedly gave the region a new-found sense of unity, a feeling that the fate of its disparate parts was somehow bound together.

 In addition, Bush’s top officials, fundamentalists all when it came to U.S. military might and delusional fantasists when it came to what that military could accomplish, had immense power at its command: the power to destroy. They gave that power the snappy label “shock and awe,” and then used it to blow a hole in the heart of the Middle East by invading Iraq. In the process, they put that land, already on the ropes, onto life support.

It’s never really come off. In the wars, civil and guerrilla, set off by the American invasion and occupation, hundreds of thousands of Iraqis undoubtedly died and millions were sent into exile abroad or in their own land. Today, Iraq remains a barely breathing carcass of a nation, unable to deliver something as simple as electricity to its restive people or pump enough oil to pay for the disaster.

At the same time, the Bush administration sat on its hands while Israel had its way, taking Palestinian lands via its settlement policies and blowing its own hole in southern Lebanon with American backing (and weaponry) in the summer of 2006, and a smaller hole of utter devastation through Gaza in 2009. In other words, from Lebanon to Pakistan, the Greater Middle East was destabilized and radicalized.

The acts of Bush’s officials couldn’t have been rasher, or more destructive. They managed, for instance, to turn Afghanistan into the globe’s foremost narco-state, even as they gave new life to the Taliban -- no small miracle for a movement that, in 2001, had lost any vestige of popularity. Most crucial of all, they and the Obama adminsitration after them spread the war irrevocably to populous, nuclear-armed Pakistan.

To their mad plans and projects, you can trace, at least in part, the rise to power of Hezbollah in Lebanon and Hamas in Gaza (the only significant result of Bush’s “democracy agenda,” since Iraq’s elections arrived, despite Bush administration opposition, due to the prestige of Ayatollah Ali Sistani). You can credit them with an Iran-allied Shiite government in Iraq and a resurgent Taliban in Afghanistan, as well as the growth of a version of the Taliban in the Pakistani tribal borderlands. You can also credit them with the disorganization and impoverishment of the region. In summary, when the Bush unilateralists took control of the car of state, they souped it up, armed it to the teeth, and sent it careening off to catastrophe.

How hollow the neocon quip of 2003 now rings: “Everyone wants to go to Baghdad. Real men want to go to Tehran.” But remember as well that, however much the Bush administration accomplished (in a manner of speaking), there was a wave of unilateralism, no less significant, that preceded it.

Our Financial Jihadis
Though we all know this first wave well, we don’t usually think of it as “unilateralist,” or in terms of the Middle East at all, or speak about it in the same breath with the Bush administration and its neocon supporters. I’m talking about the globalists, sometimes called the neoliberals, who were let loose to do their damnedest in the good times of the post-Cold-War Clinton years.

They, too, were dreamy about organizing the planet and about another kind of American power that was never going to end: economic power. (And, of course, they would be called back to power in Washington in the Obama years to run the U.S. economy into the ground yet again.) They believed deeply that we were the economic superpower of the ages, and they were eager to create their own version of a Pax Americana. Intent on homogenizing the world by bringing American economic power to bear on it, their version of shock-and-awe tactics involved calling in institutions like the International Monetary Fund to discipline developing countries into a profitable kind of poverty and misery.
In the end, as they gleefully sliced and diced subprime mortgages, they drove a different kind of hole through the world. They were financial jihadis with their own style of shock-and-awe tactics and they, too, proved deeply destructive, even if in a different way. The irony was that, in the economic meltdown of 2008, they finally took down the global economy they had helped “unify.” And that occured just as the second wave of unilateralists were facing the endgame of their dreams of global domination. In the process, for instance, Egypt, the most populous of Arab countries, was economically neoliberalized and -- except for a small elite who made out like the bandits they were -- impoverished.

Talk about “creative destruction”! The two waves of American unilateralists nearly took down the planet. They let loose demons of every sort, even as they ensured that the world’s first experience of a sole superpower would prove short indeed. Heap onto the rubble they left behind the global disaster of rising prices for the basics -- food and fuel -- and you have a situation so combustible that no one should have been surprised when a Tunisian match lit it aflame.

That this moment began in the Greater Middle East should be no surprise either. That it might not end there should not be ruled out. This looks like, but may not be, an “Islamic” moment. If the second wave of American unilateralists ensured that this would start as a Middle Eastern phenomenon, conditions for people's-power movements exist elsewhere as well.

The Gates of Hell
Nobody today remembers how, in September 2004, Amr Musa, the head of the Arab League, described the post-invasion Iraqi situation. “The gates of hell,” he said, “are open in Iraq.” This was not the sort of language we were used to hearing in the U.S., no matter what you felt about the war. It read -- and probably still reads -- like an over-the-top metaphor, but it could as easily be taken as a realistic depiction of what happened not just in Iraq, but in the Greater Middle East and, to some extent, in the world.

Our unilateralists twice drove blithely through those gates, imagining that they were the gates to paradise. The results are now clear for all to see.

And don't forget, the gates of hell remain open. Keep your eyes on at least two places, starting with Saudi Arabia, about which practically no one is yet writing, though one of these days its situation could turn out to be shakier than now imagined. Certainly, whoever controls the Saudi stock market thought so, because as the situation grew more tumultuous in Egypt, Saudi stocks took a nosedive.

With Saudi Arabia, you couldn’t get more basic when it comes to U.S. policy or the fate of the planet, given the amount of oil still under its desert sands. And then don’t forget the potentially most frightening country of all, Pakistan, where the final gasp of America’s military unilateralists is still playing itself out as if on a reel of film that just won’t end.

Yes, the Obama administration may squeeze by in the region for a while. Perhaps the Egyptian high command -- half of which seems to have been in Washington at the moment the you-know-what hit the fan in their own country -- will take over and perhaps they will suppress people power again for a period. Who knows?

One thing is clear inside the gates of hell: whatever wild flowers or weeds turn out to be capable of growing in the soil tilled so assiduously by the victors of 1991, Pax Americana proved to be a Pox Americana for the region and the world.

• Tom Engelhardt, co-founder of the American Empire Project, runs the Nation Institute's TomDispatch.com. His latest book is The American Way of War: How Bush’s Wars Became Obama’s (Haymarket Books). You can catch him discussing war American-style and that book in a Timothy MacBain TomCast video by clicking here.

Gutting the Plastic Bag Ban

SUBHEAD: Rapozo to introduce amendment exempting all food service establishments from plastic ban bill.  

By Léo Azambuja on 1 February 2011 in The Garden Island

Image above: Councilman Mel Repozo (l) and Dickie Chang (r) hard at council business. From TGI article.

More than a year apparently wasn’t enough time to catch an obvious mismatch: Paper and gravy. A new law banning checkout plastic bags went into effect three weeks ago, but many food service establishments are allegedly already complaining of food breaking through paper bags and possible contamination.

 “The brown paper bags were not designed for holding these food items,” said Councilman Dickie Chang, explaining that when food spills from a to-go container, it causes the bags to break. “It can’t even hold an apple.” Ordinance 885 was adopted in October 2009, and went into effect Jan. 11.

Councilman Mel Rapozo said he intends to address some of the issues by introducing an amendment to the bill. “It’s going to exempt the food service establishments from the bill,” Rapozo said.

Rapozo’s main concern was with food-safety issues, despite also pointing out that the bags rip shortly after contact with food. “I received a few calls from some of the restaurants that were concerned about food safety,” said Rapozo, noting that there’s a growing concern that food-borne bacteria could transfer to food from reusable bags that customers bring in. As a result, he said, the establishment could end up with the blame in a case of food-poisoning.

“The purpose really is the food-safety issue, not so much the paper issue. It’s the fact that they are concerned about the transfer of food-borne bacteria,” Rapozo said. Chang said that when the bill was crafted, the food-safety issue never made it to the discussion.

 “If people don’t mention stuff to us, we don’t know,” said Chang, adding that the thinks the reason is that most people involved in the food-service industry “did not realize the bill pertained to them, that’s the kind of feedback that I’m getting from the public.” Councilman Tim Bynum, who co-authored the original bill with then-Councilwoman Lani Kawahara, said there were no food-service concerns when the bill was discussed.

 Bynum said he doesn’t feel there’s a need for an amendment, but he is in support of council members introducing amendments whenever they wish. “I’m willing to work into anything,” he said. Rapozo said the amendment will likely be introduced on the agenda of the Feb. 9 County Council meeting.

One Word Ben: Plastics

By Andy Parx on 2 February 2011 in Parx News Network - 

Back before late ’08 when the bottom fell out of the free-for-all, credit-spawned, consumerism bubble it was common to hear people bemoan the gobble, gobble, gobble of your typical over fed, too-much-stuff turkey-American. But in spite of the hope of many that perhaps the crash presaged a new era of right-sized consumption, we’ve gone right back to our old, traditional, grab-it-while-you-can rituals, like Coneheads demanding the restoration of our right to “consume mass quantities”. And as if to underline the way that, when challenged to do the very least we can do- and we mean the very least- we act like whiny, weaned infants demanding the restoration of our endless supply of teat. So it shouldn’t be any surprise that, backed by a wave of sniveling, self-centered assholes, Councilperson Mel Rapozo is trying to make sure the baby has his bottle by reversing the ban on plastic grocery bags that went into effect last month. And make no mistake about it. This would be a reversal. According to Joan Conrow’s post yesterday the bill would exempt “Food Service Establishment(s)” and defines them as  
any building, vehicle, place, or structure, or any room or division in a building, vehicle, place, or structure where food is prepared, served, or sold for immediate consumption on or in the vicinity of the premises; called for or taken out by customers; or prepared prior to being delivered to another location for consumption. This term includes, but not limited to restaurants; coffee shops; cafeterias; short-order cafes; luncheonettes; taverns; lunchrooms; places which manufacture wholesale or retail sandwiches or salads; institutions, both public and private; food carts; itinerant restaurants; industrial cafeterias; and catering establishments.  
That of course means that every supermarket with a “deli” on the island is exempt, as is any place that “prepares food” even if all you do is serve coffee. This is supposedly about “health issues” and maintaining “sanitary conditions” but it’s anything but. What kind of slobs are these people that they can’t make sure their takeout doesn’t spill all over the place without rewrapping it in a plastic grocery bag?

Don’t forget that those plastic bags that people use to wrap fruits an veges and meats are already exempt. But that’s not enough for your fat, pre-diabetic ass? You obviously need all that greasy, fat-laden gravy on your nutritionless white rice but are you such freakin’ pigs that you can’t get your slop from the store into your pie hole without spilling it all over your morbidly obese lap? All that plastic that the deli wraps your food in isn’t enough for ya, eh?

You need another bag to put it all in, in anticipation of the fact that you’re in such a rush to cram more garbage down your gullet that can’t get out of the store without spilling it on you $500 designer jeans. Health? If you cared about health you wouldn’t be eating all that processed pre-prepared crap and go home and cook a real meal. Sanitation?

You mean after you’re careless enough to let your stuff spill out into the bag you’re so intent on getting every last drop into that gaping maw of yours that you’ve gotta lick the bag? Perhaps the most ridiculous aspect of this bellyaching gripe session that’s been going on since the ban is that it comes from people who claim to be nature lovers and even environmental activists.

If you love your plastic grocery bags so much why don’t you go live in the Texas-sized plastic bag gyre out in the middle of the Pacific? Perhaps you should go on a diet of the shearwaters, dolphins and turtles the bags kill. If you can’t live without your nasty plastic grocery bags maybe we should make ones big enough to wrap you in when we bury you in the ground... we wouldn’t want to spill you and make a mess on the way to the cemetery.

Good Evening Ladies and Germs  

By Andy Parx on 3 February 2011 in Parx News Network -  

A characteristic trait of the true babooze is the reluctance to let facts get in the way of a good babble. So now that we’ve dispensed with the preliminaries we can get down the real idiocy behind Babooze-In-Chief Mel Rapozo’s demagoguing of the plastic grocery bag ban. Because had Rapozo actually tried to find out whether the claims that reusable bags carry pathogens that can cause disease are true he would have found that they were “just baloney.” At least according to the respected independent publication Consumer Reports’ “Safety Blog,” Turns out that media hysteria over bad bugs in reusable bags came from a study conducted with funding from- you’ve probably guessed already- the plastic bag industry. “Which is why” said the article,
“we’re not so swayed by a recent report about reusable grocery bags and their potential to make you sick. The report came out of the University of Arizona, Tucson and Loma Linda University in California. Smack on page one is this note: “The authors would like to acknowledge and thank the American Chemistry Council for providing funding to support this study.” The American Chemistry Council is the trade group that advocates on behalf of plastic-bag manufacturers. Now why would the folks who make plastic grocery bags want to cast doubts on the safety of reusable grocery bags? Oh, right.”
After pointing out that the study was based on a grand total of 84 bags the article says that:
The researchers tested for pathogenic bacteria Salmonella and Listeria, but didn’t find any, nor did they find strains of E. coli that could make one sick. They only found bacteria that don’t normally cause disease, but do cause disease in people with weakened immune systems. Our food-safety experts were underwhelmed as well. “A person eating an average bag of salad greens gets more exposure to these bacteria than if they had licked the insides of the dirtiest bag from this study,” says Michael Hansen, senior staff scientist at Consumers Union. “These bacteria can be found lots of places, so no need to go overboard.” But Hansen notes that there are some reminders to take away from the study. It’s easy to spread bacteria from meat, fish, or poultry to other foods – in your kitchen or in your grocery bags. So we do think it’s wise to carry those items in disposable bags. Reusable bags are fine for most everything else, but it’s a good idea to wash them occasionally.
And of course the current ordinance on Kaua`i specifically exempts the bags used for vegetables and meats anyway. We’re not suggesting that campaign contributions from places like Safeway Inc., the Kauai Beverage & Ice Cream Co., Ltd or Randall Francisco, the head of the Chamber of Commerce- which was the only entity that strenuously opposed the bill- influenced Rapozo’s decision to reverse the ban... but they couldn’t have hurt.

 The fact is that the “amendment,” as currently written, would allow every single supermarket on the island to go back to those white plastic grocery bags when, first the original bill provided for bags for individual items like meats and produce and second, if people use common sense and wash out their reusable bags when they spill stuff in them there’s no health or sanitation issue.

Are we a bunch of baboozes who don’t have the smarts to know how to keep our food safe? Well, apparently it takes one to know one.


The Great Global Debt Prison

SUBHEAD: The banking elites haven’t just erected a prison, they’ve tossed us in Alcatraz!  

By Giordano Bruno on 4 February 2011 in Neithercorp Press -

Image above: Sign at historic debtors prison in Accomack County, Virginia. From (http://freedominourtime.blogspot.com/2010/06/amnesty-for-banksters-debtors-prison.html).
Tense and terrible times inevitably summon an odd coupling of two very different and difficult human conditions; honesty, and brutality. Certain painful truths are revealed, and often, a palpable fury erupts. Being that times today are particularly tense, and on the verge of being spectacularly terrible, perhaps we should embrace both conditions in a constructive manner, and become brutally honest with ourselves. This begins by admitting to that which most ails us. It begins by admitting how far we have fallen…

Our economy, our culture, our entire world, is built upon debt. No one ever asked us if that’s how we wanted it, it is simply how the system was designed when we came into it. Many of us have lived our entire lives under the assumption that debt is a necessary function of daily commerce and a valuable driver of successful society.

Most households in America operate at a steep loss, trapped in constantly building cycles of liability and interest. There are even widely held schools of economic thought that are centered completely on the production and utilization of nothing but debt. Only recently have many people begun to ask themselves what the tangible benefits are (if any) in being dependent on debt based finance.

After careful examination, it becomes evident that debt does not fuel economy, it suffocates it. It does not nurture growth, it stunts and poisons it. Extreme debt is not a fundamental organ in a body of commerce; it is an aberration, a spreading cancer which disrupts the circulation of healthy trade. Debt is, in large part, unnecessary.

Of course, debt can be very useful if you are the controller or determining overseer of a system, especially if you wish to centralize and maintain power over that system. The tactical wielding of debt has been used by elites for centuries as a means to imprison the masses, or to create an atmosphere of endless dependency. Let’s take a look at what debt really is, and how it is being used against the average American today…

Understanding Debt
The Charles Dickens classic ‘Little Dorrit’ is commonly misinterpreted as a “love story”, however, the primary character in the book is not Little Dorrit, or the kindly Arthur Clennam, but the debt system of Britain itself, and its effects on every social class from the street beggar to the elitist socialite.

Dickens despised the idea of debt and debtors prisons, being that his father was thrown into one for a good portion of his life, forcing young Charles to work just to support his parents. Dickens understood well the evil intent behind the debt system, and railed against it often in his writings.

One figure in ‘Little Dorrit’ which fascinated me was the character of Mr. Merdle, a national banking superstar who dominates the investment world with the help of British treasury officials and various political deviants. Merdle is referred to by merchant circles as “the man of the age”, a financial marvel who seems to make fortunes in every endeavor he touches.

Little does anyone realize that Merdle is a fraud, a Ponzi scheme artist who takes money from unwary speculators and sinks it into increasingly more tenuous investments. In order to continue hiding the fact that all his financial ventures are ending in ruin, he lures more and more depositors to pay off previous debts.

The problem is that Merdle is creating debt to chase debt. Eventually, his insolvency, and that of all those who trusted him, will catch up and overtake the lie he has carefully projected. All economic instability is invariably revealed, no matter how expertly it is hidden.

Mr. Merdle, in my mind, is an almost perfect literary representation of today’s private Federal Reserve and the global banking syndicates of JP Morgan, Goldman Sachs, Citigroup, etc. The Federal Reserve, with the help of politicians on both sides of the aisle, created a series of illusory incentives (through interest rate cuts) which allowed banks to begin lending almost unlimited fiat at rock bottom prices.

America was awash in credit, to the point that it was nearly impossible for the average person to avoid the temptation of borrowing. What we didn’t understand then, but are beginning to grasp now, is that credit derived from fiat is not “capital”, it is NOT wealth. Credit is the creation of an obligation, to be paid at a later date, if it is paid at all, and because there are no rules to tie the debt to any legitimate collateral (at least for banks), there is nothing to back the obligation if it falters. Therefore, fiat induced credit is not the creation of wealth (as Keynesians seem to believe), but the destruction of wealth!

Because of its lack of tangibility, debt can be packaged and repackaged into whatever form banks like. Derivatives are a perfect example of the phantom nature of debt; securities which have no real value whatsoever yet are rated and traded as if they are a solid commodity. This brand of commerce is, at its very root, a kind of fiscal time bomb. Just as in the literary world of ‘Little Dorrit’, the Ponzi scheme in our very literal world had to reach a tipping point, and in 2008, it did.

One glaring difference between our troubles and those of Dickens’ fiction is that Merdle actually feels guilt over what he has done (or he at least fears the justice that will be dealt him), causing him to commit suicide towards the end of the novel. In the real world, the Merdles of our era appear fully content to watch this country crumble due to their intrigues, and rarely suffer any consequences for what they pursue.

In fact, the modern banking elite are more liable to revel in the searing shockwave of a credit detonation, rather than feel any “remorse”. The point is, Dickens saw clearly over 150 years ago what many Americans today still do not; debt is an abstract idea, an absurd game which confuses and ensnares innocent people. Debt based systems con the citizenry into trading away their tangible wealth and labor for the promise of future settlements that will never come. Debt serves only to weaken the masses, and empower creditors.

The Consequences Of Debt
How has debt based economics served us so far?

The credit card debt of the average American household ranges from $8000 to $15,000. Total household debt including mortgage and home equity loans has hit an average of 136% of annual household income.

Approximately 80% of mortgage loans issued to subprime borrowers over the past decade were Adjustable Rate Mortgages (ARM), meaning 80% of mortgages in the U.S. have reset or are ready to reset at much higher interest rates. There were approximately 1.4 million bankruptcy filings in 2009, and 1.5 million in 2010. One in every 45 homes in America received a foreclosure filing in 2010.

Keep in mind that in 2005, new government regulations were implemented making filing for bankruptcy much more difficult. In 2006, filings collapsed. Now, despite stringent obstacles, filings are up again over 100%.

The “official” national debt now stands at over $14 trillion, which is around 100% of U.S. GDP (with entitlement programs like social security included, this number is probably closer to 400% of GDP) . The 100% mark is often cited as the breaking point for most countries struggling to sustain liabilities. Greece’s national debt stood at 108% – 113% of GDP when it collapsed into austerity.

From 2004, to 2010 (a span of only six years) our national debt has doubled. To put this in perspective, it took the U.S. over 200 years to reach its first trillion dollars of debt. Now, we are looking at the accumulation of at least a trillion every year. This is unsustainable.

The much talked about debt ceiling has been raised six times in the past three years. This frequency is unprecedented. International ratings agencies are now openly suggesting an end to America’s AAA credit rating.

A credit rating downgrade would be devastating to what little foreign interest is left in the U.S. Treasury bond investment.

On the local front, cities and states are on the verge of folding due to the evaporation of municipal bond markets. Cities depend greatly on two sources of revenue in order to continue operations; property taxes, and municipal investment. Property taxes, obviously, are disappearing as property values continue to spiral downwards. This leaves only municipals, which have also unfortunately fallen off the map:

Wall Street analyst, Meredith Witney, recently stated in an interview with 60 Minutes that she believed 50 to 100 American cities would default in the midst of a municipal crisis in 2011. She was promptly lambasted by the rest of the MSM for her prediction. In my opinion, she was rather minimalist in her estimates, especially if the Federal Reserve does not commit to another round of quantitative easing (QE3) for the states (Bernanke denies this policy would be enacted by the Fed, though, which means there is a good chance it will be).

To summarize, the U.S. is swimming in debt. Absolutely nothing has been changed for the better in terms of wealth destruction and liabilities since the credit crisis began, and the situation only looks more precarious with each passing quarter.

Where Is The Debt Roller Coaster Taking Us?
What is the most likely outcome of the conditions described above? The vital factor will be the continued Federal Reserve policy of fiat bailouts as a “counterbalance” to the evolving debt crisis.
As is clearly explored in the Dickens novel we discussed earlier, staving off the effects of debt by creating more debt is a temporary solution that only leads to greater calamity down the road. Anyone who believes that fiat inflation actually “cancels out” debt instability is going to find themselves sorely disappointed.

At bottom, government created stimulus is not a solution to corporate engineered debt burdens, but a reallocation of debt away from banks and into the laps of the American taxpayer. The Federal Reserve and our own Treasury have not paid off anything. They merely shifted the responsibility of payment away from the banks that created the problem, and handed that responsibility to us. On top of this, they have also set the dollar up for a crushing blow of devaluation. Here is where the prison bars enclose…

If our historic debt is not being diminished, but only moved around while it expands, then this means that eventually our credit worthiness will come into question. In fact, it already has. Foreign investment in long term Treasuries has dwindled. Our own central bank is now the largest holder of U.S. debt, surpassing even China (Note: this news has so far been ignored by almost all mainstream outlets):


So, the question of debt default turns from theoretical to quite imperative. If the Federal Reserve continues buying our debt with fiat, it means that the effects of the debt will only be delayed, the dollar will be dropped as the world reserve currency, and hyperinflation is a certainty. If they do not continue buying, then our government defaults, the country’s financial infrastructure ceases to exist, the dollar loses its world reserve status, and hyperinflation is a certainty. The banking elites haven’t just erected a prison, they’ve tossed us in Alcatraz!

The battle over yet another increase of the debt ceiling has obscured the fact that the debt has already done all the damage it needs to do. Freezing the ceiling in place becomes a battle of principle, and an important one, but it would in no way stop the dysfunction and chaos to come. At best, it might shorten the duration of the disaster by a few years.

The important thing to remember is that government intervention will only incur greater loss. There is no easy way out, no magic shortcut, no last minute brilliant idea that will wrap up this mess. Years of hard work, determination, honesty, and sacrifice are ahead of us.

Inflation will be the buzzword of 2011. Endless debt facilitates endless Keynesian liquidity. Expect to see commodities double once again this year.

Household debt will probably level off through 2011, as more Americans abandon their credit habits and make more concerted efforts to save. In 2009, Visa lost 11% of its credit use, while MasterCard lost 22%. Over 8 million consumers have stopped using credit cards altogether since the end of 2009.

Bank lending is still tight as creditors raise the requirements necessary to receive FHA (Federal Housing Administration) mortgages.Will credit use and debt based consumption ever return to levels similar to 2006? Not a chance. One might predict then that savings will rise dramatically as credit use falls, but this too is unlikely. Why? Because over the next year Americans will be spending far more on essential goods due to inflation than they ever have before.

Whatever savings they would have accrued will be eaten up by the relentless spike in commodity prices. The term used for the combination of chronic debt, low job growth, and burgeoning inflation, is “stagflation”.

I honestly can’t think of a worse situation than being subject to exploding costs in light of a dilapidated standard of living. As Dickens points out plainly in ‘Little Dorrit’, how can a man be expected to settle his obligations when he is imprisoned for them?

 Breaking The Cycle In The Midst Of Global Strife
Why after thirty years under the despotic rule of the Hosni Mubarak regime did the Egyptian people suddenly decide to revolt? Why now? The MSM will field a number of political tales, but the key to most popular uprisings, especially in the Middle East, has been the lack of necessities.

The last time Egypt saw an uprising of this magnitude was during the Bread Riots of 1977, when the IMF terminated state subsidies of basic foodstuffs. Is it any wonder that turmoil has developed so quickly in the region as grain prices double? This is the devastating power of debt, and the so called “solutions” which merely perpetuate debt.

Tunisia, Egypt, and Yemen, are only the beginning. The sting of inflation will be unbearable as austerity measures take hold in Europe, and the potential for riots in Greece, Spain, Portugal, and Italy looms large. The most volatile environment on the planet to date, however, is the United States, which, as we have shown in previous articles, is being dismantled deliberately and viciously in preparation for IMF regulation and centralization. Today, the IMF is stalking Egypt, ready to pounce as the nation goes mad.

Tomorrow, it will be us. I will be very surprised if we are not hearing about IMF intervention in the U.S. economy and the dollar by the end of this year, offering more debt, and more unaccountable governance.

The secret to breaking the circle of debt is to adopt a policy of decentralization, and self sufficiency. To take back control of our local commerce and to establish micro-economies with self contained methods of trade. Debt must be removed from the equation altogether, and systems protected by flexibility and redundancy must be applied.

Savings and meaningful production would have to take the place of endless spending and outsourcing. The claustrophobic nurse-maid philosophies of globalism would have to be cast aside and replaced with goals of independence and self reliance. By cutting our dependency on the corrupt establishment, we sever its ability to feed off of us. By building a better system, we make the faulty one obsolete. Whether or not we throw off the trappings of the debt machine is entirely up to us.

Two very important steps are required; the realization that debt is not the only way, and, the realization that debt is the worst way. Prosperity is not achieved at the expense of the future. The society that finally takes this fact to heart will accomplish incredible things indeed…

The Economics of Happiness

SUBHEAD: It’s a comfort to watch a film that presents the sentiment that a future with less oil is preferable to a future with lots of oil. By Jennifer Prediger on 4 February 2011 for Grist - (http://www.grist.org/article/2011-02-04-a-new-documentary-about-the-real-wealth-of-nations-) Image above: Image promoting course on "The Economics of Happiness" From (http://www.treehugger.com/files/2009/11/economics-of-happiness-course-schumacher-college.php). What if GDP stood for Great Domestic Pleasantness? How about an economy whose success is not determined by growth for growth’s sake? A new documentary, The Economics of Happiness, explores this rich territory. The film makes a connection between the economic crisis, the environment, and a “crisis of the human spirit”—the reality that even as our material wealth has increased, we have not gotten happier. According to a study cited by author and 350.org activist (and Grist Board member) Bill McKibben, people in the United States have actually become less happy since the 1950s. Economics of Happiness makes a well-reasoned case that the “consumer culture” we’re living in has broken down community and our connection to nature. The film also takes a look at the negative impacts of corporate globalization, arguing that the process focuses on profits rather than people. Ladakh, a region in northern India known as “Little Tibet,” serves as a case study in the ways globalization and industrialization are damaging cultures, livelihoods, and human connections. Once a place with zero unemployment, ample leisure time, natural resources, and a sense of general well-being, Ladakh has changed. The introduction of Western culture and values has created a sense of relative impoverishment. The introduction of subsidized food, fuel, and roads have all undermined the local economy. So what’s the solution? The filmmakers, Helena Norberg-Hodge, Steven Gorelick, and John Page, focus on systemic economic transformation. They show examples of initiatives around the globe where people are “rebuilding more democratic, human scale, ecological and local economies—the foundation of an ‘economics of happiness’.” Interviews with Bill McKibben, Vandana Shiva, Juliet Schor, and Samdhong Rinpoche—the prime minister of Tibet’s government in exile—make for a thought-provoking re-contextualization of globalization and the potential that lies in supporting local local banking, food production, and commerce.

It’s a comfort to watch a film that presents the sentiment that a future with less oil is preferable to a future with lots of oil.

Video above: Trailer for "The Economics of Happiness". Fro (http://www.youtube.com/watch?feature=player_embedded&v=VkdnFYDbiBE).

See also: Ea O Ka Aina: Bhutan's Wealth of Happiness 9/7/10


Two Roads Diverged in a Wood

SUBHEAD: Simultaneously industrial society has chosen the both the path of environmental collapse and economic collapse. By Guy McPherson on 3 February 2011 in Transition Voice - (http://transitionvoice.com/2011/02/two-roads-diverged) Image above: A fork in the road. From (http://www.midlifesatrip.com/wp-content/uploads/2009/09/fork-in-the-road-3.jpg).

At this late juncture in the era of industry, it seems safe to assume we face one of two futures. If we continue to burn fossil fuels, we face imminent environmental collapse. If we cease burning fossil fuels, the industrial economy will collapse.

Industrial society expresses these futures as a choice between your money or your life.

It tells you that without money life isn’t worth living. As should be clear by now, industrial society — or at least our industrial “leaders” — have not chosen door number one: environmental collapse; and nor door number two: economic collapse. But both. At the same time!

Compassionately selfish

If you believe your life depends upon water coming out of the taps and food showing up at the grocery store, you’ll defend to the death the system that keeps water coming out the taps and food showing up at the grocery story. But news flash: If you think your life depends on that system, you’re a very unusual person, especially historically. And you support an unusual culture marked by overwhelming collateral damage to simultaneously existing non-industrial cultures and non-human species.

And you’re sorely mistaken, besides.

The problem is environmental overshoot, as a handful of ecologists have been saying for decades, echoing Malthus. We’ve far exceeded the human carrying capacity of the planet. As a result, we threaten most of the species on Earth, including our own, with extinction by the end of this century. Currently, there’s not nearly enough food to feed every human on the planet, even at the expense of nearly every non-human species. Actually, tens of thousands of people have been starving to death every day for a few decades, but they’ve been beyond our imperial television screens. And even more industrialized societies are falling to escalating food prices and shocking food shortages.

A toxic brew

The root cause of the problem is complex, but it can be reduced to a few primary factors: agriculture and industrialization, the epitome of every civilization in the last thousand years, and their contribution to human population growth.

The genus Homo persisted on the planet some 2 million years, and our own species had been around for at least 250,000 years, without exceeding carrying capacity. We actually lived without posing a threat to the persistence of other species. During those years — two million of them, in fact — humans had abundant spare time for socializing and art, and spent only a few hours each week hunting, gathering, and otherwise preparing to feed themselves (i.e., “working”).

Contrast those conditions with people today and how much time we spend working (and rarely enjoying that work, if talk around the water cooler is any indication).

Agriculture leads to food storage, which leads to empire, which produces slavery, oppression, and mass murder (all of which were essentially absent for the first couple million years of the human experience). Lives were relatively short, but happy by every measure we can find. In short, without agriculture there’s no environmental overshoot. The human population explosion is effect, not cause.

The industrial revolution exacerbated the problem to such an extent we’ll never be able to recover without historic human suffering. We’re only beginning to witness the impacts of reduced energy supplies on the industrial economy, and on this kind of trajectory, unimpeded by some change, we’ll be squarely in the Stone Age, fully unprepared, within two decades at most.

At this point, our commitment to western culture (i.e., civilization) is so great that any attempt to power down will result in suffering and death of millions, probably billions. Nonetheless, it’s tragically the only way to allow our own species, and millions of others, to persist beyond century’s end and squeeze through the global-change bottleneck resulting from industrialization.

Every day in overshoot is another day to be reckoned with later, and therefore another few thousand people who must live and die in Hobbesian fashion.

There are no decent solutions.

A date with destiny

A collapse in the world’s industrial economy is producing the expected results, finally. Sadly, it’s too late to save thousands of species we’ve sent into the abyss. But perhaps there’s barely time to save a few remaining species, including our own.

If you care about other species and cultures, or even the continued persistence of our own species, then an impressive body of evidence suggests you support our imminent transition to the post-industrial Stone Age. Or whatever it looks like. Such a trip saves the maximum number of human lives, over the long term.

When you realize the (eco)systems in the real world actually produce your food and water, you’ll defend to the death the systems that produces your food and water. I’m in that camp. How about you?

What do you support? The industrial culture of death, which sanctions murderous actions every day? Or the culture of life?


You Must Watch the Empire Bowl

SUBHEAD: Our last super thing is the bread and circuses of this dying empire. Hail Superbowl XLV! By Robert Lipsyte on 30 January 2011 in TomDispatch - (http://www.tomdispatch.com/post/175348/tomgram%3A_robert_lipsyte%2C_the_empire_bowl_is_super%21/#more) Image above: Frame from TV animation of Superbowm XLV logo. From (http://www.treehugger.com/files/2011/02/nfl-offset-15000-megawatss-super-bowl-xlv.php).

If you are still passionately following football or, worse, allowing your kid to play, you may just be an old-fashioned imperialist running dog. Not that all football fans are bloodthirsty hounds feeding off the crippled hindquarters of the dying animal of empire. Some are in a vain search for a crucible of manhood that no longer exists. Others are in pursuit of a ticket out of a dead-end life.

Whatever your reason, this is the Super Bowl to watch, even if you are among those who have made an effort to disregard the game since high school jocks shouldered you in the halls.

This is the Big One. Maybe the Last Big One. Never before have so many loose strands of an unraveling empire come together in a single event accessible to those who mourn or cheer America.

Let’s start with the conceit that this game is the only super thing we have left. Super power, super economy, super you-name-it… gone. You can beat the Bushes for that, but we’re all out of super -- except for the Super Bowl. That celebration of an all-American $9 billion industry (estimated because the National Football League has never opened its books), not to mention millions more in subsidiary and dependent businesses, offers us a national holiday that has arguably superseded Thanksgiving (thanks for what?) and Christmas (electronic excess and obsolescence).

Even little Everytrader has a shot here. Without insider connections, you undoubtedly have a far better shot at winning a football wager than gambling in the stock market.

The Big Four

Here are the four biggest reasons to watch this Super Bowl.

1. It’s Not Soccer

American exceptionalism is alive and thriving on Super Bowl Sunday. National Football League franchises are overwhelmingly owned, managed, and manned by American citizens. Neither immigration nor foreign capital has made a perceptible dent in the game. And you and I have proudly subsidized all this. American taxpayers have built many NFL stadiums. Most American universities, with their government grants, have sports schools attached; those multi-million-dollar athletic departments (despite claims, they are rarely profitable) train the players and one of academia’s latest revenue-producing innovations -- sports management departments -- train the front-office personnel.

American football is barely played outside the country. Call it a failure of colonialism (as baseball and basketball might), but it’s really a tribute to good old-fashioned protectionism. Those other major sports, even ice hockey, are increasingly being taken over by Latin American, Asian, or Eastern European guest workers. Pro football remains a native game.

The “futbol” that most of the rest of the world plays is a game that American male athletes and sports fans have never found compelling. Why? What’s not to like? The so-called “beautiful game” is exactly that, and the past several generations of American school-age girls and boys were lucky to have recreational soccer programs. But there was no room on the sports “shelf” for a game so poorly suited to commercial TV interruption and American domination.

(It’s not as if soccer is in any way effete. Its fans are famously thuggish. In fact, currently, the nationalistic Russian mobs who roam cities beating up people who do not look Slavic have taken to calling themselves “Soccer fans.”)

2. No Dogs Were Harmed in the Making of It

The controversy over allowing Michael Vick back into the select company of other NFL felons -- reportedly about one-fifth of the playing population -- faded after the Philadelphia Eagles quarterback showed contrition, spoke to schoolchildren, proved to be one of the most electrifying performers in the game, and then lost early in the play-offs, avoiding the embarrassment of PETA demonstrating at the Super Bowl.

At 30, Vick was clearly better than he had been before his 21-month imprisonment. He had added a previously missing work ethic and level of concentration. One wonders if the sharpening of Vick’s focus had to do with losing what might have been his primary outlet for sadism and violence: the brutal world of training fighting dogs and then killing the losers in often unspeakably cruel ways.

There is no question that violence stirs fan blood. Football players know this; they have been remarkably hostile to attempts to soften the mayhem, especially those ringing helmet-to-helmet shots, an offspring of the modern technique learned in PeeWee leagues of “putting a hat on him” (which means tackling headfirst rather than the more traditional style of wrapping one’s arms around the ball carrier’s legs and dragging him down).

Most pro football players seem to be on the side of the hats. A more careful game won’t be football anymore, they say. It won’t be the American game -- even for some of the doctors watching who treat the “epidemic of concussions blazing through schoolboy football.”

3. But No Chicks

The title of Mariah Burton Nelson’s 1994 book, The Stronger Women Get, The More Men Love Football, seems ever more prescient. The so-called feminization of America (really the slow movement toward equality) is reflected in most sports, many boardrooms, and the military. Resistance is stiff, from human resources violations to rape. Conservatives keen over the suffering of the average male. It’s tough when you suddenly have to compete against an expanding talent pool that includes women who are better than you. Mr. Average Mediocre can no longer count on his members-only credential to keep him in the game. Unless, of course, the game is football.

Football is the last estrogen-free zone. No wonder high school and college teams have such bloated rosters. (College teams routinely “dress” 85 men, compared to a pro team’s 53.) This gives more boys the chance to imagine themselves in the testosterone club, even if many of them hardly ever get into a game. Later, as jock alums, they will donate to alma mater and speak reverently of how old coach taught them to be men -- or at least not women.

Yes, there are girls playing in some youth and high school games, even in college, mostly as kickers. But the freakishness of it is still the story. The NFL is so relentlessly misogynistic that off-field incidents like those involving Brett Favre when he was a Jet and Super Bowl-bound Pittsburgh quarterback Ben Roethlisberger tend to be dismissed as boys-will-be-boys antics. Unfortunately, there’s a certain logic to this: since they began playing the game, they’ve been told they can be real men, not girls, not sissies -- if they submit to Coach, play hard, and play in pain. In return, their perks and entitlements will be those of conquering warriors.

4. The Faux Volunteer Army

If football really is the bread and circuses of this dying empire, the injuries suffered by the gladiators (disproportionately African-American) make the game more real, more urgent. And their willingness to take the risks absolves us from blame. After all, they volunteered. They really want to play this game, the media reminds us. These aggressive, competitive men have an intrinsic need to prove themselves to themselves, each other, and us. And where else, the media asks us, would they make so much money and find so much acclaim?

At Goldman Sachs? The Mayo Clinic? Skadden, Arps? No, no, these sturdy lads are often from the underclass and they have leveraged their skill and dedication into some college studies and a job in football. That many of these gladiators, clearly smart enough to absorb complicated game plans, feel that football is their only shot seems to be an indictment of American opportunity. What about all those high school and college football players who put all their chips in their hat and still didn’t make it to the pros?

Maybe some of them joined the National Guard.

It’s here, of course, that the entire metaphor may go offsides for you. Or at least become uncomfortable. Football -- Army? Gladiators -- mercenaries? What about all the strong young men and, increasingly, women who feel that their only shot at getting an education and a meaningful life is joining the military during wartime?

The author and journalist Richard Reeves made the connection neatly when he wrote:

“We have a volunteer army, the National Football League with guns, and we are the spectators.”

As spectators we rarely see the young people die in either volunteer legion. Restrictions during the Bush years on journalists filming combat deaths or even showing returning caskets kept the wars in Iraq and Afghanistan at a comfortable remove until they became distant and routine. Old news. Maybe even a little boring for people without loved ones on active duty.

On NFL broadcasts, players with broken bones and torn tissues are quickly carted off lest their teammates lose heart. For those of us watching on TV, the collisions seem almost like cartoon hits. How can those players just pop back up? Is it the pride, the adrenaline, that allows them to pretend they are made of steel? Of course, the real damage, the dementia brought on by head trauma, is years, even decades, away.

It’s hard to believe how recently the concussion discussion began in earnest, as if players hadn’t been hit in the head for more than a century. It was launched several years ago by the revelation that former pro football players were being diagnosed with dementia, and even dying from suspected long-term brain trauma, at disproportionate rates for their age. It was helped along by a number of workers’ compensation cases and the superb reporting of Alan Schwarz of the New York Times.

The concussion discussion has replaced steroids as the NFL health topic, although the issues are joined: larger players seem to be at greater risk for early death, and bulking up via steroids probably contributes to harder hits. The discussion has also raised the question of whether parents should allow their children to play the game -- years of small, unreported traumas to the head can’t be good for developing brains. It even occasioned a rare but telling ESPN column on abolition.

Lest you consider this enough piling on the all-American game, labor troubles loom with a lock-out possible in March. Because the main issue is money -- the teams want to share less revenue (currently 60%) with the players -- the media tends to characterize the conflict as “billionaires versus millionaires.” Actually, most owners are rich from other businesses and would not have been allowed into the NFL unless they were financially secure, while few players survive more than about three years in the league. The owners also want to increase production (adding two games to the regular season) without taking more responsibility for health-care costs.

If any of this sounds depressingly like real life, how could you not watch what might be the last Super Bowl, the endgame of empire, the two-minute warning before America finally beats itself?