Brazil Reduces Deforestation

SUBHEAD: The rate of deforestation has been reduced by over two-thirds from a decade ago.

By Doug Boucher 6 July 2014 for Solutions -
(http://www.thesolutionsjournal.com/node/237165)


Image above:A cattle herd wandering in the heart of Mato Grosso, Brazil. The 2009 beef moratorium created a supply chain through which ranchers were pressured to end deforestation. From (http://news.mongabay.com/2009/0909-amazon-cattle-ranching.html).

During the second half of the twentieth century, the deforestation of the tropics became a global concern. Young people everywhere learned at an early age that “saving the rainforest” was one of the most urgent needs of the planet. Yet, for decades, these worries had no real effect on the reality of tropical deforestation.

The lack of progress until the start of the twenty-first century is evident in this graph. Figure 1 shows global greenhouse gas emissions from land use change—which for the past half-century (since 1961)

have come almost totally from tropical deforestation. Although there was a fair amount of year-to-year variation up to the year 2000, the trend is absolutely flat, with no overall decrease in deforestation despite global concern and efforts.

Then, in the early part of the twenty-first century, there was a dramatic change. The curve drops sharply, with emissions from land use change falling by over a third in barely a decade. After decades of fruitless efforts, there were clear signs of a major success within a few years.

It turns out that most of the decrease in deforestation has been in the Amazon, and mostly in one country—Brazil. Amazonia is the largest tropical forest in the world with about five million square kilometers.1,2 Furthermore, about 80 percent of the basin’s forest remains basically intact.

 Brazil contains about 60 percent of the entire Amazon forest, 1 and at the peak of its deforestation in 2004, it was not only the largest tropical forest country, but also the leader in deforestation worldwide.

In this article, I describe the events and causes that underlie the rapid reduction in deforestation in Brazil. In a few short years, a large—indeed, historic—change has occurred. If Brazil’s success can be duplicated elsewhere, most of us living today could witness the end of thousands of years of deforestation in our lifetimes.

The annual data compiled from satellite imagery by Brazil’s National Institute of Space Research (INPE), clearly shows the downward trend of deforestation in the Brazilian Amazon over the past nine years (2004 to 2013). There was substantial variation from year to year through the 1990s and early 2000s, partly related to changes in the economic factors driving deforestation (e.g., recessions, prices for commodities such as beef and soybeans, and the exchange rate of the Brazilian real relative to the dollar and the euro).4

Variation in weather from year to year also had an impact, since most deforestation occurs during the southern hemisphere dry season (roughly June to October), when forests can be most easily cut down and burned. But until 2004, there was no up or down trend visible at all beyond the annual fluctuations.

Since then, however, there has been enormous progress. The rate of deforestation has been reduced by over two-thirds from its average level in the decade from 1996 to 2005 (the period that Brazil uses as its baseline), and by nearly three-fourths from its high point in 2004. How has this been accomplished?

A New Political Dynamic

This progress reflects the growth of the environmental and social movements in Brazil in the last two decades, including the rise to power of the new Workers’ Party (PT) and their leader, Luis Inácio Lula da Silva (known to Brazilians simply as “Lula”), who was elected President in 2002 on his fourth try. Lula’s government and its progress in reducing deforestation came out of a long history of engagement with social movements.

Based in the trade union and landless peasants’ movements, but also having ties to forest peoples’ organizations such as those of indigenous peoples and the rubber tappers union, the PT provided a model for a broad-based coalition that focused on social, economic, and environmental transformation—rather than just on taking power. The PT had significant experience in pressuring both businesses and governments, including Lula’s own government, after he was elected.

At least as important as Lula was Marina Silva, his first Minister of the Environment. Her activism aimed at curtailing the rate of forest clearing went back to her early experience in the Amazon state of Acre, working with Chico Mendes to organize the rubber tappers union and the state branch of the PT. As Minister, she was responsible for implementing the government’s actions to reduce deforestation, which often brought her into conflict with other Ministries’ plans for development and economic growth.

Initially, the policies of Lula’s government were aimed at achieving broad-based social and economic development, particularly for urban workers and the peasants and landless laborers in the rural sector. In the six years after Lula’s 2002 election—through social programs such as Fome Zero (Zero Hunger) and Bolsa Familia (Family Allowances)—Brazil reduced its poverty rate from over 34 percent to less than 23 percent, while 29 million citizens rose into the middle class.5 Hunger and malnutrition rates dropped substantially, and important advances were made in reducing economic inequality.6,7

During the early years of Lula’s administration, actions aimed at reducing deforestation emphasized the creation of protected areas and recognition of indigenous lands, as well as enforcement actions against illegal logging.8
 
But equally important was the change in the political dynamic. For decades, the issue of deforestation had been seen in Brazil in terms of national sovereignty—as something raised by foreign NGOs pressuring to save Amazon forests even at the cost of Brazilians’ right to economic development. It was often pointed out that those concerned with “saving the rainforest” came from countries that had themselves become rich by exploiting their own natural resources, including destroying most of their forests.

Yet, now they were lecturing tropical countries to avoid following this same course for the sake of the planet. Foreigners’ claims to speak for the natural world were counter-posed to Brazil’s right to decide how to use its own land.

This changed under the Lula government. Deforestation was recast as the wasteful exploitation of resources that rightfully belonged to all Brazilians—particularly to forest peoples such as indigenous groups and the rubber tappers —by powerful forces such as expanding soybean farmers and cattle ranchers. Initially separate, organizations representing forest peoples and urban environmentalists began working together, and joined in 2008 to found the Zero Deforestation campaign.

This movement—composed of a broad coalition of non-governmental organizations including environmental, indigenous, rubber-tapper, labor, human rights, and other groups—exerted strong pressure on the federal government. Although allied with international NGOs such as Greenpeace, Friends of the Earth, and World Wildlife Fund, they were fundamentally Brazilian in origin and in their sources of political power.

NGOs were to become the key actors in the widely publicized 2006 and 2009 exposés of the role that the soybean and beef industries had played in deforesting the Amazon, and in negotiating deforestation moratoria with those industries. The Zero Deforestation campaign proposed what became the Amazon Fund and its management by the Brazilian national development bank BNDES (Banco Nacional do Desenvolvimento), and members of the campaign now participate in the Fund’s steering committee as important stakeholders.

Marina Silva, as Minister of the Environment, led the effort to reduce deforestation from within the government, but was also willing to leave that government and join the social movement when it was necessary for the struggle against deforestation.

After several years in office, she resigned from Lula’s cabinet in protest against the inadequate pace of action on deforestation, and became the Green Party’s candidate for President in the race to succeed Lula in 2010. Quite unexpectedly, she won nearly 20 percent of the vote in the first round,9 showing the strength of the popular commitment to ending deforestation and exerting pressure on the PT’s candidate, (Dilma Rousseff, Lula’s former Energy Minister and then Chief of Staff) to commit to continuing the progress that had been achieved. Indeed, Lula himself agreed before leaving office to move the deadline for Brazil’s target to reduce deforestation rates by 80 percent up to 2016 from 2020.

The Zero Deforestation Campaign, going further, pushed for an end to deforestation by 2015. Rousseff’s partial veto in 2012 of the amendments to the Forest Code that would have given amnesty for past illegal deforestation reflected the new political dynamic that emerged at least in part through Silva’s electoral success.

Many Actors, Many Solutions

The reduction in deforestation would not have happened without the new political dynamic, but it still required the work of many actors, including governments (both at the federal and state levels, and those of other countries such as Norway, Germany, and the U.K.), businesses, and NGOs.

Legal steps—including the enforcement of existing forest laws and prosecutions of actors in the soybean and beef supply chains who distributed products produced through deforestation—also played an important role. These prosecutions worked together with voluntary business commitments such as the soybean and beef moratoria, which were enforced using sophisticated satellite imagery.

Law Enforcement and the Soy and Beef Moratoria
The initial government actions, setting up reserves and increasing enforcement of environmental laws, were part of the historic Plan for the Prevention and Combating of Deforestation in the Amazon (PPCDAM), instituted by the Lula government in 2004.

Although not originally motivated by climate change, over time the PPCDAM grew and was transformed, and its efforts are now a key element of the National Climate Change Plan. Since emissions from deforestation represented the majority of Brazil’s global warming pollution in the 1990s, making actions against deforestation part of the Climate Plan was a logical step, but it also made it possible to connect the country’s actions with an emerging global concern.

These high-level actions were complemented by on-the-ground steps to strengthen enforcement of existing laws, for example against illegal logging. The data made available by the national space agency INPE on a monthly basis have made it possible to crack down quickly in areas of new deforestation identified through satellite monitoring programs.1

Steps taken include the closing of illegal sawmills and jailing of the perpetrators, including government officials who had been taking bribes to ignore illegal deforestation. Although such enforcement campaigns are often episodic and occur in response to media coverage—which in turn is often generated by new monthly data on deforestation or burning—they have had a cumulative effect of making deforestation a risky activity rather than accepted business as usual.

The 2006 release of Greenpeace’s report, Eating Up the Amazon, proved to be a key step in scaling up pressure.10 The report linked the soybean industry to deforestation, global warming, water pollution, and even the use of slave labor to clear land. It focused particularly on two multi-national companies: the giant grain trader and exporter Cargill and the world’s largest fast food chain, McDonald’s.

Action came within weeks. The two associations that together included nearly all soybean processors and exporters in Brazil—the Brazilian Association of Vegetable Oil Industries (ABIOVE) and the National Association of Cereal Exporters (ANEC)—announced a moratorium on deforestation. Their members would not buy any soybeans produced on Amazon farmland deforested after June 24, 2006. This soy moratorium was followed by a similar one involving beef in 2009, likewise provoked by two hard-hitting NGO reports, Amigos da Terra Amazônia Brasileira’s Time to Pay the Bill and Greenpeace’s Slaughtering the Amazon.11,12

The actions of the independent federal public prosecutors, particular in the key states of Pará and Mato Grosso, have been an important link between these voluntary business actions and government enforcement.13

 The moratorium commitments by exporters, soybean processors, slaughterhouses, and supermarkets that they would buy only non-deforestation soy and beef have been buttressed by strong threats from public prosecutors in those two states.

First in Pará and then in Mato Grosso, slaughterhouses have signed agreements under which ranchers were required to provide the GPS coordinates of their property boundaries to the slaughterhouses in order to sell their beef to them. This in turn makes it possible to use remote sensing data not only to detect deforestation, but to know on which ranch it is taking place and to take action against it.

The prosecutors’ warnings to supermarkets that they too would be held responsible for the sale of beef produced in violation of environmental laws, combined with the new ability to enforce them using GPS data, has effectively made the supply chain a part of the system through which ranchers are pressured, both economically and legally, to end deforestation.

The soy moratorium has been in place for six years, and there is now data to show just how successful it has been. Comparing satellite images showing deforestation with views of the same areas in subsequent years, Rudorff, et al., found that by the 2009/2010 crop year, only 0.25 percent of land with soybean crops had been planted in deforested areas since the moratorium began.14

These fields represented only 0.04 percent of the total soybean area in Brazil. The recent detailed examination by Macedo et al., of soybean production and deforestation in Mato Grosso, where the industry’s expansion has been concentrated, reinforces these conclusions and provides evidence that the link between soy and deforestation—strong until recently—has now been broken.1,15,16

Despite the rise of soy prices to record high levels since 2007, tropical forest clearing for soybeans has declined to very low levels in Mato Grosso. Furthermore, in the adjacent cerrado region, deforestation has also been substantially reduced.15

We do not yet have analogous data for the beef moratorium, which is more recent and involves a more complicated supply chain. However, there are initial signs of changes in the actions of ranchers in response to the moratorium, and within the next few years we should have clearer satellite evidence of how successful the beef moratorium has been.13

The soy and beef moratorium efforts show that the movement was based on a sophisticated understanding of the political economy and power dynamics of Amazonia, pressuring not only governments but also the industries that were the major drivers of deforestation. They used the threat of an international consumer boycott as a source of pressure on these industries, but the strategy emphasized pressure on businesses along the entirety of the deforestation-related supply chain, not just relying on individual consumers’ shopping decisions.

All the businesses involved—not just the farmers and ranchers producing soy and cattle, but also the banks financing them, the slaughterhouses buying their cattle, the exporters shipping their products overseas, and the intermediaries and supermarket chains distributing them domestically—were targets of the campaigning. The point of the campaigning was not to persuade individual consumers to change their behavior, but to force action by businesses that were critical links in the supply chain.


Image above: Areal photo of isolated deforested area in Amazon forest. From original article.


Indigenous Lands and Protected Areas

Much of the success in reducing deforestation came from establishing —and effectively protecting —an extensive network of indigenous lands and protected areas across the Amazon.

Starting in 2002, more and more areas were brought under various state and federal classifications, so that now just over half of the forest in the Brazilian Amazon is protected in some form. Nearly half of this area is reserved for indigenous peoples, about a fifth is under strict protection, and about a fourth is designated for sustainable development.

Some of the protected areas follow the model of state and federal preserves in developed countries. However, the protection of indigenous peoples’ territories is distinctive and plays a critical role in conservation of the Amazon rainforest.

The collective tenure of these lands by indigenous peoples—legally confirmed and enforced by the Brazilian government—gives them the right to use the land for sustainable forest management and the exploitation of timber and non-timber forest resources. In practice, they have generally chosen to keep almost all of their lands in forest, and studies of Brazil’s reserves have found that they have reduced the rate of emissions from deforestation by about ten-fold compared to neighboring areas.8

Beyond the reserves’ effectiveness as environmental measures, they represent the tangible recognition of indigenous peoples’ rights that had been denied for many years.

Rapid Economic Growth
The traditional framing of environmental issues in Brazil, as in developed countries, has seen a conflict between economic development and environmental protection.

But Brazil’s reduction of deforestation by two-thirds occurred at the same time that it saw strong economic growth and significant advancement in social justice. The country’s GDP increased at a rapid rate during the 2000s, ranging from over 3 percent to over 7 percent annually for nearly a decade.19,20

 The two industries previously most responsible for Amazon deforestation—beef and soy—both showed continued healthy growth at the national level, with production, exports, and the size of the cattle herd continuing to increase steadily even as deforestation dropped.17
 
While the first few years of decline were partly connected to the economic slowdown and lower commodity prices,1,16,18 more recent data show that the continuing reduction in deforestation rates has not been due to economic recession. Deforestation rates have continued to fall both before and after the recession of 2008-2009, and through the recent years of record agricultural prices.


State Actions

 Under Brazil’s federal system, the states have been responsible for a substantial part of the country’s success in reducing deforestation. In Brazil, states share responsibility for land use policies with the national government in Brasilia, and governors in states such as Pará have both taken action themselves and pushed the federal government for stronger anti-deforestation policies.15

Another example is the state of Amazonas, which is Brazil’s largest: as big as Alaska, and over twice the size of Texas.

Although its own reduction of the deforestation rate by 70 percent from 2002 to 2008 was from an initially low level, and thus not a major contribution to emissions reductions, its ability to maintain a rapid rate of economic growth while achieving this reduction —an increase of 65 percent in GDP in half a decade—showed how tropical forest regions can move to growth driven by urban sectors rather than by deforestation, and develop rapidly in the process. Amazonas has reduced deforestation to a very low level, with 98 percent of its forest still standing.

The Support of Norway

Brazil showed the seriousness of its commitment to combating deforestation by putting it into national legislation at the end of December 2009. The Climate Change Law inscribes the commitment to reduce overall emissions by between 36.1 percent and 38.9 percent, relative to business as usual, by the year 2020.21

This is equivalent to a 20 percent reduction from Brazil’s 2005 level.

An important indication of international support for this effort came at the 2007 United Nations Framework Convention on Climate Change (UNFCCC) meeting in Bali, Indonesia, where Norway’s Prime Minister Jens Stoltenberg announced Norway’s commitment to protecting tropical forests and offered $2.5 billion dollars during the next five years to finance such programs around the world.

One of the most notable aspects of the pledge was the promise of up to $1 billion for Brazil’s Amazon Fund, to be used for what is called “results-based financing” or “pay for performance.” This means that the money will flow not on the basis of efforts, promises or attempts to reduce deforestation, but only as the goal of reducing deforestation is met.

The “REDD+” (Reducing Emissions from Deforestation and Forest Degradation) system that Norway negotiated with Brazil is simple and straightforward. The rate of compensation for reductions (the “carbon price”) is $5.00/ton of CO2. Each hectare of tropical forest is assumed to emit 100 tons of carbon when cleared, which is equivalent to 367 tons of CO2. The reduction in area deforested is calculated in comparison to the average from 1996 through 2005, which was 19,500 km2 per year in all of the Brazilian Amazon.

To date, over $670 million in compensation has been paid to Brazil under this agreement. As REDD+ is often identified— erroneously—with offset-based carbon-market financing, it is important to point out that the Norway-Brazil program—by far the world’s largest (it covers over one-fourth of the world’s tropical forests by itself) and the most successful —is strictly a non-market, non-offset program.

Norway does not get the right to emit a single ton more of CO2 bin exchange for its funding of Brazil’s reductions in emissions. Although Germany, the U.K., and other donor nations, as well as international programs such as UN-REDD have also contributed support, no other country has committed to funding for REDD+ at a level close to that coming from Norway.

The Norwegian contribution to REDD+ efforts worldwide in the initial period ($2.5 billion over five years) amounted to about $100 annually for each of its citizens. In comparison, the United States’ REDD+ pledge in Copenhagen ($1 billion over three years) added up to only about $1 annually for every American.



Image above: This map of the Amazon Basin (within black line) shows Indigenous Reserves (dark blue) and other protected areas (light blue). Deforested areas are shown in yellow. From original article. Source Woods Hole Research Center.

What Has Been Accomplished

Undoubtedly there will be both ups and downs in this story in the years to come. Indeed, the Amazon deforestation rate rose in both 2008 and in 2013, by substantial percentages if calculated only with respect to the previous year. But a glance at Figure 2 shows that despite these reversals, the overall trend is clearly downward since the mid-2000s. Compared to the average over the 1996-2005 baseline, Brazil’s Amazon deforestation had been reduced by 70 percent in 2013. This represents by far the largest success in solving the problem of tropical deforestation, and indeed has been an important contribution to the effort to slow climate change.22

It is particularly impressive in that it has occurred during a period in which the other Amazon countries have not shown any consistent decreases in deforestation.23

The credit for this progress should be shared by many actors. The Brazilian federal government under Lula and Dilma Rousseff has taken strong actions, as have the independent federal public prosecutors and state governments. Marina Silva, both as Minister and as a Presidential candidate, has been a transformative leader on the issue for many years.24

The voluntary moratorium adopted by the soy industry—and the NGO pressure that led to it—played important roles in reducing the pressure for deforestation by a commodity that was one of its major drivers. And the compensation provided by Norway, beyond its economic value, showed that the global community would support Brazil with concrete resources, not just rhetorically.
 
Ultimately, however, it is the change in the politics of the issue that has made progress possible, and for this, Brazilian civil society deserves most of the credit. The indigenous peoples, rubber tappers, labor organizers, environmentalists and other members of the broad social movement to end deforestation, made it possible and ultimately necessary for politicians and businesses to act. They have done a great service not only to their own country, but also to the climate and biodiversity of our entire planet.

Acknowledgements

This work was originally presented at the Yale School of Forestry and Environmental Studies in 2011, and much of its content was subsequently published in a different form in the journal Tropical Conservation Science in 2013.25 It includes analyses supported by funding from the Climate and Land Use Alliance and by subcontracts from the European Federation for Transport and Environment and the Environmental Investigation Agency.

I thank these organizations and my UCS colleagues, particularly Jordan Faires, Sarah Roquemore, Estrellita Fitzhugh, and the other members of the Tropical Forest and Climate Initiative, for their support and help. The detailed comments of the three reviewers were very helpful in the revision process, and I very much appreciate them.


References

1. Nepstad, D et al. The end of deforestation in the Brazilian Amazon. Science 326, 1350- 1351 (2009).
2. Pan, Y et al. A large and persistent carbon sink in the world’s forests. Science 333, 988-993 (2011).
3. FAO-ITTO. The State of Forests in the Amazon Basin, Congo Basin and Southeast Asia. Food and Agriculture Organization [online] (Rome, Italy, 2011) (www.fao.org/docrep/014/i2247e/i2247e00.pdf).
4. Boucher, D, Elias, P, Lininger, K, May-Tobin, C, Roquemore, S & Saxon, E. The Root of the Problem: What’s Driving Tropical Deforestation Today? Union of Concerned Scientists [online] (Cambridge, MA, 2011) (www.ucsusa.org/whatsdrivingdeforestation).
5. Throssell, L. Lula’s legacy for Brazil’s next president. BBC News – Latin America & Caribbean [online] (September 30, 2010) (www.bbc.co.uk/news/world-latin-america-11414276?print=true).
6. Rocha, C. Developments in national policies for food and nutrition security in Brazil. Development Policy Review 27, 51–66 (2009).
7. Chappell, MJ and LaValle, LA. Food security and biodiversity: Can we have both? An agroecological analysis. Agriculture and Human Values 28, 3-26 (2010).
8. Ricketts, TH et al. Indigenous lands, protected areas, and slowing climate change. PLoS Biology 8, e1000331 (2013).
9. Phillips, T. Brazil election sees breakthrough for greens and environmental agenda. The Guardian [online] (October 4, 2010) (www.guardian.co.uk/world/2010/oct/04/brazil-election-breakthrough-greens).
10. Greenpeace International. Eating Up the Amazon. Greenpeace International [online] (April 2006)(www.greenpeace.org/forests).
11. Amigos da Terra – Amazônia Brasileira. A Hora da Conta - Time to Pay the Bill. Friends of the Earth-Brazilian Amazon [online] (São Paulo, Brazil, April 2009) (www.amazonia.org.br/guia/detalhes.cfm?id=313449&tipo=6&cat_id=85&subcat_...).
12. Greenpeace International. Slaughtering the Amazon. Greenpeace International [online] (June 2009) (www.greenpeace.org/international/en/publications/reports/slaughtering-th...).
13. Walker, N, Bramble, B & Patel, S. From major driver of deforestation and greenhouse gas emissions to forest guardians? New developments in Brazil’s Amazon cattle industry. National Wildlife Federation [online] (Washington, D.C., 2010) (www.nwf.org/Global-Warming/Policy-Solutions/~/media/4878226C49BF48EB9EB5...).
14. Rudorff, BFT et al. The soy moratorium in the Amazon biome monitored by remote sensing images. Remote Sensing 3, 185-202 (2011).
15. Macedo, MN et al. Decoupling of deforestation and soy production in the southern Amazon during the late 2000s. Proceedings of the National Academy of Sciences 109, 1341-1346 (2012).
16. Nepstad, DC, Stickler, CM & Almeida, OT. Globalization of the Amazon soy and beef industries: Opportunities for conservation. Conservation Biology 20, 1595–1603 (2006).
17. USDA Foreign Agricultural Service. Production, supply and distribution online. United States Department of Agriculture [online] (Washington, D.C., 2011) (www.fas.usda.gov/psdonline/psdDownload.aspx; dataset psd_oilseeds_csv.zip).
18. Soares-Filho, B et al. Role of Brazilian Amazon protected areas in climate change mitigation. Proceedings of the National Academy of Sciences 107, 10821-10826 (2010).
19. World Bank. Data – GDP growth (annual %) [online] (2011). data.worldbank.org/indicator/NY.GDP.MKTP.KD.ZG.
20. Organisation for Economic Cooperation and Development. Country Statistical Profile: Brazil. Organisation for Economic Cooperation and Development [online] (2010) (www.oecd-ilibrary.org/economics/country-statistical-profile-brazil_csp-b...).
21. Law No. 12.187 of 29, Diario Oficial da União No. 248-A, Secciao 1, 109–110 (Government Of Brazil, Brasilia, December 2009).
22. Observatorio do Clima. Sistema de Estimativa de Emissões de Gases de Efeito Estufa (SEEG) – Mudança de Uso da Terra. Observatorio do Clima [online] (2013) (seeg.observatoriodoclima.eco.br/ index.php/emissions/index/sector/Mudan%25C3%25A7a%2Bde%2BUso%2Bda%2BTerra).
23. Boucher, D. Three datasets agree: Amazon deforestation has been reduced. The Equation blog (Union of Concerned Scientists) [online] (2013) (blog.ucsusa.org/three-datasets-agree-amazon-deforestation-has-been-reduced).
24. Heid, D, Roger, C & Nag, E.-M.. Climate Governance in the Developing World (Polity Press, Cambridge, UK and Cambridge, MA, 2013).
25. Boucher, D, Roquemore S. & Fitzhugh, E. Brazil’s success in reducing deforestation. Tropical Conservation Science 6, 426-445 (2013).






.

In a Handful of Dust

SUBHEAD: Watching the descent of industrial civilization over the next few centuries into a deindustrial dark age.

By John Michael Greer on 2 July 2014 for Archdruid Report -
(http://thearchdruidreport.blogspot.com/2014/07/in-handful-of-dust.html)


Image above: Poster by Clive Upton of the assassination of Archduke Ferdinand and his wife in 1914 in Sarajevo. From (http://www.allposters.com/-sp/The-Assassination-of-Archduke-Franz-Ferdinand-Posters_i7683664_.htm).

All things considered, it’s a good time to think about how much we can know about the future in advance. A hundred years ago last Saturday, as all my European readers know and a few of my American readers might have heard, a young Bosnian man named Gavrilo Prinzip lunged out of a crowd in Sarajevo and emptied a pistol into the Archduke Franz Ferdinand and his wife Sophie, who were touring that corner of the ramshackle Austro-Hungarian empire they were expected to inherit in due time.

Over the summer months that followed, as a direct result of those gunshots, most of the nations of Europe went to war with one another, and the shockwaves set in motion by that war brought a global order centuries old crashing down.

In one sense, none of this was a surprise. Perceptive observers of the European scene had been aware for decades of the likelihood of a head-on crash between the rising power of Germany and the aging and increasingly fragile British Empire.

The decade and a half before war actually broke out had seen an increasingly frantic scramble for military alliances that united longtime rivals Britain and France in a political marriage of convenience with the Russian Empire, in the hope of containing Germany’s growing economic and military might.

Every major power poured much of its wealth into armaments, sparking an arms race so rapid that the most powerful warship on the planet in 1906, Britain’s mighty HMS Dreadnought, was hopelessly obsolete when war broke out eight years later.

Inquiring minds could read learned treatises by Halford Mackinder and many other scholars, explaining why conflict between Britain and Germany was inevitable; they could also take in serious fictional treatments of the subject such as George Chesney’s The Battle of Dorking and Saki’s When William Came, or comic versions such as P.G. Wodehouse’s The Swoop!.

Though most military thinkers remained stuck in the Napoleonic mode of conflict chronicled in the pages of Karl von Clausewitz’ On War, those observers of the military scene who paid attention to the events of the American Civil War’s closing campaigns might even have been able to sense something of the trench warfare that would dominate the coming war on the western front.

It’s only fair to remember that a great many prophecies in circulation at that same time turned out to be utterly mistaken. Most of them, however, had a theme in common that regular readers of this blog will find quite familiar: the claim that because of some loudly ballyhooed factor or other, it really was different this time.

Thus, for example, plenty of pundits insisted in the popular media that economic globalization had made the world’s economies so interdependent that war between the major powers was no longer possible. Equally, there was no shortage of claims that this or that or the other major technological advance had either rendered war impossible, or guaranteed that a war between the great powers would be over in weeks.

Then as now, those who knew their history knew that any claim about the future that begins “It’s different this time” is almost certain to be wrong.

All things considered, it was not exactly difficult in the late spring of 1914, for those who were willing to do so, to peer into the future and see the shadow of a major war between Britain and Germany rising up to meet them. There were, in fact, many people who did just that.

To go further and guess how it would happen, though, was quite another matter.  Some people came remarkably close; Bismarck, who was one of the keenest political minds of his time, is said to have commented wearily that the next great European war would probably be set off by some idiotic event in the Balkans. 

Still, not even Bismarck could have anticipated the cascade of misjudgments and unintended consequences that sent this particular crisis spinning out of control in a way that half a dozen previous crises had not done.

What’s more, the events that followed the outbreak of war in the summer of 1914 quickly flung themselves off the tracks intended for them by the various political leaders and high commands, and carved out a trajectory of their own that nobody anywhere seems to have anticipated.

That the Anglo-French alliance would squander its considerable military and economic superiority by refusing to abandon a bad strategy no matter how utterly it failed or how much it cost; that Russia’s immense armies would prove so feeble under pressure; that Germany would combine military genius and political stupidity in so stunningly self-defeating a fashion; that the United States would turn out to be the wild card in the game, coming down decisively on the Allied side just when the war had begun to turn in Germany’s favor—none of that was predicted, or could have been predicted, by anyone.

Nor were the consequences of the war any easier to foresee. On that bright summer day in 1914 when Gavrilo Prinzip burst from the crowd with a pistol in his hand, who could have anticipated the Soviet Union, the Great Depression, the blitzkreig, or the Holocaust?

Who would have guessed that the victor in the great struggle between Britain and Germany would turn out to be the United States? 

The awareness that Britain and Germany were racing toward a head-on collision did not provide any certain knowledge about how the resulting crash would turn out, or what its consequences would be; all that could be known for sure was that an impact was imminent and the comfortable certainties of the prewar world would not survive the shock.

That dichotomy, between broad patterns that are knowable in advance and specific details that aren’t, is very common in history. It’s possible, for example, that an impartial observer who assessed the state of the Roman Empire in 400 or so could have predicted the collapse of Roman power outside the Eastern Mediterranean littoral.

As far as I know, no one did so—the ideological basis of Roman society made the empire’s implosion just as unthinkable then as the end of progress is today—but the possibility was arguably there. Even if an observer had been able to anticipate the overall shape of the Roman and post-Roman future, though, that anticipation wouldn’t have reached as far as the specifics of the collapse, and let’s not even talk about whether our observer might have guessed that the last Emperor of Rome in the west would turn out to be the son of Attila the Hun’s secretary, as in fact he was.

Such reflections are on my mind rather more than usual just now, for reasons that will probably come as no surprise to regular readers of this blog. For a variety of reasons, a few of which I’ll summarize in the paragraphs ahead, I think it’s very possible that the United States and the industrial world in general are near the brink of a convusive era of crisis at least as severe as the one that began in the summer of 1914.

It seems very likely to me that in the years immediately ahead, a great many of the comfortable certainties of the last half century or so are going to be thrown overboard once and for all, as waves of drastic political, economic, military, social, and ecological change slam into societies that, despite decades of cogent warnings, have done precisely nothing to prepare for them.

I want to review here some of the reasons why I expect an era of crisis to arrive sooner rather than later.

One of the most important of those reasons is the twilight of the late (and soon to be loudly lamented) fracking bubble. I’ve noted in previous posts here that the main product of the current fracking industry is neither oil nor gas, but the same sort of dubiously priced financial paper we all got to know and love in the aftermath of last decade’s real estate bubble.

These days, the rickety fabric of American finance depends for its survival on a steady flow of hallucinatory wealth, since the production of mere goods and services no longer produces enough profit to support the Brobdingnagian superstructure of the financial industry and its swarm of attendant businesses.

These days, too, an increasingly brittle global political order depends for its survival on the pretense that the United States is still the superpower it was decades ago, and all those strident and silly claims that the US is about to morph into a "Saudi America" flush with oil wealth are simply useful evasions that allow the day of reckoning, with its inevitable reshuffling of political and economic status, to be put off a little longer.

Unfortunately for all those involved, the geological realities on which the fracking bubble depends are not showing any particular willingness to cooperate. The downgrading of the Monterey Shale not long ago was just the latest piece of writing on the wall: one more sign that we’re scraping the bottom of the oil barrel under the delusion that this proves the barrel is still full.

The fact that most of the companies in the fracking industry are paying their bills by running up debt, since their expenses are considerably greater than their earnings, is another sign of trouble that ought to be very familiar to those of us who witnessed the housing bubble’s go through its cycle of boom and bust.

Bubbles are like empires; if you watch one rise, you can be sure that it’s going to fall. What you don’t know, and can’t know, is when and how. That’s a trap that catches plenty of otherwise savvy investors.

They see a bubble get under way, recognize it as a bubble, put money into it under the fond illusion that they can anticipate the bust and pull their money out right before the bottom drops out...and then, like everyone else, they get caught flatfooted by the end of the bubble and lose their shirts. That’s one of the great and usually unlearned lessons of finance: when a bubble gets going, it’s the pseudo-smart money that piles into it—the really smart money heads for the hills.

So it’s anyone’s guess when exactly the fracking bubble is going to pop, and even more uncertain how much damage it’s going to do to what remains of the US economy. A good midrange guess might be that it’ll have roughly the same impact that the popping of the housing bubble had in 2008 and 2009, but it could be well to either side of that estimate.

Crucially, though, the damage that it does will be landing on an economy that has never really recovered from the 2008-2009 housing crash, in which actual joblessness (as distinct from heavily manipulated unemployment figures) is at historic levels and a very large number of people are scrambling for survival. At this point, another sharp downturn would make things much worse for a great many millions whose prospects aren’t that good to begin with, and that has implications that cross the border from economics into politics.

Meanwhile, the political scene in the United States is primed for an explosion. One of my regular readers—tip of the archdruid’s hat to Andy Brown—is a research anthropologist who recently spent ten weeks traveling around the United States asking people about their opinions and feelings concerning government.

What he found was that, straight across geographical, political, and economic dividing lines, everyone he interviewed described the US government as the corrupt sock puppet of wealthy interests. He noted that he couldn’t recall ever encountering so broad a consensus on any political subject, much less one as explosive as this. 

Recent surveys bear him out. Only 7% of Americans feel any significant confidence in Congress.  Corresponding figures for the presidency and the Supreme Court are 29% and 30% respectively; fewer than a third of Americans, that is, place much trust in the political institutions whose birth we’ll be celebrating in a few days. This marks a tectonic shift of immense importance. 

Not that many decades ago, substantial majorities of Americans believed in the essential goodness of the institutions that governed their country. Even those who condemned the individuals running those institutions—and of course that’s always been one of our national sports—routinely phrased those condemnations in terms reflecting a basic faith in the institutions themselves, and in the American experiment as a whole.

Those days are evidently over. The collapse of legitimacy currently under way in the United States is a familiar sight to students of history, who can point to dozens of comparable examples; each of these was followed, after no very long delay, by the collapse of the system of government whose legitimacy in the eyes of its people had gone missing in action.

Those of my readers who are curious about such things might find it educational to read a good history of the French or the Russian revolutions, the collapse of the Weimar Republic or the Soviet Union, or any of the other implosions of political authority that have littered the last few centuries with rubble: when a system loses legitimacy in the eyes of the people it claims to lead, the end of that system is on its way.

The mechanics behind the collapse are worth a glance as well. Whether or not political power derives from the consent of the governed, as American political theory insists, it’s unarguably true that political power depends from moment to moment on the consent of the people who do the day-to-day work of governing:  the soldiers, police officers, bureaucrats and clerks whose job is is to see to it that orders from the leadership get carried out.

Their obedience is the linchpin on which the survival of a regime rests, and it’s usually also the fault line along which regimes shatter, because these low-ranking and poorly paid functionaries aren’t members of the elite.

They’re ordinary working joes and janes, subject to the same cultural pressures as their neighbors, and they generally stop believing in the system they serve about the same time as their neighbors do. That doesn’t stop them from serving it, but it does very reliably make them unwilling to lay down their lives in its defense, and if a viable alternative emerges, they’re rarely slow to jump ship.

Here in America, as a result of the processes just surveyed, we’ve got a society facing a well-known pattern of terminal crisis, with a gridlocked political system that’s lost its legitimacy in the eyes of the people it governs, coupled with a baroque and dysfunctional economic system lurching toward another cyclical collapse under the weight of its own hopelessly inefficient management of wealth. This is not a recipe for a comfortable future.

The situation has become dire enough that some of the wealthiest beneficiaries of the system—usually the last people to notice what’s happening, until the mob armed with torches and pitchforks shows up at their mansion’s front door—have belatedly noticed that robbing the rest of society blind is not a habit with a long shelf life, and have begun to suggest that if the rich don’t fancy the thought of dangling from lampposts, they might want to consider a change in approach. 

In its own way, this recognition is a promising sign. Similar realizations some seventy years ago put Franklin Roosevelt in the White House and spared the United States the hard choice between civil war and authoritarian rule that so many other countries were facing just then. 

Unless a great many more members of our kleptocratic upper class experience the same sort of wake-up call in a hurry, though, the result this time is likely to be far too little and much too late.

Here again, though, a recognition that some kind of crash is coming doesn’t amount to foreknowledge of when it’s going to hit, how it’s going to play out, or what the results will be.

If the implosion of the fracking bubble leads to one more round of bailouts for the rich and cutbacks for the poor, we could see the inner cities explode as they did in the long hot summers of the 1960s, setting off the insurgency that was so narrowly avoided in those years, and plunging the nation into a long nightmare of roadside bombs, guerrilla raids, government reprisals, and random drone strikes.

If a talented demagogue shows up in the right place and time, we might instead see the rise of a neofascist movement that would feed on the abandoned center of American politics and replace the rusted scraps of America’s democratic institutions with a shiny new dictatorship.

If the federal government’s gridlock stiffens any further toward rigor mortis, for that matter, we could see the states force a constitutional convention that could completely rewrite the terms of our national life, or simply dissolve the Union and allow new regional nations to take shape. 

Alternatively, if a great many factors break the right way, and enough people in and out of the corridors of power take the realities of our predicament seriously and unexpectedly grow some gonads—either kind, take your pick—we might just be able to stumble through the crisis years into an era of national retrenchment and reassessment, in which many of the bad habits picked up during America’s century of empire get chucked in history’s compost bin, and some of the ideals that helped inspire this country get a little more attention for a while. That may not be a likely outcome, but I think it’s still barely possible.

All we can do is wait and see what happens, or try to take action in the clear awareness that we can’t know what effects our actions will have. Thinking about that predicament, I find myself remembering lines from the bleak and brilliant poetic testament of the generation that came of age in the aftermath of those gunshots in Sarajevo, T.S. Eliot’s The Waste Land:
What are the roots that clutch, what branches grow

Out of this stony rubbish? Son of man,

You cannot say, or guess, for you know only

A heap of broken images, where the sun beats

And the dead tree gives no shelter, the cricket no relief,

And the dry stone no sound of water. Only

There is shadow under this red rock

(Come in under the shadow of this red rock),

And I will show you something different from either

Your shadow at morning striding behind you

Or your shadow at evening rising up to meet you:

I will show you fear in a handful of dust.

It’s a crisp metaphor for the challenges of our time, as it was of those in the time about which Eliot wrote. For that matter, the quest to see something other than our own shadows projected forward on the future or backward onto the past has a broader significance for the project of this blog. With next week’s post, I plan on taking that quest a step further.

The handful of dust I intend to offer my readers for their contemplation is the broader trajectory of which the impending crisis of the United States is one detail: the descent of industrial civilization over the next few centuries into a deindustrial dark age.


.

RIMPAC War on the Ocean

SUBHEAD: The unseen wars on the Pacific Ocean lead by the United States Navy is cranking up this summer.

By Oceans4Peace Staff on 2 July 2014 in Island Breath -
(http://islandbreath.blogspot.com/2014/07/war-on-ocean.html)

http://www.islandbreath.org/2014Year/07/140703oceanwarbig.jpg
Image above: Detail of image from poster for panel discussion and talk-story on July 12, 2014. Click to enlarge and see full poster (for printing and handing out).
 
WHAT:
Free panel discussion and talk-story on impacts on Pacific Ocean by military activity focusing on Jeju Island, Okinawa and Hawaii (RIMPAC 2014).

PANEL:
"What happens when coral reefs are militarized: Jeju Island, South Korea"
Jim Maragos PhD - Expert on Pacific Ocean atoll ecology

"The last pristine coral reef ecosystem of Okinawa Island"
Kenta Watanabe MS -  Tropical Botanist Okinawa national College of Technology

"Effects of RIMPAC on Marine Life"
Katherine Muzik PhD - Director of Kuku Wai Environmental Education Instititute

Moderator will be Juan Wilson, publisher of IslandBreath.org

WHEN:
Saturday July 12th 2014 at 5:00-7:00pm

WHERE:
Kapaa Public Library
4-1464 Kuhio Highway
Kapaa, HI, 96746

INFO:
Call (808) 822-7646

SPONSORS:
Kauai Alliance for Peace & Social Justice
Kohola Leo
Hawaii Sierra Club Kauai Chapter
Surfrider Foundation
People for the Protection of Kauai


Video above: Public service announcement about RIMPAC 2014 and its impact on Hawaii's ocean creatures by Surfriders. From (http://youtu.be/DIF9tdtmU0E).

INTRODUCTION:
Below is a draft  of the planned introduction to the panal discussion to be presented by the moderator, Juan Wilson:


RIMPAC Panel Introduction

Introduction by Juan Wilson - moderator.

INTRODUCTION TO WAR ON THE OCEAN
Aloha to all of you,

Thank you so much for coming out this evening. I'll be moderating. My name is Juan Wilson - I'm a member of the Executive Committee of the Hawaii Sierra Club Kauai Group and the publisher of IslandBreath.
Tonight Oceans4Peace presents a panel discussion and talk story.   Our subject tonight is the War on the Ocean and RIMPAC 2014. We'll focus on Jeju Island, Okinawa and Hawaii.
BATTLEGROUND OVERVIEW (see below).

As human beings our attitude about the ocean is characterized by its use as the ultimate trashcan. The oceans can absorb as much garbage as we can put into them. When we were small in numbers and our technologically immature, that was true. Not so now.

Living here on Kauai our home is in the middle of the Pacific Ocean. It is largest crucible of life on earth - and it is suffering. It's not just Americans that have damaged it, but the Japanese, Chinese, Korean, French, Russian, etc.
The Pacific Ocean has been used as a sushi bar, a strip mine, a junk yard, a toilet bowl, a bombing range, and a nuclear wasteland.

We now face ocean acidification, global warming, climate change, rising seas, poisonous trash gyres, radioactive pollution, over-fishing, and coral reef die-off.

Behind this destruction have been our financial, economic and industrial interests. Until now it has been cheaper to just look the other way. America's navy is the starring actor in the drama - securing the continuation of this large scale destruction through its military dominance in the Pacific.

Our navy adds to the mix the use of sonar, radar and microwave radiation that damage the ecosystem. Even worse are large military exercises and weapons systems testing that intensify the degradation of life in the ocean.

As destructive as RIMPAC exercises are they are largely just a biennial public relations effort - a satin curtain over the tip of the sharpened spear.

The PRMF Open House and fireworks show, the outreach and TGI Puff Pieces, the balloons and NASA liftoffs, the Rim of the Pacific "joint cooperation" are all part of the hype.
But underlying the gloss there are many ongoing and overlapping military activities in the Pacific that we do not see advertised and that are deeply troubling.

For example as RIMPAC14 began so did CARAT14… the joint US and Filipino marines Cooperation Afloat Readiness and Training exercise on the shores of the South China Sea.

Also Japan has just announced it has reinterpreted its constitution to again allow it to adopt a more aggressive military posture in the Pacific.

This coincided with an US Presidential executive order of a huge expansion of millions of square miles of "national marine monuments". This will greatly increase the regulatory rights of America in the Pacific Ocean.

Add to this the military base expansions in Korea, Guam and Philippines - as well major extensions of our military test ranges in the Mariana and Hawaiian islands.
A pattern emerges that clarifies the purpose of the Trans-Pacific Parternship (or TPP)…. It is is in effect a demand for the surrender of the Pacific Ocean to America's Asian Pivot.

It should be clear that the War on the Pacific has been going on since Pearl Harbor and that from 1945 on it has been a nuclear conflict.

The same American corporations - like General Electric, (that built the Fukushima nuclear reactors) helped build our Nuclear Carrier Strike Forces and our Nuclear Submarine Strategic ICBM fleet.

This nuclear technology is the fist in the glove of the US Navy's Pacific Command - USPACOM. PACOM enforces American empire over half the surface of the world. From San Diego to the Indian Ocean.

The War on the Pacific may now be morphing into World War Three. The conflicting resource claims in the South China Sea may be the flash point - and the US will be at the center of the maelstrom.
This downward spiral must be stopped - And being informed helps!

This evening we have a panel of three experts to clarify what is happening in the Pacific. After their presentations will be a Q and A session with some talk-story. Our panelists include:

First, Jim Maragos PhD - Expert on Pacific Ocean atoll ecology
He will address -
"What happens when coral reefs are militarized: Jeju Island, South Korea"

Then, Kenta Watanabe MS - He is a tropical botanist at the Okinawa National College of Technology
He will describe the -
"The last pristine coral reef ecosystem of Okinawa"

And finally, Katherine Muzik PhD - Director of Kulu Wai Institute here on Kauai.
Her topic will be -
"Effects of RiIMPAC on Marine Life"


Below is map combing various US territorial claims in the Pacific Ocean including enormous military test range and war simulation areas. For more details see Island Ea o Ka Aina: The Pacific Pivot 6/26/14.

http://www.islandbreath.org/2014Year/06/140626overall.jpg
Image above: BATTLEGROUND OVERVIEW - Thumbnail of map joining  Maps A, B and C showing Pacific Ocean U.S.A. state, territory and protectorates as well as military range and testing facilities US marine national monuments existing and proposed.  Together these areas provide a complex and overlapping set of protocols, regulation and control that clusterfuck much of the Pacific Command area. Map by Juan Wilson (www.isladnbreath.org). Click for larger more complete view.

.

Standby Electronic Gluttons

SUBHEAD: Electronic devices waste $80 billion a year needing 200 x 500megawatt coal-fired power stations.

By matthew Carr on 2 July 2014 for Bloomberg News -
(http://www.bloomberg.com/news/2014-07-02/electronic-devices-waste-80-billion-of-power-a-year-iea-says.html)


Image above: Illustration of Standby pilot light for article 'Energy Vampires". From (http://savingenergy.co.za/energy-vampires/).

The world’s 14 billion television set-top boxes, printers, game consoles and other electronic devices waste $80 billion of power a year due to inefficient technology, according to the International Energy Agency.
“Electricity demand of our increasingly digital economies is growing at an alarming rate,” the Paris-based adviser to developed nations said today in a report. By 2020, an estimated $120 billion will be wasted as many devices use about the same amount of power even on standby.

Networked devices worldwide used about 616 terawatt-hours of power in 2013, most of which was used in standby mode, according to the IEA. Of that amount, 400 TWh, or the amount consumed annually by the U.K. and Norway, was wasted because of inefficient technology, the agency said.

“The problem is not that these devices are often in standby mode, but rather that they typically use much more power than they should to maintain a connection and communicate with the network,” Maria Van der Hoeven, the IEA’s executive director, said in a statement accompanying the report. “Just by using today’s best-available technology, such devices could perform exactly the same tasks in standby while consuming around 65 percent less power.”

Power demand is increasing as network connectivity spreads to appliances and devices such as washing machines, refrigerators and lights, the IEA said. The use of network-enabled utensils is projected to expand to about 50 billion units by 2020 and 100 billion the following decade, the agency estimated in 2012.

Improving the energy efficiency of networked devices in the coming years would save 600 TWh of energy, or the equivalent of shutting 200 standard 500 megawatt coal-fired power stations, the IEA said.

United Nations envoys are seeking to trim energy use and associated greenhouse-gas emissions to fight climate change. The negotiators are seeking to agree on a new global treaty next year that will come into force in 2020.

.

Efficency of well tended fires

SUBHEAD: Counter-intuitively, a well-tended wood fire can outperform modern cooking stoves.

By Kris De Decker on 23 June 2014 for Low Tech Magazine -
(http://www.lowtechmagazine.com/2014/06/thermal-efficiency-cooking-stoves.html)


Image above: Well-constructed three-stone fires protected from wind and tended with care score between 20 and 30% thermal efficiency. From original article.

Despite technological advancements since the Industrial Revolution, cooking remains a spectacularly inefficient process. This holds true for poor and rich countries alike. While modern gas and electric cooking stoves might be more practical and produce less indoor pollution than the open fires and crude stoves used in developing countries, they are equally energy inefficient.

In fact, an electric cooking stove is only half as efficient as a well-tended open fire, while a gas hob is only half as effective as a biomass rocket stove. And even though indoor air pollution is less of an issue with modern cooking stoves, research indicates that pollution levels in western kitchens can be surprisingly high.

Present-day cooking methods in poorer countries are quite well documented, as they are one of the main concerns of NGOs which promote appropriate technological development. An estimated 2.5 to 3 billion people still cook their food over open fires or in rudimentary cookstoves, and these numbers keep increasing due to population growth.

The most basic and widely used type of cooking device is the wood-fuelled "three-stone fire", which is made by arranging three stones to make a stand for a cooking pot. Alongside the three-stone fire -- which dates back to Neolithic times -- many types of home-made cooking stoves can be found. They are powered by burning coal or biomass, be it wood, crop residues, dung or charcoal. [1]

Cooking fire inside the houseIndoor cooking in Guatemala. Image: Source: Global Alliance for Clean Cookstoves.

The main concern with the use of crude biomass cooking stoves is their destructive influence on human welfare and natural resources. When used indoors, biomass cooking stoves lead to severe health issues such as chronic lung diseases, acute respiratory infections, cataracts, blindness, and adverse effects on pregnancy. The main victims are women, who do most of the housework, and young children, who are often carried on the mother's back while she is cooking.

Inefficient biomass stoves also force people (again, most often women) to spend much of their time collecting fuel. The environmental degradation caused by biomass stoves is equally problematic. When wood is used as a primary fuel, inefficient cooking methods lead to large-scale deforestation, soil erosion, desertification and emissions of greenhouse gases. For coal-fuelled stoves, the main issue is indoor air pollution.

The Thermal Efficiency of a Three-stone Fire
At the heart of the problem lies the low thermal efficiency of traditional cooking methods. For three-stone fires, thermal efficiency is stated to be as low as 10 to 15%. [1][2] In other words: 85 to 90% of the energy content in the wood is lost as heat to the environment outside the cooking pot.

Obviously, this low efficiency wastes natural resources, but it also boosts air pollution and greenhouse gas emissions because the relatively low temperature of the fire leads to incomplete combustion.

Improved three-stone fireAn improved three-stone fire. Picture: Chef Cooke @ Flickr.

However, the issue is more complicated than it is usually presented. To begin with, the productivity and cleanliness of an open fire (and similar crude cooking stoves) greatly depends on the circumstances in which they are used and on the skills of the cook. In its test of 18 cooking stove designs from all over the world, the Partnership for Clean Indoor Air (PCIA) [3][4] concluded that:
"Well-constructed three-stone fires protected from wind and tended with care scored between 20 and 30% thermal efficiency. Open fires made with moister wood and operated with less attention to the wind can score as low as 5%. The operator and the conditions of use largely determine the effectiveness of operation. If the sticks of wood are burnt at the tips and pushed into the center as the wood is consumed, the fire can be hot and relatively clean burning."
Due to the influence of environmental factors such as wind, an indoor three-stone fire is generally more efficient than one operated outside. However, outdoor open fires can also be made more efficient by placing them in a hole in the ground or by shielding them with the use of earthen walls, which also adds thermal mass. Furthermore, PCIA remarks that "it is important to recognize that the open hearth and resulting smoke often have considerable cultural and practical value in the home, including control of insects".

The Thermal Efficiency of Improved Biomass Stoves
Especially since the 1970s and 1980s, many international NGO's have tried to improve cooking traditions in poorer countries. This has resulted in a large number of so-called "improved cooking stoves", which again vary in terms of design, performance and costs. Hundreds of variations exist. [1][4]
Clean cookstovesA collection of improved biomass stoves. Source: Global Alliance for Clean Cookstoves.

Some of these designs are exclusively aimed at minimising air pollution at the cost of higher fuel consumption, while other designs achieve a higher efficiency but increase air pollution. [4] In this article, we will focus exclusively on cooking stoves that address both issues simultaneously. This is not to suggest that other designs can't be preferable in certain circumstances.

For example, because biomass cooking stoves do not present direct health problems when used outdoors, saving fuel would be the most important aim in that context.

Compared to a basic three-stone fire with 10-15% thermal efficiency, improved cooking stoves can easily halve the fuel requirements of the cooking process. This can be achieved by providing an insulated combustion chamber, improving the air supply, and other measures.

In a laboratory comparison of five major types of biomass cooking stoves, it was found that an improved rocket stove uses 2,470 kJ to boil one litre of water and then simmer it for 30 minutes, while a basic three-stone fire requires 6,553 kJ to fulfill the same task (see the dark blue bars in the graphic above). [5][1] The rocket stove thus uses 60% less fuel than the three-stone fire. Furthermore, the rocket stove boils 2.5 litres of water more than 5 minutes faster (see the light blue points in the graphic above).

The values are the average of three tests and measure specific energy consumption instead of thermal efficiency. Both test methods have their shortcomings -- measuring the efficiency of cooking is suprisingly complex -- so by applying both methods the accuracy of an experiment increases. [6] This was done by the Partnership for Clean Indoor Air, which compared the thermal efficiency and specific energy consumption of 18 cookstove designs, including a well tended open fire with a thermal efficiency of 20-30%. [4]

In this study, one of the best performing improved biomass stoves -- a 20 liter can rocket stove (image at the right) -- convincingly beats the efficiency of the well-tended open fire. It requires 733 grams of wood (12,579 kJ) to bring five litres of water to boil and simmer for 45 minutes, only 65% of the 1,112 grams of wood (19,496 kJ) required by the well-tended open fire. The thermal efficiency of the rocket stove varies between 23 and 54%. [7]

The rocket stove also lowers air pollution: the emissions are only 26% of the carbon monoxide (CO) and 60% of the particulate matter (PM) produced by the well-tended open fire. Lastly, it shortens cooking time to 22 minutes for five litres of water, compared to 27 minutes for the open fire.
The top performing biomass stove in the test is a wood gas stove, with slightly more than one-third the wood consumption (459 grams of wood or 9,434 kJ) and 15-20% of the pollution levels of the three-stone fire. It has a thermal efficiency of 44-46%. However, it requires an electric fan to improve combustion efficiency, while all others are natural-draft stoves.

Cooking in Wealthy Households
There is great irony in the fact that the improved biomass stoves mentioned above are much more efficient than modern cooking stoves used in the western world and in wealthier households of developing nations. In fact, most modern cooking stoves have a thermal efficiency that is on par with that of a three-stone fire.

The western world switched from open fires to closed cookstoves from the eighteenth century. Initially, these "kitchen stoves" were used for both heating and cooking, and were powered by coal, charcoal or biomass. When central heating systems were introduced in the early twentieth century, the
Conventional electric hobs use attached iron plates as their heating units, while more sophisticated models use infrared, halogen or induction units, which are positioned below glass ceramics.

Of these, only induction-based cooking plates are more efficient than conventional electric hobs. The others mainly offer increased convenience, such as greater ease when cleaning. Most gas cooking stoves place burners on top of a stainless steel or ceramic surface, while others place them on top or beneath a glass ceramic surface. Again, the latter offers increased convenience, but no significant efficiency benefit. [8]

Electric stoveAn electric glass-ceramic cooktop (Source: Wikimedia). Less efficient than a well tended open fire.

Research into the efficiency of modern cooking stoves is rather limited. According to a study by the Dutch research institute VHK, a traditional electric cooktop (with vitro-ceramic plate) has a thermal efficiency of 13%, while that of an electric induction cooker is 15%. A microwave obtains 19% thermal efficiency. Only a classical gas cooking stove (23%) reaches the thermal efficiency of a well-tended three-stone fire. [8] While the study is aimed primarily at the preparation of hot drinks, it is the most complete study available and its results are applicable to cooking food with only a few small caveats. [9]

Now, if we compare the thermal efficiencies from modern cooking stoves with those from stoves used in poorer households, we see that the improved biomass stoves in developing countries beat our "high-tech" cooking technology with a factor of two to three (graphic below). Gas or electric ovens are not included in this comparison, but their efficiency is even lower than gas or electric hobs because water is a much better conductor of heat than air.

The low efficiency of modern cooking devices may surprise people, as these are not the figures that are usually presented in sales brochures or consumer reports. For example, the Californian Consumer Energy Center gives an efficiency level of 90% for an electric induction cooker, 65% for a standard electric range, and 55% for a gas burner. [10]

Power Conversion Losses
The main discrepancy with these figures is caused when one doesn't take into account that electricity first needs to be produced in power plants which sometimes convert less than a third of the primary energy into electricity [11]. This is not an issue with gas or biomass stoves, where a primary fuel is directly converted into heat for cooking. [12] But it does have a destructive effect on the thermal efficiency of any electric cooking device, be it an electric hob or a microwave. In the graphic below, power conversion losses are indicated by the dark blue bars.

The VHK study assumes an electric grid efficiency of 40%. This figure takes into account power generation and distribution losses, as well as fuel extraction and a projected saving on these issues over an average product life of 10-15 years. [8] It should be noted that this percentage corresponds to a global average, including the use of renewables and atomic energy. Depending on the country, grid efficiency can be higher or lower. [13]

Thermal efficiency of modern cooking stovesBoiling water preparation energy impact (kWh primary energy for 1,000 litre useful boiled water per year) for different cooking devices. Dark blue: power generation loss. Light blue: heat loss. Red: theoretical minimum. Pink: production, distribution, end-of-life. Pink: extra boiling time. Purple: standby. Green: over-filling. Source: [8].
If we only look at the different types of thermal power plants, we find that the thermal efficiency for a traditional coal plant (81% of all coal-based power plants in use) is only 25 to 37%, while that of a common direct-combustion biomass power plant is only 20%. [13][14] At world level, the average energy efficiency of thermal power plants is 36%. [13] These percentages should be reduced with electric transmission and distribution losses, which are on average 6% in Europe, 7% in the USA, and 9% on a world level. [13]

This means that if your electric stove is operated by electricity from a biomass power plant -- a fast growing "green" trend nowadays -- the power conversion efficiency is three to four times lower (11-14%) than the authors of the study assume, and thermal efficiency drops to about 5%. This is similar to the thermal efficiency of a neglected open fire, and one-tenth the thermal efficiency of a rocket stove. Likewise, a cookstove which uses coal or gas directly to heat food is much more energy efficient than a cookstove that runs on electricity produced by a coal or gas power plant.

Evidently, there is something wrong with the western approach to sustainability. Converting heat into electricity which is then converted back into heat, at 20-40% efficiency, is similar to building a Rube Goldberg machine; it's a needlessly complex operation compared to simply converting the primary fuel into heat to boil water. Essentially, any electric cooking device is an insult to the science of thermodynamics.

Heat Transfer Loss
A second problem is that the high efficiency figures given in sales brochures and consumer reports underestimate the heat loss that occurs during the heat transfer from cooking stove to cooking pot (shown by the light blue bars in the graphic above). This heat loss is present with all cooking stoves, but is especially high in the case of gas hobs. In the graphic above, the red bar concerns the minimum energy that it takes to boil 1,000 litres of water, assuming that there is no energy loss during the heat transfer between the cooking stove and the water. This value is 105 kWh/yr for a starting cold water temperature of 10 degrees celsius.

Energy losses appear because of three reasons. Firstly, some heat from the cooking fire escapes before it can reach the cooking vessel. Secondly, some heat from the cooking fire is used to heat up the cooking pot, which constantly loses heat to the environment. Lastly, heat is wasted because some of the boiling water escapes through evaporation. While the red bar is logically the same for every cooking device, the light blue bar showing the additional energy required to compensate for heat transfer loss varies from 57 kWh/yr for an electric induction stove to 255 kWh/yr for a gas hob.
Gas stoveGas stoves have the largest heat transfer losses of all modern cooking stoves. Picture: Ashley Bischoff @ Flickr.

Heat transfer loss is not fully accounted for in most testing standards for cooking appliances. For example, the US standard uses a test by which the heat transfer efficiency of a cooking top is established from heating up aluminum cylinders of certain dimensions, not pots of water. [15][16] This avoids the complex phase change from liquid to vapour and is thus better reproducible.

However, as all the heat of the cylinder is counted as useful, it ignores that in real life situations some energy -- notably the energy to heat up the pot or kettle itself -- is wasted. Only taking into account the energy loss in heating the pot itself, energy efficiency decreases with about 10% of the figures given by standard tests, concludes VHK. [8]

Furthermore, the US test is modeled after the process of boiling food on all burners or hot plates simultaneously, which is not always the case. Heat transfer losses are larger when only one or two pots are on the fire.

Three stone fire 3An outdoor three-stone fire. Image: Global Alliance for Clean Cookstoves.

Apart from power conversion losses and heat transfer losses, the remainder of the energy losses are due to production, distribution and disposal of cooking devices (embodied energy), standby losses (which are only relevant for microwaves, induction stoves and sophisticated gas stoves), and cooking habits. These factors have a relatively small influence.

Of all the energy losses involved in modern cooking appliances, only heat transfer loss applies to cooking devices in poorer households. There are no power conversion losses, fuel is mostly gathered by hand, there are no standby losses, and embodied energy is negligible as most devices are home-made.

Indoor Air Pollution in Rich vs. Poor Households
While the thermal efficiency of modern cooking devices is clearly inferior to that of a well-tended three-stone fire or rocket stove, they do have an advantage when it comes to indoor air pollution. However, this is not a black-and-white issue either. Air pollution levels depend on what you're cooking, how skillful you are, and which technology you use.

In the worst case scenario, pollution levels in modern kitchens can be similar to those of a well-tended three-stone fire indoors. This is not to say that the problem of indoor pollution in poor households is overstated, but rather that cooking in modern kitchens is not always as clean as we assume it to be.

Particulate matter (PM) is considered as the single best indicator of potential harm in air quality. [4] In poor households where indoor cooking happens with crude stoves or open fires, PM-levels vary from 200 to 5,000 ug/m3 over a 24-hour period, and from 300 to 20,000 ug/m3 during the actual use of stoves. [17][18][19] The Partnership for Clean Indoor Air measured PM emissions for a well tended three-stone fire, which resulted in values of between 281 and 2,004 ug/m3 while cooking. [4]

Indoor air pollution cookstovesIndoor cooking with biomass stoves. Image: Global Alliance for Clean Cookstoves.

Similar research undertaken in a kitchen equipped with modern technology found PM concentrations in the kitchen, living room and bedroom from below the detection limit to 3,880 ug/m3 during a variety of 32 different cooking tests with gas and electric ranges. [20] The medium and average concentrations of PM during the 32 cooking tests exceeded ambient air quality standards (which are 150 g/m3 for PM10 and 65 ug/m3 for PM2.5). These values come close to the best-case scenarios in poor households.

Importantly, cooking pollutants are not caused by the burning of gas or fuel alone, but also in the cooking process itself. PM2.5 concentrations were over 1,000 ug/m3 during stovetop stir-frying, baking lasagna in the gas oven, and frying tortillas in oil on the range top burner. The authors conclude that:

"Very high levels of several pollutants were measured in indoor air during different types of cooking activities. The levels measured for some cooking activities exceeded health-based standards and guidelines, and could pose a risk to home occupants, especially susceptible groups of the population such as young children and the elderly."

Unfortunately, gas stoves -- which have the highest thermal efficiency of all modern cooking stoves -- produce the most air pollution in modern kitchens. [20] The average indoor PM emissions for gas stoves can amount to 25% of those of biomass cooking stoves. [19] A 2014 study estimates that 60 percent of homes in California that cook at least once a week with a gas stove can reach pollutant levels of CO, NO2 and formaldehyde that would be illegal if found outdoors. [21] The authors state that:


"If these were conditions that were outdoors the EPA (Environmental Protection Agency) would be cracking down. But since it's in people's homes, there's no regulation requiring anyone to fix it. Reducing people's exposure to pollutants from gas stoves should be a public health priority."

Air Pollution and Greenhouse Gas Emissions
Obviously, indoor cooking with an electric stove is the healthiest option, albeit not totally free from producing indoor air pollution. However, electric stoves are only "clean" because they emit most of their pollution elsewhere -- at the smokestacks of the power plant. Any biomass stove design with a chimney basically achieves the same. If a chimney is added to an indoor biomass stove, indoor air pollution drops to almost zero. [4]

Clean cookstoveA clean cookstove in India. Image: Global Alliance for Clean Cookstoves.

And while the burning of coal or gas emits less air pollution and greenhouse gases than the burning of biomass per unit of energy produced [22], you have to burn more fuel in order to make up for the power conversion losses. Especially if your electric stove runs on electricity from a biomass power plant, then air pollution and greenhouse gas emissions are much higher than in the case of a biomass stove.

On the other hand, if we consider biomass to be climate neutral over time because the harvested forest gets a chance to grow back, then a biomass stove beats all other cooking methods when it comes to greenhouse gas emissions. The same goes for the cooking stove powered by electricity from biomass, although it would produce considerably more air pollution than the biomass stove, and require a much larger area of sustainably managed forest.

What's the solution?
When the German Wuppertal Institute investigated the potential for improved energy efficiency of cooking stoves on a global scale, they concluded that energy use could be halved. [2] Although it's remarkable how the proposed solutions for this energy inefficiency differ for poor and rich countries. In the developing world, the focus is mainly on designing more efficient biomass stoves that produce fewer pollutants. While achieved savings as a result of switching to biogas would be larger, its investment would be 30 times higher compared to the distribution of improved wood cooking stoves. [2]
Clean cookstove 2An improved biomass cookstove in India. Source: Global Alliance for Clean Cookstoves.

For the developed world, the Wuppertal Institute focuses on a much more costly measure: extending the use of the most efficient types of "western" stoves, such as the electric induction hob. However, as we have seen, these stoves are far less efficient than the improved biomass stoves, and they are also more expensive. The authors infer that, compared to developing countries, energy saving potentials with modern cooking stoves are far smaller and less cost-efficient. But as is apparent from the inefficiencies of western cooking technology, the energy savings potential is, in reality, larger.

One possibility for the West to improve the sustainability of its cooking stoves, not mentioned by the Wuppertal Institute, is to generate electricity by wind, solar or water energy. If electricity is generated by renewable energy, electric hobs and microwaves suddenly beat all other cooking stoves when it comes to efficiency, air pollution and greenhouse gas emissions. That being said, using renewable energy to produce electricity to create heat for cooking remains a needlessly complex and costly approach to make cooking more sustainable.

There are some obvious but often overlooked solutions that would make cooking close to 100% sustainable in rich and poor countries alike. See our follow-up article: "If we insulate our houses, why not our cooking pots?".



Notes & Sources
[1] "What users can save with energy-efficient cooking stoves and ovens", Oliver Adria and Jan Bethge, October 2013.
[2] "The overall worldwide saving potential from domestic cooking stoves and ovens", Oliver Adria and Jan Bethge, October 2013.
[3] As of 2012, the Partnership for Clean Indoor Air (PCIA) has integrated with the Global Alliance for Clean Cookstoves.
[4] "Test Results of Cook Stove Performance", Partnership for Clean Indoor Air, 2012. See Appendix C for the University of California Berkeley (UCB) Water Boiling Test (WBT) protocols.
[5] "A laboratory comparison of the global warming impact of five major types of biomass cooking stoves", Nordica MacCarthy, 2008
[6] Thermal efficiency rewards the production of excess steam, while specific consumption penalizes it. For the pros and contras of both testing approaches, see [4], page 76-77.
[7] These percentages concern the outer values of different test procedures and during different stages of the cooking process. The thermal efficiency of a rocket stove is especially high when bringing water to boil but its advantage is much smaller during simmering.
[8] "Quooker Energy Analysis", Part one, Van Holsteijn en Kemna B.V. (VHK), March 2010.
[9] The heat transfer efficiency figures chosen by [8] are based on a typical mixed use of cooking stoves, in which the energy is used both for preparing meals and for hot drinks. Since boiling smaller amounts of water for hot drinks is somewhat less efficient, this approach underestimates the heat transfer efficiency of cooking food. However, to be on the safe side, the researchers are rather conservative in their revision of heat transfer efficiencies (see chapters 2.2 & 2.4), so the difference must be small.
[10] "Stoves, Ranges and Ovens", Consumer Energy Center, California Energy Commission.
[11] The average efficiency of a coal plant is 35%. See: "Power generation from coal: Measuring and Reporting Efficiency Performance and CO2 Performance", OECD/IEA, 2010.
[12] It should be noted that the energy losses of the natural gas distribution network can be rather large, and this fact does not seem to be taken into account in the study. The thermal efficiency of gas stoves may thus be overstated. The same goes for the greenhouse gas emissions, mainly due to methane leaks during gas production.
[13] "The state of global energy efficiency: global and sectorial energy efficiency trends", Enerdata.
[14] "How is biomass energy used?", Canadian Centre for Energy Information.
[15] "Test Procedure for Residential Kitchen Ranges and Ovens", US Department of Energy, 1997. For related documents, see "Residential Kitchen Ranges and Ovens".
[16] "Evaluation of Kitchen Cooking Appliance Efficiency Test Procedures", Steven Nabinger, US Department of Commerce, 1999
[17] "Smoke, health and household energy. Volume 1", Liz Bates, 2005.
[18] "The health effects of indoor air pollution exposure in developing countries", WHO, 2002
[19] "health effects of chronic exposure to smoke from biomass fuel burning in rural areas", WHO India, 2007
[20] "Indoor Air Quality: Residential Cooking Exposures", R. Fortmann et al., State of California Air Resources Board, 2001
[21] "Pollutant exposures from natural gas cooking burners: a simulation-based assessment for southern california", Environmental Health Perspectives, January 2014.
[22] "Trees, Trash, and Toxics: How Biomass Energy Has Become the New Coal", Mary . Booth, Partnership for Policy Integrity, April 2014

See also:
http://www.lowtechmagazine.com/2014/07/cooking-pot-insulation-key-to-sustainable-cooking.html

.