Showing posts with label Elderly. Show all posts
Showing posts with label Elderly. Show all posts

Eat less - Live longer

SUBHEAD: The secret to a long healthy life is to seriously reduce caloric intake of food.

By Alex Riley on 1 June 2017 for The BBC -
(http://www.bbc.com/future/story/20170601-the-secret-to-a-long-and-healthy-life-eat-less)


Image above: The Dinner Box Strategy - McDonald's profits from selling an insane amount of food For $9.99.From (http://www.businessinsider.com/mcdonalds-dinner-box-strategy-2014-6).

Permanently cutting the daily calories you consume may turn out to have a profound effect on your future life, according to some tantalising scientific studies.

In a restaurant setting sometime in the not-too distant future, a man and a woman are on their first date. After the initial nerves subside, all is going well.

The man is 33, he says, has been single for most of those years, and, although he doesn’t mention it, knows he is looking to settle down and have a family. The woman replies that she is 52, has been married, divorced, and has children in their early 20s. He had no idea – she looked his age, or younger.

This is a dream of Julie Mattison from the National Institute on Ageing (NIA) in the United States. She envisions a time when chronological age ticks by with every year, but biological age can be set to a different timer, where elderly doesn’t mean what it does now.

It sounds far-fetched, but our society has already made great strides towards that goal, thanks to advances in medicine and improvements in healthy living. In 2014, for instance, the United States Health Interview Survey reported that 16% of people aged between 50 and 64 were impaired every day with chronic illness. Three decades earlier that number was 23%.

In other words, as well as benefiting from longer lifespans, we are also experiencing longer “healthspans” – and the latter is proving to be even more malleable. To paraphrase and update a speech from John F Kennedy given at the first White House Conference on Ageing in 1961, life can indeed be added to years, rather than just years added to life.

Healthspan is proving to be even more malleable than lifespan. 


So, what do we need to do to enhance the length and quality of our lives even more? Researchers worldwide are pursuing various ideas, but for Mattison and colleagues, the answer is a simple change in diet.

They believe that the key to a better old age may be to reduce the amount of food on our plates, via an approach called “calorie restriction”. This diet goes further than cutting back on fatty foods from time-to-time; it’s about making gradual and careful reductions in portion size permanently.

Since the early 1930s, a 30% reduction in the amount of food consumed per day has been linked to longer, more active lives in worms, flies, rats, mice, and monkeys. Across the animal kingdom, in other words, calorie restriction has proven the best remedy for the ravages of life. And it’s possible that humans have just as much to gain.

The idea that what a person eats influences their health no doubt predates any historical accounts that remain today. But, as is often the case for any scientific discipline, the first detailed accounts come from Ancient Greece.

Hippocrates, one of the first physicians to claim diseases were natural and not supernatural, observed that many ailments were associated with gluttony; obese Greeks tended to die younger than slim Greeks, that was clear and written down on papyrus.

Spreading from this epicentre of science, these ideas were adopted and adapted over the centuries. And at the end of the 15th Century, Alvise Cornaro, an infirm aristocrat from a small village near Venice in Italy, turned the prevailing wisdom on its head, and on himself.

If indulgence was harmful, would dietary asceticism be helpful? To find out, Cornaro, aged 40, ate only 350g (12oz) of food per day, roughly 1000 calories according to recent estimates. He ate bread, panatela or broth, and eggs.

For meat he chose veal, goat, beef, partridge, thrush, and any poultry that was available. He bought fish caught from the local rivers.

Restricted in amount but not variety, Cornaro claimed to have achieved “perfect health” up until his death more than 40 years later.

Although he changed his birthdate as he aged, claiming that he had reached his 98th year, it is thought that he was around 84 when he died – still an impressive feat in the 16th Century, a time when 50 or 60 years old was considered elderly.

In 1591, his grandson published his posthumous three-volume tome entitled “Discourses on the Sober Life,” pushing dietary restriction into the mainstream, and redefining ageing itself.

With an additional boost of health into the evening of life, the elderly, in full possession of their mental capacities, would be able to put decades of amassed knowledge to good use, Carnaro claimed. With his diet, beauty became the aged, not the youthful.

Cornaro was an interesting man but his findings are not to be taken as fact by any branch of science. Even if he was true to his word and did not suffer ill health for nearly half a century, which seems unlikely, he was a case study of one – not representative of humans as a whole.

But since a foundational study in 1935 in white rats, a dietary restriction of between 30-50% has been shown to extend lifespan, delaying death from age-related disorders and disease. Of course, what works for a rat or any other laboratory organism might not work for a human.

Long-term trials, following humans from early adulthood to death, are a rarity. “I don’t see a human study of longevity as something that would be a fundable research programme,” says Mattison. “Even if you start humans at 40 or 50 years old, you’re still looking at potentially 40 or 50 more years [of study].”

Plus, she adds, ensuring that extraneous factors – exercise, smoking, medical treatments, mental wellbeing – don’t influence the trial’s end results is near impossible for our socially and culturally complex species.

That’s why, in the late 1980s, two independent long-term trials – one at NIA and the other at the University of Wisconsin – were set up to study calorie restriction and ageing in Rhesus monkeys. Not only do we share 93% of our DNA with these primates, we age in the same way too.

Slowly, after middle age (around 15 years in Rhesus monkeys) the back starts to hunch, the skin and muscles start to sag, and, where it still grows, hair goes from gingery brown to grey. The similarities go deeper.

In these primates, the occurrence of cancer, diabetes, and heart disease increases in frequency and severity with age. “They’re an excellent model to study ageing,” says Rozalyn Anderson, a gerontologist from the University of Wisconsin.


Image above: A young rhesus monkey eats while riding the back of its mother. From original article.

Sherman is the oldest Rhesus monkey ever recorded, nearly 20 years older than the average lifespan for his species in captivity 

And they’re easy to control. Fed with specially made biscuits, the diets of the 76 monkeys at the University of Wisconsin and the 121 at NIA are tailored to their age, weight, and natural appetite.

All monkeys receive the full complement of nutrients and minerals that their bodies crave. It’s just that half of the monkeys, the calorie restricted (or CR) group, eat 30% less.

They are far from malnourished or starving. Take Sherman, a 43-year-old monkey from NIA. Mattison says that since being placed on the CR diet in 1987, aged 16, Sherman hasn’t shown any overt signs of hunger that are well characterised in his species.

Sherman is the oldest Rhesus monkey ever recorded, nearly 20 years older than the average lifespan for his species in captivity. As younger monkeys were developing diseases and dying, he seemed to be immune to ageing. Even into his 30s he would have been considered an old monkey, but he didn’t look or act like one.

The same is true, to varying extents, for the rest of his experimental troop at NIA. “We have a lower incidence of diabetes, and lower incidence of cancer in the CR groups,” says Mattison. In 2009, the University of Wisconsin trial published similarly spectacular results.

Not only did their CR monkeys look remarkably younger – with more hair, less sag, and brown instead of grey – than monkeys that were fed a standard diet, they were healthier on the inside too, free from pathology. Cancers, such as the common intestinal adenocarcinoma, were reduced by over 50%.

The risk of heart disease was similarly halved. And while 11 of the ad libitum (“at one’s pleasure,” in Latin) monkeys developed diabetes and five exhibited signs that they were pre-diabetic, the blood glucose regulation seemed healthy in all CR monkeys. For them, diabetes wasn’t a thing.

Overall, only 13% of the monkeys in the CR group had died of age-related causes in 20 years. In the ad libitum group, 37% had died, nearly three times as many. In an update study from the University of Wisconsin in 2014, this percentage remained stable.

The results show that ageing itself is a reasonable target for clinical intervention and medical treatment – Rozalyn Anderson 


“We have demonstrated that ageing can be manipulated in primates,” says Anderson. “It kind of gets glossed over because it’s obvious, but conceptually that’s hugely important; it means that ageing itself is a reasonable target for clinical intervention and medical treatment.”

If ageing can be delayed, in other words, all of the diseases associated with it will follow suit. “Going after each disease one at a time isn’t going to significantly extend lifespan for people because they’ll die of something else,” says Anderson. “If you cured all cancers, you wouldn’t offset death due to cardiovascular disease, or dementia, or diabetes-associated disorders. Whereas if you go after ageing you can offset the lot in one go.”

Eating less certainly seemed to help the monkeys, but calorie restriction is much tougher for people out in the real world. For one, our access to regular, high-calorie meals is now easier than ever; with companies like Deliveroo and UberEats, there is no longer a need to walk to the restaurant anymore. And two, gaining weight simply comes more naturally to some people.

“There’s a huge genetic component to all of this and its much harder work for some people than it is for others to stay trim,” says Anderson. “We all know someone who can eat an entire cake and nothing happens, they look the exact same. And then someone else walks past a table with a cake on it and they have to go up a pant size.”

Ideally, the amount and types of food we eat should be tailored to who we are – our genetic predisposition to gaining weight, how we metabolise sugars, how we store fat, and other physiological fluxes that are beyond the scope of scientific instruction at the moment, and perhaps forever.

But a predisposition to obesity can be used as a guide to life choices rather than an inevitability. “I personally have a genetic history of obesity running through my family, and I practice a flexible form of caloric restriction,” says Susan Roberts a dietary scientist at Tufts University in Boston.

“I keep my BMI at 22, and [have calculated] that that requires eating 80% of what I would eat if my BMI was at 30 like every other member of my family.”

Roberts stresses that it isn’t hard – she follows her own weight management programme using a tool called iDiet to help her eat less but avoid feeling hungry or deprived of enjoyment. If this wasn’t possible, she adds, she wouldn’t practise calorie restriction.

Not only has Roberts seen the problems of obesity first-hand in her family, she knows the benefits of CR better than most. For over 10 years she has been a leading scientist in the Comprehensive Assessment of Long-Term Effects of Reducing Intake of Energy trial, also known as Calerie.

Over two years, 218 healthy men and women aged between 21 and 50 years were split into two groups.  In one, people were allowed to eat as they normally would (ad libitum), while the other ate 25% less (CR). Both had health checks every six months.

Unlike in the Rhesus monkey trials, tests over two years can’t determine whether CR reduces or delays age-related diseases. There simply isn’t enough time for their development. But the Calerie trials tested for the next best thing: the early biological signs of heart disease, cancer, and diabetes.

Published in 2015, the results after two years were very positive. In the blood of calorie-restricted people, the ratio of “good” cholesterol to “bad” cholesterol had increased, molecules associated with tumour formation – called tumour necrosis factors (TNFs) – were reduced by around 25%, and levels of insulin resistance, a sure sign of diabetes, fell by nearly 40% compared to people who ate their normal diets. Overall, the blood’s pressure was lower.

Significant health benefits may be garnered in an already healthy body, but further trials are needed.

Admittedly, some benefits may come from weight-loss. Earlier trials from Calerie had included people that were obese as well as those with a healthy body mass index (BMI) of 25 or below, and slimming down would have certainly improved the welfare of the heavier participants.

“One thing that’s been very clear for a long time is that being overweight or obese is bad for you,” says Roberts. Diseases and disorders previously thought to be age-associated diseases are now popping up in the obese population, she adds.

But the latest results suggested that significant health benefits can be garnered in an already healthy body – a person who isn’t underweight or obese. That is, someone whose BMI lies between 18.5 and 25.

Despite these results, evidence from further trials will be needed before someone with an already healthy BMI should be advised to reduce their calorie intake. (And anyone wanting to change their diet would be advised to consult a medical professional beforehand.)

In the meantime, the scientists will be hoping that their rhesus macaques may help us to understand exactly why calories restriction may have these effects. With nearly 30 years of data on lives and deaths, and blood and tissue samples, from nearly 200 monkeys, the work at NIA and the University of Wisconsin aim to shine a light into the black box of calorie restriction, illuminating just how it delays ageing.

With less food, is the metabolism forced to be more efficient with what it has? Is there a common molecular switch regulating ageing that is turned on (or off) with fewer calories? Or is there an as of yet unknown mechanism underpinning our lives and deaths? The importance of monkeys like Sherman far outspans their lives.


Image above: Old and obese people finding pleasure at an ice cream store. From original article.

Calorie restriction may be one of the most promising avenues for improving health and how long it lasts in our live.

Answers to such questions might be long in coming. “If I cloned 10 of myself and we all worked furiously, I don’t think we’d have it solved,” says Anderson. “The biology is inordinately complicated.”

It’s a worthwhile undertaking – understand how CR works and other treatments could then be used to target that specific part of our biology. Ageing could be treated directly, that is, without the need of calorie restriction. “And I think that’s really the golden ticket,” says Anderson.

Although lacking a neat explanation, calorie restriction is one of the most promising avenues for improving health and how long it lasts in our lives. “There was nothing in what we saw that made us think caloric restriction doesn’t work in people,” says Roberts, from the Calerie trial.

And, unlike drug-based treatments, it doesn’t come with a long list of possible side effects. “Our people were not hungrier, their mood was fine, their sexual function was fine. We looked pretty hard for bad things and didn’t find them,” says Roberts.

One expected issue was a slight decrease in bone density that is often tied to gradual weight loss, says Roberts. But as a precaution, volunteers were provided with small calcium supplements throughout the trial.

Even with such promising findings, “this [the Calerie trial] is the first study of its kind, and I don’t think that any of us would feel confident in saying, ‘okay, we’re going to recommend this to everyone in the world,’” says Roberts.

“But it’s a really exciting prospect. I think that delaying the progression of chronic diseases is something that everyone can get behind and get excited about, because nobody wants to live life with one of those.”

.

Climate change as genocide

SUBHEAD: Not since WWII have more human beings been at risk from disease and starvation than now.

By Michael Klare on 21 April 2017 for Resilience -
(http://www.resilience.org/stories/2017-04-21/climate-change-genocide/)


Image above: Photo of a young man in drought conditions in Ethiopia in 2008. From original article.

On March 10th, Stephen O’Brien, under secretary-general of the United Nations for humanitarian affairs, informed the Security Council that 20 million people in three African countries — Nigeria, Somalia, and South Sudan — as well as in Yemen were likely to die if not provided with emergency food and medical aid.

“We are at a critical point in history,” he declared. “Already at the beginning of the year we are facing the largest humanitarian crisis since the creation of the U.N.”  Without coordinated international action, he added, “people will simply starve to death [or] suffer and die from disease.”

Major famines have, of course, occurred before, but never in memory on such a scale in four places simultaneously. According to O’Brien, 7.3 million people are at risk in Yemen, 5.1 million in the Lake Chad area of northeastern Nigeria, 5 million in South Sudan, and 2.9 million in Somalia.

In each of these countries, some lethal combination of war, persistent drought, and political instability is causing drastic cuts in essential food and water supplies. Of those 20 million people at risk of death, an estimated 1.4 million are young children.

Despite the potential severity of the crisis, U.N. officials remain confident that many of those at risk can be saved if sufficient food and medical assistance is provided in time and the warring parties allow humanitarian aid workers to reach those in the greatest need.

“We have strategic, coordinated, and prioritized plans in every country,” O’Brien said. “With sufficient and timely financial support, humanitarians can still help to prevent the worst-case scenario.”

All in all, the cost of such an intervention is not great: an estimated $4.4 billion to implement that U.N. action plan and save most of those 20 million lives.

The international response? Essentially, a giant shrug of indifference.

To have time to deliver sufficient supplies, U.N. officials indicated that the money would need to be in pocket by the end of March. It’s now April and international donors have given only a paltry $423 million — less than a tenth of what’s needed.

While, for instance, President Donald Trump sought Congressional approval for a $54 billion increase in U.S. military spending (bringing total defense expenditures in the coming year to $603 billion) and launched $89 million worth of Tomahawk missiles against a single Syrian air base, the U.S. has offered precious little to allay the coming disaster in three countries in which it has taken military actions in recent years.

As if to add insult to injury, on February 15th Trump told Nigerian President Muhammadu Buhari that he was inclined to sell his country 12 Super-Tucano light-strike aircraft, potentially depleting Nigeria of $600 million it desperately needs for famine relief.

Moreover, just as those U.N. officials were pleading fruitlessly for increased humanitarian funding and an end to the fierce and complex set of conflicts in South Sudan and Yemen (so that they could facilitate the safe delivery of emergency food supplies to those countries), the Trump administration was announcing plans to reduce American contributions to the United Nations by 40%.

It was also preparing to send additional weaponry to Saudi Arabia, the country most responsible for devastating air strikes on Yemen’s food and water infrastructure. This goes beyond indifference.  This is complicity in mass extermination.

Like many people around the world, President Trump was horrified by images of young children suffocating from the nerve gas used by Syrian government forces in an April 4th raid on the rebel-held village of Khan Sheikhoun.

“That attack on children yesterday had a big impact on me — big impact,” he told reporters. “That was a horrible, horrible thing. And I’ve been watching it and seeing it, and it doesn’t get any worse than that.” In reaction to those images, he ordered a barrage of cruise missile strikes on a Syrian air base the following day.

But Trump does not seem to have seen — or has ignored — equally heart-rending images of young children dying from the spreading famines in Africa and Yemen.

Those children evidently don’t merit White House sympathy.
Who knows why not just Donald Trump but the world is proving so indifferent to the famines of 2017?

It could simply be donor fatigue or a media focused on the daily psychodrama that is now Washington, or growing fears about the unprecedented global refugee crisis and, of course, terrorism.  It’s a question worth a piece in itself, but I want to explore another one entirely.

Here’s the question I think we all should be asking: Is this what a world battered by climate change will be like — one in which tens of millions, even hundreds of millions of people perish from disease, starvation, and heat prostration while the rest of us, living in less exposed areas, essentially do nothing to prevent their annihilation?

Famine, Drought, and Climate Change
First, though, let’s consider whether the famines of 2017 are even a valid indicator of what a climate-changed planet might look like.

After all, severe famines accompanied by widespread starvation have occurred throughout human history. In addition, the brutal armed conflicts now underway in Nigeria, Somalia, South Sudan, and Yemen are at least in part responsible for the spreading famines.

In all four countries, there are forces — Boko Haram in Nigeria, al-Shabaab in Somalia, assorted militias and the government in South Sudan, and Saudi-backed forces in Yemen — interfering with the delivery of aid supplies.

Nevertheless, there can be no doubt that pervasive water scarcity and prolonged drought (expected consequences of global warming) are contributing significantly to the disastrous conditions in most of them.

The likelihood that droughts this severe would be occurring simultaneously in the absence of climate change is vanishingly small.

In fact, scientists generally agree that global warming will ensure diminished rainfall and ever more frequent droughts over much of Africa and the Middle East. This, in turn, will heighten conflicts of every sort and endanger basic survival in a myriad of ways.

In their most recent 2014 assessment of global trends, the scientists of the prestigious Intergovernmental Panel on Climate Change (IPCC) concluded that “agriculture in Africa will face significant challenges in adapting to climate changes projected to occur by mid-century, as negative effects of high temperatures become increasingly prominent.”

Even in 2014, as that report suggested, climate change was already contributing to water scarcity and persistent drought conditions in large parts of Africa and the Middle East. Scientific studies had, for instance, revealed an “overall expansion of desert and contraction of vegetated areas” on that continent.

With arable land in retreat and water supplies falling, crop yields were already in decline in many areas, while malnutrition rates were rising — precisely the conditions witnessed in more extreme forms in the famine-affected areas today.

It’s seldom possible to attribute any specific weather-induced event, including droughts or storms, to global warming with absolute certainty.

Such things happen with or without climate change.  Nonetheless, scientists are becoming even more confident that severe storms and droughts (especially when occurring in tandem or in several parts of the world at once) are best explained as climate-change related.

If, for instance, a type of storm that might normally occur only once every hundred years occurs twice in one decade and four times in the next, you can be reasonably confident that you’re in a new climate era.

It will undoubtedly take more time for scientists to determine to what extent the current famines in Africa and Yemen are mainly climate-change-induced and to what extent they are the product of political and military mayhem and disarray. But doesn’t this already offer us a sense of just what kind of world we are now entering?

History and social science research indicate that, as environmental conditions deteriorate, people will naturally compete over access to vital materials and the opportunists in any society — warlords, militia leaders, demagogues, government officials, and the like — will exploit such clashes for their personal advantage.

“The data suggests a definite link between food insecurity and conflict,” points out Ertharin Cousin, head of the U.N.’s World Food Program.  “Climate is an added stress factor.”

In this sense, the current famines in Nigeria, Somalia, South Sudan, and Yemen provide us with a perfect template for our future, one in which resource wars and climate mayhem team up as temperatures continue their steady rise.

The Selective Impact of Climate Change
In some popular accounts of the future depredations of climate change, there is a tendency to suggest that its effects will be felt more or less democratically around the globe — that we will all suffer to some degree, if not equally, from the bad things that happen as temperatures rise.

And it’s certainly true that everyone on this planet will feel the effects of global warming in some fashion, but don’t for a second imagine that the harshest effects will be distributed anything but deeply inequitably.  It won’t even be a complicated equation.

As with so much else, those at the bottom rungs of society — the poor, the marginalized, and those in countries already at or near the edge — will suffer so much more (and so much earlier) than those at the top and in the most developed, wealthiest countries.

As a start, the geophysical dynamics of climate change dictate that, when it comes to soaring temperatures and reduced rainfall, the most severe effects are likely to be felt first and worst in the tropical and subtropical regions of Africa, the Middle East, South Asia, and Latin America — home to hundreds of millions of people who depend on rain-fed agriculture to sustain themselves and their families.

Research conducted by scientists in New Zealand, Switzerland, and Great Britain found that the rise in the number of extremely hot days is already more intense in tropical latitudes and disproportionately affects poor farmers.

Living at subsistence levels, such farmers and their communities are especially vulnerable to drought and desertification.

In a future in which climate-change disasters are commonplace, they will undoubtedly be forced to choose ever more frequently between the unpalatable alternatives of starvation or flight.  In other words, if you thought the global refugee crisis was bad today, just wait a few decades.

Climate change is also intensifying the dangers faced by the poor and marginalized in another way.  As interior croplands turn to dust, ever more farmers are migrating to cities, especially coastal ones.

If you want a historical analogy, think of the great Dust Bowl migration of the “Okies” from the interior of the U.S. to the California coast in the 1930s. In today’s climate-change era, the only available housing such migrants are likely to find will be in vast and expanding shantytowns (or “informal settlements,” as they’re euphemistically called), often located in floodplains and low-lying coastal areas exposed to storm surges and sea-level rise.

As global warming advances, the victims of water scarcity and desertification will be afflicted anew.  Those storm surges will destroy the most exposed parts of the coastal mega-cities in which they will be clustered.

In other words, for the uprooted and desperate, there will be no escaping climate change.  As the latest IPCC report noted, “Poor people living in urban informal settlements, of which there are [already] about one billion worldwide, are particularly vulnerable to weather and climate effects.”

The scientific literature on climate change indicates that the lives of the poor, the marginalized, and the oppressed will be the first to be turned upside down by the effects of global warming. “The socially and economically disadvantaged and the marginalized are disproportionately affected by the impacts of climate change and extreme events,” the IPCC indicated in 2014.

“Vulnerability is often high among indigenous peoples, women, children, the elderly, and disabled people who experience multiple deprivations that inhibit them from managing daily risks and shocks.”

It should go without saying that these are also the people least responsible for the greenhouse gas emissions that cause global warming in the first place (something no less true of the countries most of them live in).

Inaction Equals Annihilation
In this context, consider the moral consequences of inaction on climate change. Once it seemed that the process of global warming would occur slowly enough to allow societies to adapt to higher temperatures without excessive disruption, and that the entire human family would somehow make this transition more or less simultaneously.

That now looks more and more like a fairy tale.

Climate change is occurring far too swiftly for all human societies to adapt to it successfully.  Only the richest are likely to succeed in even the most tenuous way.

Unless colossal efforts are undertaken now to halt the emission of greenhouse gases, those living in less affluent societies can expect to suffer from extremes of flooding, drought, starvation, disease, and death in potentially staggering numbers.

And you don’t need a Ph.D. in climatology to arrive at this conclusion either.

The overwhelming majority of the world’s scientists agree that any increase in average world temperatures that exceeds 2 degrees Celsius (3.6 degrees Fahrenheit) above the pre-industrial era — some opt for a rise of no more than 1.5 degrees Celsius — will alter the global climate system drastically.

In such a situation, a number of societies will simply disintegrate in the fashion of South Sudan today, producing staggering chaos and misery. So far, the world has heated up by at least one of those two degrees, and unless we stop burning fossil fuels in quantity soon, the 1.5 degree level will probably be reached in the not-too-distant future.

Worse yet, on our present trajectory, it seems highly unlikely that the warming process will stop at 2 or even 3 degrees Celsius, meaning that later in this century many of the worst-case climate-change scenarios — the inundation of coastal cities, the desertification of vast interior regions, and the collapse of rain-fed agriculture in many areas — will become everyday reality.

In other words, think of the developments in those three African lands and Yemen as previews of what far larger parts of our world could look like in another quarter-century or so: a world in which hundreds of millions of people are at risk of annihilation from disease or starvation, or are on the march or at sea, crossing borders, heading for the shantytowns of major cities, looking for refugee camps or other places where survival appears even minimally possible.

If the world’s response to the current famine catastrophe and the escalating fears of refugees in wealthy countries are any indication, people will die in vast numbers without hope of help.

In other words, failing to halt the advance of climate change — to the extent that halting it, at this point, remains within our power — means complicity with mass human annihilation. We know, or at this point should know, that such scenarios are already on the horizon.

We still retain the power, if not to stop them, then to radically ameliorate what they will look like, so our failure to do all we can means that we become complicit in what — not to mince words — is clearly going to be a process of climate genocide.

How can those of us in countries responsible for the majority of greenhouse gas emissions escape such a verdict?

And if such a conclusion is indeed inescapable, then each of us must do whatever we can to reduce our individual, community, and institutional contributions to global warming. Even if we are already doing a lot — as many of us are — more is needed.

Unfortunately, we Americans are living not only in a time of climate crisis, but in the era of President Trump, which means the federal government and its partners in the fossil fuel industry will be wielding their immense powers to obstruct all imaginable progress on limiting global warming.  

They will be the true perpetrators of climate genocide.

As a result, the rest of us bear a moral responsibility not just to do what we can at the local level to slow the pace of climate change, but also to engage in political struggle to counteract or neutralize the acts of Trump and company.

Only dramatic and concerted action on multiple fronts can prevent the human disasters now unfolding in Nigeria, Somalia, South Sudan, and Yemen from becoming the global norm.



.