SUBHEAD: Intelligent beings descended from five-eyed, single-tentacled Opabinia were possible, but they didn’t happen.
By John Michael Greer on 3 July 2013 for the Archdruid Report -
(http://thearchdruidreport.blogspot.com/2013/07/a-peculiar-absence-of-bellybones.html)
Image above: Rendering of an Opabinia, a single tentacled, five eyed arthropod species from the Cambrian period - a half a billion years ago. From (http://greenanswers.com/question/what-most-interesting-animal-you-have-seen-picture/).
The fixation on imaginary “perfect storms” critiqued in last week’s post is only one expression of a habit of thinking that pervades contemporary American culture and, to a lesser extent, most other industrial societies. I’ve referred to this habit in a couple of posts in this series already, but it deserves closer attention, if only to help make sense of the way that individuals, institutions, and whole societies so often get blindsided these days by utterly predictable events.
Like several of the other themes already explored in this sequence, the habit of thinking I have in mind was explored by Oswald Spengler in The Decline of the West. His way of discussing it, though, relies on turns of phrase that don’t translate well into English, and philosophical concepts that were familiar to every reader in 1918 Germany and completely opaque to most readers in 2013 America.
To make sense of it, I’ll need to reframe the discussion by way of an excursion into deep time, so we can talk about the difference between what can happen and what does happen.
Unlike the Marcellus shale, the Barnett shale, and some of its other distant geological cousins, the Burgess shale doesn’t contain any appreciable amounts of oil or natural gas. What it does contain is a vast number of delicate fossils from the Cambrian period. It’s been argued that your ancestors and mine are there in the Burgess shale, in the form of a tiny, wriggling whatsit called Pikaia with a little strip of cartilage running down its back, the first very rough draft of what eventually turned into your backbone.
There are plenty of other critters there that are unlike anything else since that time, and it’s perfectly plausible to imagine that they, rather than Pikaia, might have left descendants who evolved into the readers of this blog, but that’s not what happened.
Intelligent beings descended from five-eyed, single-tentacled Opabinia were possible; they could have happened, but they didn’t, and once that was settled, a whole world of possibilities went away forever. There was no rational reason for that exclusion; it just happened that way.
Let’s take a closer look at Pikaia, though. Study it closely, and you can just about see the fish that its distant descendants will become. The strip of cartilage runs along the upper edge of its body, where fish and all other vertebrates have their backbones. It didn’t have to be there; if Pikaia happened to have cartilage along its lower edge, then fish and all the other vertebrates to come would have done just as well with a bellybone in place of a backbone, and you and I would have the knobbly bumps of vertebrae running up our abdomens and chests.
Once Pikaia came out ahead in the struggle for survival, that possibility went wherever might-have-beens spend their time. There’s no logical reason why we don’t have bellybones; it simply turned out that way, and the consequences of that event still constrain us today.
Fast forward 200 million years or so, and a few of Pikaia’s putative descendants were learning to deal with the challenges and possibilities of muddy Devonian swamps by wriggling up out of the water, and gulping air into their swim bladders to give them a bit of extra oxygen. It so happens that these fish had four large fins toward the underside of their bodies.
Many other fish at the time had other fin patterns instead, and if the successful proto-lungfish had happened to come from a lineage with six fins underneath, then amphibians, reptiles, birds, and mammals would have six limbs today instead of four.
A six-limbed body plan is perfectly viable—ask any insect—but the vertebrates that ventured onto land had four, and once that happened, the question was settled. Nothing makes six-legged mammals impossible, but there aren’t any and never will be. In an abstract sense, they can happen, but in the real world, they don’t, and it’s only history that explains why.
Today, another 400 million years later, most of the possible variables shaping life in this planet’s biosphere are very tightly constrained by an intricate network of ecological pressures rooted in the long history of the planet.
Those constraints, among other things, drive convergent evolution—the process by which living things from completely different evolutionary lineages end up looking and behaving like each other. 100 million years ago, when the Earth had its normal hothouse climate and reptiles were the dominant vertebrates, the icthyosaurs, a large and successful family of seagoing reptiles, evolved what we now think of as the basic dolphin look; when they went extinct and a cooling planet gave mammals the edge, seagoing mammals competing for the same ecological niche gave us today’s dolphins and porpoises.
Their ancestors, by the way, looked like furry crocodiles, and for good reason; if you’re going to fill a crocodile’s niche, as the protocetaceans did, the pressures that the rest of the biosphere brings to bear on that niche pretty much require you to look and act like a crocodile.
The lesson to be drawn from these examples, and countless others, is that evolution isn’t free to do everything that, in some abstract sense, it could possibly do. Between the limits imposed by the genetics of the organism struggling to adapt, and the even stronger limits imposed by the pressures of the environment within which that struggle is taking place, there are only so many options available, and on a planet that’s had living things evolving on it for two billion years or so, most of those options will have already been tried out at least once.
By John Michael Greer on 3 July 2013 for the Archdruid Report -
(http://thearchdruidreport.blogspot.com/2013/07/a-peculiar-absence-of-bellybones.html)
Image above: Rendering of an Opabinia, a single tentacled, five eyed arthropod species from the Cambrian period - a half a billion years ago. From (http://greenanswers.com/question/what-most-interesting-animal-you-have-seen-picture/).
The fixation on imaginary “perfect storms” critiqued in last week’s post is only one expression of a habit of thinking that pervades contemporary American culture and, to a lesser extent, most other industrial societies. I’ve referred to this habit in a couple of posts in this series already, but it deserves closer attention, if only to help make sense of the way that individuals, institutions, and whole societies so often get blindsided these days by utterly predictable events.
Like several of the other themes already explored in this sequence, the habit of thinking I have in mind was explored by Oswald Spengler in The Decline of the West. His way of discussing it, though, relies on turns of phrase that don’t translate well into English, and philosophical concepts that were familiar to every reader in 1918 Germany and completely opaque to most readers in 2013 America.
To make sense of it, I’ll need to reframe the discussion by way of an excursion into deep time, so we can talk about the difference between what can happen and what does happen.
Unlike the Marcellus shale, the Barnett shale, and some of its other distant geological cousins, the Burgess shale doesn’t contain any appreciable amounts of oil or natural gas. What it does contain is a vast number of delicate fossils from the Cambrian period. It’s been argued that your ancestors and mine are there in the Burgess shale, in the form of a tiny, wriggling whatsit called Pikaia with a little strip of cartilage running down its back, the first very rough draft of what eventually turned into your backbone.
There are plenty of other critters there that are unlike anything else since that time, and it’s perfectly plausible to imagine that they, rather than Pikaia, might have left descendants who evolved into the readers of this blog, but that’s not what happened.
Intelligent beings descended from five-eyed, single-tentacled Opabinia were possible; they could have happened, but they didn’t, and once that was settled, a whole world of possibilities went away forever. There was no rational reason for that exclusion; it just happened that way.
Let’s take a closer look at Pikaia, though. Study it closely, and you can just about see the fish that its distant descendants will become. The strip of cartilage runs along the upper edge of its body, where fish and all other vertebrates have their backbones. It didn’t have to be there; if Pikaia happened to have cartilage along its lower edge, then fish and all the other vertebrates to come would have done just as well with a bellybone in place of a backbone, and you and I would have the knobbly bumps of vertebrae running up our abdomens and chests.
Once Pikaia came out ahead in the struggle for survival, that possibility went wherever might-have-beens spend their time. There’s no logical reason why we don’t have bellybones; it simply turned out that way, and the consequences of that event still constrain us today.
Fast forward 200 million years or so, and a few of Pikaia’s putative descendants were learning to deal with the challenges and possibilities of muddy Devonian swamps by wriggling up out of the water, and gulping air into their swim bladders to give them a bit of extra oxygen. It so happens that these fish had four large fins toward the underside of their bodies.
Many other fish at the time had other fin patterns instead, and if the successful proto-lungfish had happened to come from a lineage with six fins underneath, then amphibians, reptiles, birds, and mammals would have six limbs today instead of four.
A six-limbed body plan is perfectly viable—ask any insect—but the vertebrates that ventured onto land had four, and once that happened, the question was settled. Nothing makes six-legged mammals impossible, but there aren’t any and never will be. In an abstract sense, they can happen, but in the real world, they don’t, and it’s only history that explains why.
Today, another 400 million years later, most of the possible variables shaping life in this planet’s biosphere are very tightly constrained by an intricate network of ecological pressures rooted in the long history of the planet.
Those constraints, among other things, drive convergent evolution—the process by which living things from completely different evolutionary lineages end up looking and behaving like each other. 100 million years ago, when the Earth had its normal hothouse climate and reptiles were the dominant vertebrates, the icthyosaurs, a large and successful family of seagoing reptiles, evolved what we now think of as the basic dolphin look; when they went extinct and a cooling planet gave mammals the edge, seagoing mammals competing for the same ecological niche gave us today’s dolphins and porpoises.
Their ancestors, by the way, looked like furry crocodiles, and for good reason; if you’re going to fill a crocodile’s niche, as the protocetaceans did, the pressures that the rest of the biosphere brings to bear on that niche pretty much require you to look and act like a crocodile.
The lesson to be drawn from these examples, and countless others, is that evolution isn’t free to do everything that, in some abstract sense, it could possibly do. Between the limits imposed by the genetics of the organism struggling to adapt, and the even stronger limits imposed by the pressures of the environment within which that struggle is taking place, there are only so many options available, and on a planet that’s had living things evolving on it for two billion years or so, most of those options will have already been tried out at least once.
Even when something new emerges, as happens from time to time, that doesn’t mean that all bets are off; it simply means that familiar genetic and environmental constraints are going to apply in slightly different ways. That means that there are plenty of things that theoretically could happen that never will happen, because the constraints pressing on living things don’t have room for them.
That much is uncontroversial, at least among students of evolutionary ecology. Apply the same point of view to human history, though, and you can count on a firestorm of protest.
Nonetheless, that’s exactly what I’ve been trying to do in this blog over the last seven years—to point out that historical change is subject to limits imposed by the historical trajectories of societies struggling to adapt, and the even stronger limits imposed by the pressures of the environment within which that struggle is taking place; worse still, to point out that societies have an equivalent of convergent evolution, which can be studied by putting different societies side by side and comparing their historical trajectories, and that this reveals otherwise untraceable constraints and thus allows meaningful predictions to be made about the future of our own civilization.
Each of those proposals offends several of the most basic assumptions with which most people nowadays approach the future; put them all together—well, let’s just say that it’s no surprise that each weekly post here can count on fielding its quota of spit-slinging denunciations.
As regular readers of this blog know, a great many of these quarrels arrange themselves around the distinction I’ve just drawn. Whether we’re talking about 2012 or near-term human extinction or the latest claim that some piece of other of energy-related vaporware will solve the world’s increasingly intractable energy and resource shortages, my critics say, “It could happen!” and I reply, “But it won’t.”
They proceed to come up with elaborate scenarios and arguments showing that, in fact, whatever it is could possibly happen, and get the imperturbable answer, “Yes, I know all that, but it still won’t happen.”
Then it doesn’t happen, and the normal human irritation at being wrong gets thrown in the blender with a powerful sense of the unfairness of things—after all, that arrogant so-and-so of an archdruid didn’t offer a single solitary reason why whatever it was couldn’t possibly happen!—to make a cocktail that’s uncommonly hard to swallow.
There’s a reason, though, why these days the purveyors of repeatedly disproved predictions, from economists through fusion-power proponents to believers in the current end of the world du jour, so constantly use arguments about what can happen and so consistently ignore what does happen. It’s a historical reason, and it brings us a big step closer to the heart of this sequence of posts.
When Nietzsche proclaimed the death of God to a mostly uninterested 19th century, as I mentioned in an earlier post in this sequence, he was convinced that he was doing something utterly unprecedented—and he was wrong. If he’d been a little more careful about checking his claims against what he’d learned as a classical philologist, he would have remembered that the gods also died in ancient Greece in the fourth century BCE, and that the rationalist revolt against established religion in the Greek world followed the same general course as its equivalent in western Europe and the European diaspora two millennia or so later.
Put the materialist philosophers of the Greek Enlightenment side by side with the corresponding figures in its European equivalent, or line up the skeptical barbs aimed at Homer’s portrayal of the gods and goddesses of Greece with those shot at the Bible’s portrayal of the god of Christianity—by Nietzsche among others!—and the similarities are hard to miss.
What’s more, the same thing has happened elsewhere. India went through its rationalist period beginning in the sixth century BCE, giving rise to full-blown atomist and materialist philosophies as well as an important school of logic, the Nyaya; it’s indicative of the tone of that period that the two great religious movements founded then, Buddhism and Jainism, in their earliest documented forms were wholly uninterested in gods.
The equivalent period in ancient China began about a century later, with its own achievements in logic and natural science and its own dismissal of formal religion—sacrifices and rites are important for social reasons, Confucius argues, but to busy oneself excessively with them shows that one is ignorant and unreasonable.
It’s a standard element of the trajectory of literate civilizations through time. Every human society comes out of the shadows of its origins well equipped with a set of beliefs about what does happen. Since most human societies in their early phases are either wholly innocent of writing, or have lost most of a former tradition of literacy in the collapse of some previous civilization, those beliefs are normally passed down by way of the oldest and most thoroughly proven system of information storage and transfer our species has invented—that is to say, mythology: a collection of vivid, colorful stories, usually in verse, that can be learned starting in early childhood and remembered letter-perfect into advanced old age.
Since the information storage capacity of myths is large but not limitless, each myth in a mature mythology is meant to be understood and interpreted on several levels, and learning how to unpack the stories is an essential part of education as an adult in these societies.
For human societies that rely on hunter-gatherer, nomadic pastoral, or village horticultural economies, mythology is amply suited to their information storage and transfer needs, and it’s rare for these to go looking for other options. Those societies that take to field agriculture and build urban centers, though, need detailed records, and that usually means writing or some close equivalent, such as the knotted cords of the old Incas. Widespread public literacy seems to be the trigger that sets off the collapse of mythic thinking.
Where literacy remains the specialty of a priesthood jealous of its privileges, among the ancient Maya or in Egypt before the New Kingdom, writing is simply a tool for record-keeping and ceremonial proclamations, but once it gets into general circulation, rationalism of one kind or another follows in short order; an age of faith gives way to an age of reason.
That transformation has many dimensions, but one of the more important is a refocusing from what does happen to what can happen. At the time, that refocusing is a very good thing.
Literacy in an age of faith tends to drive what might be called the rationalization of religion; myths get written down, scribes quarrel over which versions are authentic and what interpretations are valid, until what had been a fluid and flexible oral tradition stiffens into scripture, while folk religion—for the time being, we can define that messy category “religion” in purely functional terms as the collection of customary rites and beliefs that go with a particular set of mythic narratives—goes through a similar hardening into an organized religion with its own creed and commandments.
That process of rigidification robs oral tradition of the flexibility and openness to reinterpretation that gives it much of its strength, and helps feed the building pressures that will eventually tear the traditional religion to shreds.
It’s the rise of rational philosophy that allows people in a literate civilization to get out from under the weight of a mummified version of what does happen and start exploring alternative ideas about what can happen. That’s liberating, and it’s also a source of major practical advantages, as life in a maturing urban civilization rarely fits a set of mythic narratives assembled in an older and, usually, much simpler time. It becomes possible to ask new questions and speculate about the answers, and to explore a giddy range of previously unexamined options.
That much of the story is hardwired into the historical vision of contemporary Western culture. It’s the next part of the story, though, that leads to our present predicament. The wild freedom of the early days of the rationalist rebellion never lasts for long. Some of the new ideas that unfold from that rebellion turn out to be more popular and more enduring than others, and become the foundations on which later rationalists build their own ideas.
With the collapse of traditional religions, in turn, people commonly turn to civil religions as a source of values and meaning, and popular civil religions that embrace some form of rationalist thought, as most do, end up imbuing it with their own aura of secondhand holiness. The end result of the rationalist rebellion is thus a society as heavily committed to the supposed truth of some set of secular dogmas as the religion it replaced was to its theological dogmas.
You know that this point has arrived when the rebellion starts running in reverse, and people who want to think ideas outside the box start phrasing them, not in terms of rational philosophy, but in terms of some new or revived religion.
The rebellion of rationalism thus eventually gives rise to a rebellion against rationalism, and this latter rebellion packs a great deal more punch than its predecessor, because the rationalist pursuit of what can happen has a potent downside: it can’t make accurate predictions of the phenomena that matter most to human beings, because it fixates on what can happen rather than paying attention to what does happen.
It’s only in the fantasies of extreme rationalists, after all, that the human capacity for reason has no hard limits. The human brain did not evolve for the purpose of understanding the universe and everything in it; it evolved to handle the considerably less demanding tasks of finding food, finding mates, managing relations with fellow hominids, and driving off the occasional leopard.
We’ve done some remarkable things with a brain adapted for those very simple purposes, to be sure, but the limits imposed by our ancestry are still very much in place.
Those limits show most clearly when we attempt to understand processes at work in the world. There are some processes in the world that are simple enough, and sufficiently insulated from confounding variables, that a mathematical model that can be understood by the human mind is a close enough fit to allow the outcome of the process to be predicted.
That’s what physics is about, and chemistry, and the other “hard” sciences: the construction of models that copy, more or less, the behavior of parts of the world that are simple enough for us to understand. The fact that some processes in the world lend themselves to that kind of modeling is what gives rationalism its appeal.
The difficulty creeps in, though, when those same approaches are used to try to predict the behavior of phenomena that are too complex to conform to any such model. You can make such predictions with fairly good results if you pay attention to history, because history is the product of the full range of causes at work in comparable situations, and if A leads to B over and over again in a sufficiently broad range of contexts, it’s usually safe to assume that if A shows up again, B won’t be far behind.
The difficulty creeps in, though, when those same approaches are used to try to predict the behavior of phenomena that are too complex to conform to any such model. You can make such predictions with fairly good results if you pay attention to history, because history is the product of the full range of causes at work in comparable situations, and if A leads to B over and over again in a sufficiently broad range of contexts, it’s usually safe to assume that if A shows up again, B won’t be far behind.
Ignore history, though, and you throw away your one useful source of relevant data; ignore history, come up with a mental model that says that A will be followed by Z, and insist that since this can happen it will happen, and you’re doomed.
Human behavior, individual as well as collective, is sufficiently complex that it falls into the category of things that rational models divorced from historical testing regularly fail to predict. So do many other things that are part of everyday life, but it’s usually the failure of rational philosophies to provide a useful understanding of human behavior that drives the revolt against rationalism.
Over and over again, rational philosophies have proclaimed the arrival of a better world defined by some abstract model of how human beings ought to behave, some notion or other of what can happen, and the actions people have taken to achieve that better world have resulted in misery and disaster; the appeal of rationalism is potent enough that it normally takes a few centuries of repeated failures for the point to be made, but once it sinks in, the age of reason is effectively over.
That doesn’t mean that the intellectual tools of rationalism go away—quite the contrary; the rise of what Spengler called the Second Religiosity involves sweeping transformations of religion and rational philosophy alike. More precisely, it demands the abandonment of extreme claims on both sides, and the recognition of what it is that each does better than the other. What comes after the age of reason isn’t a new age of faith—not right away, at least; that’s further down the road—but an age in which the claims of both contenders are illuminated by the lessons of history: an age of memory.
That’s why, a few centuries after the rationalists of Greece, India, and China had denounced or dismissed the gods, their heirs quietly accepted a truce with the new religious movements of their time, and a few centuries further on, the heirs of those heirs wove the values taught by the accepted religion into their own philosophical systems. That’s also why, over that same time, the major religions of those cultures quietly discarded claimsthat couldn’t stand up to reasonable criticism.
Where the Greeks of the Archaic period believed in the literal truth of the Greek myths, and their descendants of the time of Socrates and Plato were caught up in savage debates over whether the old myths had any value at all, the Greeks of a later age accepted Symmachus’ neat summary—“Myths are things that never happened, but always are”—and saw no conflict at all between pouring a libation to Zeus the Thunderer and taking in a lecture on physics in which thunderbolts were explained by wholly physical causes.
That state of mind is very far from the way that most people in the contemporary industrial world, whether or not they consider themselves to be religious, approach religious beliefs, narratives, and practices.
The absurd lengths to which today’s Christian fundamentalists take their insistence on the historical reality of the Noah’s ark story, for example, in the face of conclusive geological evidence that nothing of the sort happened in the time frame the Biblical narrative provides for it, is equaled if not exceeded by the lengths to which their equal and opposite numbers in the atheist camp take their insistence that all religions everywhere can be reduced to these terms.
Still, I’d like to suggest that this rapprochement is the most likely shape for the religious future of a declining industrial world, and that it also offers the best hope we’ve got for getting at least some of the achievements of the last three centuries or so through the difficult years ahead. How that process might play out is a complex matter; we’ll begin discussing it next week.
.
No comments :
Post a Comment