And yet-well, and yet, there is a seductive consistency to Toffler's prognostications, and he surely seems to know a lot about a lot of different kinds of things, and after all he might be right about the shape of things to come. And once we start to think this way- as Speaker Newt Gingrich apparently has-we are caught, for something lives in all of us that wants to shake loose from our petty little thoughts, that wants to move around and think big thoughts, that wants to get up and do the Big Think in just the way that Toffler and his fellow prophets of the technological future all seem able to do.
We ought to be at least initially skeptical whenever we encounter simplicity used to explain complexity, but the explanatory range of a thought is not exactly what makes the thought big. Big thinking almost always involves the prediction of specific future events, and thus not every big explanation is born from thinking big. Aristotle was not a big thinker, though Plato could be from time to time. St. Thomas Aquinas never did the Big Think, though St. Bonaventure felt the urge occasionally. Even Hegel was not prone to thinking big by predicting the specific future, though Marx was when the unbuttoned mood was on him.
Just predicting the future, however, is not enough to constitute a Big Think. St. John of Patmos was not a big thinker and neither was Nostradamus, but Swedenborg, Joachim of Flora, and H. G. Wells were. The key to thinking big is the determination of specific future events according to a theory; the key is the construction of a systematic morphology of time, an analogy in which the appearance of something in a known system of the past or the present requires the appearance of an anologue in an unknown but analogous system of the future. Big thinking is the replacement of direct causal relations with the weird and wonderful causality of analogy: if an event occurs in one system, then the parallel event must occur in an analogous second system-even though we see no direct cause for the event within that second system.
If no prediction of the future short of divine revelation were possible, then we could dismiss the futuristic prophecies of big thinkers as easily as we dismiss the astrological fantasies of the Psychic Friends Network on late night television. But the problem is that all social commentary, all political thought, and all moral judgment rely on the fact that the future is at least partially predictable, on the fact that we can have some knowledge of the shape of things to come. But the difference between big thinking and the kind of predictions we have to make to live our lives is not one of degree (as though technological futurism were normal scientific prediction carried just a little too far) but one of kind: big thinking scorns to link known causes to predictable effects.
Some people are better than others at discerning the future. All of us can figure out, a second before the train barrels past, whether we could have made it across the railroad tracks. But some people can judge much earlier what's coming down the tracks-when the train first rumbles into sight around the bend, when we still have time to try to make it through the crossroads. The future is partially predictable, and some people seem to have a talent for the sort of practical judgment that discerns the present links joining future events to past causes. But the big thinker scorns these little causal links of practical judgment. The big thinker is the man who doesn't need to see the long black train coming 'round the bend; he just knows-after careful consultation of his watch and minute calculations in the margins of his train schedule-that we can make it safe across the tracks.
Of all the thinkers of the Big Think, the biggest was Oswald Spengler- and the most self-congratulatory. "In this book," he wrote of his own Decline of the West (1918), "is attempted for the first time the venture of predetermining history." Spengler claimed to have found the Archimedean point for undertaking the "Copernican Revolution" of history and thus for seeing history in its "morphological" forms. The morphologist of history sees that each culture has in it elements that correspond to elements in other cultures: as the ancient Mediterranean decayed into the imperialism of Alexander the Great, so the modern West must shortly have its Alexander. This is not cause and effect, any more than the fact that a dog has paws causes a cat to have paws. The "form" of mammal is the same in both dogs and cats despite their differences. The morphologist of history looks not for cause and effect (the old structures of bad science), but for Destiny: the inescapable recreation of the stages of general Culture in every moment of each specific culture.
We do not reach down to what is wrong with the Big Think by listing Spengler's many factual errors about the past and failed predictions about the future. The Big Think is not wrong because Spengler was a sloppy historian, a proto-Nazi, and a terrible writer, though he was all three. Spengler's Big Think is wrong because it replaces fact with method, proof with illustration, and cause with parallel. Spengler is wrong not so much because the ostensible facts he deigns to gather are often mistaken, but because he gathers his facts solely to illustrate an idea that he just knows from the beginning has to be true since it explains so much.
Nothing is as dated as the future, and nothing more nostalgic than the prophetic notions we held in days gone by. The smart house and the paperless office, the classless society and the withering away of the state are exhibits now closed in Tomorrowland, but to remember them is to remember how things used to be. For twenty years, in his best-selling futurist trilogy Future Shock, The Third Wave, and Powershift, Alvin Toffler has been predicting the imminence of the future. About some events he turned out to be right, about others wrong, but the most remarkable thing about his prophecies is how rapidly they, like Spengler's, came to seem old.
Perhaps this rapid aging of the thesis of the book itself proves that there was something to Toffler's 1970 Future Shock, which asserted that the future comes faster in these late times than it used to. Under the baleful influence of his brother Brooks, Henry Adams put a similar thesis in a 1909 essay, "The Rule of Phase Applied to History," claiming that each new age is in length only the square root of the age before-and predicting that human thinking will come "to the limit of its possibilities in the year 1921." It didn't happen that way, of course, but the prophesies of Adams, and of Spengler shortly after him, captured the sense of a world worn out that many people felt at the time. And so, too, Toffler's Future Shock captured the sense of unassimilated change that many people felt in the silly season of the late sixties and early seventies.
It was with The Third Wave in 1980, however, that Toffler found at last the Spenglerian morphology for predetermining history that made it all explain so much. Just as the first wave of agricultural technology was in its ebbing overwhelmed by the flow of industrial technology, so the ebbing wave of industry is now being overwhelmed by the wave of electronic technology. To the shrimp and seaweed tossed by combers, swirled in the backwash and drowned in the breakers, life may seem confusing. But to the big thinker standing on the beach with his schedule of ocean tides, the pattern of the waves is plain.
The pattern that he claims to see is that the struggle for power is the struggle to control knowledge. In The Third Wave Toffler seems to argue that the source of power has always been the control of knowledge (and thus that even first-wave feudalism and second-wave industrialism ought to be explained as knowledge control), while in his 1990 Powershift he seems to argue that only with the third, electronic wave of technology does knowledge control become the source of power. But either way, all the changes apparently overtaking modern civilization are best understood as struggles to control the flood of information cast upon us by electronic technology.
There is an odd sort of optimistic pessimism about all big thinkers, for the future is determined and there is nothing we can do about it except understand it through big thinking. Certain family planning measures "could help us ease our way into tomorrow, minimizing for millions the pain of transition," Toffler writes in The Third Wave. "But whether painful or not, a new family system is emerging to supplant" our old one, and we had best just stand aside before the unstoppable wave sweeps us under. Just as the anti-industrial Luddites vainly tried to hold back the second wave before they sank, so-Toffler sees in good morphological fashion-anti-technologists vainly try now to hold back the third wave. They too will sink, and the only dry person will be the big thinker on the beach who stands above the unstoppable waves.
With this pattern in place, Toffler and the futurists can explain everything. When the Communist regimes were successful in Eastern Europe, it was because they controlled knowledge. When the Communist regimes collapsed, it was because they could no longer do so. Toffler was, one supposes from his work, an anti-Communist. But his writing about communism asserts an oddly valueless necessity. Communism failed because it foolishly tried to control communication in a global economy, and not because people thought democracy was better. An anarchic sort of democracy appears simply because no government is capable any longer of imposing intelligible order on information.
The content of this information is finally unimportant-though Toffler tends to think of it as always knowledge about how to do things. The third wave has a strange, self-fulfilling quality about it, since the knowledge for which companies and governments spy upon each other is mostly knowledge about the technologies for gaining knowledge. Moral knowledge seems not to exist in this wave, and Toffler generally equates morality with social necessity. The role of the churches in the overthrow of Communist regimes he praises only because the churches offered a system for the exchange of knowledge that was outside government control, and not because the churches held any knowledge that people desired.
But the Big Think is always impersonal in exactly this way, whether in Spengler, or Adams, or Toffler, or any of their innumerable imitators. The predetermined future is coming willy-nilly, and only the big thinker who knows the big picture will be calm. And in Toffler's vision, since power is knowledge of the ways to gain knowledge, only big thinkers like Toffler will have power. The rest of us-raising our children, trying to tell right from wrong, living our lives as best we can, using the technology that seems helpful to us and ignoring the rest-are apparently doomed to helpless lives swirling in the backwash.
Following the news that a Florida judge had condemned Hill to the electric chair, the National Right to Life Committee expressed the view of most abortion foes. While mourning the deaths of "unborn children," the committee "unequivocally oppose[d]" Mr. Hill's decision to kill an abortion doctor to stop the man from going about this daily rounds. On the same day that I read the Committee's statement, I picked up the December 1994 issue of First Things containing a symposium among conservative religious thinkers on "Killing Abortionists." Cardinal O'Connor of New York, Cardinal Mahony of Los Angeles, the Christian Coalition's Ralph Reed, and others urged against accepting Paul Hill's statement that "Whatever force is legitimate in defending a born child is legitimate in defending an unborn child." Nearly all the symposium's participants agreed that the strictures of such authorities as Thomas Aquinas and John Calvin rule out the deliberate use of deadly force to stop abortions.
It was not that the contributors disputed Mr. Hill's view that the entity in the womb is unambiguously a "child." They spoke of abortion as "a moral species of murder" perpetrated against "unborn children," "innocent human beings." Indeed, the one thing Paul Hill has in common with the mainstream anti-abortion movement is a tendency to speak about abortion in the language of genocide. One hears constantly about the slaughter of "children" in the abortion "holocaust."
Thus after John Salvi let loose his own salvo against the abortion industry, it was no surprise to find Randall Terry of Operation Rescue writing in the New York Post about the "slaughter [of] our young," "the murder of innocent children," "thirty-five million innocent babies [torn] from their mothers' wombs"-while simultaneously opposing armed action on the grounds of unspecified "principles of Calvin, Knox, and Cromwell concerning 'lower magistrates.'"
Though abortion opponents favor the most colorful possible speech, one may oppose abortion without it. For instance, there is the Jewish approach (which is my own). Exodus 21:22 specifies what happens when a man violently brings a woman's pregnancy to a premature end: he pays a fine, a penalty hardly comparable to that imposed by God for murder (the death penalty) or for manslaughter (internal exile). Citing this and other verses, the rabbis of the Talmud concluded that abortion, while not the murder of a child, is to be strongly rejected as an interference in the divinely guided process of human reproduction.
So, as a Jew, I always stop short at terms like "holocaust" as applied to abortion. A holocaust is the mass murder of entities that are human beings in every sense in which Cardinal O'Connor or Randall Terry is a human being. And if a holocaust were going on in the United States today, one would think the responsibility to take up arms against it would be as great as it was when an undisputed Holocaust was going on in Europe. Within that part of the anti-abortion movement whose members decry the "murder" of "innocent children," a few, such as Paul Hill and John Salvi, have acted accordingly-bombing clinics and shooting staffers. That these men used force in a wild, uncontrolled way does not mean that sane abortion foes could not come up with a more careful strategy, using the minimum level of violence necessary to accomplish their end: say, by shooting abortion doctors in the legs instead of the head, or setting fire to clinics by night.
And yet when a Catholic priest in Alabama sought to justify the use of force to prevent abortions, his archbishop denounced and suspended him. Reacting to Paul Hill, Cardinal O'Connor said, "If anyone has an urge to kill an abortionist, let him kill me instead." Some abortion opponents I know disdain even the civil disobedience of Operation Rescue. And the archbishop of Boston has gone so far as to rule out even peaceful protests on the sidewalks outside abortion clinics. These polite people insist on the adequacy of words and votes.
To be sure, they offer earnest intellectual justifications for their inaction. Some allow that they might in theory accept the use of force to stop the murder of "babies," but "prudential considerations" regarding the practical effectiveness of violence rule it out. Yet I have never heard a sustained discussion comparing the strategic merits of peaceful persuasion (which so far has produced meager results) with the merits of force. Among this variety of abortion foes, as soon as the words "prudential considerations" (or some equivalent) are invoked, the discussion comes to a quick, relieved halt. Others present arguments opposing force altogether, on moral grounds.
A detailed example appeared in the November 1994 issue of Catholic World Report, in which Professor John M. Haas cited the Summa Theologica of Thomas Aquinas: "It is unlawful to take a man's life, except for the public authority acting for the common good." Professor Haas, who speaks of "the unborn" instead of "the children," reserves to the state the right to kill in defense of innocent life, and concludes that "two wrongs do not make a right." Alternatively, an abortion opponent may say the abortionist "kills children" but does not "murder" them, since murder implies an intent to take the life of an entity you know to be a full human being. Cardinal O'Connor asserts that, despite the mass "murder" going on, "The United States today is not Nazi Germany."
There is a gap in reasoning here. Assume that a fetus is a child. Then every five years, the United States allows the murder of more babies than the Nazis killed Jews. Yes, the authorities who guarantee the right of abortion and the doctors who carry out the surgery do not believe fetuses are human beings, but neither did the Nazis believe that Jews are human beings. In one case the government itself organized the act of genocide. In the other, the government guarantees the right of a subpopulation-abortionists-to commit genocide. So if a fetus is a child, what's the big difference?
The truth is, the distinctions offered by the intellectuals and activists I refer to have about them a distinct air of excuse-making. Imagine that fifty years ago a theologian of moral seriousness equal to that of Cardinal O'Connor found himself outside the gate of Auschwitz and holding a machine gun, given the opportunity to liberate some prisoners by shooting a couple of guards. Had you at that moment admonished him-saying "two wrongs do not make a right"-one assumes he would not have been deterred.
I know from experience that most abortion foes are people moved not by some psychosexual desire to enslave women, as pro-abortion activists frequently allege, but by a commitment to moral sentiments. If these intelligent and passionately engaged men and women believed with a whole heart that abortion is murder, if they really believed that the lives of a million actual children are at stake every year, then I have no doubt they would be organizing clandestine paramilitary units to move against abortion providers at this moment. They would find justification in Thomas Aquinas and John Calvin.
Thank God they are not arming for guerrilla war. Yet neither that fact, nor any admiration for their moral commitments, should excuse such sloppy language. For language has consequences. Just as you must not shout "fire" in a theater crowded with people, you must not say "murder" and "child" in a movement that includes people, however few, like John Salvi.
Some twenty years ago, when I joined an organization of abortion advocates, it was the word "right"-as in "a woman's right to choose"- that shaped my views. A few years later it was again the word "right"-as in the "the right to life"-that drew me into the pro-life movement.
As an abortion advocate, I had learned never to give humanity to what was in the womb of a pregnant woman. "Don't use the words 'child' or 'baby,'" we were told. Talk instead about "a mass of tissue" or "the product of conception."
Mass of tissue? Who isn't? Product of conception? Aren't we all? If what is aborted is not a living human being with a claim to legal protection then why couldn't we defend abortion with words that were less generic? It seemed to me a strange kind of truth that required deception to promulgate it.
After months of seeking such words, I found none. I realized I either had to change my mind or continue to change reality by disguising the truth.
In America today the lives of unborn children have come to depend not on scientific fact but on the use of misleading words. "Fetus" is one of those. It is a good word, a medical word, but a Latin word. Why has a word from a "dead" language been resurrected for use in everyday conversation? After all, we don't congratulate parents on the birth of their "neonate."
Verbal smoke screens have long been used to accustom people to accept ideas, actions, and policies they would otherwise find obnoxious. "Fetus" serves the manipulative purpose of dehumanizing those in the womb who are unwanted, while words like "unborn children" or "child before birth" call forth an emotional and relational response.
A few years ago, in a high-profile trial of an abortion doctor, a picture was shown of an unborn child the same gestational age as the one who had been aborted. The press called it a "fetus." The doctor's lawyer called it a "fetus." The jury, though, called what they saw a "premature baby."
Shakespeare understood that it is easier to kill a snake than to kill a human being. Thus in Julius Caesar he has Brutus set the stage for Caesar's murder by using words that would dehumanize him: "Think him as a serpent's egg . . . and kill him in the shell."
Yet women throughout the ages have always known that what they carry are children. Before ever there was a pro-life movement there were baby showers, not fetus showers. Pregnant women are asked by friend and stranger alike, "When is your baby due?" Fathers say, "I heard the baby's heartbeat," and mothers say, "I felt the baby move." Expectant parents are quick to show ultrasound pictures of their baby in the womb. Women who miscarry grieve the loss of a child not of a fetus.
Over thirty years ago Planned Parenthood pamphlets warned that "an abortion kills the life of a baby after it has begun." To pretend that what is in the womb is anything but a child is, as George Will once said, "a revolution against the judgment of generations."
The unborn child, of course, is not the same as the child sleeping in a crib or playing in a sandbox. Yet, if someone were to say he or she had at home an infant and an adolescent, we surely would not wonder if that person was referring to different kinds of pets or variations of plants. Infant and adolescent are terms that describe human beings at various stages of development. That is all the word "fetus" does. It tells us where a child is on the life spectrum. Is the adolescent less human than the adult? Is the infant less human then the adolescent? Is the fetus less human than the infant? They are different only in terms of development and dependence.
Columnist Joan Beck, writing in the Chicago Tribune, said that "Obstetricians understand better than anyone else that an unborn baby isn't a blob of tissue or a growing tumor or an unnecessary appendix. Whatever the euphemisms, they know that it is living, that it is human, that it is unique, and that abortion kills it."
Children know it, too, and they know it instinctively. Late one night, as I viewed an abortion slide, my youngest child, then a sleepy three- year old, unexpectedly entered the room. I heard his sharp intake of breath as he saw the body of a three-month old, dismembered by a D & C abortion. With great sadness in his voice he asked, "Who broke the baby?" Here was a child too young to have his sight clouded by semantic subterfuge, and, with a wisdom that often escapes the learned, he could mentally assemble the body parts and call what he saw a broken baby.
What, then, is this thing called abortion? If a living human child is killed by it, is abortion murder? Not now. Not since the Supreme Court's 1973 Roe v. Wade ruling. "Murder," by dictionary definition, is "the unlawful and malicious or premeditated killing of one human being by another." Abortion qualifies as premeditated in that it is contracted for and carried out at a set time and place. What abortion is not is unlawful, but that does not change the nature of what abortion is or does.
There are some who taunt pro-lifers by saying that if we really believe abortion kills children, if what is in the womb is really a child, then we ought to commend, not condemn, the killing of abortion providers. They don't understand that it is possible to be anti-abortion but not pro-life, that pro-lifers oppose violence whether inside or outside abortion centers, that the goading of pro-lifers to respond in kind is more of a provocation to violence than are words that speak of a child's right to be born. Pro-lifers know that violence is too weak a weapon against the evil that is abortion. They know it cannot be called murder in a legal sense but that it can be called a holocaust.
This year marks the fiftieth anniversary of the end of World War II. While historically and socially the Nazi Holocaust was a unique episode in human history, the forces that created it are alive and well today. Those forces, the coupling of dehumanizing language with technology that dispatched victims in an efficient assembly-line manner, characterized the Third Reich. If we think that kind of evil is safely frozen in the past, we ignore the unpleasant truth about human nature: that the potential for great evil exists in each of us. And the human mind is never more resourceful than when it is involved in self-justification.
The Holocaust of half a century ago is too singular to be watered down by comparisons. But so, too, is the American holocaust of over thirty- two million children who have been killed and buried beneath words that have been emptied of their meaning. The words of Scripture, though, are clear and precise when they say of a pregnant woman simply that she is "with child." That is why, after twenty-two years, the Supreme Court's decision that was touted as having "settled" the abortion issue has not- because those who oppose it think in terms of reverence rather than rights.
The death of Darwinism, it now seems clear, will be chaos theory (or "complexity theory," as the researchers in the young science of emergent order seem to prefer). The problem with Darwinism, as honest Darwinists have always admitted, is that it has nothing to say about how new features of living things arise or how new instinctual behaviors originate. It has a great deal to say about how natural selection can preserve and refine these things once they appear, of course. It inspired a search in the fossil record for the lineages of living creatures that has given our natural science a unique historical depth. Darwinism is not wrong; it just is not the final answer to the question of how living things originate. We have been looking in the wrong place for that answer-which must be not so much a matter of genes as of the order that arises spontaneously from simpler units (particularly the geometries of complex molecules).
Even children often notice that the paw and foreleg of a dog are like a strangely distorted human hand, with the thumb appearing as a useless claw partway up the foreleg. The skeletons of all vertebrates, in fact, have long been known to be variations on a few basic structural themes. Anatomical analogies show up among all classes of living things. These analogies were the basis in the eighteenth century for the comprehensive classification of species drawn up by Carolus Linnaeus. And these variations on common anatomical themes occur not only between species. Within individual organisms, sophisticated features grow from the variation of more primitive ones, as illustrated by Goethe's still persuasive derivation of all the major parts of a plant from the basic form of the leaf.
A description of nature like this invites the search for common mechanisms in living things to generate the archetypical forms. If biology had maintained this perspective into the next century, however, the search might not have produced fruitful results, since the physics of the time was unable to address the question of the spontaneous generation of order. In any event, this way of looking at biology was reduced to a minor theme of scientific thought for several generations as more accessible avenues of research appeared.
In the nineteenth century, as Michel Foucault noted in The Order of Things, the structural diagrams of eighteenth-century biology became timelines, descriptions of the lineages of living things. This, of course, was only one of several areas in which knowledge was being given a genealogical twist. Much the same was happening in philology, as the comparative description of Indo-European languages became the description of the descent of language families. In the English-speaking world, however, the pervasive genealogical cast of modern Western thought was expressed primarily through the new, radically evolutionary biology that Darwin introduced. Darwinism became the governing mode of thought among biologists not because they felt it provided a foundation for early industrial capitalism or for some other ulterior motive, but because it produced results. It did not produce quite the results earlier scientists would have liked to see, of course. It led biologists to ask historical questions, to enquire "how" rather than "why" the living world came to be as it is, and evolutionary biology became a science of narratives rather than general principles. But the results Darwinism did produce were so important that only today do we perhaps see how the Darwinian perspective could ever be exhausted.
The problem is that the genes, the medium of natural selection, do not exhaust heredity. The DNA in the nucleus of a cell cannot even reproduce itself. It requires the special environment provided by the complex anatomy of the cell to do so. When a cell divides, many other bodies have to reproduce themselves besides the nucleus. What DNA does provide is information for the manufacture of proteins. What the proteins then do in turn, however, is a matter of structural chemistry. The British biologist Brian Goodwin, in his 1994 book How the Leopard Changed Its Spots, shows just how much autonomy proteins have in bending and forming themselves into the stuff of living things. There are unicellular organisms that will partially regenerate their quite complicated surfaces even if the nucleus is removed, guided by chemical geometries even more complicated than those that give us snowflakes and mineral crystals. You would never guess from looking at the formulae of these chemicals that they can form branches and other structures, quite without micromanagement from the nucleus. Researchers in complexity theory were not the first to notice that order can arise spontaneously. Thanks to some new mathematics and the number-crunching abilities of computers, however, we now have some understanding of how this is possible.
The unsettling thing about chaos theory is its casual dismissal of material reductionism. The stuff of which something is made does not necessarily determine its behavior. Rather, the behavior of material is guided by certain "shapes" that turn up throughout nature, in the living and the nonliving. It has long been known, for instance, that plants and animals often subtly incorporate familiar number sequences into their anatomies, such as those plants whose leaves are arranged around their stems in accordance with the dictates of the Golden Section, or with the Fibonacci series. Today we can see in detail how even simple, nonliving materials can also assume these forms. The mathematics that governs the "periodic" patterns formed by colonies of slime molds turns out to be the same mathematics that describes the function (and malfunction) of the heart. Like other complex systems, life manifests "emergent" properties, abilities of which there is no hint in the material that makes them up. Life's special property, however, is to be able to regenerate itself, and by extension to reproduce itself. Goodwin argues that this self-sustaining ability is neither a miracle nor an accident, but a probable state into which the appropriate materials will fall, just as water running down a drain will assume the shape of a whirlpool.
The random mutation of genes explains neither life's origin nor the complex structures of living things. Random accident simply will not provide the instructions needed to build an amoeba, much less an eye. If what the complexity theorists say is true, however, then the basic features of life are often-repeated variations on "accidents waiting to happen." The formation of the eye, for instance, seems to be a recapitulation on a smaller scale of the folding and differentiation process that produces the whole nervous system. The fact that the formation of eyes-from the eyes of flies to the eyes of eagles-may be triggered by a genetic "switch" common to all forms of seeing life illustrates how a single piece of genetic information can produce widely different anatomical structures. Genes are important, of course. They define the field (called the "morphospace") where the possible forms of an organism are found. The field is not flat, however. Irregularities do not arise from genes, but from general formal principles. For a normal developing organism, there are "potholes" in morphospace into which they fall. For an evolutionary line of organisms, morphospace is sloped in a way that guides evolution in some directions but not in others. Thus, for instance, there are no vertebrates with six limbs, useful though that arrangement might seem to be. Not just anything can happen in biology, time and natural selection notwithstanding.
Today's Neo-Darwinists are aware of these things, and they are not unduly upset by them. Stephen Jay Gould, for instance, has noted that some biological features seem to be easier to evolve than others. He has made the point a part of his deconstruction of meaning in evolution. Not only does evolution have no particular direction, he says, but organisms do not even make the kind of sense they would if they were perfectly adapted to their environments. Their basic structures are even more arbitrary than we thought, for they seem to be guided in large part by nothing more than formal accidents. Natural selection, Gould says, simply makes organisms that are good enough to survive. It does not make them perfect.
While this argument has some merit, it can only deliver its intended frisson of existential emptiness if the easy-to-evolve forms are both completely arbitrary and very numerous. Gould thought he had evidence for these propositions from the analysis of the Burgess Shale fossils from British Columbia, which seemed to show that the biological world of half a billion years ago was inhabited by weird organisms that were not, for the most part, ancestral to the living things we see today. This biological dispensation was ended by one of the great, sudden die- offs to which our planet is subject. The survivors survived merely because they were lucky, not because they were in any way superior.
This thesis, argued with great persuasiveness in Gould's 1989 book Wonderful Life, was badly punctured by fossil finds made in China from roughly the same period. The Burgess Shale fossils had suffered a history that left them rather jumbled, so Gould was relying on careful but problematical efforts to piece them back together. The Chinese fossils were much better preserved. They showed that the strange monsters so imaginatively reconstructed from the British Columbian rocks were in fact mostly more primitive but ancestral forms of creatures with which we are familiar. This suggests that evolution had less choice about following the course it did than the Neo-Darwinian synthesis would lead us to believe.
None of this implies a radical break with biology as we know it. It does suggest something of a return to the more purely structural biology of the eighteenth century. The difference is that we will be able to go a long way toward describing just how anatomical forms arise in a given organism, rather than satisfying ourselves with an account of the organism's lineage. Parallel evolution, the fact that very similar creatures can arise from quite different evolutionary histories, is likely to attract more attention now that we have a theory for it. Actually, a science of form suggests a return to a kind of biology even older than Linnaeus. How the Leopard Changed Its Spots does not even mention Aristotle in the index, but obviously complexity theory is well on the way to restating something like the notion of formal cause. The difference, of course, is that while Aristotle was very interested in the actual stuff in which forms were potentially present, complexity theory emphasizes the independence of forms from the materials they shape.
There was a strong Platonic streak in the biology of two centuries ago. The "archetypical" forms that supposedly defined the natural world were often incorporated into the "argument from design" for the existence of God. The forms described by complexity are far more abstract than anything conceived in the eighteenth century, however. An argument from design that attempted to make use of them would have to argue not from the design of actual living things but from that of the mathematical objects that govern their development.
If in fact the approach to evolution suggested by complexity theory is as important as its proponents say, one of the unforeseen side-effects is likely to be a noticeable demotion of the ontological status of the gene. Goodwin himself does not seem to appreciate this, since he ends his book with a tirade that, among other things, castigates genetic engineering as an impudent meddling with the forces of nature. But the whole point of complexity in biology is that genes just define the morphospace for other substances to do their work. They are not the "essence" of life. If life has an essence, it lies in the immaterial forms that govern its development. Furthermore, complexity suggests that genetic engineering may have surprisingly narrow limits. What the creature will be like cannot be computed from the DNA in the nucleus of a germ cell. This follows from the basic idea of emergent behavior: the activity of a complex system cannot be computed from the composition of its parts.
Genetics is a wonderful science, of course. We can alter the genetic makeup of a cell so that it produces useful substances, such as insulin, and we can spot mistakes in the genes of individuals. What we cannot do, however, is produce a super-race, or indeed any kind of radically new creature. Novelty in biology means a change not just in the genes, but in the whole organism. It is not in our power to create new archetypes to inform artificial creatures. Such archetypes might be waiting in morphospace for us to stumble on them in the course of doing something else, but that is hardly "engineering."
Something else we may soon be able to forget about is sociobiology, at least in its evolutionist incarnation. Emergent behavior belongs as much to societies of living things as to the life of individual organisms. It is emergent behavior that permits insect colonies, each individual member of which is almost completely stupid, to act in a wonderfully coordinated fashion. (As Goodwin demonstrates, it is actually simpler to show the mathematical basis of social behavior than of physical morphogenesis, since whole organisms are much easier to observe than molecules.) Attempts to use evolution to explain why people act the way they do tend to take much the same form as evolutionary explanations of anatomy. If you believe writers like Robert Wright (author of The Moral Animal), it is all perfectly simple. Natural selection ensures that those genes that support life-sustaining behavior will be passed on to future generations, just as it ensures that genes associated with useful physical traits will be passed on. In Wright's scheme of things, of course, communitarian and altruistic behavior are survival-enhancing, rather than the highly competitive and aggressive behavior emphasized by earlier Darwinists. And yet, while it is possible to show by studying primates why a particular behavior may be good for human genes, the only real evidence that a given human behavior may be life-sustaining is that some living people act that way today. In other words, their lineages survived because their lineages survived-which is true enough as far as it goes, but not really very helpful. Certainly it gets us not a millimeter nearer to understanding where these life- sustaining behaviors came from in the first place. Natural selection is not false, of course: an unfortunate inherited disposition, such as an irresistible impulse to jump off high places, is going to diminish the survival prospects of any family line. Natural selection refines behavioral repertoires that have other origins. It does not explain them.
Yet while the theory of natural selection does not do everything we might wish, at least it does not do too much. Sociobiology is often criticized for suggesting that we are slaves to our genetic inheritances, to which sociobiologists reply that our genes compel us to do almost nothing, but simply give certain predispositions. An attempt to explain human behavior using complexity would actually be much more deterministic. Complexity would turn sociobiologists from looking for etiological myths in the anthropological literature and set them to trying to define archetypes of behavior. One can imagine a whole new sociology that is interested not in fundamental causes, but in the perception of ideal states, like discerning the hexagrams of the "I Ching" in daily life. Doubtless these states will be defined empirically, using the sort of computer models that have proven so helpful in explaining how all the ants in a nest can manage to start and stop in a synchronous rhythm. Post-Darwinian sociobiology, too, will be something best taken with a grain of salt.
Goodwin makes the point that Darwin's model of evolution closely parallels what he, and others, imagine is the Christian model of history. Both deal with imperfect creatures slowly redeeming their natures as they struggle forward in time. Goodwin hopes that evolution as seen from the perspective of complexity will have an appeal beyond the Christian West. He sees it as an expression of the "original blessing" theology of Matthew Fox. The whole natural world participates in the same fundamental forms, and so trees should have rights like those of human beings, and people should be nice to each other and establish a cooperative, community-based economy. The last two chapters of his book are devoted to these ideas, but they are, I think, something of a non sequitur. Complexity theory is simply not a metaphysics of universal harmony. The more likely effect on popular culture of this view of evolution would not be communitarian, but apocalyptic.
There is not much difference between a Hegelian synthesis and the radically unpredictable emergent behavior of a complex system. The vitalist biologists of the first half of the twentieth century were criticized, doubtless correctly, for appealing to mysterious, unmeasurable organic forces. Complexity theory requires no new physical forces, but its emphasis on unpredictable leaps in nature is strongly reminiscent of Henri Bergson. It invites the creation of philosophies like dialectical materialism, but minus the material. Darwinism went well with a view of history characterized by slow human progress. The new view of evolution, on the other hand, suggests the possibility of sudden cosmic transformation. In New Age eschatology, it is not just mankind that will be made new, but the whole biosphere. Goodwin may have a point when he says that Darwinism bears a family resemblance to some Christian views of history, though not entirely for the reasons he cites. Like the Augustinian model of history, Darwinian time is linear, of indefinite duration, and essentially "forward-looking," if not invariably meliorative. Both discourage the idea that we happen to live at a time of unique importance to the historical process. But another major Christian view of history is the millenarian, the daily expectation of unknown wonders. Like chaos theory, it is "nonlinear," and so friendly to surprises and discontinuities. This mode of thought may be the mode of the future-and not just in biology.