Copyright (c) 1999 First Things 96 (October 1999): 78-100.
The British media let it all hang out for the funeral of Basil Cardinal Hume, Archbishop of Westminster. But then, as we learned from the lachrymose sensationalism surrounding Princess Di’s death, something dreadfully embarrassing seems to have happened to the nation of the famously stiff upper lip. In any event, Niall Ferguson, the noted Oxford historian, writing in the Daily Telegraph, complains that at the time of the cardinal’s funeral last June "a visitor from, say, Poland might have been forgiven for thinking he was in another Roman Catholic country." Part of the problem, as he sees it, is that there are so many committed Catholics in Fleet Street. He allows that there are a lot of Jews and atheists, too, but the depressing thing "is how few committed Protestants there seem to be." If there were more Protestants, Ferguson opines, so much attention lavished on a cardinal’s funeral "would have elicited a certain amount of bristling." Ferguson misses the bristling.
And it’s not only Fleet Street. Catholics, he notes, were 5.9 percent of the population of the United Kingdom in 1900 and are now 9.8 percent. "That Britain has become one–tenth Catholic is a remarkable historical phenomenon. But more remarkable is that it is so much less Protestant than it used to be." Ferguson chiefly blames Protestants for the change he so dislikes. Their clergy have made so many accommodations in doctrine, liturgy, and morals that "most have resignedly accepted their role as the organizers of ill–attended sing–songs. By comparison, the Catholic Church has achieved a clever and successful makeover: not only united where the Protestants are divided, but also solemn where they tend to be silly, and socially conservative where they tend to be trendy." It used to be that people associated Catholicism with "Irish backwardness; it was Bogside, not Brideshead." The change started with Newman, who gave Catholicism "some intellectual respectability," and was exemplified by the upper–class and eminently proper Cardinal Hume.
But the fact remains, says Ferguson, that Catholicism is obscurantist and reactionary. For example, "Is it really sensible that a senescent Polish bachelor should have the last word on the reproductive rate of hundreds of millions of women?" The "link between Catholicism and stagnation does seem suspiciously close. No doubt, as the life of Basil Hume well illustrated, Catholicism does more good than harm in a developed society like ours. But I for one would not fancy the chances of a Catholic Britain in the twenty–first century." When it comes to bristling, what Fleet Street may lack Oxford seems ready to supply in abundance.
Cardinal Hume was, by almost unanimous agreement, a charmingly self–effacing and amiable man. Among some in this country, as well as in the UK and Rome, he was thought to be orthodox, although not assertively so. He had that British air of somewhat amused tolerance toward the American cousins. In an interview with the ultramundane National Catholic Reporter, he reflected on his American encounters. "It came across to me very strongly that there was this fearful side to [the U.S. bishops]. That, I think, frustrates some of their theologians. You see, we’re different over here. We have a minority church. We’re not important worldwide. We can get on with our own jobs. It’s a different situation." In the same interview, he reportedly expressed puzzlement about Cardinals Bernard Law and James Hickey in the U.S. who took public issue with the late Joseph Cardinal Bernardin’s Common Ground Initiative. "I was a good friend of Joe’s," said Hume. "I had stayed with him in Chicago. I couldn’t understand that."
Cardinal Hume was, as the British say, a wet, and it may be that his accommodating manner was what was needed for the last two decades of the quiet insinuation of Catholicism as the largest (in terms of actual practice) communion in Britain. In the view of the Niall Fergusons, it seems, it’s simply not fair for popery to return in such nonthreatening guise. As for Cardinal Hume, while we wish he had understood America and some of its other cardinals better, requiescat in pace.
People write memoirs for many reasons, some more noble than others: to set the record straight, to settle scores, to discharge a duty to history, to justify one’s actions, to bid for public approval, to indulge the pleasures of nostalgia, to instruct the successor generation, or simply to try to make sense of one’s life and the world of which it is part. All these reasons are evident in Henry Kissinger’s Years of Renewal (Simon & Schuster, 1,151 pp., $35), the third and concluding volume in the account of his service as National Security Advisor and Secretary of State in the Nixon and Ford administrations. Together with White House Years and Years of Upheaval, this project surely vies for the title of the century’s most comprehensive, instructive, and readable record of major league diplomacy. For anyone seriously interested in foreign affairs and, more particularly, the place and prospects of America in the world, it is essential reading.
Years of Renewal covers the time from the swearing in of Gerald Ford, after the August 1974 resignation of Richard Nixon, to the inauguration of Jimmy Carter in January 1977. Of course, to make the story coherent, Kissinger must frequently cut back and forth in time, displaying both the sources and later consequences of the problems with which he dealt. One of the great merits of a book such as this is that it rescues us from the flurry of daily headlines, the events, alarums, and crises both real and fake that once seemed of such enormous moment. If one result is a stronger appreciation of how ephemeral are many of our news–generated anxieties, another is an understanding of how the crises of the moment are, more often than not, symptoms of perduring dilemmas. Consider but a few of the dizzying array of excitements recounted in this volume that kept Kissinger and his colleagues up late at night, shuttling to and fro, jockeying for advantage, threatening and cajoling, promising and seducing, tasting intermittent exultations amid recurring tedium and occasional despair: Turks, Greeks, and Cyprus; détente and nuclear disarmament; civil war in Lebanon; Israel and the "Peace Process"; apartheid in South Africa; civil war in Angola; Marxist insurgencies in Latin America; and on and on.
But the most interesting and most wrenching part of this very big book is about the U.S. and Indochina. For those of us who were caught up in the turbulent disputes over the Vietnam War, and also for those born later, Kissinger’s account makes for gripping reading. This is his telling of the much told story, and he tells it with an intense mix of sadness, disillusionment, resentment, and outrage—while striving to maintain a modicum of professorial detachment. Anyone who cares about that story, which continues to shape our domestic politics and foreign policy in sometimes convoluted ways, will want to attend to Kissinger’s telling.
Kissinger does not go so far as some who say that the war was effectively won, but he believes there was a very reasonable chance that South Vietnam could have maintained its freedom and independence if the U.S. had continued its modest support after the withdrawal of its troops. The support required was much less than what has been committed to South Korea, which has maintained its independence almost forty years since the aggression from the North. But after Walter Cronkite and the herd of independent minds in the media had misrepresented the Tet Offensive of 1968 as a great defeat, domestic support for continued assistance to Saigon was shattered. Of course Kissinger blames political figures such as George McGovern and William Fulbright, but his harshest and saddest words are reserved for Senator Henry (Scoop) Jackson, who represented himself as something of a conservative and realist in foreign policy but was, in Kissinger’s view, as guilty as McGovern in leaving South Vietnam to the tender mercies of Hanoi.
His judgment on the war’s outcome is severe:
The United States devoted two decades of blood and treasure to help a group of newly independent, fledgling societies avoid conquest by their merciless and militarily more powerful Communist neighbor in North Vietnam. Yet, when the precarious peace wrought by the Paris Agreement  was challenged, the United States, in the throes of physical and psychological abdication, cut off military and economic assistance to people whom we had given every encouragement to count on our protection. This consigned those we had made our wards to an implacable—and, in Cambodia, genocidal—Communist conqueror.
The shame and the terror of the last helicopter leaving the U.S. embassy in Saigon in April 1975 should be indelibly imprinted on our memory of the past century of horror. Kissinger’s account of the confusions and humiliations surrounding that retreat is particularly poignant. Then there was Cambodia, where the Khmer Rouge were at the same time taking over Phnom Penh and would end up by systematically murdering between one and two million Cambodians, 15 to 30 percent of the population. The American ambassador offered to evacuate Sirik Matak, the former prime minister, who on April 12 responded, as the U.S. evacuation was underway, with a handwritten note in elegant French:
Dear Excellency and Friend:
I thank you very sincerely for your letter and for your offer to transport me towards freedom. I cannot, alas, leave in such a cowardly fashion. As for you, and in particular for your great country, I never believed for a moment that you would have this sentiment of abandoning a people which has chosen liberty. You have refused us your protection, and we can do nothing about it.
You leave, and my wish is that you and your country will find happiness under this sky. But, mark it well, that if I shall die here on the spot and in my country that I love, it is no matter, because we all are born and must die. I have only committed this mistake of believing in you [the Americans].
Please accept, Excellency and dear friend, my faithful and friendly sentiments.
The week of the U.S. evacuation, the headline in the New York Times read: "Indochina Without Americans: For Most, a Better Life." The Khmer Rouge first executed all members of the former government who had stayed behind, then all government employees and their families, and then forced the two million inhabitants of Phnom Penh to leave the city for a war–ravaged countryside that could not support them. Sirik Matak was shot in the stomach and left without medical aid. He died after three days.
It is a bitter story, and the bitterness with which Kissinger tells it is fully warranted. In 1965, Father Dan Berrigan, Rabbi Abraham Joshua Heschel, and I were the first cochairmen of what then was called Clergy Concerned About Vietnam. In the present book, Kissinger indicates, but does not develop, the point that already by 1965 he thought U.S. policy in Vietnam a dangerous mistake. The question was how to withdraw with honor, which meant, above all, keeping faith with those who counted on us. In January 1969, four of us met with Kissinger, newly installed as Nixon’s National Security Advisor, in the White House. He told us that, if a year later we did not see persuasive evidence of a policy of U.S. withdrawal of troops from Indochina, he would join our protest against the war. This is compatible with the account of his intentions—and Nixon’s and, later, Ford’s policies—offered in the present book. By 1971, I had largely withdrawn from what had become Clergy and Laity Concerned About Vietnam, which was probably the largest continuing organization in what was known as the peace movement. I was thoroughly disillusioned with the "radicalized" liberalism that seemed increasingly to want not a just settlement but a Communist victory, and was on my way to joining the company that would go by the name of "the neoconservatives."
As in the case of Senator Jackson, who was championed by this company, Kissinger’s book is very hard on conservatives and, especially, on "neocons" who, he charges, were complicit in the callous cutting off of even meager assistance to allies in Indochina. He expected better of these people, and his particular disappointment with Commentary and its then editor, Norman Podhoretz, is acute. Later, in 1982, Podhoretz would publish Why We Were in Vietnam, a scathing account of the abandonment of Indochina, but by then it was too late to make any difference in policy, as distinct from the never–ending polemics of the intellectual class. I expect that Podhoretz, Jeane Kirkpatrick, and other major players in the neoconservative world of the time will, in due course, have something to say about Kissinger’s view of their part in what led up to the catastrophe of 1975. As for my part, I was simply weary of Vietnam and paid slight attention. That is no excuse. I was stirred to action again in the summer of 1975 when I joined with Jim Forest of the Fellowship of Reconciliation, Fr. Berrigan, and a few others in protesting to Hanoi its brutal treatment of those in the conquered South. That, too, was too late. We appealed to a little over a hundred people who were identified as national leaders in the movement against the war to join us in our protest. The split was almost exactly fifty–fifty. Those who declined made clear that they subscribed to the maxim, "No enemies on the Left." They were pleased by Hanoi’s victory. (For further reflection on those turbulent days, see "Remembering the Movement" in my 1992 book, America Against Itself, University of Notre Dame Press.)
The Nixon–Kissinger "breakthrough" with China is almost universally acclaimed, and Years of Renewal contains fascinating details about meetings with Chinese leaders, especially with Chairman Mao, who in his last years expressed his preference for "rightists" in world affairs and spoke repeatedly about his soon going to meet the judgment of God. One gets the impression that Kissinger was embarrassed by the second subject. We are not told what he thinks about President Clinton’s strange notion of U.S. "partnership" with the brutally repressive regime in Beijing, which is perhaps just as well since after he left office the entanglement of his Kissinger Associates Inc. with corporations operating in China has compromised the credibility of what he says about China policy. On other questions, Kissinger takes pains to answer his critics, especially his conservative and neoconservative critics, about his role as it appears in light of the much later ending of the Cold War.
In the 1970s, Kissinger was much criticized for policies of "détente" and "coexistence" that seemed to assume that the Soviet empire was a permanent feature in world affairs. He now takes much credit for the "third basket" of the 1975 Helsinki agreement of the European Security Conference, dealing with human rights. It is true that leaders such as Vaclav Havel and Lech Walesa in Central Europe courageously used the third basket to challenge "the evil empire," but it must be candidly said that little thanks is owed Kissinger for that. The critics at the time had ample reason to say that Helsinki looked like a ratification of Soviet hegemony and the division of Europe created by Yalta. The conservative criticism, Kissinger writes, "was a pity because we did not differ with their analysis of the nature of the Soviet system. Where we disagreed was in assessing its implications for American foreign policy. . . . Far better, we thought, to seize the initiative and control the diplomatic process. In the meantime, we would keep open the possibility that what had begun as tactics might evolve into a more reliable pattern of coexistence."
His critics did not think U.S. policy should be aimed at "a more reliable pattern of coexistence." They thought the goal should be, however long it took and through however circuitous a policy, the end of the Soviet empire. For Kissinger, the paramount concern was for stability and order so that he could "control the diplomatic process." Looking back on the end of the Cold War, Kissinger goes further: "Reagan’s policy was, in fact, a canny reassertion of the geopolitical strategies of the Nixon and Ford Administrations clothed in the rhetoric of Wilsonianism—a quintessentially American combination of pragmatism and idealism. In an important sense, the victories of the 1980s derived from a Reaganite variant—not a rejection—of the strategies of the 1970s."
One can understand why Kissinger says that now, but it is utterly implausible. Reagan was neither a historian nor a philosopher, but he made no secret of his belief that the Soviet empire was a temporary aberration, a brutal yet fragile artifact that, if carefully but firmly pressed, could not be sustained. There is no evidence from the time that Kissinger shared that view. Reagan’s position was strongly supported also by Prime Minister Margaret Thatcher. In Kissinger’s account of the ending of the Cold War, she is not mentioned. Most astonishingly, there is not even one mention of John Paul II. Reagan, Gorbachev, Thatcher, John Paul II—in almost every telling these four were the crucial players in the collapse of the evil empire. In some of the more persuasive scholarly accounts, John Paul II was the most crucial player, since without him there would have been no Walesa, no Solidarity, and no "Polish miracle" that began the unraveling of Soviet control. In Kissinger’s extended discussion of one of the most important political, moral, economic, and military events in human history, John Paul II did not exist, Gorbachev was a pawn of events beyond his control, Thatcher goes unmentioned, and Reagan is grudgingly commended for following Kissinger’s lead. (I say grudgingly, recalling that "the rhetoric of Wilsonianism" is a deeply pejorative phrase in the Kissinger lexicon.) Such a reconstruction of history is as unseemly as it is incredible.
Kissinger is commonly described as a practitioner of Realpolitik, one who favors stability and order over justice. He writes that his devotion is "to evolution over revolution," which is sensible enough. He wants always to be "realistic" and to deal with the problems of the "real world," which, he says again and again, are "complicated," "convoluted," "ambiguous," etc. One is reminded of Herbert Butterfield’s observation that realism is not a school of thought but a boast. Given the moralistic boastings and sentimental fuzzy–mindedness of some foreign policy practitioners, not least in the current Administration, such realism is not without its merits. But it also has its severe limitations. From the evidence of his memoirs, Kissinger is never more in his element than when relishing the complexities of making an improbable deal. There is a frequently anti–intellectual tone in his dismissal of critics who presumably don’t know about that real world. Kissinger presents himself as the consummate deal–maker, and never more pleased than when a deal can be made circuitously and secretively, producing a public surprise. It is not unlike, I suppose, the satisfactions in doing magic tricks. At the risk of psychologizing, the impression is not entirely dissimilar from Kissinger’s extended and fascinating description of the personality and habits of Richard Nixon. In any event, Kissinger’s "real world" is a world largely impervious to big ideas and clear moral judgments. In this connection, one notes that Kissinger has a thing, as they used to say, about religion.
In underscoring the complexity of the real world of international affairs, Kissinger employs religious vocabulary in criticizing those who are simplistic or deceived by abstractions. He has no use for the "theology" of the arms control crowd; Hanoi employs "sacramental phraseology" in negotiating; Cambodian genocide has a "near liturgical quality"; and the bureaucracy is given to delivering "homilies." "Sacramental" is the word employed repeatedly to signify what is unreal, evasive, or delusive. One is left wondering where this habit of speech—and, one supposes, of mind—comes from in Kissinger’s life experience. He certainly gives no indication of having thought much about religion, or Christianity in particular. Apparently indifferent to philosophy and tone deaf to religion, he makes embarrassing missteps. For instance, in analyzing his difficulties with Archbishop Makarios, the president of Cyprus, he twice asserts that the source of the problem was "a legacy of the Greek Orthodox clergy’s historic claim to both secular and religious leadership." Anyone who knows the slightest thing about Orthodoxy knows that the tradition, from the Byzantine Empire to Moscow’s evil empire, is exactly the opposite: the subordination of the Church to the state in what is pejoratively called caesaropapism.
More than with the first two volumes, I am struck by the ways in which Years of Renewal underscores Kissinger’s observation that anyone entering upon public service had better get a good education first because afterwards he has to live on his intellectual capital. Kissinger evidences very little intellectual curiosity. These thousands of pages are infinitely detailed about tactics, maneuverings, logistics, and personalities but are impatient of ideas, except for the one big idea of power. Apart from memos and briefs related to the tasks at hand, he seems to have had little time for reading, which is perhaps not surprising, given the exhausting pace at which he worked, caught up, one gathers, in a perpetual shuttle from crisis to crisis, from deal to deal. His critics on the domestic front are, in distinction from those who gladly betrayed their country’s interests, persistently described as Wilsonians, ideologues, idealists, moralists, and others trapped "in our nation’s historic quest for moral purity." In this context, "moral purity" drips with authorial derision.
I suppose Kissinger may have written these three big books for most of the reasons mentioned at the outset of this reflection, but certainly not for the last one—"to try to make sense of one’s life and the world of which it is part." At least there is slight evidence in these curiously impersonal books that that was among his purposes. In a time of narcissism and tell–all exhibitionism, there is a lot to be said for such reticence. At the same time, his determination to, as he might put it, stick to the facts of the real world may reveal more about Henry Kissinger than the true confessions he did not write. The third volume concludes on a personal note, however. It is a letter from his father, who was then very ill, written in 1946.
In all the hard times of the war, the confidence lived in me that God will protect you. I am grateful to Him that He was with you. . . . Always keep in mind that we find real satisfaction only in what we are doing for others. Try always to be good, faithful, helpful, reliable, selfless.
I would have liked to see you grown up and to be witness of your success and happiness. God bless you.
The last words of the book are, "My father recovered and lived until 1982 through all the events described in these memoirs." The clear implication is that his father got to see what he hoped to see. One must hope that Henry Kissinger is right about that.
Salvation by "faith alone" has been given a new twist, except in this case we’re talking about the salvation of institutions. Addressing a Salvation Army conference in Atlanta, Vice President Al Gore endorsed the "charitable choice" proposal originally advanced by conservative Republican Senator John Ashcroft of Missouri. The proposal would allow government funding of "faith–based" social services. Said Gore, "For too long, faith–based organizations have wrought miracles on a shoestring. With the steps I’m proposing today, they will no longer need to depend on faith alone." Michael Horowitz of the Hudson Institute, who has been a formidable champion of efforts to get U.S. foreign policy on the side of religious freedom, is among the strong opponents of the Ashcroft–Gore direction, fearing that it will mean the co–optation of religious programs by the government. "Who do you want running these groups: the best saver of souls or the best reader of the Federal Register?" he asks. (The Register is the government publication scanned by grant seekers for funding opportunities.)
James Pinkerton, policy advisor in the Reagan Administration, shares Horowitz’s misgivings. "Churches," he says, "will be sorely tempted to grab for the apple of federal funding. But once they bite, the last remaining reservoir of political innocence will be gone." Government funding of religiously based programs is a question frequently addressed in these pages. The late Dean Kelley, a dear friend and relentless proponent of religious freedom, was insistent in cautioning, "The Queen’s shilling is followed by the Queen’s command." He, Horowitz, Pinkerton, and others are entirely right to warn about the perils involved in proposals such as "charitable choice," as well as in education vouchers and similar measures. But I believe the risks must be taken.
First, there are millions of people, especially the poor, who would greatly benefit by programs aimed not simply at "delivering services" but at transforming lives. Second, it is a long–standing and grave injustice that religious programs serving a public purpose are discriminated against simply because they are religious. Third, sustained political attentiveness, combined with growing judicial "accommodation" on church–state questions, can help prevent funded programs from becoming government agents. Fourth and most important, it is up to those who run such programs to make sure that their religious integrity is not compromised, and that can be done in a new climate in which what are pejoratively referred to as "sectarian" purposes are no longer penalized.
The critics are right in noting that the vitality of religious schools and social services is related to the sacrifices, financial and other, of those who voluntarily support them (by "faith alone," if you will). Could that be undermined by government support? The answer is clearly yes. We may, however, be moving into a new era in which that peril is more than matched by high promise. Religious institutions will be free to decline government funding, and no doubt many will. But this much is certain: the unjust and irrational discrimination against religion, and the systematic denial of services to those who need them, is no longer tolerable. Conceivably, we may live to rue the day when law and public policy abandoned the old strictures of a rigid "separation of church and state." The law of unintended consequences has not been repealed. Whether or not things will turn out as the critics fear, however, is entirely up to us.
Along with words of high praise, there have been some critical reviews of my extended reflection, "Bill Clinton and the American Character," in the June/July issue. The gist of the criticism is that I let the American people off too lightly, suggesting that, in the sleazy soap opera running from January 1998 through the failed impeachment effort, they were more sinned against than sinning.
I do not wish to indulge in what William J. Bennett calls "wishful assertions," but my conclusion is somewhat different from his. I might have avoided some misunderstandings had I made it clear that my question is: What have we learned about the character of the American people from these events that we did not know before? Sensible people know that the idea of a national character is as dubious as the discussion of it is inevitable. Those who think I let the citizenry off too lightly produce a long and depressing list of pervasive cultural and moral debilitations that were on display during Clinton’s Year of Lies. To which my response is, Yes, but we knew that before; we did not learn it from these recent events.
Over the last decade Bennett himself has provided a useful service in promulgating what he calls "an index of leading cultural indicators." He and others have amply documented a dreary record of decline and fragmentation with respect to marriage, divorce, teenage sexuality, falling educational standards, and weakened adherence to the virtue traditions in American life. At a different level of analysis, I offered in 1984, in The Naked Public Square, an examination of the ways in which religion and religiously grounded morality have been systematically excluded from public life. Religiously informed morality has been hermetically sealed in the sphere of privacy and personal preference, from which it is near impossible to make judgments regarding life in the public square. Indeed public moral judgment has been largely discredited as "judgmentalism." The philosophical causes of this fateful turn were brilliantly analyzed by Alasdair MacIntyre in his 1981 book, After Virtue. The consequences of this intellectual undermining of the virtue tradition for moral formation and deformation are evident from grade school through graduate school, and, of course, also in our political culture.
What we have known about the American character for a long time was incisively set forth by Philip Rieff in The Triumph of the Therapeutic, first published in 1966. Rieff, you may recall, contended that Western history has passed through several defining models of human life and community. In the Greco–Roman period the model was that of political man; in Christendom it was religious man; in the Enlightenment it was what he calls economic man, meaning the life of rational calculation. After Freud, says Rieff, we have psychological man. The triumph of the therapeutic is a condition in which there is no publicly acknowledged good beyond that of a "sense of well–being." The good society is one in which no ego is offended or constricted, and most certainly not constricted by moral judgment. We heard William Bennett worry that history may conclude that William Jefferson Clinton really was the representative man of our time. That worry is given credibility by the prescient words of Philip Rieff, written when Mr. Clinton was still a teenager. The passage deserves quotation in full:
The wisdom of the next social order [the therapeutic society], as I imagine it, would not reside in right doctrine, administered by the right men, who must be found, but rather in doctrines amounting to permission for each man to live an experimental life. Thus, once again, culture will give back what it has taken away. All governments will be just, so long as they secure that consoling plenitude of option in which modern satisfaction really consists. In this way the emergent culture could drive the value problem clean out of the social system and, limiting it to a form of philosophical entertainment in lieu of edifying preachment, could successfully conclude the exercise for which politics is the name. Problems of democracy need no longer prove so difficult as they have been. Psychological man is likely to be indifferent to the ancient question of legitimate authority, of sharing in government, so long as the powers that be preserve social order and manage an economy of abundance. The danger of politics lies more in the ancient straining to create those symbols or support those institutions that narrow the range of virtues or too narrowly define the sense of well–being; for the latter seems to be the real beatitude toward which men have always strained. Psychological man, in his independence from all gods, can feel free to use all god–terms; I imagine he will be a hedger against his own bets, a user of any faith that lends itself to therapeutic use.
Bill Clinton as the representative man of his time? It is worth considering. The demand for permission "to live an experimental life." Politics as the feeling of any pain that limits the "plenitude of option." In independence from the gods who impose and judge, the freedom to use all god–terms and any faith, especially those that, when one is caught, communicate to more traditional folk one’s contrition for violating the constraints by which one is not bound. I hope this does not sound cynical. (With respect to the continuing low theater of our politics, a friend complains, "I’m trying to be cynical, but I just can’t keep up.")
A further observation by Philip Rieff sounds disturbingly familiar now. Of culture as therapy he writes:
The rules of health indicate activity; psychological man can exploit older cultural precepts, ritual struggle no less than play therapy, in order to maintain the dynamism of his culture. Of course, the newest Adam cannot be expected to limit himself to the use of old constraints. If "immoral" materials, rejected under earlier cultural criteria, are therapeutically effective, enhancing somebody’s sense of well–being, then they are useful. The "end" or "goal" is to keep going. Americans, as F. Scott Fitzgerald concluded, believe in the green light.
Americans believe in the green light, as we were constantly reminded by Mr. Clinton and his allies who urged us to put this behind us and move on. A yellow light, never mind stopping for a red light, is suspiciously un–American. On score after score, Philip Rieff’s depiction of the therapeutic society and psychological man eerily anticipated what we have been through in the past year and a half, and indeed the two terms of the Clinton presidency.
Books such as After Virtue, The Naked Public Square, and The Triumph of the Therapeutic attempted to say things that we have known about the American character for a long time now. These are not the only things that we know about the American character. Recall the caution that America is a society so vast and so various that almost any generalization made about it is amply supported by evidence. Yet we have no choice but to make generalizations, in the hope of devising a conceptual framework or story–line by which we can understand the historical moment and our place in it. I suggest this generalization: The coercive privatization of religion and religiously grounded morality, the abandonment of the virtue traditions, and the triumph of the therapeutic are very important parts of what we know about our so ci ety and our culture. But they are not the whole of it.
Already in the 1970s, I was discussing the great conflicts in our public life in terms of a "culture war." That phrase has been criticized for being excessively polemical, but I am afraid that the phrase is all too accurate. Alasdair MacIntyre has observed, paraphrasing Clausewitz, that our politics has become warfare carried on by other means. The war is over the definition of American culture, and over whether there is or should be a definable American culture. Culture is that which gives expression to who we are and aspire to be, and is, of course, rooted in cultus, indicating that culture reflects what we revere, and even what we worship. Thus—inevitably and ominously—culture wars are not entirely unlike wars of religion. American culture has never been entirely one thing; and certainly it has not been one thing since the collapse of the liberal Protestant cultural hegemony in the last half century. More generally, no culture is just one thing. In the 1920s the great G. K. Chesterton visited America and declared it to be a nation with the soul of a church. The British journalist Alistair Cooke agreed, adding that it is also a nation with the soul of a whorehouse. Both generalizations are amply supported by the evidence.
What I believe we have learned from the recent unpleasantness is what happens when the office of the presidency is not on the side of the nation with the soul of a church. Or, put differently, when the presidency is captive to a different church which employs old rituals and god–terms in devotion to psychological man’s religion of the actualized self following his bliss (Joseph Campbell) to a sense of secured well–being. The culture of which Bill Clinton may be the representative man has been with us for a long time. But it has always had to contend against a countervailing culture, and this contention gives shape to what is aptly described as the culture war.
So, in light of the past year and a half, what have we learned about the American character that we did not know before? I suggest that we have not learned very much. We have learned what happens to a nation with the soul of a church and the soul of a whorehouse when it has a President such as this. Maybe, just maybe, we have learned never again to entrust the presidency to a person of such reckless habits and suspect character. We have learned that our political system is defenseless against a talented and shameless representative of a therapeutic culture that recognizes no appeal beyond a basely defined sense of well–being. In sum, we have learned to hope with renewed intensity that Mr. Dooley was right when he said that God looks out for drunks, little children, and the United States of America.
Few arguments have stirred as much discussion in recent years as Francis Fukuyama’s "The End of History," first published as an article in 1989 and then, in 1992, as a book by the same title. More recently, his The Great Disruption: Human Nature and the Reconstitution of Social Order (Free Press) has appeared, and we will be giving it major attention in these pages. While admiring the author’s manifest brilliance and imaginative powers, I confess that I was not much taken with the "end of history" thesis. My reasons were similar to those expressed by historian Gertrude Himmelfarb in her essay in a symposium on Fukuyama published in the summer issue of the National Interest. She is suspicious of grand "philosophies of history," recalling Fernand Braudel’s writing of history in terms of "inanimate forces" and "deeper realities," by comparison with which human interests, passions, and ideas were of slight consequence. Braudel exclaimed, "Down with occurrences, especially vexing ones! I had to believe that history, destiny, was written at a much more profound level." For Himmelfarb, history is occurrences, and much of the time they are vexing ones. As I have probably written too often, whether in philosophy, theology, or social criticism, we should profoundly distrust theories that slight the thusness and soness of things.
The symposium is in response to Fukuyama’s "second thoughts" on his 1989 article—thoughts which have led him to a greater appreciation, evident in the most recent book, of the "new science" that is transforming human nature and the social order. Writing in the Weekly Standard (June 28, 1999), Andrew Ferguson takes on both Fukuyama and the new science for their construction of an essentially unscientific (in the sense of testable or falsifiable) mythology of materialism and determinism. Ferguson and some in the National Interest symposium accuse Fukuyama of wanting to have his cake and eat it, too: to assert a continuing "human nature" that is, at the same time, infinitely malleable, to affirm scientific determinism while leaving room for human decision and even for something like the soul. Harvey Mansfield of Harvard takes up a somewhat different but related concern:
Fukuyama concentrates his doubts on the new developments in biotechnology, particularly on two new drugs, Ritalin and Prozac, illustrating the scary character of modern science. Such drugs may seem capable of creating a "new type of human being." But in fact they simply help to constitute the Last Man, whose definition has been available at least since the early writings of Marx. Ritalin tempers the high spirits of boys, and Prozac raises the low spirits of women. The result is that we will no longer be troubled by psychic sexual differences and all will be equally capable of the same equanimity. Anyway, why would we want to be troubled if, life being perfect, there is nothing to be troubled about? Fukuyama can see that these drugs contribute to the belittlement, not the esteem of man. Men are belittled when they do not feel joy or despair, even though, or precisely because, such sentiments are often mistaken or excessive. Nothing great is gained for us if nothing important can be lost. The modern project for reducing risk can be seen at work not only in economics but even in the element of esteem, where it moves us to equalize our chances and to smooth out life’s ups and downs. Fukuyama rightly wonders whether, when you take such drugs or other soothing therapy, you are still yourself; or have you given your self away to keep it safe? The situation gets worse if one were to push Fukuyama to decide whether esteem is really recognition in the Hegelian manner. Hegel conceived that recognition is equal because in recognizing what is other, one recognizes oneself. But if the other is always the alienated self, there is nothing in the universe except the self. Then what are we to say of a life devoted to finding the self that is always ready to abandon the self, that vacillates between desiring recognition and settling for prudent submission? We know that the modern, well–adjusted self belittles man because we know from religions and philosophies wiser than Hegel that it cannot satisfy human nature. Men cannot be satisfied if there is nothing above them to admire and strive for. At the end of history we may decide to rob ourselves of our humanity, and indeed the most telling charge against American education today is that it gives our children nothing to look up to. But if we do that, it will have been our delusion and our fault.
Readers will recognize the similarity to the future envisioned by Walker Percy in books such as Love in the Ruins and The Thanatos Syndrome. The discontents of civilization are "vanished" as the self leaves civilization and presses a button to become another self. Also of particular interest in the symposium is the contribution of Harvard’s E. O. Wilson, author of On Human Nature and Consilience: The Unity of Knowledge. "In a nutshell," writes Wilson, "human nature is not the genes that prescribe it. Nor is it the cultural universals, such as incest taboos and rites of passage, which are its products. Rather, human nature is the epigenetic rules, the highly diverse inherited regularities of development in mental traits and their physiological modulators. These rules are the genetic biases in the way our senses perceive the world, the symbolic coding by which we represent the world, the options we open to ourselves, and the responses we find easiest and most rewarding to make." Then comes the really interesting part:
In sum, in heredity as in the environment, you cannot do just one thing. When a gene is changed by mutation or replaced by another gene, unexpected and possibly unpleasant side effects are likely to follow. Although a complete base pair sequencing and then gene mapping are expected to be completed within a decade, we are probably generations away from a complete genomics—the genetic maps plus all the molecular steps by which the code is read out in final phenotypic traits. By the time the treacherous waters of possible genomic intervention and replacement are charted, I suspect a moral argument will keep Homo sapiens from traveling there except for gene therapy and minor enhancement. The epigenetic rules—human nature—are not just the algorithms by which individuals are assembled. They are the essence of humanity, the product of millions of years of adaptation to this unique, life–giving planet. They are all we have that separates us from carbon–based all–purpose computers, or transformation into jerrybuilt artifacts. It is one thing to evolve toward taller, brighter, more sociable be ings; it is quite another to change or even lose our humanity. In my opinion, human be ings will never choose to become posthuman.
There is a measure of comfort in Wilson’s words, but note that his oft reiterated dogma is "scientific materialism." Although he doesn’t like the word, his dogma (as Ferguson says, the mythology of the new science) is also thoroughly deterministic. But behold! No matter to what manipulations we subject Homo sapiens (meaning ourselves), "a moral argument" will prevent us from taking the last and catastrophic step. We will choose not to become posthuman. Of course we must hope that Wilson is right about that, and that we will choose sooner rather than later—as in choosing today to prohibit, for instance, cloning and the eugenic projects associated with embryonic stem cell experimentation. But Wilson’s moral argument and choosing cannot be squared with his relentlessly deterministic materialism. One is tempted to conclude that, while Fukuyama may be wrong about the end of history, the discussion surrounding his work suggests that the end of clear thinking is already upon us. So long, however, as we recognize that as a temptation to be resisted, it has not happened, at least not definitively.
The phrase "Christian America," I have suggested, is a description under the judgment of an aspiration. If we ask what is the quintessentially American aspiration, a number of candidates come to mind: freedom, equality, opportunity, justice. But most, if not all, of the possible answers to that question are summed up in the word "democracy." Departing from the usual format of this section, I begin in this issue a three–part reflection on ten proposals about what makes democracy both possible and necessary. The proposals are closely connected to Judeo–Christian presuppositions and, more specifically, to Catholic social teaching, which is thoroughly ecumenical in substance. It is also necessary to be attentive to Catholic social teaching because, among other reasons, many non–Catholics of good will are honestly puzzled about how the Catholic Church can support democracy. After all, the Catholic Church itself is not organized democratically. It is a hierarchical institution in the service of advancing authoritative (some would say authoritarian) truth claims. And for all Christians there would seem to be nothing very democratic about the belief that Jesus Christ is Lord.
Surely, it is said, there is a credibility problem with respect to the Church’s affirmation of democracy. I should quickly add that it is not only non–Catholics who are dubious about the Church’s support for democracy. Many Catholics, especially those who think of themselves as traditionalists, are deeply skeptical about democracy. Democracy, in this view, is one of a horde of pernicious doctrines that modernity unleashed in its attack on Catholic truth. If, as it is said, the Second Vatican Council of the early 1960s came around to a more friendly disposition toward democracy, that is but further evidence that a doubtfully inspired Council caved in to the Zeitgeist of a historical moment marked by madness unbounded. This, it should be noted, is a distinctly minority view among Catholics, but it should not be ignored. One reason it should not be ignored is that it reminds us that democracy is still a relatively new thing on the world scene. Asked what he thought about the French Revolution, Chairman Mao’s foreign minister, Chou En–lai, said, "It’s too early to tell." Christians who have a historical memory and an appreciation of the many different kinds of regimes under which the Church lives today and will likely live in the future should not absolutize liberal democracy. At the same time, we can uncompromisingly affirm the principles which today find expression in democratic theory and practice.
Although not so commonly encountered today, there is an old argument that used to be pressed vigorously by conservatives in America, namely, that our government is not a democracy but a republic. For many reasons—not least because what is now called conservatism is more populist than earlier conservatisms—that argument is not so frequently heard today. Suffice it that the founders certainly intended to establish a republic and the institutions of American government are republican in character. I think it accurate to say that ours is a democratic government conducted through republican means. It is not a direct but a representative democracy. But it is a democracy because political sovereignty is vested in the people, the demos. That is evident in the foundational assertion of the Declaration of Independence that "governments are instituted among men, deriving their just powers from the consent of the governed." In this understanding, such consent is given not just once and for all time, but must be constantly elicited if the government is to continue to be legitimate.
A new century and a new millennium is a time to think anew, and some think the problem with democracy is that it is old hat. Lecturing a while back at a major state university in the midwest, I was challenged by a political science student. "You say so much about openness to the future, yet aren’t you captive to a democratic theory and practice of the past?" That is, as lecturers inevitably say, a good question. Longtime readers will not be surprised if I say that I am not embarrassed by the charge that I am respectful of the past. Yet, in response to the question posed, I believe the idea of democracy is as audaciously new today as when it was first proposed. If it does not have to be reinvented, it certainly has to be rethought, by every generation. Today there is a particular urgency about rethinking democracy in relation to its moral and religious grounding.
Let it frankly be admitted that in this connection Protestantism is in some ways historically privileged. One thinks, for instance, of A. D. Lindsay’s The Modern Democratic State, which, first published in 1943, has not been surpassed in demonstrating how democratic theory and practice emerged from the left wing of the Reformation, notably from the revolution of Cromwell in England. This is not to deny that the seeds of democracy are to be found in pre–Reformation Christian ideas about the person, community, and freedom. There is, for instance, Thomas Aquinas and his defense of mendicant communities as necessarily free agents in service of the truth, an argument fraught with implications about the connections between freedom and govern mental legitimacy. Many other examples of the an tici pation of liberal democracy are explored in the work of Christopher Dawson, that great and sadly ne glec ted historian. But that is a subject for another time.
Most recent writers on democracy have been of a thoroughly secularist bent, being either indifferent to or ignorant of its roots in Christian history, and in Judaism before that. The political philosopher Leo Strauss wrote insightfully on the theme of Athens and Jerusalem, but in his view, though he was religiously attuned to Jerusalem, Athens was Athens and Jerusalem was Jerusalem and never the twain shall meet. A recent book on the development of democracy by a noted Straussian leaps with erudite insouciance from the fifth century Athens of Pericles to the U.S. Constitution. More commonly, contemporary writers on democracy seem determined to begin from scratch, so to speak. They write in a mode of apparent historical amnesia, and sometimes the amnesia is quite deliberate. One thinks, for instance, of John Rawls’ A Theory of Justice, in which the definition of democratic justice emerges from behind a "veil of ignorance," where we are to deliberate as though we quite literally do not know who we are. It is a striking instance of what is frequently examined in these pages, the attempt to step outside the civilizational circle of conversation. Quite literally, Rawls and others create a story of fictional human beings and their interests, and then suggest that, on the basis of such fictions, we should adopt principles for action and see how things go.
My proposals for democracy are intended to be keenly attentive to what we know about who we are, what is in our interest, and what is in the interest of others. Democracy needs to be proposed because not everybody is convinced that it is the polity we should want for the future. Along the way of making these proposals, I hope it will become apparent what I mean by democracy, and then perhaps there will be greater agreement that a more democratic society is the only "new society" that the political science student at Michigan State and the rest of us should be working toward.
In Catholic social teaching, the most recent and impressive argument on democratic theory and practice is the 1991 encyclical of John Paul II, Centesimus Annus. My understanding of what we call liberal democracy, and of church–state relations in democracy, is also informed by reflection on political philosophy and history, with specific reference to the experience of the United States of America. The American experience is both a model and a warning for the rest of the world. It is important to emphasize both of those aspects, the model and the warning.
For a number of years now, I have been involved in the Centesimus Annus Seminar in Krakow, Poland. In that seminar we bring together young academics and clergy from Poland and other Central and Eastern European countries to study Catholic teaching regarding the free and just society. Since the Revolution of 1989, I have noticed a striking change in the attitudes of Central European students toward democracy, and more particularly toward the American version of democracy. In the first few years, everything American was thought to be good. We had to caution the students against an uncritical admiration of everything American.
More recently, a certain disillusionment has been evident. In Poland, the Czech Republic, Hungary, Ukraine—and especially in Russia—people have discovered that democracy is not easy. After the Communist era, these societies declared their determination to be "normal" societies. Now they have discovered that normality is difficult. It is difficult in Poland as it is difficult in the United States, as it is difficult in Italy, Zimbabwe, Brazil, and everywhere else in the world. Our seminar students have learned that they cannot simply imitate things American but must find their own way to democracy. And yet—as the largest, the most influential, and arguably the oldest and most vital democratic experiment in world history—America remains both model and warning. In Central and Eastern Europe today, the question of American democracy and the question of American power get entangled in confusing ways. There is a sometimes latent but nonetheless strong streak of anti–Americanism in many European minds. That is in part a leftover from Marxist ideological influences, in part an understandable envy of America’s preeminence on so many scores, and in part a justifiable resentment of the imperious, if not imperial, ways the U.S. throws its weight around. (International affairs are not, and probably cannot be, conducted democratically.)
There is no dispute that the U.S. is the only remaining superpower in the world. Some go further and say that America is the world’s "lead society," mean ing that what happens in America will, sooner or later, mutatis mutandis, happen elsewhere. There is a strong element of truth in this. As we have had fre quent occasion to observe, the world deserves a bet ter "lead society" than the U.S., but this is what the world is stuck with, at least for the foreseeable future.
Catholic social teaching in the past, I think it fair to say, has been somewhat indifferent to the American experience. When Rome addressed questions such as democracy and church–state relations, it was usually with the French Revolution of 1789 chiefly in mind. In recent years, and most impressively in the pontificate of John Paul II, that has changed dramatically. It seems to me, however, that in the United States and elsewhere Catholic intellectuals have hardly begun to internalize the impressive teaching initiatives of the pontificate. As George Weigel contends in his magnificent Witness to Hope: The Biography of John Paul II (HarperCollins), coming to terms with the arguments of John Paul is a great task for years to come. We are dealing here with what is called, following John Henry Newman, the development of doctrine. That development will be clearer if, as with Centesimus Annus, we give the Revolution of 1776 at least equal billing with the Revolution of 1789 in addressing democracy, church–state relations, and related questions.
There are ten propositions about democracy that command our attention (they could, of course, be divided or combined into more or less than ten). The first proposition is this: The sovereignty of the democratic state is answerable to a higher sovereignty. Critics of democratic theory and practice, including Christian critics, frequently claim that the chief problem with democracy is that it acknowledges no sovereignty higher than the sovereignty of the people. The democratic state, claiming to represent the vox populi, presents itself as the vox Dei, thus turning the democratic state into an "idol." In response, we must recognize that the democratic state, and democracy itself, can indeed become an idol. When that happens, it is a profound distortion of democracy.
To be sure, in democracy political sovereignty is vested in the people. But a free people is free to acknowledge and hold itself accountable to a sovereignty higher than itself. In this respect also, 1776 is very different from 1789. In the Jacobin version of democracy, the state is assumed to embody what Rousseau called the General Will, beyond which there is no higher court of appeal. In the American founding, by way of dramatic contrast, it is recognized that society is prior to, and superior to, the state. The Declaration of Independence speaks of "Nature and Nature’s God," from whom all rights are derived and to whom the people constitute them selves as accountable. Although it is of much la ter provenance, the Pledge of Allegiance in the U.S. speaks of "one nation under God." That means, first of all, "under judgment"—the nation is an swer able to a judgment higher than that of the state, high er than positive law, and higher even than the will of the people. The paradox, of course, is that only the will of the people can maintain the effective awareness of being under the judgment of a higher will. (God will attend to the reality of our being under judgment.)
In the actual decision–making of the American polity, this higher sovereignty is asserted obliquely rather than directly. That is to say, it depends upon the people continuing to acknowledge such a higher sovereignty. The point was made emphatically and repeatedly by the founders, including Thomas Jef fer son, who some contemporary writers er ron e ous ly depict as a premature member of the American Civil Liberties Union. John Adams put it this way: "We have no government armed with power capable of con tend ing with human passions un bri dled by morality and religion. Our constitution was made only for a moral and religious people. It is wholly inadequate for the government of any other." The constitutional order is not "a machine that runs of itself." It must be sustained by a virtuous citizenry, and by popular religion that publicly appeals to a sover eignty that transcends the sovereignty of the state. These con ditions cannot be guaranteed. That is why democracy is always a risky enterprise. That is why the founders called our constitutional order an "experi ment." It is in the nature of experiments that they can succeed or they can fail, and they succeed on ly when we keep in mind the possibility of their failing.
The Religion Clause of the First Amendment to our Constitution contains two provisions. One forbids the "establishment" of a religion, and the other guarantees the "free exercise" of religion. The "no establishment" provision is in the service of the "free exercise" provision. Free exercise is the end, and no establishment is one means in the service of that end. This understanding of the Religion Clause, as every student of the subject knows, has not always prevailed in our jurisprudence. Indeed, in recent years, the courts have frequently acted as though "no establishment" is the end, and in the service of that end they have sharply curtailed the free exercise of religion, at least in the public sphere. This has resulted in the unhappy circumstance that I have described as the naked public square.
The naked public square is a thoroughly secularized public life from which religion and religiously grounded moral judgment have been excluded. At present, this constitutes the most severe crisis in church–state relations, and, I think it not too much to say, the most severe crisis in the American democratic experiment itself. In the constitutional order rightly understood, the state acknowledges a sovereignty higher than itself, and acknowledges that that sovereignty is defined by the people. In "Christian America" the institution that bears witness to that higher sovereignty is the church. (I here speak of the lower–case "church," including all the Christian denominations and the synagogue, as well as—and this may become more important in the course of the coming century—the mosque.) The state recognizes the integrity of the church, not simply as a voluntary association of individuals, but as a communal bearer of the witness to a higher sovereignty from which, through the consent of the governed, the legitimacy of the state itself is derived.
(To be continued next month with a second proposition: In a democratic society, people live under several, and sometimes conflicting, sovereignties.)
In the beginning was the Word,
and the creative Word brought forth the cosmos.
Then things really get going as the great Columbia river is surrounded
by the cries of the birds
the eagle and owl,
merganser and magpie,
osprey and raven
and other winged people.
The bishops, according to what it says here, take flight in deploring what people have done as "the Earth was transformed" by everything from lumbering to "atom–based plants" and "aluminum plants for new airplanes." But they dream of a new day when
the peoples of the sky, the land and the waters
lived each in relation to all and to Earth . . .
new energy sources soon came into being,
that worked with Earth’s energy, sunlight, and wind. . .
eco–justice, eco–consciousness walked hand in hand,
and communities called themselves neighbors again.
It goes on and on like that. Whoever is responsible for spreading the word that Catholic bishops could associate themselves with such dopiness that tries but fails to rise to the dignity of heresy should be ashamed of himself. At the same time, one cannot entirely shake the suspicion that the report just may be true. That’s another thing about the Internet.