148 Solzhenitsyn’s Warnings

 

 

 

SOLZHENITSYN’S WARNINGS            

            During the 1970s I read most of Alexander Solzhenitsyn’s novels (One Day in the Life of Ivan Denisovich; First Circle; Cancer Ward) as well as The Gulag Archipelago, a massive (three volume) documentation of Soviet brutality under Lenin and Stalin, and The Oak and the Calf, an account of his struggles with censorship in the USSR.  By the decade’s end, thanks to Solzhenitsyn, I was delivered from some of the academy’s gilded portraits of the USSR and a bit better prepared to discern the Marxist rhetoric so glibly infusing many analyses of American history.  And I was also prompted to re-examine, during the next decade, America’s role in the world vis-à-vis both Communism and similarly aggressive ideologies such as Islam. 

          Recently, assessing Spanish elections, wherein a docile public wilted in the face of terrorism, I find myself thinking about, and re-reading, some of Solzhenitsyn’s addresses.  He was awarded the Nobel Prize in 1970, and his Nobel Lecture (New York:  Farrar, Straus and Giroux, c. 1972) focused on art and the role of the artist.  “One kind of artist,” markedly evident in the avante garde individualists of the West, “imagines himself the creator of an independent spiritual world and shoulders the act of creating that world and the people in it, assuming total responsibility for it” (p. 4).  Such self-serving rebels against convention enjoy moments of fame but do little good.  The other kind, endorsed by Solzhenitsyn, rightly understands his sacred vocation and “acknowledges a higher power above him and joyfully works as a common apprentice under God’s heaven” (p. 4). 

          To work wisely and well as an artist is a truly noble calling, for as Dostoevsky said, “Beauty will save the world.”  Great art, truthful art, weathers the winds of time and gives wings to our souls.  Indeed, Plato’s “old trinity of Truth, Goodness, and Beauty” (p. 7) retains its ancient grandeur, and nothing rivals the importance of investing one’s life in illuminating and defending such transcendent realities, the “permanent things.”  Speaking personally, Solzhenitsyn noted that he miraculously survived his years in the Gulag, while thousands perished.  So he had a sacred mission:  to record, to explain, to imbed their story in the nation’s literature.  “Our twentieth century has turned out to be more cruel than those preceding it, and all that is terrible in it did not come to an end with the first half” (p. 22).  Millions died because too few believed in “fixed universal human concepts called good and justice” while the oppressors disdained them as “fluid, changing, and that there for one must always do what will benefit one’s party” (p. 220.  Sadly enough, might-makes-right philosophies forever enlist devotees, and hijackers and terrorists ever wreck their carnage.  But despite the fact that (as Dostoevsky lamented) there is much “slavery to half-cocked progressive ideas” (p. 24), one must courageously seek to refute them.     

          This means refuting the “spirit of Munich” that has spread cancerously throughout the West.  That spirit, Solzhenitsyn says, “is dominant in the twentieth century.  The intimidated civilized world has found nothing to oppose the onslaught of a sudden resurgent fang baring barbarism, except concessions and smiles.  The spirit of Munich is a disease of the will of prosperous people; it is the daily state of those who have given themselves over to a craving for prosperity in every way, to material well-being as the chief goal of life on earth” (p. 24).  He referred, of course, to the agreement Neville Chamberlain made with Adolf Hitler in 1938, declaring:  “How horrible, how fantastic, how incredible it is that we should be digging trenches and trying on gas masks because of a quarrel in a far-away country between people of whom we know nothing!”  Returning to the cheering masses in England, he proclaimed the arrival of “Peace in Our Time.” 

Replying to Chamberlain, Winston Churchill said: “I do not grudge our loyal, brave people    . . . the natural, spontaneous outburst of joy and relief when they learned that the hard ordeal would no longer be required of them at the moment; but they should know the truth.  They should know that . . . we have sustained a defeat without war, the consequences of which will travel far with us along with our road.”  The next year, of course, Germany invaded Poland.  Even then, however, many Europeans sought to remain “neutral,” numbly paralyzed in their pacifism.  This, Churchill said, was “lamentable; and it will become much worse.  They bow humbly and in fear of German threats.  Each one hopes that if he feeds the crocodile enough, the crocodile will eat him last.  All of them hope that the storm will pass before their turn comes to be devoured.  But I fear–I fear greatly–the storm will not pass.  It will rage and it will roar, ever more loudly, ever more widely.” 

The ghastly carnage of WWII, of course, might have been avoided had Churchill’s warnings been heeded.  But Chamberlain’s appeasement postponed the conflict until it could only be waged against desperate odds.  Neither the League of Nations nor Europe’s politicians had the courage to resist.  So it’s up to writers such as himself, Solzhenitsyn said, to speak the truth to the world.  While struggling against the autocracy of the USSR, he’d found an international fraternity of writers who rallied to his side when Communist hardliners sought to suppress him.  His weapon, naturally, was the writer’s pen enlisted to proclaim the truth.   Tyranny thrives by lying.  Truth tellers expose and ultimately defeat the tyrants.  Writers “can VANQUISH LIES!  In the struggle against lies, art has always won and always will” (p. 33).  And so, he memorably declared in closing:  “ONE WORD OF TRUTH OUGHTWEIGHS THE WORLD” (p. 34). 

* * * * * * * * * * * * * *

          Exiled from the USSR soon after receiving the Nobel Prize, Solzhenitsyn found refuge in the mountains of Vermont, where he continued to write and declare the truth.  Initially lionized by the American intelligentsia, he was invited to deliver the 1978 commencement address at Harvard University, published as A World Split Apart (New York:  Harper & Row, c. 1978).  He began his speech abrasively, noting that though Harvard’s motto is Veritas graduates would find that “truth seldom is sweet; it is almost invariably bitter” (p. 1).  But he would speak truly anyway!  And his words proved “bitter” to many who heard him! 

          After assessing various developments around the world, he questioned the resolve of the West to deal with them.  Alarmingly, he said, “A decline in courage may be the most striking feature that an outside observer notices in the West today.  The Western world has lost its civic courage, both as a whole and separately, in each country, in each government, in each political part, and, of course, in the United Nations.  Such a decline in courage is particularly noticeable among the ruling and intellectual elites, causing an impression of a loss of courage by the entire society” (pp. 9-11).  This decline, “at times attaining what could be termed a lack of manhood,” portended a cataclysmic cultural collapse.  

          Solzhenitsyn also lamented the West’s materialism, litigiousness, licentiousness, and irresponsible individualism.  Personal freedom is, of course, a great good, but irresponsible freedom erupts in evil acts and “evidently stems from a humanistic and benevolent concept according to which man–the master of this world–does not bear any evil within himself, and all the defects of life are caused by misguided social systems, which must therefore be corrected” (p. 23).  If so, it would seem that affluence would eliminate crime!  Strangely enough, however, crime was more rampant in the wealthy West than in the impoverished USSR! 

          Then he upbraided the media.  Granted virtually complete “freedom,” journalists in the West used it as a license for irresponsibility.  Rather than working hard work to discover the truth, they slip into the slothful role of circulating rumors and personal opinions.  Though no state censors restrict what’s written, “fashionable” ideas get aired and the public is denied free access to the truth.  Fads and fantasies, not the illumination of reality, enlist the mainstream media.  “Hastiness and superficiality–these are the psychic diseases of the twentieth century and more than anywhere else this is manifested in the press” (p. 27).    Consequently, “we may see terrorists heroized, or secret matters pertaining to the nation’s defense publicly revealed, or we may witness shameless intrusion into the privacy of well-known people according to the slogan ‘Everyone is entitled to know everything'” (p. 25). 

          Solzhenitsyn was further disturbed by the widespread pessimism and discontent Westerners displayed regarding economic development.  Amazingly, elite intellectuals celebrated the very socialism that had destroyed his homeland.  (Remember that Harvard’s superstar economist, John Kenneth Galbraith, still trumpeted the virtues of socialism in the 1980s!)  This, Solzhenitsyn warned, “is a false and dangerous current” (p. 33).  In the East, “communism has suffered a complete ideological defeat; it is zero and less than zero.  And yet Western intellectuals still look at it with considerable interest and empathy, and this is precisely what makes it so immensely difficult for the West to withstand the East” (p. 55).  But the capitalist system in the West is no panacea either.  Both East and West, he said, need “spiritual” rather than “economic” development, and the spirit has been “trampled by the party mob in the East, by the commercial one in the West” (p. 57).  

          American politicians who appeased Communism especially elicited Solzhenitsyn’s scorn.  In fact, looking at the nation’s recent withdrawal from Vietnam, he said:  “the most cruel mistake occurred with the failure to understand the Vietnam War.  Some people sincerely wanted all wars to stop just as soon as possible; others believed that the way should be left open for national, or Communist, self-determination in Vietnam (or in Cambodia, as we see today with particular clarity).  But in fact, members of the U.S. antiwar movement became accomplices in the betrayal of Far Eastern nations, in the genocide and the suffering today imposed on thirty million people there.  Do these convinced pacifists now hear the moans coming from there?  Do they understand their responsibility today?  Or do they prefer not to hear?  The American intelligentsia lost its nerve and as a consequence the danger has come much closer to the United States.  But there is no awareness of this.  Your short-sighted politician who signed the hasty Vietnam capitulation seemingly gave America a carefree breathing pause; however a hundredfold Vietnams now looms over you” (p. 41).  The future he envisioned would be shaped by a “fight of cosmic proportions,” a battle between the forces of either Goodness or Evil.  Those who are morally neutral, those who exalt in their moral relativism, are the true enemies of mankind.   Thus, two years before Ronald Reagan was elected President, Solzhenitsyn insisted that only a moral offensive could turn back the evil empire. 

          Cowardice had led to retreat in Southeast Asia.  Democracies themselves, Solzhenitsyn feared, lack the soul strength for sustained combat.  Wealthy democracies, especially, seem flaccid.  “To defend oneself, one must also be ready to die; there is little such readiness in a society raised in the cult of material well-being.  Nothing is left, in this case, but concessions, attempts to gain time, and betrayal” (p. 45).  More deeply, the “humanism” that has increasingly dominated the West since the Renaissance explains its weakness.  When one believes ultimately only in himself, when human reason becomes the final arbiter, when human sinfulness is denied, the strength that comes only from God will dissipate.  Ironically, the secular humanism of the West is almost identical with the humanism of Karl Marx, who said:  “communism is naturalized humanism” (p. 53). 

          Consequently, he said, “If the world has not approached its end, it has reached a major watershed in history, equal in importance to the turn from the Middle Ages to the Renaissance. It will demand from us a spiritual blaze; we shall have to rise to a new height of vision, to a new level of life, where our physical nature will not be cursed, as in the Middle Ages, but even more importantly, our spiritual being will not be trampled upon, as in the Modern Era” (pp. 60-61).  This speech ended Solzhenitsyn’s speaking career in the United States.  The nation’s elite newspapers–the New York Times and Washington Post–thenceforth ignore him.  Prestigious universities, such as Harvard, closed their doors.  He became something of a persona non grata and spent the last 15 years of his life in America living as a recluse, working industriously on manuscripts devoted to Russia’s history. 

* * * * * * * * * * * * * * * *

          In the years immediately prior to Solzhenitsyn’s Harvard speech, he spoke to several American and British audiences, setting forth themes summarized at Harvard.  His speeches were published in Warning to the West (New York:  Farrar, Straus and Giroux, c. 1976).  He particularly assailed the appeasement proposals of Bertrand Russell, summed up in the slogan “Better Red than dead.”  To Russell and his fifth-column ilk, Solzhenitsyn replied:  “Better to be dead than a soundrel.  In this horrible expression of Bertrand Russell’s there is an absence of all moral criteria” (p. 119). 

          Delivering an address over BBC in 1976, Solzhenitsyn noted that “until I came to the West myself and spent two years looking around, I could never have imagined the extreme degree to which the West actually desired to blind itself to the world situation, the extreme degree to which the West has already become a world without a will, a world gradually petrifying in the face of the danger confronting it, a world oppressed above all by the need to defend its freedom” (p. 126).  “There is a German proverb,” he continued, “which runs Mut verloren–alles verloren:  When courage is lost, all is lost.  There is another Latin one, according to which loss of reason is the true harbinger of destruction.  But what happens to a society in which both these losses–the loss of courage and the loss of reason–intersect?  This is the picture which I found the West presents today” (pp. 126-127).  This predicament, he thought, proceeds from centuries of philosophical and theological development and colonial expansion. 

The First World War, culminating this process, virtually destroyed Europe, and in its wake the evils of socialism inundated Russia, annihilating 100 million or more of its people.   Europeans, eschewing moral criteria to follow narrowly pragmatic policies, stood by silently.  England’s prime minister, “Lloyd George actually said:  ‘Forget about Russia.  It is our job to ensure the welfare of our own society” (p. 131).  So Russia’s erstwhile “allies,” ignoring her wartime sacrifices, did nothing to stop the Bolshevik’s triumph, tyranny, and terror.  Even as millions were executed or sent into the Gulag Archipelago, when six million peasants died in the Ukraine in the 1930s, Westerners ignored it.  Sadly, Solzhenitsyn said:  “Not a single Western newspaper printed photographs or reports of the famine; indeed, your great wit George Bernard Shaw even denied its existence.  ‘Famine in Russia?’ he said.  ‘I’ve never dined so well or so sumptuously as when I crossed the Soviet border.'” (p. 133). 

Similarly, during WWII England and the Allies benefited from Russia’s assistance.  But following the war Stalin continued, with little criticism in the West, to oppress his people.  “Twice we helped save the freedom of Western Europe,” he said.  “And twice you repaid us by abandoning us to our slavery” (p. 136).  Frankly, he believed that Westerners preferred peace and security, pleasure and comfort, to demanding justice for Russia’s oppressed.  So they ignored the mass deportations of “whole nations to Siberia” and the occupation of Estonia, Latvia, and Lithuania!  Having stopped Hitler, they seared their consciences and remained untroubled with Stalin.   

          Indeed, rather than seriously evaluating and learning from Russia’s disaster, Western intellectuals seemed (in the 1970s) willing to replicate it!  “And what we see is always the same as it was then:  adults deferring to the opinion of their children; the younger generation carried away by shallow, worthless ideas; professors scared of being unfashionable; journalists refusing to take responsibility for the words they squander so easily; universal sympathy for revolutionary extremists; people with serious objections unable or unwilling to voice them; the majority passively obsessed by a feeling of doom; feeble governments; societies whose defensive reactions have become paralyzed; spiritual confusion leading to political upheaval” (p. 130). 

          Solzhenitsyn was particularly incensed by the “misty phantom of socialism” so prevalent in places like England.  “Socialism has created the illusion of quenching people’s thirst for justice:  Socialism has lulled their conscience into thinking that a steamroller which is about to flatten them is a blessing in disguise, a salvation.  And socialism, more than anything else, has caused public hypocrisy to thrive,” enabling Europeans to ignore Soviet atrocities (p. 141).  There’s actually no logic to socialism, for “it is an emotional impulse, a kind of worldly religion,” embraced and followed with blind faith (p. 142).  As an ideology, it is spread and embraced by immature, sophistic believers. 

          The British, of course, had drifted toward socialism under the post-WWII Labor leaders.   Consequently, “Great Britain, the kernel of the Western world, has experienced this sapping of its strength and will to an even greater degree, perhaps, than any other country.  For some twenty years Britain’s voice has not been heard in our planet; its character has gone, its freshness has faded” (p. 144).  The land of Churchill had vanished!  “Contemporary society in Britain is living on self-deception and illusions, both in the world of politics and in the world of ideas” (p. 144).  What was true about Great Britain, he insisted, was equally true about much of the West. 

          As one would anticipate, Solzhenitsyn’s BBC career ended abruptly!  Neither British nor American politicians, labor leaders, professors or journalists wanted to be rebuked for their failures!  In the 1970s, neither the United Nations nor the Europeans, neither Richard Nixon nor  George McGovern, neither Gerald Ford nor Jimmy Carter, neither William J. Fullbright nor John F. Kerry had the courage to oppose Communism in Southeast Asia.  Nor do numbers of their successors today seem ready to deal with the violence and injustices in the Middle East.   Let us, however, never say that no one warned us about appeasement’s desserts!

147 The Cornucopia of Capitalism

 

 

 

THE CORNUCOPIA OF CAPITALISM

                As an adolescent, growing up in Stockholm, Sweden, Johan Norberg espoused anarchism, tried (with John Lennon) to “imagine there’s no countries,” and decried multinational capitalism. He longed for a world wherein folks would be free. Fifteen years later, having seriously studied economics and become a professor of that discipline, he’s still fervently committed to freedom–particularly the small but critical daily liberty to “pick and choose” what one eats, where one lives, how one works. But he’s changed his mind as to how best to extend it and written In Defense of Global Capitalism (Washington, D.C.: Cato Institute, c. 2003). Originally published in Sweden, the book was picked up by the Cato Institute (well known for its “libertarian” economic ideals), translated into English and published. Graphs and charts, footnotes and citations, references to trustworthy sources, indicate the book’s research foundations, but it’s engagingly written and quite understandable for anyone concerned with economics. 

            Contrary to the slogans shouted by today’s anarchists protesting “globalization” (the World Bank and International Monetary Fund, multinational corporations and international trade agreements), despite doctrinaire leftist claims that exploitation and deprivation are spreading everywhere, Norberg demonstrates that during the past three decades a transformation has taken place around the world.  People in countries such as India and China have made startling, unprecedented economic gains.  “Consumerism,” so often labeled evil by Western critics, appears to have stimulated developments that have dramatically raised the standard of living.  This took place not as a result of a socialist revolution, “but rather from a move in the past few decades toward greater individual liberty” (p. 23).   Attuned to an ancient Chinese proverb, “When the wind of change begins to blow, some people build windbreaks while others build windmills,” Norberg wants us to flow with the wind of free enterprise and make the world a better place. 

            Dealing with facts, rather than utopian fantasies, leads one to discover that “between 1965 and 1998, the average world citizen’s income practically doubled, from $2,497 to $4,839, adjusted for purchasing power and inflation” (p. 25).  Amazingly, though one would never expect it if one listened to leftist pundits and social gospel preachers, “world poverty has fallen more during the past 50 years than during the preceding 500” (p. 25).  Population has, indeed, soared during these decades, but “the number of absolute poor has fallen by about 200 million” (p. 26) because of rapid economic development.  Poverty in Asia declined from 60 to 20 per cent in 20 years!  Economic growth erases poverty. 

            There’s far less hunger in the world today because “we have never had such a good supply of food” (p. 31).  This results, primarily, from the “green revolution” once strongly opposed by environmentalists who issued dire warnings as to its long-term impact.  Germ and insect resistant crops, better sowing and harvesting methods, more efficient use of available water, have resulted in an amazingly productive agricultural system.  Though best illustrated in the United States, the same pattern is evident world-wide.  Famines now occur less often, in large part Norberg says, because democracies (and their freedoms for individuals) seem never to experience such.  Famines strike places like North Korea, the former Soviet Union, Cambodia, Ethiopia–all ruled by tyrants.  Dictatorships, not agricultural failures, not ecological abuses, cause famines. 

            China especially reveals the positive impact of global capitalism.  In the 1970s Deng Xiaoping “realized that he would have to distribute either poverty or prosperity, and that the latter could only be achieved by giving people more freedom” (p. 47).  Peasants were allowed to lease land, to grow and market crops for themselves.  They did so “to such a huge extent that nearly all farmland passed into private hands in what may have been the biggest privatization in history.  It paid off, with crop yields rising between 1978 and 1984 by an incredible 7.7 percent annually.  The same country that 20 years earlier had been hit by the worst famine in human history now had a food surplus” (p. 47).  Half a billion Chinese–nearly twice the American population!–climbed out of poverty simply because they could participate in a capitalistic economy.  “The World Bank has characterized this phenomenon as ‘the biggest and fastest poverty reduction in history'” (p. 48). 

            Vietnam, surprisingly, shows the same trend.  Though impoverished by its Marxist straightjacket for several decades, “Vietnam since the end of the 1980s has introduced free trade reforms and measures of domestic liberalization” (p. 133).   Exports, especially rice, have boomed.  “This has resulted in rapid growth and a uniquely swift reduction of poverty.  Whereas 75 percent of the population in 1988 were living in absolute poverty, by 1993 this figure had fallen to 58 percent, and 10 years later had been reduced by half, to 37 percent; 98 percent of the poorest Vietnamese households increased their incomes during the 1990s” (p. 133).   Similar currents have streamed through India and South Korea.  In the 1960s South Korea was poorer than Angola, but today it’s the world’s 13th largest economy.  Conversely, North Korea, sentenced to the nightmare endemic to Communism, sunk even deeper into the pit of deprivation and desperation. 

What huge investments in “foreign aid,” what highly touted “compassionate Christian ministry” endeavors failed to significantly impact, an unleashed free enterprise capitalism accomplished in two decades!  There are, manifestly, many global inequities.  But “the fantastic thing,” Norberg says, “is that the spread of democracy and capitalism has reduced them so dramatically” (p. 61).  Wherever government steps aside and lets individuals flourish, they freely invest and innovate and forge associations that precipitate prosperity.  In such free systems, folks like Bill Gates will, of course, become fantastically wealthy.  But the system that sustains them also provides a rising tide that lifts everyone’s boat.  If my income doubles in two decades, while Gates’ quadruples, why should I complain?   Unless I’m consumed by envy, I won’t!    In a capitalist system, the “poor benefit from growth to roughly the same extent and at the same speed as the rich.  They benefit immediately from an increase in the value of their labor and from greater purchasing power” (p. 81).  It’s obvious that capitalism accentuates inequalities.  But this occurs not because capitalism makes some folks poor, but because it makes “its practitioners wealthy.  The uneven distribution of wealth in the world is due to the uneven distribution of capitalism” (p. 154). 

            Such a capitalistic order “requires people to be allowed to retain the resources they earn and create” (p. 66).  Private property, so demonized by socialistic thinkers, proves to be the essential lynchpin for widespread economic development.  The folks at the bottom benefit the most from private property.  “The Peruvian economist Hernando de Soto has done more than anyone else to show how poor people lose out in the absence of property rights” (p. 91).  Conversely, public spending–even on behalf of the poor–ultimately harms its intended beneficiaries.  Taking from the rich to enrich the poor harms the poor.  Do-gooders, especially the enlightened elites who direct the welfare state and feel highly righteous in distributing the dole, feel good about themselves but actually do little good!  In Asia, where poverty has declined so rapidly, there has been almost no “redistribution” of wealth, no “social justice.”  Only millions of free people lifting up themselves!  East Asia’s “miracle shows an open, free-enterprise economy to be the sine qua non of development” (p. 103).   Several  million ordinary people, pooling their wisdom in the marketplace, working and saving, buying and selling, investing and losing investments, prove far more prescient and trustworthy than a few dozen bureaucrats orchestrating a planned economy. 

The division of labor basic to capitalism means that “one hour’s labor today is worth about 25 times more than it was in the mid-19th century.  Employees, consequently, now receive about 25 times as much as they did then in the form of better pay, better working conditions, and shorter working hours” (p. 68).  The alleged “victims” of multinational corporations, workers in “the poorest developing countries” employed by American-affiliated companies like Nike, earn “eight times the average national wage!” (p. 217).  Unlike the “sweatshops” denounced by Leftists marching in the streets, American factories in the Third World pay their employees handsomely and contribute to the rapid development of once impoverished lands. 

In short:  the world is, in fact, much better than it was a century ago.  And it’s almost exclusively the result of the spread of democracy and free enterprise. 

* * * * * * * * * * * * * * * *

Norberg’s work confirms the portrait painted in It’s Getting Better All the Time:  100 Greatest Trends of

the Last 100 Years by Stephen Moore and Julian L. Simon (Washington, D.C.:  Cato Institute, c. 2000).  Moore took the economic data collected by the late Julian Simon, a noted economist, and distilled it (with colorful charts) to illustrate the book’s thesis:  “Freedom works” (p. 12).   Simon became somewhat notorious for publicly challenging various “doomsayers,” most notably the alarmists trumpeting the environmental crisis.  In 1980 he challenged Paul Ehrlich to put his money where his mouth was:  to wager $1000 on his pessimistic  predictions.  “A few years before that Ehrlich wrote:  ‘I would take even money that England will not exist in the year 2000.’  He wrote in 1969, on the eve of the green revolution, that ‘the battle to feed humanity is over.  In the 1970s the world will undergo famines.  Hundreds of millions of people will starve to death.’  Although Professor Ehrlich continues to make dimwitted statements like this, he is still taken quite seriously by the American intelligentsia.  He even won a MacArthur Foundation ‘genius’ award after he made these screwball predictions” (pp. 20-21). 

But Simon, unimpressed with Ehrlich’s Stanford credentials and bombastic assertions, dismantled his façade.  Setting forth the terms of his wager, he allowed Ehrlich to choose any five natural resources that he thought would become more expensive in the next 10 years.  “By 1990 not only had the optimist (Simon) won the bet, but every one of the resources had fallen in price, which, of course, is another way of saying that the resources had become less scare, not more” (p. 20).  Things were, by every measurable indice, getting better.  The 20th century also witnessed incredible economic and political advances.  The authors contend, “there has been more improvement in the human condition in the past 100 years than in all of the previous centuries combined since man first appeared on earth” (p. 1).  Blessed with such improvements, many of us fail to realize how significant they are.  “No mountain of gold 100 years ago could have purchased the basics of everyday life that middle-income Americans take for granted in 1999” (p. 6).  Underlying this spectacular development, and largely explaining it, are three things:  electricity; modern medicine; and the microchip. 

Of the 100 positive trends the book highlights, increased longevity is one of the most impressive.  Since the beginning of the industrial revolution, life expectancy has doubled, perhaps “‘the greatest miracle in the history of our species'” (p. 26).  Americans live 30 years longer than they did in 1900.  In China, in 1950, life expectancy was 40 years; today it’s 63, an amazing 50 percent gain in 50 years.  Infant mortality has sharply declined.  Deadly diseases, such as tuberculosis, smallpox, and diptheria, have been largely eliminated.  Miracle drugs, cures for cancer, treatments for heart disease all make for longer lives, freedom from killer diseases. 

Contrary to Malthusian predictions, we now have more food and less threat of famine than ever, despite the globe’s population growth.  Today’s farmer produces 100 times as much food as did his counterpart a century ago.  Prices for food have declined steadily.  Wealth, rather than shrinking as more people share the planet, has dramatically increased.  The true “wealth” is human ingenuity.  The world’s resources are not a finite pie, demanding that it be cut and divided in ever-smaller portions.  True wealth is the result of creative persons finding ever better ways to live.  So more and more people have been getting more and more wealthy. 

            For all the good news contained in the book, it’s also obvious that the 20th century was, in some respects, the worst of all centuries.  Multiplied millions of people died in wars–and four times as many were liquidated by totalitarian governments.  Somewhere between 150 and 200 million innocent folks were sacrificed on the altars of (largely socialist, whether fascist or communist) ideology (p. 16).  These ghastly evils were done, almost exclusively, by regimes that deprived people of individual freedom.  Despots determined to dictate economic systems, always for “the good of the people,” launched their programs by confiscating guns and restricting free speech, by stamping out the free press and restricting the opportunity to worship God.  Virtually “every great tragedy of the 20th century has been a result of too much government, not too little” (p. 15). 

The book’s succinct, clear, persuasive.  As Lawrence Kudlow, the chief economist for CNBC, writes:  “This book is so chock full of good news that it’s virtually guaranteed to cheer up even the clinically depressed.  Moore and Simon dismantle the doomsday pessimism that’s still so commonplace in academia and the media.  The evidence they present is irrefutable:  Give people freedom and free enterprise and the potential for human progress is seemingly limitless” (back cover). 

* * * * * * * * * * * * * * * * *

The libertarian humorist P.J. O’Rourke provides much the same evidence in Eat  the Rich (New York: 

Atlantic Monthly Press, c. 1998).  “I had one fundamental question about economics,” he says, beginning his book:  ‘Why do some places prosper and thrive while others just suck?  It’s not a matter of brains.  No part of the earth (with the possible exception of Brentwood) is dumber than Beverly Hills, and the residents are wading in gravy.  In Russia, meanwhile, where chess is a spectator sport, they’re boiling stones for soup” (p. 1).  Why?  It’s a good question!  And it’s a question O’Rourke clearly answers:  free people, under the rule of law, prosper. 

            Reared in a privileged American home, O’Rourke went off to college, where he and his peers imbibed the intellectual currents of the ’60s, posed as hippies, and styled themselves “Marxists” without much of a clue as to what that entailed.  In time, he became a journalist, started thinking like an adult,  and began to notice, as he traveled the globe in the 1990s, the importance of economics.  Curious, he pulled out the economics textbook he’d been assigned in college, Samuelson and Nordhaus’ Economics, widely used throughout the country for 40 years.  “Professor Samuelson,” O’Rourke discovered, “turns out to be almost as much of a goof as my friends and I were in the 1960s” (p. 8).   To Samuelson Karl Marx was “the most influential and perceptive critic of the market economy ever” (p. 8) and blessed his memory by embracing his theories, arguing that socialist improvements to the American economy would make life better for all concerned.   

            Having personally witnessed Marx’s influence in various world areas, O’Rourke resolved to discard Samuelson–and fellow travelers like of John Kenneth Galbraith–and find better answers to his questions in countries that have embraced either capitalism or socialism.  He discovered that there can be “good capitalism,” like that found on Wall Street, largely responsible for America’s amazing prosperity.  There can be “bad capitalism” such as developed in Albania following the collapse of Enver Hoxha’s tyranny.  Albanians in the 1990s were “free,” but not doing well.  “The Albanian concept of freedom approaches my own ideas on the subject, circa late adolescence.  There’s a great deal of hanging out and a notable number of weekday, midafternoon drunk fellows” (p. 47).  But not much productive labor!  Lots of freedom, but little enterprise!

            In Sweden, O’Rourke checked out what’s often called a “good socialism.”  One can do very little and get quite a lot in this workers’ paradise.  A mere 2.7 million of the 7 million Swedes work to pay for folks getting benefits or working for the government.  Unfortunately, bills come due in time.   The nation’s economy is slowly shrinking.  In 1950 the nation was among the richest on earth.  Swedes were taxed at about the same rate as are Americans today, with the government spending 31 percent of GDP.  Then came the socialist takeover, when the welfare state replaced capitalism.  Productivity slipped.  Crime boomed.  The Swedes mortgaged the future and bought momentary comfort.  But the good times will end, O’Rourke predicts, and cracks in the social fabric indicate that the end may be near at hand.

            Checking out Cuba, O’Rourke found a “bad socialism.”  Everything seems shattered by Castro’s revolution.  Simply looking out his hotel window, he saw “holes in everything:  holes in roofs, holes in streets, holes where windows ought to be” (p. 77).   The island looks war-ravaged, and the people seem shell-shocked into silence.  The fading beauty of Havana, where folks were in fact rather free under Batista, gives witness to the losses Cuba has suffered.  The inescapably totalitarian aspects of socialism manifest themselves in Castro’s “paradise.”  The residue–or the debris–of socialism now litters Russia a decade after the “collapse” of communism.  O’Rourke noted that Russians were certainly more active and alive than in the 1980s, but the “system” still hardly works.  Notably absent, he says, is the rule of law.  So “businessmen” behave more like thugs than entrepreneurs.  “What would be litigiousness in New York is a hail of bullets in Moscow.  Instead of a society infested with lawyers, they have a society infested with hit men.  Which is worse, of course, is a matter of opinion” (p. 129).  The rampant corruption, he believes, is directly tied to Marx and Lenin, the men who laminated their amoral, nihilistic worldview onto the nation. 

            The African nation of Tanzania, O’Rourke says, illustrates “how to make nothing from everything.”  It’s one of the world’s truly impoverished nations.   By comparison, “Papua New Guinea is almost ten times more prosperous, never mind that some of its citizens have just discovered the wheel” (p. 166).   There’s plenty of  arable land and abundant natural resources.  The people were little affected by European colonialism and have suffered few wars.  What went wrong is attributable to Julius Nyere, the celebrated “teacher” who led the nation for nearly three decades.  He imposed a stern, rigorously egalitarian collectivism, styled “familyhood,” designed to make Tanzania a peoples’ paradise.  Everything’s regulated, everything’s prescribed by government.  To be blunt:  Tanzania is poor because Nyere and his socialistic enthusiasts “planned it” (p. 175). 

            By contrast there’s Hong Kong, which demonstrates “how to make everything from nothing.”  One of the best examples of laissez-faire economics, Hong Kong’s British colonial government did little but “keep the peace, ensure legal rights, and protect property” (p. 199).  Individuals took the initiative and fueled an economic “miracle.”  “With barely one-tenth of 1 percent of the world’s population, Hong Kong is the world’s eight-largest international trader and tenth-largest exporter of services” (p. 205).  What will happen with its absorption by mainland China, of course, remains to be seen.

            Summing up his discoveries in economics, O’Rourke admits it’s pretty much what his parents told him before he went off to the university: “Hard work, Education, Responsibility; Property rights; Rule of law; Democratic government” (pp. 233-34) insure economic prosperity.  Especially important is the rule of law, for rampant freedom (as in Albania) or rampant crime (as in Russia) prevent economic development.  People will work hard, save, invest, risk and innovate only when the law protects their property.  All in all–professor Samuelson notwithstanding–Adam Smith was right:  the free market provides the best for the most. 

            Discerning, clear-headed, witty and understandable, O’Rourke’s treatise provides a remarkably astute world tour of diverse economies, locating their sources and detailing their consequences.  Fun to read, but memorable in its message! 

146 Child Care? Who Cares?

“Train up a child in the way he should go:  and when he is old, he will not depart from it” (Proverbs 22:6).   Caring for children ever characterizes healthy cultures.  Even “primitive” cultures invested much in rearing the coming generation–as evident in an Iroquois tradition that encouraged folks to consider the next seven generations when charting tribal policies.   If you want to make a “good society” you need to rear “good kids.”  Robert Coles, long-time Harvard professor and highly-regarded authority on children, says:  “Good children are boys and girls who in the first place have learned to take seriously the very notion, the desirability, of goodness–a living up to the Golden Rule, a respect for others, a commitment of mind, heart, soul to one’s family, neighborhood, nation–and have also learned that the issue of goodness is not an abstract one, but rather a concrete, expressive one:  how to turn the rhetoric of goodness into action, moments that affirm the presence of goodness in a particular life lived” (The Moral Intelligence of Children).

Given such ancient wisdom, given the need most everyone acknowledges that we need to rear “good” children, their conditions–as documented in several recent studies–should concern us all.  Robert M. Shaw, M.D., a child and family psychiatrist who once taught at the Albert Einstein College of Medicine, now directs the Family Institute of Berkeley, California, and maintains his psychiatric practice.  He has recently published, with Stephanie Wood, The Epidemic:  The Rot of American Culture, Absentee and Permissive Parenting, and the Resultant Plague of Joyless, Selfish Children (New York:  ReganBooks, c. 2003).  Though written without any clear religious commitment, the book echoes profoundly religious themes; coming from a writer comfortably settled in the liberal environs of Berkeley, California, it champions a thoroughly conservative message.

The book’s lengthy subtitle encapsulates its message, and Shaw writes with a deep sense of outrage at the ways parents, for the past 30 years, have failed their kids.  He claims the killings at Columbine High School at Littleton, Colorado, “did not surprise” him.  Hardly “an aberration,” killers Eric Harris and Dylan Klebold simply demonstrated what one would expect to result from “the childrearing attitudes and practices that have spread like a virus from home to home in this country” (p. x).   Spending time with youngsters–or merely walking through a shopping mall–should alert us to their sullen, angry, whining, self-absorbed attitudes, ample “signs that our society has become toxic to children” (p. xi).

The big problem, as James Dobson indicated long ago, is parents’ failure to discipline their children.  In truth, “No!” is a good word!  Kids need boundaries, limits, restrictions.  They actually welcome “limits on when they go to bed, when they do their homework, when they watch TV, what they eat, who they play with.  And they thrive in tightly managed environments” (p. 129).  Permissive parenting is poor parenting!  “When parents let a child run wild, they are in fact abandoning him” (p. 147).  Without careful guidance, Shaw says, children fail to develop into caring, sensitive adults.  But because they spend so much time away from their kids, today’s  parents internalize a great deal of guilt and are overly-anxious to please rather than direct their offspring.  They even try to be friends with their youngsters, consulting rather than correcting them.  Whenever a mom or dad tells a child “Let’s go” and appends an “OK?”  there’s a problem!  Adults, not children, must make such decisions.

Parents have also allowed themselves to be brainwashed by “the parenting gurus who preach child-centric theories, asserting:  ‘Never let your baby cry,’ ‘He’ll use the potty when he’s ready,’ ‘Discipline is disrespectful,’ ‘The child’s feelings should come first'” (p. 15).  And when ill-disciplined kids get out of control, there’s always Ritalin and Prozac, which doubled in usage within a single decade.   All sorts of verbal evasions proliferate like crab grass!  Kids are called “difficult,” “oppositional,” “high-maintenance,” etc.  In fact, they’re just spoiled!  Rather than dealing with the real issues, the “experts” have simplistically prescribed a singular cure:  self-esteem!  Whatever’s wrong, self-esteem will correct it!  Bumper stickers and awards ceremonies, incessant praise and mandatory applause, all seek to make children “feel good” about themselves.  A sense of “self-esteem,” it’s said, develops when kids enjoy incessant approval.  Nonsense! says Shaw.  The self-esteem peddled by “pop psychologists is nothing less than self-worship, narcissism,” and it sizably contributes to the many problems youngsters struggle with.  Real self-esteem, on the other hand, is a by-product of authentic accomplishments.  Actually scoring a touchdown–not getting praised for trying–gives one self-esteem.   Just do something worthwhile, something good, and forget the smiley faces.

Doing things means viewing less TV.   Watching too much, and thinking about it too little, proves toxic to youngsters.  Most kids are mostly unsupervised as they weekly absorb anywhere from 20-50 hours of programming, much of it whetting appetites for consumption, sex, and violence.  Consequently, they read less and learn less, have fewer friends and like their parents less.  They also are much more discontented with things in general.  Shaw urges parents to monitor and control their children’s TV time.  The medium–like computers and music–has much to offer.  But we need to choose what’s right and protect our kids from what’s wrong.

What children most need isn’t more TV or awards or drugs but, rather, more parental care.  As John Locke observed, centuries ago, “Parents wonder why the streams are bitter, when they themselves have poisoned the fountain.”   Especially in the early years, a baby needs a mother’s arms and words.  “She alone has that unique instinctual drive that prepares her to engage in a developmental dance with her newborn” (p. 26).  Without what Shaw calls “motherese” during a baby’s first two years, his cognitive and emotional development suffers.   “This incredible relationship between mother and child is absolutely unique, the single most sacred thing in our culture” (p. 34).  And yet, amazingly enough, this “sacred thing” has been ruthlessly assailed and ridiculed, rejected by powerful elites in this country.

Those who have urged women to pursue full-time careers–feminists of all shades who have urged women to ignore their own inner promptings–have created a world profoundly hostile to children’s wellbeing.   Truth to tell, institutionalized childcare is mainly defended by those who place parents’ concerns above children’s.  Considerable dishonesty pervades the social sciences, where studies are hyped or ignored in accord with their support of working mothers and day schools.  Two parents, both pursuing careers full-time, Shaw insists, can  hardly provide “the optimum environment for raising children” (p. 80).   He writes with deep conviction, for his life has been spent dealing with “anguished parents and their children, and I can tell you this much is true:  at least one of the parents has to make raising the children the top priority” (82).   Anything less puts kids in harm’s way.  There’s much harm, for example, in childcare.  The more time a child spends in childcare facilities the less closely he will bond with his mother–and the more behavioral problems he will have thereafter.


Shaw’s contentions are buttressed by Brian C. Robertson’s Day Care Deception:  What The Child Care Establishment Isn’t Telling Us (San Francisco:  Encounter Books, c. 2003).   This is a modest updating of his earlier publication, Forced Labor:  What’s Wrong with Balancing Work and Family (Dallas:  Spence Publishing Company, c. 2002).  Robertson works as a research fellow at the Family Research Council’s Center for Marriage and Family, and he edits the Family Policy Review.  He argues that young children need constant, loving, motherly attention; a healthy attachment, early established, enables babies to develop well.  No paid substitutes can actually “mother” a baby.  “As G.K. Chesterton remarked over eighty years ago, ‘If people cannot mind their own business, it cannot possibly be more economical to pay them to mind each other’s business, and still less to mind each other’s babies'” (p. 154).  That truth, however, has been systematically denied and rejected by the elites who shape public opinion and establish public policy.  Consequently, more and more children suffer a variety of behavioral problems that ultimately affect American culture.

Basic to Robertson’s case is “attachment theory,” best represented by the noted psychologist John Bowlby and popularized by Benjamin Spock, who urged moms to stay home with their children as much as possible until they were at least four years old.  To separate a child from his mother was widely understood to endanger the child’s well-being.  During the past 30 years, however, vigorous critics have denied attachment’s import.  Though no evidence supported their case, the critics basically silenced (through intimidation) the attachment theorists.  Consequently, Dr. Spock’s 1992 edition of Baby and Child Care says nothing about the need for any infant-mother attachment and even encourages parents to elevate self-fulfillment over concern for children.  Explaining his radical about-face on this issue, Spock said that too many women “pounced” on him and blamed him for making them feel “guilty.”  Convinced they would work whether he approved or not–and unable to withstand feminist wrath–he says:  “I just tossed it.  It’s a cowardly thing that I did; I just tossed it in subsequent editions” (p. 73).

Spock represents the almost universal capitulation of elite academic and media “authorities” on childcare.  They deny the data Roberson presents which is, indeed, alarming.  It’s evident that professors and journalists care much more for their agenda than the truth.  In Bernard Goldberg’s lengthy experience as a journalist, he witnessed the success of feminists, who “are the pressure group that the media elites (and their wives and friends) are most aligned with.”  Consequently, “America’s newsrooms are filled with women who drop their kids off someplace before they go to work or leave them at home with the nanny.  These journalists are not just defending working mothers–they’re defending themselves” (Bias, 163, 178).  This explains why “research” justifying day care for kids gets prominent exposure, whereas equally valid “research” condemning it is rarely reported.

The same holds for professors in elite universities.  Despite a great deal of preening about “academic freedom” and fearlessly pursuing the truth, no tolerance is granted  “research” suggesting children suffer when deprived of their parents’ presence.  Like Social Security for politicians, daycare for children is the “third rail” for academics–touch it and you die!  Professors hoping to be published, to get tenure, to enjoy advancement and prestige in their profession, simply cannot challenge feminist orthodoxies.  Indeed, Dr. Louise Silverstein, in the American Psychologist, urged her colleagues to “‘refuse to undertake any more research that looks for the negative consequences of other-than-mother care'” (p. 103).  One of the few who dared to do so is a highly regarded scholar, Jay Belsky, who initially defended (in the ’70s and ’80s) the notion that children fared well in daycare facilities.  In time, however, mounting evidence prodded him to reverse himself.  Suddenly, he found himself attacked as an enemy of working women–indeed of women in general!  Publishing his research proved difficult.  He was “shunned at scientific meetings” (p. 43).  He’d become an outcast, a nobody!  Consequently, he’s accepted an appointment in England!

What the professors and journalists refuse to report, however, should be reported.  For children increasingly suffer as a result of parental deprivation.  On a purely physical level it’s clear that children in day care institutions are far more likely to be sick than their counterparts at home.  One epidemiologist actually called day care centers “the open sewers of the twentieth century” (p. 87).  Chronic inner ear infections, diarrhea, dysentery, jaundice, hepatitis A all thrive when small children are mixed together, and “high quality” centers are as disease-ridden as their less esteemed rivals.  Harder document, of course, is the soul-suffering endured by young children.  Kids now spend more time alone, more time with TV, less time eating meals at home, less time talking with adults.  They’re more likely to demonstrate anti-social behavior and less likely to internalize solid ethical principles.

These problems are fully understood by America’s parents, though denied by the nation’s elites!  More than three-fourths of ordinary moms and dads would prefer for moms to stay home with young children.  When day care is needed, they much prefer it be provided by a relative or friend.  But three-fourths of the alleged “experts” (generally highly-educated, and especially academic women), however, prefer day care centers.  And, though these scholars and journalists are quite wealthy, they want the government to subsidize their “child care.”  Some, like Hillary Clinton, propose aggressive interventions by the state.  So, Mrs. Clinton urged:  “Every home and family should be taught through parenting education and family visitation by social service intermediaries, how to raise children.  This would begin in the prenatal stages and continue through childhood'” (pp. 156-157).

Senators Hillary Clinton and Edward Kennedy and Christopher Dodd set the tax policies and national agenda to comply with the radical feminist agenda.  Though few parents want what Clinton et al. seek to dictate, they are subjects of the welfare state and struggle to cope with its policies.  It’s a daunting struggle, but Robertson provides data and perspectives with which to resist it.


Though rather unwieldy (640 pp.) and repetitious at times, William D. Gairdner’s The War Against the

Family:  A Parent Speaks Out on the Political, Economic, and Social Policies That Threaten Us All (Toronto:  Stoddart Publishing Co., c. 1992) gives us a Canadian parent’s perspective on a variety of issues.  A graduate of StanfordUniversity and an Olympic athlete, Gairdner weaves together history, philosophy, theology, education, psychology, sociology and jurisprudence, touching on everything from abortion to taxation.  Much of the book’s value derives from his quotations, sources, and interesting synthesis of his studies regarding the state of the modern family.

Let me focus on only one of his major themes:  the doleful impact of the Welfare State, the deleterious effect of all utopian schemes that propose to improve upon the natural order of things.  As he writes in his Preface, this book “shows how the political, economic and social/moral troubles that play themselves out in the nation at large inevitably trickle down to alter our most private lives and dreams; how any democracy based on freedom and privacy will strangle itself if it drifts toward, or is manoeuvred into, a belief in collectivism of any kind” (p. ix).  To the extent socialism triumphs, Gairdner argues, the family suffers.

This is graphically evident in Sweden, often touted as a grand example of  “democratic socialist” success.  Following the ideological schemes of the economist Gunnar Myrdal and his wife Alva, a “radical feminist sociologist” (p. 138), Sweden engineered a cradle-to-grave welfare state.  (The Myrdals’ work, incidentally, prompted the U.S. Supreme court’s 1952 Brown v. Topeka Board of Education decision mandating public school desegregation.)  In fact, early plaudits for the Swedes’ egalitarian economic system have paled of late as its debts are now mounting.  In the words of Goran Bruhner, “Sweden used to be a welfare paradise on earth.  Not it is the sick man of Europe” (p. 14).  Swedes pay the world’s highest taxes, and two-thirds of the nation’s GNP is devoted to government spending.  One-third of the people produces goods while the other two-thirds redistributes the money derived from taxing the producers.  Ten percent of the workforce fails to work on any given day–rising to 20 percent on Monday and Friday!  Swedes are “sick” 23 days a year.

Social, as well as economic decay, also marks Sweden.  The government has pursued a markedly secular agenda, evident in a 1968 publication, titled “The Family is Not Sacred.”  The author of the article declared:  “I should like to abolish the family as a means of earning a livelihood, let adults be economically independent of each other and give society a large share of responsibility for its children . . .  In such a society we could very well do without marriage as a legal entity” (p. 139).  To a great extent that has taken place in Sweden.  Fewer people marry in Sweden than in any Western nation.  Two thirds of the people in Stockholm live alone!  Swedes who do marry usually cohabit beforehand–getting motivated to marry when a child results from their intimacy.  In the midst of it all, the Swedes are having fewer and fewer children.  And those that are born are quickly lodged in daycare facilities.  Following the Myrdals’ socialist agenda, Sweden pursued policies pushing women into the workforce.  Today 60 percent of the women work–45 percent of them for the government.

The Swedish Welfare State, Gairdner insists, has delivered a lethal blow to the family.  But, to the enlightened elitists in Canada –the “Court Party”– Sweden serves as a model to follow!  Beginning with Pierre Troudeau’s ascent to power in 1968, Canada ‘s leaders have systematically orchestrated a radical swerve to the left, quickly imposing state controls in virtually every area of life.  Should Canadians–and Americans–wonder about what happens to the family when socialism triumphs, simply look to Sweden .  Doing so, Gairdner says, should prompt us to reverse directions!

One of the great reversals needed involves education, to which Gairdner devotes several chapters.  State-controlled education–one of the goals listed in The Communist Manifesto–illustrates the damage children suffer when subjected to a centrally-planned, bureaucratic system.  Amazingly, Americans in New England and the old Northwest demonstrated a higher rate of literacy in 1840 than they do today!  If you think clearly about it, “there is little difference between a collectivized, command economy and collectivized, command education.  Neither can work well, and the unit cost of the product is very great–about double the cost of the same education rendered privately” (p. 198).  Failing schools demand more money and more teachers in smaller classes, ignoring solid evidence showing that neither makes much of a dent in students’ performance.    Public school problems cannot be solved by the public schools, for they are in fact the problem!

The public school movement, strongly championed by “reformers” like the Fabians in England, dislodged churches and private schools as mentors of the young.  In 1905, the Intercollegiate Socialist Society was formed, with John Dewey as a founding member.  He and his associates envisioned “education for a new social order,” and his highly influential Democracy and Education said nary a word about home and family while stressing grand themes like “social unity” and “State consciousness.”  An admirer of the communist endeavors in Russia and the ’20s and ’30s, Dewey wanted to abolish private property and install a state-controlled economic system.  To secure those ends, he taught successive generations of educators to be “change agents” who would transform the public schools into centers for collectivist ideology.

“History will surely show,” says Gairdner, “that one of the tragic links in the long chain of Western decline was the surrender by families, to the nation State, of control over their children’s education.  As Yale historian John Demos has aptly argued, the school is one of the institutions responsible for the long-term ‘erosion of function’ of the family.  And Stanford’s Kingsley Davis writes that ‘one of the main functions [of the school system] appears to be to alienate offspring from their parents” (p. 208).   But we need not abandon our young to the state!  To reverse the harm being done to our kids, Gairdner urges us to support private schools, vouchers, anything possible to take back some of the power from the omnivorous state.  And, perhaps, there is headway being made in the U.S. today!  Ultimately, truth prevails, and it’s difficult to evade the truth of G. K. Chesterton, a century ago:  “This triangle of truisms, of father, mother and child, cannot be destroyed; it can only destroy those civilizations which disregard it” (p. 584).

145 The Homosexual Agenda

In 1986, United States Supreme Court Chief Justice Warren Burger, writing for the majority in Bowers, upholding   Georgia  law forbidding sodomy, said:  “Decisions of individuals relating to homosexual conduct have been subject to state intervention throughout the history of Western civilization.  Condemnation of those practices is firmly rooted in Judeo-Christian moral and ethical standards . . . . [Sir William] Blackstone described ‘the infamous crime against nature’ as an offense of ‘deeper malignity’ than rape, a heinous act ‘the very mention of which is a disgrace to human nature’ and ‘a crime not fit to be named.’  To hold that the act of homosexual sodomy is somehow protected as a fundamental right would be to cast aside millennia of moral teaching.”  His historical perspective was accurate, and his citing Blackstone revealed his reliance upon one of the masterful authorities in jurisprudence.

Soon thereafter, however, the Court discarded Blackstone and millennia of moral teaching!  Seventeen years after the Bowers decision, the Supreme Court, in   Texas  , reversed itself and effectively legalized sodomy.   A few months later a Massachusetts Supreme Court case ordered the state legislature to draft legislation facilitating same-sex marriage.  Thus we’re alerted to what Alan Sears and Craig Osten consider in The Homosexual Agenda:  Exposing the Principal Threat to Religious Freedom Today Nashville  :  Broadman & Holman, Publishers, c. 2003).  R. Albert Mohler Jr., President of The Southern Baptist Theological Seminary, recommends the book, declaring that “The sexual revolution of the last half-century amounts to the moist sweeping and significant reordering of human relationships in all of human history.”  The revolution was orchestrated by a cadre of activists who now make “the legitimation and celebration of homosexuality” the next stage of sexual liberation.

Indeed, “As one observer of the homosexual movement [the Orthodox Jewish columnist Don Feder] has warned, ‘Gay activists are sexual Marxists.  Legitimizing same-sex unions as a warm-up act.  Ultimately they want to eliminate any barriers, and signposts, that limit or channel the exercise of human sexuality'” (p. 96).  As is evident in Sweden, they also want to eliminate any criticism, much less opposition, of their behavior.  The Swedish parliament recently forbade “all speech and materials opposing homosexual behavior and other alternative lifestyles. Violators could spend up to four years in jail” (p. 183).  Deeply influenced by sociologists Gunnar and Alva Myrdal, the Swedes have sought, as Alva Myrdal urged, to treat all adults “in the same manner by society, whether they lived alone or in some sort of common living arrangement.”  Same-sex, as well as heterosexual, “living arrangements” are fine.   In the   U.S.  , under the guise of “hate crimes” legislation largely written to appease homosexual activists, teachers and pastors may very well become liable to prosecution simply for upholding biblical standards regarding sexual conduct.  Indeed, Senator Ted Kennedy, a constant supporter of hate-crimes bills,  has “called religious objections to homosexual behavior ‘an insidious aspect of American life'” (p. 202).

To attain their goals, sexual revolutionaries have known they must destroy (or at least immobilize) the family and the church, the two social institutions most opposed to sexual license.  To expand the definition of “family” to include many sorts of “loving” relationships, to force the church (in the name of “love”) to validate such bonds, has been a basic part of the homosexual agenda.  Courts have increasingly granted gay and lesbian couples to adopt children.  Revealingly, “Al Gore and his wife Tipper donated $50,000 to the Human Rights Campaign to help its ‘FamilyNet’ campaign promote homosexual adoption.  Their book, Joined a the Heart:  The Transformation of the American Family, prominently featured homosexual ‘families'” (p. 111).

One of the longest levers slowly easing the public’s hostility to homosexual activity is the entertainment industry.  Portraying gay and lesbian activities as healthy–and branding any criticism of such as hateful–have become pervasive in films and television.  Comedies have been especially effective by first disarming and then appealing to viewers for “tolerance.”  As Michael Medved noted:  “A Martian gathering evidence about Americasociety, simply by monitoring our television, would certainly assume that there were more gay people in America than there are evangelical Christians” (p. 28).

Despite the ancient opposition of Christians to homosexual acts, today’s Churches have gradually moved from “loving the sinner” to endorsing sodomy as an appropriate expression of sexuality so long as it occurs within the context of “love.”   Though this is most evident in the consecration of an openly gay bishop in the Episcopal Church, evangelical activists, such as Tony Campolo and his wife Peggy, have aggressively promoted “the radical homosexual agenda (p. 128).  Peggy, particularly, has argued Paul’s apparent condemnation of same-sex relations in “Romans 1 does not apply to monogamous, ‘loving,’ homosexual relationships, and that evangelicals who feel differently than her are ‘grossly misinformed'” (p. 129).  Such statements appear in one of the books’ most disquieting chapters, entitled “The Silence (and Silencing) of the Church.”  Rarely these days does one hear words such as Martin Luther’s:  “The heinous conduct of the people of Sodom is extraordinary, inasmuch as they departed from the natural passion and longing of the male for the female, which was implanted by God, and desired what is altogether contrary to nature.  Whence comes this perversity:  Undoubtedly from Satan, who, after people have once turned away from the fear of God, so powerfully suppresses nature that he beats out the natural desire and stirs up a desire that is contrary to nature'” (p. 123).

In part, as the authors carefully document, Christians have been silenced through violence and intimidation–as when gay activists invaded S. Patrick’s Cathedral and disrupted a mass being conducted by Cardinal John O’Connor.  Others threw condoms in a service of Village Seven Presbyterian Church in   Colorado Springs because a prominent layman, Will Perkins, had supported an amendment to the state constitution which would have banned any preferential treatment of homosexuals.  In part, homosexuals have moved into the church through doors opened by radical feminists “who have tried to reshape the church and the gospel I their own image.  That dodge can be best summarized as ‘the Bible has to be interpreted in the context of the time it was written and therefore that passage is no longer relevant today'” (p. 126).  When churches rewrite the Scripture, using “gender-inclusive language” and approve praying to “Mother and Father God,” there is no reason to deny homosexual arguments regarding a new version of the Christian faith, suited to gay and lesbian desires.

In response to the homosexual agenda, Sears and Osten urge Christians to be true to Scripture and Tradition.  They cite the words of Titus Brandsma, a Dachau  martyr, who said:  “Those who want to win the world for Christ must have the courage to come into conflict with it” (p. 205).  There’s no question that opposing homosexual activists requires courage.  It’s the courage evident in the words of the Anglican archbishop of Sidney, who said:  “‘The Christian Gospel is the insertion of truth into the untrustworthy discourse of the world.  Some of us want to be kind, so loving that we will not speak the truth.  The therapeutic model of pastoral care has been perverted into mere affirmations of human behaviour.  Our love is no love, for it refuses this great test:  will it speak boldly, frankly, truthfully?”  Sadly enough, he continued:  “We have contributed towards the gagging of God, perhaps because we are frightened of suffering.  But there is one fundamental task to which we must be committed, come whatever may:  Speak the truth in love'” (p. 211).

* * * * * * * * * * * * * * * * * * * *

Christopher Wolfe has edited Same-Sex Matters:  The Challenge of Homosexuality  (Dallas  :  Spence Publishing House, 2000).    In his introductory essay, Wolfe argues that “there is no question that our current family instability–and the growing acceptance of homosexuality–reflects, among other factors, the influence of changing social mores on contraception, premarital sex, cohabitation, and no-fault divorce.”  The moral relativism pervading contemporary culture justifies “whatever is pleasant and does not immediately harm others in some relatively tangible way” (p. 17).     Refusing to condemn fornication and adultery, so long as they involve “consenting adults,” one cannot easily express outrage at homosexual acts.

Patrick Fagan, along with several of the other essayists, roots today’s sexual permissiveness in the contracepting culture that emerged in the 1940s.  He cites Sigmund Freud, interestingly enough, who asserted, “in ‘The Sexual Life of Human Beings’ that the separation of procreation and sexual activity is the most basic of perversions, and that all other sexual perversions are rooted in it:  ‘The abandonment of the reproductive function is the common feature of all perversions.  We actually describe a sexual activity as perverse if it has given up the aim of reproduction and pursues the attainment of pleasure as an aim independent of it'” (p. 29).  The Anglican Church abandoned its historic opposition to contraception at the Lambeth Conference in 1930, other Protestant denominations soon followed along.

Religious reservations regarding homosexuality were also weakened as divorce and abortion gained acceptance.  Focusing attention on “hard cases,” cultivating compassion for “victims,” effectively pulled public opinion toward greater acceptance of what earlier generations condemned.  Just as “love” grown cold justified divorce so could “love” powerfully felt justify homosexual relationships.  Just as opposition to abortion was effectively branded “hateful” toward women, so opposition to sodomy was labeled “hate” and “homophobia.  Indeed, as Michael Medved makes clear in his essay, “the main threats to the family in America do not come from the gay community.  They come from infidelity, they come from divorce, they come from all the temptations heterosexuals fear and feel in a hedonistic culture” (p. 167).

For anyone interested in Church history, Robert Louis Wilken’s essay, “John Boswell and Historical Scholarship on Homosexuality” is most helpful, since Boswell’s “scholarship” is routinely cited by homosexual activists anxious to suggest that the EarlyChurch tolerated their lifestyle.  Though highly praised by The New York Times and similarly leftist media, Boswell’s work is, in fact, deeply flawed, indeed “bogus.”  His writings, such as Christianity, Social Tolerance, and Homosexuality,  and Same-Sex Unions in Pre-Modern Europe illustrate “advocacy scholarship, pseudohistorical learning yoked to a cause, tendentious scholarship at the service of social reform, a tract in the culture wars” (p. 198).  From a first-rate scholar such as Wilken, this is a damning indictment.  And it properly extends to all “scholars” who try to reinterpret either biblical or historical materials to “christianize” homosexuality.

The impossibility of doing so becomes clear in Bishop Fabian Bruskewitz’s “Homosexuality and Catholic Doctrine.  After citing the official teaching documents of the Church, Bruskewitz reminds Catholics that their opposition to homosexuality is derived from a theology of creation, crediting Him with the goodness and design of all that is.  By nature, homosexual acts go counter to the created order.  They violate the essence of love.  Livio Melina, a professor of moral theology at the PontificalLateranUniversity in Rome, makes this clear:  “‘In the homosexual act, true reciprocity, which makes the gift of self and the acceptance of the other possible, cannot take place.  By lacking complementarity, each one of the partners remains locked in himself and experiences contact with the other’s body, merely as an opportunity for selfish enjoyment.  At the same time, homosexual activity also involves the illusion of a false intimacy that is obsessively sought and constantly lacking.  The other is not really the other.  He is like the self; in reality, he is only the mirror of the self which confirms it in its own solitude exactly when the encounter is sought.  This pathological narcissism has been identified in the homosexual personality by the studies of many psychologists.  Hence, great instability and promiscuity prevail in the most widespread model of homosexual life, which is why the view advanced by some, of encouraging stable and institutionalized unions, seems completely unrealistic'” (p. 222).

* * * * * * * * * * * * * *

Though written nearly a decade ago, I still regard Jeffrey Satinover’s Homosexuality and the Politics of Truth (Grand Rapids:  Baker Books, 1996) the best book on the subject. The author, a medical doctor, has been involved treating AIDS patients from the beginning of the epidemic’s outbreak in the early ’80s.  He knows the truth and is bold to declare it.  He is also deeply compassionate, distressed by the pain endured by those afflicted with the deadly HIV virus.

The truth is:  like alcoholism, homosexual behavior is deadly.   One study “found that the gay male life span, even apart from AIDS and with a long-term partner, is significantly shorter than that of married men in general by more than three decades.  AIDS further shortens the lie span of homosexual men by more than 7 percent” (p. 69).  They inordinately suffer chronic diseases, including syphilis and gonorrhea, such as hepatitis, rectal cancer, and bowel disorders.  They take their own lives and suffer mentally.

Amazingly, when AIDS began to do its deadly work, “the first priority” in the gay community “was to protect homosexuality itself as a perfectly acceptable, normal, and safe way of life” (p. 15).  Rather than trying to protect individuals from disease, something that would have required amending one’s lifestyle, the gay community orchestrated a political movement designed to protect it, by misleading the public, asserting that homosexuality is genetically programmed, irreversible, and normal.  In fact, there is no evidence for a “gay gene,” and homosexuality is largely a learned behavior.  It can, therefore, be reversed–and thousands of individuals have been restored, through “reparative therapy” to heterosexuality.  And it is, in fact, utterly un-normal, running counter to the most basic laws of nature.

To promote their deceit, homosexual activists engaged, skillfully, in politics!  In the ’70s they persuaded the American Psychiatric Association’s Board of Trustees to declassify homosexuality as a “disorder,” though a large majority of psychiatrists still judged it such.  Political pressure applied behind the scenes, not scientific evidence, dictated the change.  Homosexual activists, by disrupting meetings and intimidating officials, gained “scholarly” validation for their sexual behavior.   The American Psychological Association soon followed suit.  With “science” supporting their cause, homosexuals then turned to legislatures and courts, slowly overturning the nation’s moral consensus.

What’s needed, Satinover says, is a recovery of the Judeo-Christian ethos that once characterized this nation.  Secularists have opened the gates to a resurgent Gnostic paganism, ever tolerant of “diversity” in many forms.  Himself Jewish, Satinover urges Orthodox Jews and Christians to join together in promoting a biblically based public, as well as private, morality.  Sin must be called sin!  Christians, especially, need to recover veneration for the Old Testament Law!  Ultimately, “it is not really a battle over mere sexuality, but rather over which spirit shall claim our allegiance, [for] the cultural and political battle over homosexuality has become in many respects the defining moment for our society.  It has implications that go far beyond the surface matter of ‘gay rights.’  And so the more important dimension of this battle is not the political one, it is the one for the individual human soul” (p. 250).

* * * * * * * * * * * * * * * *

In Legislating Immorality:  The Homosexual Movement Comes Out of the Closet (Chicago:  Moody Press, c. 1993), George Grant and Mark A. Horne evaluate the issue from a strongly Evangelical perspective.  Thus their main concern is “an uninformed and compromised church” which needs to discern “that whatever is right is actually good, that whatever is good is actually just, and that whatever is just is actually merciful.  The kindest and most compassionate message Christians can convey to homosexuals and their defenders is an unwavering Biblical message” (p. 5).

The authors provide both contemporary and historical illustrations, showing the pervasiveness of homosexuality, especially in non-Christian cultures.   With the resurgence of paganism in the Enlightenment, the West has increasingly tolerated it.  With refreshing candor, Camille Paglia, a highly secularized writer, asserts:  “Happy are those periods when marriage and religion are strong. . . .  Unfortunately, we live in a time when the chaos of sex has broken out into the open. . . .  Historiography’s most glaring error has been its assertion that Jude-Christianity defeated paganism.  Paganism has survived in the thousand forms of sex, art, and now the modern media. . . .  A critical point has been reached.  With the rebirth of the gods in the massive idolatries of popular culture, with the eruption of sex and violence into every corner of the ubiquitous mass media, Judeo-Christianity is facing its most serious challenge since Europe’s confrontation with Islam I the Middle Ages.  The latent paganism of western culture has burst forth again in all its daemonic vitality'” (p. 54).

Homosexuals have successfully infiltrated the media and schools, using their influence to dissolve opposition to their orientation and behavior.  School administrators, such as Joseph Fernandez in New York, seek to impose curricula containing books like Daddy’s Roommate and Heather Has Two Mommies, books published by a company “that specializes in subversive homosexual works” (p. 79) and promoting the acceptance of homosexuality.   Though angry parents ousted Fernandez from his position as School Chancellor, the schools for two decades have increasingly urged tolerance–indeed often admiration–for homosexuals.

Churches, too, have eased or eliminated their opposition to homosexual acts.  Mainline denomination, especially, have divided over the issue.  Grant and Horne see this as a symptom of a more basic issue:  their integrity.  For one’s position on homosexuality cannot be severed from “the issue of biblical authority, the nature of church ministry, the scope of church discipline, and the church’s responsibility and relationship to the civil sphere” (p. 165).   Citing official declarations from United Methodists, Episcopalians, Presbyterians, et al., the authors demonstrate the degree to which the nation’s churches have gradually embraced the homosexual agenda.  Even self-professed evangelicals, such as Tony Campollo,  Virginia Ramey Mollenkott and Letha Scanzoni, open doors of acceptance to gay rights.

What’s needed, the authors argue, is a recovery of the true biblical and historically Christian position.  In the   Early  Church, believers separated themselves from the sexually perverse Greco-Roman culture.  This included homosexual practices–something staunchly condemned by every extant pre-Constantinian text.  In time, as Christians become numerically dominant, laws reflected their convictions.  Thus Emperor Theodosius, in 390 A.D., “declared sodomy a capital crime and various Christian realms continued to enforce that standard for almost two millennia” (p. 209).

144 Judicial Tyranny?

 

            When he argued the case for the ratification of the United States Constitution in Virginia , James Madison, the document’s most influential architect, warned:  “I believe there are more instances of the abridgment of the freedom of the people by gradual and silent encroachments of those in power than by violent and sudden usurpation.”  To protect the people’s freedom, the Constitution balanced powers in the federal government, safeguarding the rule of law from tyrannical usurpers.  Madison ‘s concern for the loss of freedom to “gradual and silent encroachments” was recently revived in an issue of Commentary (October 2003), wherein  distinguished contributors addressed the question:  “Has the Supreme Court Gone Too Far?”  Their essays demonstrate the extent to which recent Supreme Court decisions regarding affirmative action and sodomy–simply the latest of a list of similar judicial edicts–have forced many thoughtful folks to ponder the fate of constitutional law in America . 

            One of the contributors, Lino A. Graglia, a professor at the University of Texas Law School, argued that the Court has abandoned its assigned role–interpreting the Constitution–and now pursues “policy choices” designed to empower an elite, enlightened minority of like-minded liberals.  To Graglia, “Virtually every one of the Court’s rulings of unconstitutionality over the past fifty years–on abortion, capital punishment, criminal procedure, busing for school racial balance, prayer in the schools, government aid to religious schools, public display of religious symbols, pornography, libel, legislative reapportionment, term limits, discrimination on the basis of sex, illegitimacy, alien status, street demonstrations, the employment of Communist-party members in schools and defense plants, vagrancy control, flag burning, and so on–have reflected the views of this same elite. In every case, the Court has invalidated the policy choice made in the ordinary political process, substituting a choice further to the political left. Appointments to the Supreme Court and even to lower courts arc now more contentious than appointments to an administrative agency or even to the Cabinet–matters of political life or death for the cultural elite–because maintaining a liberal activist judiciary is the only means of keeping policymaking out of the control of the American people.”

            Another contributor to the Commentary symposium, Judge Robert H. Bork, had earlier and more amply set forth his views in Coercing Virtue:  The Worldwide Rule of Judges ( Washington :  The AEI Press, 2003).  As the book’s subtitle indicates, Bork believes that judicial activism now threatens peoples’ liberties everywhere, for they “are enacting the agenda of the cultural left” (p. 2).  As tenured members of the intelligentsia (labeled the “New Class” by Bork), judges increasingly consider themselves entitled to impose their political and cultural worldview.  They illustrate what G.K. Chesterton noted as a universal phenomenon:  “In all extensive and highly civilized societies groups come into existence founded upon what is called sympathy, and shut out the real world more sharply than the gates of a monastery. . . .  The men of a clique live together because they have the same kind of soul, and their narrowness is the narrowness of spiritual coherence and contentment, like that which exists in hell” (Heretics, 5th ed., 1905, pp. 180-181).   C.S. Lewis similarly observed the persistent desire we all possess to enter the “inner ring” and thereby gain power over others. 

            When the “inner ring” consists of irreligious intellectuals, utopian ideologies replace theological dogmas and guide their thinking.  As Max Weber noted, in The Sociology of Religion, intellectuals who reject religion easily embrace “the economic, eschatological faith of socialism.” Most 20th century secular utopians have embraced a socialist agenda and seek to attain it through political means.  “The socialist impulse remains the ruling passion of the New Class” (p. 6), though it now focuses on cultural issues such as sex and education rather than economics.  Modern “liberalism,” with its commitment to social change through political coercion, is thoroughly socialistic, Bork says.  And it is equally authoritarian, for the cultural elites, everywhere failing to persuade the masses to democratically embrace their values, now seek to impose them through the courts. 

            Consequently, “What judges have wrought is a coup d’etat–slow-moving and genteel, but a coup d’etat nonetheless” (p. 13).  They also lend support to a collage of special interest groups–environmentalism, feminism, multiculturalism–which share a socialistic commitment to reshaping the world.  Bork’s view was earlier espoused by the esteemed sociologist Robert Nisbet, who  noted that “‘crusading and coercing'” courts have preempted power so as to precipitate “the wholesale reconstruction of American society,” aiming to implement what Jean-Jacques Rousseau and Jeremy Bentham championed:  “sovereign forces of permanent revolution” (p. 10).  This revolution, embodied in activist judges, is both political and cultural and has significantly, if subtly, replaced “traditional moralities with cultural socialism” (p. 137).   

            Of particular interest to Bork is the internationalization of this agenda.   He devotes two chapters to Canada and Israel , whose courts are on the cusp of judicial activism.  Europe courts such as the International Criminal Court, established in 1998, have become aggressive in asserting the prerogatives of “international law”–generally understood as the edicts of elite tribunals.  “Crimes against humanity” were cited justify legal moves against Chile’s Augusto Pinochet and Yugoslavia’s Slobodan Milosevic, but not against China’s Li Peng or Cuba’s Fidel Castro!  Wars to combat communism are labeled unjust, whereas wars that advance causes favored by elite jurists are justified for advancing “universal human rights.”  American Supreme Court justices have, alarmingly, begun to cite non-American courts in issuing decisions.  Thus Justice Stephen Breyer has cited court decisions in India , Jamaica , and Zimbabwe !  The U.S. Constitution may have little bearing on the Court’s decisions, but Zimbabwe ‘s jurists apparently do! 

* * * * * * * * * * * * * * * * * * * *

            ReadingCoercing Virtue prompted me to re-read Bork’s Slouching Towards Gomorrah:  Modern Liberalism and American Decline (New York:  ReganBooks, c. 1996), a work of cultural commentary rather than legal analysis.  Modern liberalism, as Bork defines it, espouses apparent antinomies:  radical egalitarianism and radical individualism.  It triumphed as the New Left of the ’60s, represented by Bill and Hillary Clinton, Tom Hayden and Jane Fonda, gained control of the nation’s institutions in the ’90s.   Teaching at Yale Law School , Bork saw a radical change in the class that entered in 1967.  Radicalized in their undergraduate years, they “were angry, intolerant, highly vocal, and case-hardened against logical argument” (p. 36).  Simultaneously angry and hedonistic, crusading for “social justice” and care freely cohabiting, they espoused a nihilism that now pervades the nation. 

            In time, the young radicals took their ideals and became “part of the chattering class, talkers interested in policy, politics, and culture.  They went into politics, print and electronic journalism, church bureaucracies, foundation staffs, Hollywood careers, public interest organizations, anywhere attitudes and opinions could be influenced” (p. 51).  They established a variety of special interest groups–environmental, feminist, abortion rights, ethnic, etc.   And they are leading us, Bork believes, down the slope to moral degradation, Gomorrah !

* * * * * * * * * * * * * * * * * *

            Judge Bork also wrote an essay for a symposium that was provocatively titled “The End of Democracy?” and published in the journal First Things (November 1996).  At the heart of the controversy, says Richard John Neuhaus, the journal’s editor-in-chief, is the degree to which we still have a constitutional republic.  Neuhaus once attended a conference wherein a legal scholar concluded his presentation with the assertion that ‘we are no longer living under the Constitution of the United States of America .’  To which a Supreme Court justice in  attendance responded, ‘Welcome to the second half of the twentieth century'” (p. 244).  Though many were amused at the moment by the justice’s quip, the truth seems to be that we no longer live under the Constitution ratified in 1789.

            The essays elicited a flurry of controversy, with dozens of responses printed in various periodicals–and in subsequent issues of First Things.  All the relevant materials, plus a lengthy “Anatomy of a Controversy” by Richard John Neuhaus (including the anecdote regarding the Constitution cited in the prior paragraph), were collected into a single volume, edited by Mitchell S. Muncy, entitled The End of Democracy?  The Celebrated First Things Debate with Arguments Pro and Con” (Dallas:   Spence Publishing Company, 1997). 

            The journal’s editors, introducing the essays, wondered “whether we have reached or are reaching the point where conscientious citizens can no longer give moral assent to the existing regime” (p. 3).  The term “regime” ignited a storm of protest, but the editors used it by design to indicate the possibility that “we the people” no longer rule our own country.  “Democratic” means too routinely fail to attain the people’s ends and policies they clearly oppose, such as unrestricted abortion rights, are routinely imposed through judicial fiat, as was especially evident in two 1992 Supreme Court decisions:  Romer v. Evans and Planned Parenthood v. Casey

            Such decisions prompted a dissenting Justice Antonin Scalia to declare:  “Day by day, case by case, [the Court] is busy designing a Constitution for a country I do not recognize” (p. 10).  In Romer, the Court overturned the clearly expressed will of the people of Colorado , who had adopted, through a statewide referendum, a constitutional amendment specifically denying homosexuals the special protections and rights granted them by some municipalities.  Commenting on the case, Robert Bork notes that “Romer is a prime instance of ‘constitutional law’ made by sentiment having nothing to do with the Constitution.”  Rather, it established “the newly faddish approval of homosexual conduct among the elite classes from which the justices come and to which most of them respond” (p. 12). 

            Russell Hittinger, a professor of law at Tulsa University , argues that the amazing claims set forth by the Court in Casey asserted that even if Roe v. Wade was legally questionable it was legitimate since the American people had accepted it as law.  One of the dissenting justices, Byron White, called his colleagues’ decision in Roe an “exercise of raw judicial power,” and Casey locked in concrete that decision, making “abortion the benchmark of it is own legitimacy, and indeed the token of the American political covenant” (p. 18).  The Court behaves as if the American people had established a new “regime” ruled by judicial edicts, not legislative enactments.  After examining crucial decisions, Hittinger asserts that the new regime is “a very bad regime” (p. 27) because it leaves the weakest among us–the unborn children and the helplessly infirm–at the mercy of those who want them dead.  It excludes the people from political power, rightly exercised through legislative elections and deliberations.  And, sadly enough “it has made what used to be its most loyal citizens–religious believers–enemies of the common good whenever their convictions touch upon public things” (p. 28). 

              Hadley Arkes, a professor of law at Amherst College when the essays were published, carefully considers the implications of Romer v. Evans, the Supreme Court decision which nullified a constitutional amendment secured through a referendum whereby the people of Colorado sought to invalidate the preferential treatment homosexual activists had secured in certain localities.  This decision illustrates the propensity of judges to “advance the interests of gay rights and other parts of the liberal agenda” (p. 31).  Ultimately, Arkes insisted, the gay activists want to redefine the family and legalize same-sex marriages.  This is evident in the words of Nan Hunter, a lesbian activist Bill Clinton appointed, in 1993, “deputy general counsel/legal counsel” in the Department of Health and Human Services, who sought “to dismantle the legal structure of gender in every marriage'” (p. 35).  Such radical changes, of course, cannot be won through the democratic process, whenever the people are allowed to express and implement their convictions.  Only by enlisting an “enlightened” elite, only by pushing their agenda through the courts, can gay and lesbian activists attain their goals.

            In “Kingdoms in Conflict,” one of the more radical essays in the symposium, Charles Colson argued that we are now witnessing the culmination “of a long process I can only describe as the systematic usurpation of ultimate political power by the American judiciary–a usurpation that compels evangelical Christians and, indeed, all believers to ask sobering questions about the moral legitimacy of the current political order and our allegiance to it” (p. 41).  Supreme Court decisions, especially those securing abortion rights, cannot be prod devout citizens to ponder their allegiance to a regime responsible for the deaths of millions of unborn children.  Citing theologians such as Calvin and Aquinas, who endorsed Augustine’s aphorism that “an unjust law is no law at all,” Colson wonders how much more must take place before Christians begin to challenge and even disobey their masters. 

            Sharing Colson’s discontent, Robert P. George, a professor of politics at Princeton University , suggested that we may very well be subjects of “The Tyrant State.”  Though America is still a democratic society, “even a democratic regime may compromise its legitimacy and forfeit its right to the allegiance of its citizens” (p. 54) when it endorses what John Paul II called “the culture of death.”  This has occurred, in the U.S. , primarily through legalized abortion.  Sadly enough, in our democracy “our judges–whose special responsibility it is to preserve the core democratic principle of equality before the law–are the ones whose edicts have betrayed this principle” (p. 56).  Reflecting on what we should do, right now, given the significant freedoms we still enjoy, Professor George urges us to heed Pope John Paul II, “‘to have the courage to look the truth in the eye and to call things by their proper names, without yielding to convenient compromises or to the temptation of self-deception.’  Let us, therefore, speak plainly:  The courts, sometimes abetted by, and almost always acquiesced in, federal and state executives and legislators, have imposed upon the nation immoral policies that pro-life Americans cannot, in conscience accept” (p. 61). 

            These five essays constitute the heart of The End of Democracy.  The rest of the book contains a variety of responses, ranging from letters to First Things to lengthy essays published in other periodicals.  Most of them are quite critical, and some (Peter Berger and Gertrude Himmelfarb) on the editorial board of First Things resigned lest they be implicated in the questioning of America ‘s “democracy.”  Others (James Dobson and Mary Ann Glendon) endorsed the endeavor. 

            What becomes clear, in both the original essays and the responses to them, is the fact that abortion deeply divides this nation.  In an essay for The National Review, a “neoconservative” Jewish writer, William Kristol explained:  “the truth is that abortion is today the bloody crossroads of American politics.  It is where judicial liberation (from the Constitution), sexual liberation (from traditional mores), and women’s liberation (from natural distinctions) come together.  It is the focal point for liberalism’s simultaneous assault on self-government, morals, and nature.  So, challenging the judicially-imposed regime of abortion-on-demand is key to a conservative reformation in politics, in morals, and in beliefs” (p. 94). 

            Kristol’s analysis is amplified by Hadley Arkes, on of the original essayists, in an explanation of their intent.  Rooted in the Declaration of Independence’s appeal to the natural law–that by nature all men are entitled to certain rights, especially the right to life, he and other contributors “spoke no treason, and they took care not to incite people to a course of lawlessness.  But . . . we come to the very edge when our government tells us that the killing of unborn children must be regarded as a private right; that we may have no proper concern about the terms on which killing is carried forth in our neighborhoods; and that the meaning of ‘homicide’ is no longer part of the business of people living together in a republic” (p. 169). 

143 War Against Terrorism

 One of the nation’s finest and most able scholars, Jean Bethke Elshtain, Professor of Social and political Ethics at The University of Chicago, provides helpful perspectives on our nation’s role in the Middle East in Just War Against Terror: The Burden of American Power in a Violent World (New York:  Basic Books, c. 2003).  Prodded to write after America’s attack by Muslim terrorists, Elshtain defends President Bush’s response, especially when seen in the light of Osama bin Laden’s 1998 declaration of war against America, [entitled] “Jihad Against Jews and Crusaders (p. 3).  Unlike many naive folks in the West, bin Laden declares that the current conflict is at heart a religious struggle, a rekindling of a battle that has waxed and waned since Mohammed launched his conquests in 622 A.D. 

Summing up bin Laden’s agenda a Yale historian, Donald Kagan, has written:  “he and other terrorists have made it clear that the U.S. is ‘the great Satan,’ the enemy of all they hold dear.  And what these terrorists hold dear includes the establishment of an extreme and reactionary Muslim fundamentalism in all currently Muslim lands, at least which is a considerable portion of the globe.  Such a regime would impose a totalitarian theocracy that would subjugate the mass of people, especially women. . . .  No change of American policy, no retreat from the world, no repentance for past deeds or increase of national modesty can change these things.  Only the destruction of America and its way of life will do, and Osama bin Laden makes no bones about this” (p. 85). 

As a careful philosopher, Elshtain draws important distinctions between the terrorism of bin Laden  the just war tradition that has developed in Christian theology.   Islamic Jihad bears many of the marks of terror!   There is a striking similarity between the “reign of terror” orchestrated by the Jacobins in France in the 1790s and the policies of Moslem jihadists.  Elshtain grasped this when she attended a conference in Jerusalem in 1993 and heard a distinguished scholar, Bassam Tibi, explain that:  “[The] Western distinction between just and unjust wars linked to specific grounds for war is unknown in Islam.  Any war against unbelievers, whatever its immediate ground, is morally justified.  Only in this sense can one distinguish just and unjust wars in Islamic tradition.  When Muslims wage war for the dissemination of Islam, it is a just war. . . .  When non-Muslims attack Muslims, it is an unjust war.  The usual Western interpretation of jihad as a “just war” in the Western sense is, therefore, a misreading of this Islamic concept (emphasis mine)” (p. 131). Islamic jihad is simply a religious version of might-makes-right aggression.  Wars of conquest, waged to expand and install Islam, are just wars; terror tactics, so long as they advance the cause of Islam, are defensible.  The Muslim world is ever at war with the non-Muslim world.

The just war tradition, conversely, has ever sought to distinguish between moral and immoral conduct.  Given human sinfulness, there is a need for government, whose primary responsibility is to protect people.  Thus laws, judges, police, and soldiers are necessary to maintain order and punish evil-doers.  Early Christians, especially Augustine, accepted this and provided guidelines for Christians to follow in supporting the political order.  Elshtain rightly dismisses the historical errors of those who argue that the EarlyChurch was pacifist, indicating that the primary advocates of this position were theologians like Origin and Tertullian “who fell outside the Christian mainstream” (p. 51).  (In fact, soldiers were admired sufficiently to justify branding model Christians milites Christi, soldiers for Christ!)   “Jesus preached no doctrine of universal benevolence.  He showed anger and issued condemnations.  These dimensions of Christ’s life and words tend to be overlooked nowadays as Christian concentrate on God’s love rather than God’s justice.  That love is sometimes reduced to a diffuse benignity that is then enjoined on believers.  This kind of faith descends into sentimentalism fast” (p. 100). 

Following the admirably non-sentimental Augustine, Christian thinkers formulated the just war position, convinced that “To save the lives of others, it may be necessary to imperil and even take the lives of their tormenters” (p. 57).  Careful criteria were enumerated over the centuries and attained something of a consensus in virtually all branches of Christendom.  The late Reinhold Niebuhr, arguably the most influential American ethicist of the 20th century, was “hardheaded [in his] insistence that Christianity is not solely a religion of love” (p. 109), showing that love and justice ever work together in solid Christian ethics.

Thus, Elshtain wonders: “is the war against terrorism just?”  After examining the evidence, she responds:  yes it is!  There are, of course, thousands of Elshtain’s colleagues in the academy and pundits in the press who say no!  They resemble, she thinks, the “humanists” portrayed in Albert Camus’s novel The Plague.  Such folks refuse to judge things in terms of black and white–all is a fuzzy mixture of gray.  There are many sides to every question and every conclusion must be tentative.  Their talk, their terms, their preferences–but not discomfiting realities–define their world.  Consequently, they “are unwilling or unable to peer into the heart of darkness.  They have banished the word evil from their vocabularies.  Therefore, it cannot really exist.  Confronted by people who mean to kill them and to destroy their society, these well-meaning persons deny the enormity of what is going on” (pp. 1-2).  So, when Ronald Reagan called the U.S.S.R. an “evil empire” a chorus of criticism was unleashed against him.  When George Bush labeled Iraq, Iran, and North Korea “an axis of evil,” the same singers united in denouncing him. 

In fact, the mark of the modern academic is negativity!   Criticizing, finding fault, disagreeing with traditional views, earns one a seat in the faculty lounge.    As one experienced professor noted, “You don’t get tenure by praising American policy” (p. 88).  In most American universities, professors retain their anti-Vietnam War stance, ever questioning the legitimacy of the military and condemning America’s power in the world.  They seem trapped in a strange time-warp, unable to see the world apart from their youthful anger at the Vietnam engagement.  Thus you find professors alleging that America’s foreign policy is “fascist” and historian Mary Beard actually insisting that “the United States had it coming” when the terrorists attacked the nation on 9/11 (p. 93).         

Something of the same marks mainline churches.  In a chapter entitled “the pulpit responds to terror,” Elshtain demonstrates the degree to which American clergy share the leftism of the academy, for “a position best described as ‘pseudo’ or ‘crypto’ pacifism now dominates, certainly from our mainline pulpits”(p. 112).  Such preachers state the 9/11 devastation was less a murderous attack upon innocent people than a “wake-up call” for us Americans who need to examine ourselves and change our ways, to eliminate the “root causes” of terrorist anger.  A prominent evangelical, Tony Campolo (never one to allow historical ignorance to temper his rhetoric), as well as former President Bill Clinton, harked back to the Crusades, suggesting that Muslim terrorists were simply avenging the evils their ancestors suffered at the hands of Christian Crusaders.  Campolo and Clinton, of course, seemed often mute when confronted with the far greater numbers of Christians who have been slaughtered and enslaved as a result of Islamic Jihad!            Significantly, Albert Camus, in a 1948 statement, insisted that the “Christian has many obligations, and that the world today needs Christians who remain Christians.”  Camus professed that he did not share the Christian hope.  But he did share “the same revulsion from evil” (p. 123).  Elshtain clearly prefers the atheist Camus to the sentimental Christians, arguing that when confronting Terrorisism we must have the courage to both denounce and actively oppose it.   

* * * * * * * * * * * * * * * * * * * *

Immediately following September 11, 2001, one of the nation’s premier military historians, Victor Davis Hanson, wrote a series of articles that provided Americans historical and analytical insights with which to put things in perspective.  Those essays have been collected and published as An Autumn of War: What America Learned from September 11 And the War on Terrorism (New York: Anchor Books, c. 2002).   “At the very outset,” he says, “I was convinced that September 11 was a landmark event in American history, if not the most calamitous day in our nation’s 225 years” (p. xiii).  In response, “we must be retold that we war to remember the dead, to save the innocent, and to end the violence” (p. 12). 

We must understand the Muslim threat, which Hanson thinks is primarily a reaction against the West’s economic and military success in the Middle East.  Islamists hate Israel primarily because “Israelis have defeated Muslims on the battlefield repeatedly, decisively, at will, and without modesty” (p. 195).  Even more, the very existence of Israel illustrates “that it is a nation’s culture–not its geography or size or magnitude of its oil reserves–that determines its wealth or freedom” (p. 195).  Similarly, the United States, both as Israel’s ally and as the world’s foremost example of a successful modern society, incurs Muslim wrath.  “The Taliban, the mullahs of Iran, and other assorted fundamentalists despise the United States for its culture and envy it for its power” (p. 15).  We should have heeded early alarms, such as the PBS documentary, American Jihad, and intelligence reports that showed the rapid growth of terrorists rather openly operating in the United States during the 1990s.  Taking advantage of the freedoms–and frequently generosity–of their host country, they malignantly awaited opportunities to destroy her. 

Importantly, Hanson says, weakness–even the voluntary weakness of pacifism–never copes with the systemic hatred that fuels radical Muslims.  Long ago the Greeks decided that war, though often horrendous, was at times necessary to destroy evil powers and preserve civilization.  Today, if we understand the world, we must understand what we really are fighting for–”preserving Western civilization and its uniquely tolerant and human traditions of freedom, consensual government, disinterested inquiry, and religious and political tolerance” (p. 73).  Only military power can do this.  “It is an iron law of war that overwhelming military superiority, coupled with promises to the defeated of resurrection, defeats terrorists–in the past, now, always–whether they be zealots, dervishes, or Ghost dancers” (p. 155). 

But he wonders if we have the will to empower the military to do the same today.   Americans responded to Pearl Harbor, in 1941, with an anger and resolve that enabled them to support military action, despite horrendous battles, such as that at Okinawa, where kamikaze attacks destroyed 34 American ships and 12,000 servicemen died.  Folks at home wept, but they never marched in the streets demanding an end to the war.  Following the 9/11 attacks on New York and Washington, however, many Americans seemed more fearful of offending Muslims’ feelings than responding with strength to their assault.

As was evident in Vietnam, “One of the first casualties of war is language” (p. 75).  This is often true of those who wage it, but it also distinguishes many of those how oppose it.  Hanson pointedly condemns those American university professors who posture and pontificate while undercutting their nation’s morale.  Having taught at FresnoStateUniversity for two decades, he anticipated that his colleagues would protest any military response to terrorism, such as the attack upon Afghanistan.  The ordinary working class people he knows (in addition to teaching Hanson farms his family farm) supported President Bush’s response.  But “nearly all of the opposition to our conduct in this war was expressed by professors and those in law, the media, government, and entertainment, who as a general rule lead lives rather different from those of most Americans” (p. xvii).  This elite tenth of the population, dominating “the media, the university, politics, foundations, churches, and the arts–is adamantly and vocally at odds with most Americans” (p. 92).  Anti-war protesters, marching in the streets and urged on by folks in classrooms and churches imagine “peace” comes through appeasement.  The courage of Churchill, the toughness of an earlier America, seems absent in too many sectors. 

Though some anti-war rhetoric appeals to an authentic Christian conscience, Hanson argues that it is secular, humanistic pacifists, including the “deviant offspring of the Enlightenment–Marxists and Freudians–[that] gave birth to even more pernicious social sciences that sought to ‘prove’ to us that war was always evil and therefore–with help from Ph.D.s–surely preventable” (p. 66).  Consequently:  “Pacifists shamed us into thinking that all wars were bad, relativism convinced us that we are no different from our enemies, conflict resolution and peace studies hectored us that there was no such thing as a moral armed struggle of good against evil” (p. 98), and the nation’s elite chattered about a principled policy of appeasement.

                Hanson reminds us of the intelligentsia’s penchant to distort the truth to serve its own ends.  Remember Vietnam!  American soldiers fought well, but the war was lost when the people lost the will to support it.  Yet the truth was rarely told.  “At the so-called bloodbath at Hue, the U.S. Marines lost 147, killed over 5,000 of the enemy, and freed the city in the worst street-fighting since the Korean War.  The siege of Khe Sahn was an enemy failure and resulted in 50 communist dead for each American lost.  In the horrific Tet offensive, a surprised American military inflicted 40,000 fatalities upon the attackers while losing fewer than 2,000” (p. 20).  Watching Walter Cronkite on CBS–or listening to Peter Arnett’s fabrications on CNN–in those years, however, one would never have guessed that Americans were prevailing in the struggle for Vietnam.                                                      

* * * * * * * * * * * * * * *

Much that Victor Davis Hanson says in his essays grows out of his study of military history.  Carnage and Culture; Landmark Battles in the Rise of Western Power (New York: Doubleday, c. 2001) illustrates the scholarship and analytical skill he brings to the discussion.  Importantly, as he admits, his “interests are in the military power, not the morality of the West” (p. xv), and there is little of the Christian concern for “just war” principles in the book.  He mainly argues that the citizen soldiers of the West, for a variety of sound reasons, have proved militarily superior to tribal warriors (such as Shanka Zulu), mercenaries (such as Hannibal’s corps), and despot’s conscripts (from Darius’ Persians to Japanese aviators).  “Warriors are not necessarily soldiers” (p. 446).

The famed Greek physician Hippocrates’ perceptive observation wears well:   “Now where men are not their own masters and independent, but are ruled by despots, they are not really militarily capable, but only appear to be warlike. . . .  For men’s souls are enslaved and they refuse to run risks readily and recklessly to increase the power of somebody else.  But independent people, taking risks on their own behalf and not on behalf of others, are willing and eager to go into danger, for they themselves enjoy the prize of victory.  So institutions contribute a great deal to military valor” (Airs, Waters, Places {16, 2}).

Hanson begins his book with an examination of the Greeks’ victory over the Persians at Salamis in 480 B.C.  The numerical odds certainly favored Xerxes’ forces, but one thing enabled the Greeks to prevail:  personal freedom, eleutheria, something unknown elsewhere in antiquity.  “No Greek citizen could be arbitrarily executed without a trial.  His property was not liable to confiscation except by vote of a council,” and to the Greek “the ability to hold property freely . . . was the foundation of freedom” (p. 36).  At Salamis, the Greeks were not merely resisting a despot’s designs, they were struggling to secure their most absolute moral values.  Free people, as Herodotus emphasized, “are better warriors, since they fight for themselves, their families and property, not for kings, aristocrats, or priests.  They accept a greater degree of discipline than either coerced or hired soldiers” (p. 47).  A century later Aristotle noted: “Infantrymen of the polis think it is a disgraceful thing to run away, and they choose death over safety through flight.  On the other hand, hired soldiers, who rely from the outset on superior strength, fell as soon as they find out they are outnumbered, fearing death more than dishonor” (Nichomachean Ethics {3.1116b16-23}). 

Thus began the West=s military tradition fully evident in republican Rome’s legions.  “The Roman republican army was not merely a machine.  Its real strength lay in the natural elan of the tough yeoman infantry of Italy, the hard-nosed rustics who voted in the local assemblies of the towns and demes of Italy and were every bit as ferocious as the more threatening-looking and larger Europeans to the north.  In the tradition of constitutional governance . . . the Romans had marshaled a nation of free citizens-in-arms” (p. 118).  In fact, the Romans transformed the Greek’s allegiance to the polis and developed “the concept of nation:  Romanness” that extended beyond the barriers of race and geography.  In time, Roman citizenship could be attained and enjoyed throughout the world by those who were willing to assent to and abide by its provisions.  As Hanson catalogues subsequent world-shaping battles at Poitiers, Tenochtitlan, Lepanto, and Midway, it becomes clear that a free people–and only the West granted such freedom–almost always prevails against totalitarian regimes.  Economic freedom, for example, provides the wealth that ultimately translates into superior weapons. 

The final chapter deals with the Vietnam war, with the Tet offensive as the focus.  Hanson demonstrates that the American military fought well and could have won the war.  But the media systematically distorted the truth concerning the war, exaggerating American atrocities and refusing to report far greater Viet Cong atrocities, manipulating the public to support an essentially socialistic agenda.  So too dissenters at home–Tom Hayden and Jane Fonda, David Halstrom and Noam Chomsky–helped jaundice the nation concerning the war’s conduct and prospects.  These folks, of course, turned their backs on the millions who were liquidated by Communists when the U.S. withdrew from Southeast Asia in the 1970s. 

Broad in its scope, Carnage and Culture provides a thoughtful and readable overview of military history, certainly a significant segment of man’s past. 

142 P.C. Tyranny

 

P.C. TYRANNY  

 

                SmithCollege, an elite women’s school, recently decided to eliminate the pronouns “she” and “her” from its student constitution.  This was done, a college representative explained, because “a growing number of students identify themselves as transgender, and say they feel uncomfortable with female pronouns.”  Lest anyone feel “uncomfortable” on campus, free speech is limited.  For more than a decade, SmithCollege administrators have been trying to disabuse students of various prejudices.  Thus any language smacking of oppressive attitudes must be banned.  Categories spelled out for students include: ableism; ageism; classism; heterosexism; lookism; racism; and sexism.  Even positive comments concerning a person’s appearance, it seems, makes one guilty of “lookism.”  Ludicrous as it may seem, such speech codes have become normative on thousands of American campuses. 

                Political correctness has slowly constricted Americans’ ability to speak freely, illustrating Justice Louis D. Brandeis’ warning that “The greatest dangers to liberty lurk in insidious encroachment by men of zeal, well-meaning but without understanding” (p. 4).  The “Newspeak” of George Orwell’s 1984 has subtly extended its tentacles throughout America’s culture.  Nowhere is this more evident than in the nation’s schools, as Diane Ravitch, a professor of education at New York University, demonstrates in The Language Police (New York: Alfred A. Knopf, 2003).  Ravitch has carefully sifted through various state examinations, boards of education policies, and textbook publishers’ guidelines (that often proved difficult to obtain) prescribed by the language police. 

                Some pressure comes from social conservatives.  Fearing to offend them (as in the case of Darwinian evolution or Islam) educators increasingly avoid dealing with subjects that may incite their protests.  Thee religious right clearly wants to censor certain educational materials, but its limited success is evident in the established position still enjoyed by evolution in science texts and curricula.  The secular left, on the other hand, has quite effectively imposed its agenda.  Revering the New Left’s idols of race, class, and gender (standard mantras of current neo-Marxist philosophy), school boards and textbook publishers are carefully imposing a hardened ideology upon the nation’s students. 

                Flying the flag of “multiculturalism,” educators carefully crusade against prejudice and discrimination, even if it means deleting great works of literature and glossing over historical truths.  Mark Twain’s Huckleberry Finn, for example, is routinely attacked as a “racist” book and left unread in the schools.  History must be rewritten so as to favorably portray formerly slighted or disparaged groups.  To avoid hints of ethnocentrism, no culture can be called “primitive.”  “Even those that had no literacy and only meager technology are described as advanced, sophisticated, complex, and highly developed” (p. 141).  Democratic, constitutional political systems are no better than dictatorial, nepotistic regimes.  One would never suspect, reading today’s textbooks, that Mao and Castro were brutal, genocidal killers, since they are generally accorded a sympathetic treatment. 

                Accordingly, a widely-used history text, To See a World “lauds every world culture as advanced, complex, and rich with artistic achievement, except for the United States” (p. 142).  Such texts “condemn slavery in the Western world but present slavery in Africa and the Middle East as benign, even as a means of social mobility, by which slaves become family members, respected members of the community, and perhaps achieved prosperity and high office.  The Aztec ritual of human sacrifice is glossed over as something that their religion required to ensure that the sun would rise the next day, a minor detail in what was otherwise a sophisticated and complex culture that valued education and learning” (p. 143). 

                The same slant appears whenever “class” and “gender” are considered.  Today’s language police insist that the “poor” be defended and the “rich” despised.  Mathematics are now be taught to emphasize economic inequalities!  Consider one exam question:   “Jose’s mother is a prizefighter, and his father is a receptionist in a hair salon.  If his mother makes $40,000 in a fight, and his father earns minimus wage, how many years will it take for Jose’s people to throw off the yoke of colonial oppression?”  Marx’s “proletariat” (the working class) now appears under the rubric of the “marginalized” and “exploited” of the world.  To advance their commitment to an egalitarian society, textbooks also portray a utopian world in which “class distinctions did not exist, not now and not in the past, either” (p. 13).  Consequently, Democrats are generally given positive treatment whereas Republicans receive condemnation for their support of the “rich.”            Even more pervasively–and reflecting the powerful presence of feminists in educational circles–there is an effort to abolish “gender” distinctions.  Women must never be presented as homemakers or as even minimally domestic or emotionally tender-hearted.  Men must never be portrayed as brave or strong or working with tools–though it’s fine to show them as weak and emotional.  Men must never be shown to be bigger, or stronger, than women.  Female plumbers are acceptable–males, never!  Female, but never male, attorneys grace the pages of today’s texts.  Daddy may stay at home with the kids, but Mommy always goes to work in a plush office.  In general, the historical role of women is exaggerated and their “rights” and eminence in non-Western cultures falsely portrayed.  Language, especially, must be rigorously controlled in this area.  “Gender bias is implied by any use of the term man, as in “mankind” or “man in the street” or “salesman” (p. 25).  In a 30 page appendix, “a glossary of banned words, usages, stereotypes and topics,” Ravitch documents, simply by listing words proscribed by the language police, the extent to which this extends.  Banned words include: actress; average man; boyish figure; brotherhood; busboy; cameraman; career woman; cattleman; caveman; chairman; clergyman; cowboy; cowgirl; craftsmanship. 

                Responding to all this, Ravitch concludes: “The question before us, the battle really, is whether we have the will to fight against censorship.   I, for one, want to be free to refer to “the brotherhood of man” without being corrected by the language police.  I want to decide for myself whether I should be called a chairman, a chairwoman, or a chairperson (I am not a chair).  I want to see My Fair Lady and laugh when Professor Higgoins sings, “Why can’t a woman be more like a man?”  As a writer, I want to know that I am free to use the words and images of my choosing” (p. 169).  She supports, by endorsing, an ancient American commitment, expressed by John Adams in 1765 when he wrote:  “‘Let us dare to think, speak, and write. . . .   Let every sluice of knowledge be opened and set a-flowing.’  Even in our schools” (p. 170). 

                The evidence set forth in The Language Police should concern us all.  Whether in the schools, press, or churches, there are folks determined to sanitize our speech, even when truth is compromised.  To speak or write sensitively, tactfully, does not require politically correct shackles.  A free people must be free to think and speak without fear.  That freedom is currently eroding, and it will take a struggle to regain it.

* * * * * * * * * * * * * * * *

                Impressed with Ravitch’s scholarship, I secured a copy of her Left Back: A Century of Battles Over School Reform (New York: Touchstone, c. 2000) and found it to be a fine historical account of the past century, finding therein developments that help explain the current concern for political correctness.  In brief: school reformers, “progressives”  personified by John Dewey, sought to dislodge traditional academics (developing proficiency in subjects such as Latin and mathematics) and establish societal change as the main aim of education.  Though stoutly contested until mid-century, the cultural revolution of the ’60s finally implanted progressivism in the nation’s schools.

                Dewey–and his less famous colleagues at ColumbiaUniversity’s Teachers College–saw the public schools as a tool with which to collectivize society.  Rather than liberating individual students’ minds–the traditionalists’ notion of education–these progressives wanted to make society more egalitarian and socialistic.  Teachers were to be social workers rather than scholars.  Indeed, to Dewey, the teacher is “‘the prophet of the true God and the usherer in of the true kingdom of God'” (p. 459).  Students were to have “fun” doing various things, engaging in group projects and discussions instead of working hard to master difficult subjects.  To William Kilpatrick, a highly-regarded colleague of Dewey, “dancing, dramatics, and doll playing” were preferable to classical languages and mathematics (p. 181).

                Both Kilpatrick and Dewey admired the U.S.S.R. in the 1930s and wanted to reconstruct the U.S. in accord with it.  Dewey admired “the Soviet’s efforts to dismantle the traditional family, which Marxists considered ‘exclusive and isolating in effect and hence as hostile to a truly communal life'” (p. 206), and replacing the family with the school has always been a mainstay in the progressives’ agenda.  Dewey especially praised the Soviets’ use of the school to promote social change, shaping children in accord with socialist ideology, what he termed “‘a unified religious social faith'” (p. 208).  Dewey and Kilpatrick particularly admired the “project method” used in Soviet classrooms–students working together to solve problems rather than listening to teachers lecture.  “We teach children, not subject matter” was their mantra.      Another Columbia professor, George Counts, also visited the USSR and lavished praise on both the nation and its dictator, Stalin.  During the 1930s, he “became the most forceful advocate for radical ideas in American education” (p. 211).  That Stalin relied on censorship and propaganda hardly disturbed the professor, for his cause was noble and the world was being transformed.  Doing his own propagandizing back home, Counts addressed the National Education Association in 1932, calling for “elimination of capitalism, property rights, private profits, and competition, and establishment of collective ownership of natural resources, capital, and the means of production and distribution” (p. 217).  He also worked within the American Historical Association, helping draft a 1934 report declaring that the era of individualism and laissez faire economics was ending, to be replaced by a “‘new age of collectivism'” (p. 228).  Broadus Mitchell, an economist at Johns Hopkins, urged teachers, “‘above all others, to become propagandists’ against the economic system and to stir discontent ‘into the mind of the millions'” (p. 230).

                Ironically, at the very time Americans were praising the USSR’s educational system it was being discarded by the Soviets!  In the mid-30s, Russian schools reverted to a very traditional curriculum!  Subsequently, especially following Stalin’s bloody purges, some scholars (including George Counts) changed their minds and began to support the formerly-despised liberal tradition in education.   Counts even turned against state-controlled education and its potential for mind-control.  Dissenters and critics of progressive education, notably Robert Maynard Hutchins, made the case for traditional, academic studies, and large numbers of ordinary Americans supported them.  Consequently, students in the ’40s and ’50s (such as myself) continued to take Latin and math–though we’d been subjected to progressivism’s  “see and say” reading techniques rather than phonics in grammar school.

                The cultural revolution of the ’60s, however, revived progressivist ideology, and “the zeitgeist in American education swung wildly toward the liberationist, pseudorevolutionary consciousness that was roiling the rest of the culture” (p. 384).  Radical books, such as Summerhill, by A.S. Neill, and Teaching as a Subversive Activity, by Neil Postman and Charles Weingartner, proved highly influential.  (I confess to using the Postman book in my Philosophy of Education classes for several years in the ’70s!)  Carl Rogers’ psychological views, calling for “personal growth” through “encounter groups” and “sensitivity training” powerfully impacted teachers and preachers alike. 

                Consequently, Ravitch shows, SAT scores steadily declined.  Foreign language enrollments collapsed.  Mathematics and science classes lost allure.  Students took fewer classes, studied less, learned less.  They did, however, enjoy “values clarification” classes that allowed them to construct (in small group discussions groups) their own ethics.  Indeed, constructivism became something of a religious dogma for educators–students do not discover, but rather design for themselves, what they take to be true.  Flattered by their “facilitators” in the classroom, students excelled in self-esteem but little else.  In sum: “the hedonistic, individualistic, anarchic spirit of the sixties was good for neither the educational mission of the schools nor the intellect, health, and well-being of young people” (p. 407).  And neither was the progressive education it implemented!

                Ravitch writes well, making the story she tells both compelling and alarming.  To understand why our schools are as they are, Left Back provides answers.

* * * * * * * * * * * * * * *

                The developments Ravitch describes in the public schools have also occurred in the nation’s universities, as Alan Charles Kors and Harvey A. Silverglate demonstrate in The Shadow University: The Betrayal of Liberty on America’s Campuses (New York: The Free Press, c. 1998).  Kors is a history professor at the University of Pennsylvania; Silverglate is an criminal defense attorney who has taught at HarvardLawSchool and an active member of the ACLU.  They wrote the book because they wanted to alert the nation to an immanent peril:  “Universities have become the enemy of a free society, and it is time for the citizens of that society to recognize this scandal of enormous proportions and to hold these institutions to account” (p. 3).  Since, “morally and practically, none of us enjoys more freedom of speech than is accorded the least popular speaker” (p. 101), what takes places on campuses should concern everyone.  Commending the book, Wendy Kaminer wrote: “unlike most critics of political correctness,” the authors “have no political agenda of their own to advance, except the preservation of liberty.  They take seriously the obligation to defend the rights of all individuals, adversaries as well as friends.  The ShadowUniversity is a scrupulously fair, painstakingly documented account of repression on America’s campuses, where students and faculty members are regularly denied fundamental rights of speech, conscience, and due process.  I never knew it was quite this bad” ( book jacket endorsement). 

                For Professor Kors this threat became very real at his own university when a Jewish student, Eden Jacobowitz, was disciplined for yelling “Shut up, you water buffalo!” to a noisy group of black women disturbing the peace of his dormitory.  The women claimed they’d been subjected to “racial harassment,” and the university’s disciplinary machinery swung quickly into action.  Jacobowitz was accused of violating Penn’s speech code, and faced expulsion.  Thanks to the intervention of Kors and Silverglate, as well as national media attention, and after a drawn-out series of hearings, the charges were dropped.  But the case illustrates the extent to which university administrators will go in seeking to suppress free speech and the deviousness of their techniques.

                The authors carefully examine the constitutional meaning of free speech and the university tradition of academic freedom, noble principles basic to America’s free society.  (Though a bit technical, one of the virtues of this book is its sterling scholarship, citing court cases and telling examples.  The authors have examined hundreds of university speech codes, and anyone wanting to truly understand the implications of these issues will benefit from their analyses.)  Such freedoms have always had their foes, but today’s threat comes mainly from the “political and cultural left” (p. 67).  EmoryUniversity’s faculty senate, for example, rejected a resolution “specifying that ‘all judgments under this policy related to freedom of expression would be consistent with First Amendment standards'” (p. 160).  PC preempts freedom!

                Though university professors contribute to the fervor for political correctness, the real assault on individual liberty, Kors and Silverglate say, is the “shadow university” that has boomed under the aegis of “student services.”   “Increasingly, offices of student life, residence offices, and residence advisors have become agencies of progressive social work whose mission is to bring students to mandatory political enlightenment” (p. 211).  Here we find compulsory orientation sessions, designed to browbeat students into accepting feminist rhetoric and homosexual activity.  Wendy Shalit, for example, was forced to attend a “Feel-What-It-Is-Like-To-Be-Gay” sensitivity session at WilliamsCollege (p. 226).  Dormitories are policed to make sure nothing offends racial or sexual sensitivities.  

                Identifying the source of such views, the authors write: “The contemporary movement that seeks to restrict liberty on campus arose specifically in the provocative work of the late Marxist political and social philosopher Herbert Marcuse . . . who gained a following in the New Left student movement of the ’60s” (p. 68).  Though he claimed to believe in “freedom,” he redefined its meaning in accord with the thought Rousseau, Marx, and Gramsci, something quite different from Jefferson and Madison. And his “prescriptions are the model for the assaults on free apeech in today’s academic world” (p. 71).  Marcuse’s freedom was highly selective and admittedly “repressive”!  Some should enjoy it, others should not.  Radicals should be free to say literally anything, but conservatives should be gagged.  “The use of the epithet ‘nigger’ by a white toward a black would be outlawed as sracist, whereas Malcolm X’s famous characterization of Caucasians was the ‘white devil’ would not” (p. 75).  Spike Lee’s rants must be allowed, but not Mark Twain’s novels.  Education should be propaganda aimed at social leveling; teachers should be revolutionaries intent on social change.  “Thus, for example, history would be taught so that the student understands ‘the frightening extent to which history was made and recorded by and for the victors, that is, the extent to which history was the development of oppression'” (p. 71).  To see Marcuse’s shadow in the workings of todays “language police” requires no great imagination!  “The struggle for liberty on American campuses is, in its essence, the struggle between Herbert Marcuse and John Stuart Mill” (p. 110). 

# # #

141 “The New Faithful”

Just when the advocates of “contemporary worship” and “user-friendly” churches have succeeded in revamping large sectors of the Christian world, young people seem to be rejecting it.  There’s a growing hunger, it seems, for a more traditional, more ancient, more orthodox version of the Faith–a hunger for the spiritual disciplines of prayer, Bible study, and the sacraments–rather than self-esteem psychobabble and entertainment.  Such is evident in The New Faithful: Why Young Adults are Embracing Christian Orthodoxy (Chicago: LoyolaPress, c. 2002), by Colleen Carroll.   An award-winning and widely-published journalist, Carroll (a Roman Catholic) became fascinated with this phenomenon and took a year to research and write her account, traveling extensively and interviewing some 500 appropriate spokesmen.  “If you are making plans for your church in the next decade, you can’t afford to leave this book unread,” says Benedict Groeshchel (book cover), and I suspect he’s right.  The “contemporary” worship and “relevant” preaching that has frequently alienated “senior citizens” now  seems spurious to the coming generation as well.  Marrying the spirit of the age, it’s often said, leaves one a widow in the next!

As a journalist, Carroll provides illustrations to document her thesis, summed up by BostonCollege philosopher Peter Kreeft:  “It’s a massive turning of the tide” a fundamental rejection of  “the old tired, liberal, modern” version of Christianity (p. 3).  Young folks aren’t drawn to the watered-down Catholicism set forth by the ’60s generation.  A convention of liberal Catholics in 2000 featured “gray-haired radicals, priests wielding canes, and nuns dressed as defiantly as septuagenarians can. But young adults were scarce” (p. 281).  Liberalism, in both its theological and political forms, has lost its luster.  Younger priests are more conservative than their baby-boomer elders, supporting priestly celibacy and opposing women’s ordination.  Traditional seminaries bulge with candidates, while their liberal counterparts lament empty corridors.  There is a  resurgence of interest in Thomas Aquinas, who “argued that laws are just when they are based on the way God designed the universe.  So certain moral actions are right or wrong in their very nature, depending on their conformity to God’s law.  And human beings instinctively know it” (p. 171).

The relativism that shaped their parents’ culture seems less alluring to younger folks hungry for some sound moral standards.  “A former Wall Street Financier,” John McCloskey, became an Opus Dei priest and now works with Ivy League students.  He says “that when students are presented with ideas and teachings that sharply contrast with campus culture–church teachings against abortion and contraception, for instance, and orthodoxy’s insistence on absolute standards of right and wrong–they often respond with surprise and interest” (p. 21).  McCloskey notes that “College campuses are the refuge of the sixties liberals,” who now discover that “they are the old fogies”(p. 182) upholding increasingly antiquated ideologies.  “‘People are getting sick of trite little phrases.  “God is love” and “God loves you”–what does that mean?'” asks a young Notre Dame student, planning to enter the priesthood.  Demanding, ascetic orders, such as Mother Teresa’s Missionaries of Charity,  the Friars of the Renewal, and the Legionaries of Christ, are far more appealing to younger Catholics than temporizing organizations like the Jesuits.  “Today, it is increasingly those hard-core, demanding religious orders and seminaries that are experiencing a surge in religious vocations” (p. 98).

In Evangelical circles the same trend appears in the growth of “Campus Crusade for Christ, a conservative evangelical group that stresses strict moral standards and salvation by Jesus Christ” which grew, in five years, from 21,000 to 40,000 members (p. 8).  “About a thousand graduate students belonged to the e-mail list of Harvard’s InterVarsity Christian Fellowship in 2000–twice the number that were signed

up four years earlier” (p. 161).  Evangelical colleges are booming and attract some of the nation’s finest young thinkers.  Churches upholding the inerrancy of Scripture, traditional devotions, and rigorous morality enroll the children of liberal Protestants.  Such young people often lament the fact that they heard little about sin and salvation (the “hard gospel”) in their childhood, while platitudes espousing tolerance, social reform and leftist politics abounded.  Consequently, as a 2001 Hartford Institute for Religion Research study demonstrates, there is “a strong correlation between the vitality of a congregation and its commitment to high moral standards.  According to the survey, “Two out of three congregations  that emphasize personal and public morality also report healthy finances and membership growth” (p. 69).

Part of the “hard gospel,” of course is sexual chastity.  Remarkably, growing numbers of the “young faithful” favor high sexual standards, and there is a significant surge of support for sexual abstinence before marriage.  “‘The new sexual revolution is not being led by adults, but by young people,’ roared Mary-Louise Kurey, Miss Wisconsin 1999, top-ten finalist for Miss America, and author of a book about abstinence.  ‘We are seeing a complete turnaround in young attitudes toward sex and relationships'” (p. 121).   (Interestingly enough, the reigning Miss America also emphasizes chastity and had to do battle with the pageant’s hierarchy to make it the standard theme of her public addresses!)  This is truly significant, for, as Conner makes clear, “The connection between faith and sex is a powerful one.  Pastors often say that transgressions of Christian sexual morality lead young believers away from the faith faster than any other moral lapses.  Their explanation: sexual intercourse is an intimate, potent experience, and the desire for sexual activity often clouds moral judgment” (p. 140).

This is a readable, well-organized book, meriting study by anyone concerned about the future of the Church.  Chuck Colson’s endorsement is telling: “Colleen Carroll does more than simply chronicle the embrace of Christianity by young adults, as important as that is.  Her interviews and meetings with young American adults serve as documentation of the spiritual and intellectual bankruptcy of postmodernism.  The New Faithful is a reminder that when the idols of our age crumble, it is the truth of Christianity that remains standing” (book jacket).

* * * * * * * * * * * * *

Carroll’s findings in The New Faithful have been anticipated for decades by Thomas C. Oden.  His   Agenda for Theology (1979), After Modernity . . . What?  (1990), three volume Systematic Theology, and  editorial supervision of the Ancient Christian Commentary on Scripture all indicate the depth of his commitment to rediscovering orthodoxy.   What he’s called for seems to be happening, and he describes it  in The Rebirth of Orthodoxy: Signs of New Life in Christianity (San Francisco: HarperSanFrancisco, c. 2003).  “Turning from the illusions of modern life, the faithful are now quietly returning to the spiritual disciplines that have profoundly shaped their history, and in fact have enabled their survival.  This is the rebirth of orthodoxy” (p. ix).  Orthodoxy, as he defines it, is the “integrated biblical teaching as interpreted in its most consensual classic period” (p. 29).  The spiritual disciplines sustaining it, consequently, include:  “close study of scripture, daily prayer, regular observance in a worshiping community, doctrinal integrity, and moral accountability” (p. ix)

Orthodoxy is a very personal issue for Oden, and some of the most interesting sections of this book are autobiographical.  Reared in Oklahoma, he was baptized in his parents’ (fashionably liberal, social-gospel) Methodist church and nominally embraced the Christian faith.  Off to college, he was prepped to acquire “my agnosticism from Nietzsche, my social views from radical Methodists and existentialists, and my theology (God help me, I confess) from Alan Watts” (p. 84).  He then acquired a Ph.D. at Yale and began his teaching career.  “Although it was assumed that I was teaching theology, my heart was focused on radical visions of social change and on the blatant politicizing of the mission of the church” (p. 84).  He now confesses that he entered the “ministry” as a political strategy, looking for a lever of power with which to foment revolutionary social change.  In a revealing footnote he confesses that “For me Marxism became  radicalized early in the 1950s, and personalized in the figure of Ho Chi Minh, whom I unreservedly idolized as an agrarian Communist patriot ten years before America’s entry into the Vietnam war.  My major mentors were almost all socialist or quasi-Marxist.  Long before Vietnam I was a pacifist.  Before Vietnam my ideology was formed around the group they wrote the Port Huron Statement; that same group later shaped the founding of the Students for Democratic Actoin” (p. 197).

His early years very much resembled Hillary Clinton’s!  Both were reared Methodists, attended YaleUniversity for graduate work, avidly embraced the radical rhetoric of Saul Alinsky (an “unprincipled amoralist” who was the subject of Hillary’s senior thesis in college), espoused situation ethics, and avidly read motive, a radical religious journal for college-age Methodists.  “That magazine fueled me intellectually during my heady years as a pacifist, existentialist, Tillichian, and aspiring Marxist” (p. 84-85).  Hillary has kept all the issues she received, notes Barbara Olsen in Hell to Pay: The Unfolding Story of Hillary Clinton.  Consequently, Oden says,  “When I look now at Hillary’s persistent situational ethics, political messianism, statist social idealism, and pragmatic toughness, I see mirrored the self I was a few decades ago.  Methodist social liberalism taught me how to advocate liberalized abortion and early feminism almost a decade before the works of Germaine Greer and Rosemary Radford Reuther further raised my consciousness” (p. 85).  He was a prototypically “modern” religious studies professor, “only pretending to be a theologian” (p. 84).

The devastation wrought by thinkers such as himself cannot be ignored.  Liberal churches have been imploding 40 years.  Liberal leaders, controlling mainline seminaries and denominations, refuse to accept responsibility for the massive loss of members, still caressing “the fantasy that they have the high moral ground on sexuality issues, politically correct policing, and standard theological issues such as universal salvation” (p. 149).   Conversely, conservative Evangelical churches have prospered, proving the thesis of Dean Kelly’s Why Conservative Churches are Growing.  They uphold “scripture as the norm of faith and life, with a stress upon the believer’s experience of a personal relationship with Jesus as Lord and Savior, the only Son of God, and the Holy Spirit as enabler of a world-wide mission of proclamation.  They maintain a biblical doctrine of the incarnation, atonement, and the Lord’s return.”  They believe the Bible is God’s Word and that they are “saved through faith active in love”(p. 149).

Providentially, Oden (though remaining within his denomination) shifted from a Liberal to an Evangelical position as a result of his reading of Scripture and the Church Fathers.  He discovered what he’d not found in his formal education:  life-changing Truth, a Truth preserved, for 20 centuries, by “consensual” teaching, clearly evident in Church tradition.  Eminent Fathers (especially Augustine, Ambrose, Jerome, Gregory I,  Athanasius, Basil, Gregory of Nazianzus,  John Chrysostom) and Church councils (especially the first Seven Ecumenical Councils) laid a sound foundation for biblical interpretation and theological assertions.   As a Methodist, Oden reveres John Wesley, and he cites, with approval, Wesley’s reliance upon  “‘the most authentic commentators on Scripture, as being both nearest the fountain, and eminently endued with the Spirit by whom all Scripture was given. . . .  I speak chiefly of those who wrote before the Council of Nice.  But who would not likewise desire to have some acquaintance with those that followed them?  With St. Chrysostom, Basil, Jerome, Austin [Augustine]; and above all, the man with a broken heart, Ephraim Syrus?'” (p. 99).

There is, thus, today a significant theological return to the sources of Christian dogma.  If nothing else postmodernism has freed folks from the shackles of modernity.  One can even espouse the allegedly antiquated positions of premodernism!  To take the Bible as God’s Word, to uphold the facticity of the Resurrection, to take seriously the positions of Augustine and Aquinas and Wesley, are now permitted.  And Oden shows how numbers of unusually talented young theologians are doing precisely that.

In that consensual tradition one also finds a basis for ecumenical harmony.  What modern churches have failed to find through bureaucratic maneuvers is remarkably evident in a “new ecumenicism” drawing together devout Evangelicals, Catholics, and Orthodox.  This makes sense since from the beginning Christianity has been gloriously multicultural!  All around the globe believers respond to the Gospel, embrace Christ, and are brought into the fellowship of the redeemed.  And they increasingly find themselves bound together by shared commitments to the same Lord.   “The decisive classic text for orthodox ancient ecumenical method,” Oden says, is Vincent of Lerins’s Commonitory, a fourth century synthesis of those positions widely espoused by the Church.  Vincent explained that when believers differed in their interpretation of Scripture they heeded traditional judgments.  Vincent recognized man’s “‘insatiable lust for error,'” graphically evident in “‘a permanent desire to change religion, to add something and to take something away'” (p. 175).  Thus, though they all embraced the Bible as their ultimate authority they recognized that not everyone had the right to interpret it on his own.  To resolve differences he proposed what we know “as the Vincentian rule:  In the worldwide community of believers every care should be taken to hold fast to what has been believed everywhere, always, and by all.  Its Latin form reads: Quod ubique, quod semper, quod ab omnibus creditum est” (p. 162).

* * * * * * * * * * * * *

Somewhat similar views are set forth by Robert E. Webber, a distinguished Evangelical professor (long at Wheaton, now at North Park Seminary), in The Younger Evangelicals:  Facing the Challenges of the New World (Grand Rapids: Baker Books, c. 2002).  Endorsing the words of one of his sources, Steve Gerali, Webber asserts: “‘the ContemporaryChurch, having been built and enmeshed in the generational values of the baby boomer, is alienating a generation of adolescents'” (p. 156).  Younger Evangelicals are divorcing themselves from their “boomer” parents–much as their parents too frequently divorced each other!  In a series of chapters–essentially repeating the same story, and generally citing the same sources and  informants–Webber explains the positions and portrays the pastors, youth ministers, educators, and worship leaders who personify them.

Webber believes that Evangelicals have passed through three distinct stages since WWII.  First, folks like myself (now “senior citizens”) identify with “traditionalists” like Billy Graham.  Second, baby-boomers, born in the post-war era, developed the “pragmatic” approach best evident in mega-churches such as Willow Creek and Saddleback.  Third, the coming generation–the “younger” evangelicals–seems increasingly distinguished (especially following 9/11/01) by its interaction with “a new form of American patriotism, a wave of conservative political philosophy, a new form of civil religion, an new economic tightening of resources, and a more disciplined life” (p. 47).  Importantly, this new “world has led to the recovery of the biblical understanding of human nature.  The language of sin, evil, evildoers, and a reaffirmation of the deceit and wickedness of the human heart has once again emerged in our common vocabulary” (p. 48).  Accordingly, though reared in a relativistic culture, they hunger for absolute truths sufficient for guidance in life.

These younger evangelicals, in brief, reject modernity and look for guidance in the pre-modern world of orthodox theology and traditional morality.  They’re interested in history, especially the story of the AncientChurch.  They’re open to theological dogmas, particularly as defined by the Nicene Creed–sensing, as Flannery O’Connor said, that “‘dogma is an instrument for penetrating reality'” (p. 74).  Preaching the Cross, calling for self-denying commitment, upholding high moral standards, they envision and hope to establish a different form of “church,” one more akin to that primitive believers.   Exposed int public schools to classes in “values clarification” that prescribe purely subjective standards for morality, they understand the need for objective ethics.  Though they seem anxious to find some absolutes, however, Webber’s younger evangelicals (paling themselves on the horns of an overt contradiction) also embrace some of postmodernism’s relativism.  Taking an “anti-foundationalist” stance, they insist that Christianity is a story to be pondered, not a proposition to be understood.

As a seasoned professor, Webber richly documents his presentation.  Anyone interested in the subject will find, in his notes and bibliography, ample books and web sites to pursue.  He has contact with a large number of the younger evangelicals and obviously endorses their endeavors.  (In part, one suspects, this is because they endorse positions he has advocated for some time!)  Unfortunately, there are some distracting glitches and disquieting generalizations that detract from the book.  Webber refers to “Armenians” when he means “Arminians.”  He routinely refers to the “EarlyChurch” as a pattern for both himself and today’s younger evangelicals, but too often his assertions lack solid basis in the sources of that era.  His notion, for example, that early Christians were disinterested in intellectual apologetics, preferring to illustrate their convictions through their lives, cannot square with the actual writings of Justin Martyr and Tertullian, two of the earliest apologists.

Finally, though there is a refreshing desire to escape “modernity” and be fully counter-cultural, these “younger evangelicals,” I suspect, are as enmeshed in their secular culture as were their predecessors in theirs.  If one compares some of the tenets of Postmodernism with the views of Webber and his protagonists, one wonders if they have merely replaced the cultural compromises of modernism with similar compromises with postmodernism!

# # #

140 Dylan, Colson, Graham

In unique ways Bob Dylan, Chuck Colson, and Billy Graham have left formative impressions on post-WWII America.  During my 17 years as Chaplain at PLNU, I often quoted lyrics from Bob Dylan’s songs.  Restless Pilgrim: The Spiritual Journey of Bob Dylan (Lake Mary, FL: Relevant Books, c. 2002), by Scott Marshall, illustrates why I did so.   For Dylan has not only been one of the major forces in popular music for 40 years, he has also illustrated a persistent hunger for God, evident in the biblical themes that resound in his songs.  Marshall makes little effort to probe Dylan’s lyrics, relying instead on published interviews and books to illustrate his concerns.  He wisely acknowledges the difficulty of determining exactly where the singer stands, noting his Dylanesque disclaimer, “Well, you never  know.”  He’s sung folk, rock, country, and Christian music.  Neither Jews nor Christians nor agnostics know exactly what to make of him.  In fact: “Bob Dylan refused to be categorized–or, perhaps better, simply cannot be categorized” (p. xiv).  He’s “always simply been his own man.  More accurately, Bob Dylan has always been God’s own man, long before he knew it” (p. xiv).

For a few years (ca. 1980) he openly espoused Christianity, releasing three distinctively “Christian” albums.   Then he seemed to move in different directions, and many folks assumed he’d abandoned his faith in Christ.  But he still includes some of his Christian songs in his concerts, and (in 1997) when he received an award at the Kennedy Center for the Performing Arts he led a standing ovation in response to Shirley Caesar’s rendition of “Gotta Serve Somebody,” probably his most famous Christian tune.  “If Caesar had not been permitted to perform that night, Dylan would have been a no show” (p. 3).

Dylan’s interest in Gospel music began when he listened, late at night, to music broadcast from Shreveport, Louisiana.  Jewish by birth, he has always read the Bible, and his music has consistently reflected its influence.  He told a Rolling Stone reporter that he was a “literal believer” in the Bible, holding both the Old and New Testaments to be inspired of God (p. 74).  The lyrics of his 1965 album, Highway 61 Revisited–one of his greatest–were described by one journalist “as a translation of the Bible in street terms” (p. 8).  In his notes to Biograph, a magnificent multi-record collection of his music, he said he “‘wanted to expose people to [gospel music] because [he] loved it and it’s the real roots of all modern music, but nobody cared'” (p. 89).

Dylan’s religious quest became quite public when, in 1979, he embraced the Christian faith.  Influenced by “born again” musicians, like T-Bone Burnett and Jerry Scheff, he was “‘willing to listen about Jesus'” (p. 27).  A Vineyard Church pastor, Larry Myers, visited him and remembered that no one tried to pressure him, but “‘God spoke through His Word, the Bible, to a man who had been seeking for many years.  Sometime in the next few days, privately and on his own, Bob accepted Christ and believed that Jesus Christ is indeed the Messiah'” (p. 28).  As he explained, in 1980, “‘Jesus put his hand on me.  It was a physical thing . . . I felt my whole body tremble.  The glory of the Lord knocked me down and picked me up'” (p. 143).  To another journalist he said, “Let’s just say I had a knee-buckling experience'” (p. 143).

Subsequently Dylan involved himself in serious Bible studies and even attended some classes.  He recorded Slow Train Coming, with its explicitly Christian lyrics, including “Gotta Serve Somebody.”  He began singing his new faith in concerts–and quickly encountered mounting hostility.  While many of his fans  adjusted to the ever-questing pilgrim, others protested.  Secular critics, particularly, panned his performances.  Ever willing to be controversial, however, Dylan was undeterred, producing, Michael Long said, “some of the greatest songwriting and recording of his career'” (p. 59).  He released Saved and Shot of Love, with their fervently evangelical message, greatly distressing Columbia Records, which had long profited from his productions.  He also spoke his mind, condemning homosexuality for example, eliciting predictable venom from the Hollywood and media elite, who were “downright ruthless in their coverage of the ‘new Dylan'” (p. 53).  In a radio interview, Dylan was asked if Jesus is the answer to the world’s needs.  “‘Yeah, I would say that,’ Dylan replied.  ‘What we’re talking about is the nature of God . . . in order to go to God, you have to go through Jesus'” (p. 56).

Then, after publically espousing his faith in Jesus, Dylan seemed to abandon it.  His new tunes moved in different directions.  Some critics suggested he’d returned to Judaism, others declared he’d lost his religious interests.  Dylan’s explanation is simple:  “‘I’ve made my statement, and I don’t think I could make it any better than in some of those songs.  Once I’ve said what I need to say in a song, that’s it.  I don’t want to repeat myself'” (p. 56).  He clearly, in the mid-80s, explored Judaism with new intensity.  An Orthodox Jewish community used one of his songs in a charity telethon conveniently, though they omitted one of the verses that explicitly acknowledged Jesus.  Indeed “how a Jewish person can believe in Jesus and still be Jewish–is perhaps the one that ultimately gets to the heart of Dylan’s spiritual journey” (p. 110).  In Marshall’s judgment, however, this fits in with his “completed Jew” belief in Jesus as Messiah.  Those who have interviewed him, and musicians who work with him, describe him as still a believer.  Backup singers who have recently (2001)  performed with him say “they prayed with Dylan before each show.  These were Christian prayers'” (p. 70).

As Marshall explores the past 20 years of Dylan’s life, he finds much evidence of his continued Christian commitment.  His songs, for example, often deal with biblical themes of sin and salvation, the need for repentance and righteousness.  In a 1986 tour of Australia, he closed each concert with “In the Garden,” one of his most moving Gospel songs.  Explaining the song, he said:  “‘This last song now is all about my hero.  Everybody’s got a hero.  Where I come from, there’s a lot of heroes.  Plenty of them.  John Wayne, Clark Gable, Richard Nixon, Ronald Reagan, Michael Jackson, Bruce Springsteen.  They’re all heroes to some people.  Anyway, I don’t care nothing about those people [as heroes].  I have my own hero.  I’m going to sing about Him right now'” (p. 86).  In Jerusalem, facing a Jewish audience, he sang “Gotta Serve Somebody” and “Slow Train.”

Albums like Oh Mercy, recorded in 1989, though not explicitly Christian, certainly have biblical messages.   “Shooting Star,” the song which rather sums up the album’s message, declares that it’s the “‘last time you might hear the Sermon on the Mount'” (p. 97).  In Marshall’s opinion, Oh Mercy “was practically a companion piece to the album of a decade earlier, . . . Slow Train Coming” (p. 98).  At his concerts during the ’90s he routinely included songs from his “Christian albums.”   For example, in 1991, he sang “Gotta Serve Somebody” some 80 times, “I Believe in You” 29 times, and “In the Garden” 10 times (p. 107).  In 1997 he performed, as requested by Pope John Paul II (who later spoke), at a concert in Bologna, attended by several hundred thousand people, singing, among others, “Knockin’ on Heaven’s Door” and “A Hard Rain’s A-Gonna Fall.”  He opened a 1999 concert in Pensacola with an old Christian hymn, “Rock of Ages,” and a few weeks later sang Fanny Crosby’s “Pass Me Not, O Gentle Savior” in Buffalo, doing the same a few days later in Amherst, Mass.  Later that year, touring with Paul Simon, he sang “Hallelujah, I”m Ready to Go,” a song “which became something of a staple during the tour, [that] included these lyrics: ‘Sinner don’t wait / Before it’s too late / He’s a wonderful Savior to know / I fell on my knees / He answered my pleas / Hallelujah, I’m ready to go'” (p. 142).

In 2001, as Dylan turned 60, he “agreed to participate on a track for the forthcoming tribute album, Pressing On:  The Gospel Songs of Bob Dylan.  Considering that the project only featured songs from Slow Train Coming and Saved, his participation would have seemed odd if he no longer believed in Jesus as the Messiah” (p. 156).  In his 2002 concerts he included “Hallelujah, I’m Ready to Go” and “I am the Man, Thomas,” a “song about Jesus’ crucifixion and resurrection” (pp. 170-171).  Still more: “when Dylan included ‘Solid Rock’ in the first set list of his European spring tour of 2002” (p. 172), a song he’d not sung since 1981, he astounded many of his fans because it is one of his most clearly Christian compositions.  To Marshall, in words summing up this fine treatise, “These are not the words and sentiments of a man who has forsaken belief in Jesus” (p. 172).

For Dylan fans such as myself, this book provides a handy guide to Dylan’s spiritual journey.  Drawing upon published interviews, some of them in obscure periodicals, Restless Pilgrim brings us up to date on one of the nation’s most enigmatic, but engrossing, songwriters.

* * * * * * * * * * * * * * * * * * * * * * *

In Charles Colson: A Story of Power, Corruption, and Redemption (Nashville: Broadman & Holman, Publishers, c. 2003), John Perry focuses on the crucial years when Colson served in the White House, followed by his spiritual transformation in the wake of Watergate, emphasizing the difference Christ made in the life of President Nixon’s “hatchet man.”  Though many of the details will be familiar to anyone who read Colson’s best-selling Born Again, Perry brings to the story information gleaned from Patty Colson and other sources as well as providing an outsider’s perspective of the man.  Though not an “authorized” biography, it benefitted from interviews with Colson carries his informal approval. 

Born in 1931 to hard-working parents in Boston, Charles Colson excelled academically and was accepted by both Brown and Harvard universities.  Harvard’s elitist snobbery alienated him, however, and he attended Brown on a ROTC scholarship.  Fulfilling his ROTC commitment, he joined the Marine Corps and proved himself to be an able officer.  After two years of active duty, he joined the reserves, found a job with the Navy Department in Washington, D.D., and entered law school at George Washington University, taking evening classes.  He would graduate in 1959 and be admitted to the bar later that year.  The next year he was offered a job in Senator Leverett Saltonstall’s office, making him “the youngest senior congressional staff member on Capital Hill” (p. 24).  He orchestrated the Massachusetts’ senator’s successful re-election campaign in 1960 and was touted as one of the ten Outstanding Young Men of 1960 by Boston’s Camber of Commerce.  The next year Colson opened a law office as a trade representative of the New England Council in Washington and quickly succeeded in attracting clients and wielding influence in the nation’s capital.

Impressed with Richard Nixon, Colson supported his 1968 election campaign and was asked to join his administration as a special counsel.  Though it meant considerable financial sacrifice, Colson readily accepted the invitation and quickly became a trusted insider, though he often clashed with others in the Nixon White House.  He particularly delighted the President by getting things done, even when it meant cutting various bureaucratic corners.  “By the summer of 1970 Nixon was regularly giving Colson direct assignments, bypassing White House protocol and, in particular, cutting Bob Haldeman out of the loop” (p. 61).  Nixon once boasted, to some guests, “Colson–he’ll do anything!” (p. 103).

As the election of 1972 approached, Colson supervised various endeavors to assure Nixon’s re-election.  Some of this involved trying to get him portrayed as positively in the press.  Dealing with the growing dissent concerning the Vietnam War also called for considerable attention.  When Daniel Ellsberg clandestinely  orchestrated the release of the “Pentagon Papers”–a major setback for Nixon and a blow to Henry Kissinger’s diplomatic work with the North Vietnamese delegation in Paris–Colson was ordered to expose Ellsberg.  Complying, he leaked a damaging FBI file on Ellsberg to a reporter, one step in discrediting him.  Colson also secured the cash which enabled Howard Hunt and associates to break into Ellsberg’s psychiatrist’s office, hunting damaging details, though he knew nothing about the burglary itself.  The more famous burglary, at the Watergate Hotel, was also done without Colson’s knowledge.  Though critics sought to implicate him, the famous “tapes” and other documents demonstrate his innocence.

Nixon’s landslide victory in 1972 was followed, within a month, by Colson’s resignation from his administration.  Haldeman and Erlichman, apparently, desired to minimize his influence, and Colson was disinterested in anything less than a major position, so he left the White House, still deeply committed to the President.  Quickly reestablishing his law practice, he assumed the next few years would be devoted to acquiring wealth and solidifying his position within Washington’s beltway.  Quickly, however, the Watergate scandal swept him into a cauldron of controversy.  Informally he offered advice to Nixon and his inner circle, urging them to simply tell the truth.  Publically he defended the President and denied any personal awareness of (much less involvement in) the Watergate burglary.  Nevertheless he had to deal with accusations in the press–some the result of John Dean’s duplicity and mendacity.  (Dean sought to save his own skin by incriminating others, however innocent!)  As the Nixon presidency collapsed, Colson was sucked into the chaos.

In the midst of it all, he met Tom Phillips, president of the Raytheon Company, who briefly testified to “the most marvelous experience of my whole life,” coming to faith in Jesus Christ (p. 140).  He further encouraged Colson to chat with him about it later.  Burdened by all the pressures of Watergate, Colson decided to visit Phillips at his home on August 12, 1973.  One of the most successful men in America, Phillips  had increasingly found life meaningless, and out of curiosity went to a Billy Graham Crusade in New York.  There he responded to the message and accepted Jesus as Savior.  Having told his story, Phillips then gave Colson a copy of C.S. Lewis’s Mere Christianity and urged him to consider its claims.  Following the conversation, Colson drove a short distance, parked, and broke into tears.  Sobbing, he prayed, “God, I don’t know how to find You, but I’m going to try.  I’m not much the way I am now, but somehow I want to give myself to You.  Take me!” (p. 146).    He’d turned life’s most important corner.

Subsequently, Colson carefully read Mere Christianity and found it intellectually persuasive.  He returned to Washington and found a growing circle of Christian friends (unanticipated folks like senators Mark Hatfield and Howard Hughes) who confirmed and encouraged his new-found faith.  Along with his faith, however, he faced a growing conviction that he’d wronged Daniel Ellsberg by seeking to discredit him.  To be a Christian, he sensed, meant doing what’s right without regard for the consequences.  Thus, though he was innocent of many accusations, he voluntarily confessed to slandering Ellsberg.

The judge, inexplicably, decided to make an example of Colson and sentenced him to prison.  It was devastating, but it also opened up an entirely new world for him.  While in prison he developed a compassion for inmates and later established Prison Fellowship to minster to them.  This increasingly led him to speak not only in prisons but in other venues.  He wrote the best-seller, Born Again, and quickly became one of the most prominent Christians in the country.

* * * * * * * * * * * * * * * * * * *

For several years I’ve intended to read Just As I Am: The Autobiography of Billy Graham (San Francisco: HarperSanFrancisco, c. 1997), but was put off by its length–750 pages!  Recently tackling the tome, however, proved to be most rewarding.  There is no doubt that Billy Graham is one of the greatest Christians the Church as produced in two millennia.  Though such calculations are difficult to tabulate, he’s no doubt preached the Gospel to more people than any other evangelist.  He’s also met–and witnessed to–many of the most notable people of his generation, served as a trusted counselor for eight (no doubt nine now, since he’s certainly close to the Bush clan) American presidents, and (most importantly) distinguished himself with personal integrity and graciousness.

Graham tells us about his early years, working on his father’s farm in North Carolina, his “180-Degree Turn” in response to a Mordecai Ham revival message, his call and preparation to preach, culminating at Wheaton College, where he met his wife, Ruth, and several friends who would become an important part of his growing ministry.  Opportunities to preach in the Chicago area led to an association with Youth for Christ, as well as an invitation to become President of Northwestern Schools in Minneapolis.  These positions opened doors for him to preach across the nation, holding “campaigns” in various places.  Early on it was markedly evident that Billy Graham had the gift of evangelism, as numbers of hearers routinely responded to his simple Gospel messages.

In 1949 a major turning point occurred in Los Angeles, where he preached for eight weeks, attracting significant numbers of people and gaining national attention as a result of William Randolph Hearst’s instructions to “puff Graham” in his newspapers.  Prominent entertainers, such as Stuart Hamblen, were converted, enhancing Graham’s image.  Featured in Time Magazine and other publications, he became (almost overnight) the spokesman for a resurgent revivalism in America, a movement that would soon be identified as “evangelicalism.”  This led, during the next five decades, to an endless number of “crusades” throughout the world, detailed in this autobiography, impacting millions of people.  He’s preached in Roman Catholic and Russian Orthodox churches, baseball and soccer stadia, open fields and primitive huts.  Graham also began to utilize other outlets to spread the Good News–films, radio, TV, conferences, training centers.  Christianity Today, for example, came into existence solely because of his vision and commitment to provide the reading public with a thoroughly Evangelical journal.

Especially interesting, to me at least, are Graham’s chapters on the presidents he’s known.  Early on met Harry Truman, but alienated the president through immature aggressiveness.  Having learned his lesson, he developed cordial, and often deeply warm relationships with every subsequent president.  Though he may have (privately) differed with them politically, he related to them as a friend and spiritual advisor.  He admires and commends, perhaps a bit naively, each of them.  One certainly finds in Graham a helpful correction to some of the critical views of Johnson, Nixon, and Reagan, Clinton, all of whom he liked and trusted. Importantly, we find many of these men sincerely hungry for spiritual counsel and assurance.  One cannot but be grateful that there was a man named Billy Graham who could speak, with authority, about God to the leaders of the nation.

Graham also met the world’s most eminent leaders, ranging from Margaret Thatcher to Jawaharlal Nehru to Mikhail Gorbachev to John Paul II.  In one way or another, he seems to have encountered most of the most important people of his era.  And he unfailingly sought to talk personally with them about Jesus Christ.  Whether addressing thousands of people in a mass meeting, or speaking privately and confidentially within the corridors of the Kremlin, Billy Graham has sought to be an evangelist.  One reads this book with growing amazement at the sheer scope of his influence around the world!

Graham also tells us much about his family, his many friends, his personal perspectives.  Whatever he discusses, one senses that Billy Graham is an honest, authentic man, fully devoted to God, and ever aware of his own limitations.  There’s a winsome humility–never false or overly self-critical–that explains much of his success.  Just As I Am is not only a great invitation hymn (routinely used in the Graham crusades) but an apt title for his life.  Well worth the reading!

# # #

139 “No Free Lunch”

“NO FREE LUNCH”  

One of the most self-evident propositions is this:  from nothing comes nothing.  Similarly self-evident is the observation of Thomas Reid, the great 18th century Scottish common sense philosopher:  “From marks of intelligence and wisdom in effects, a wise and intelligent cause may be inferred.”  To establish such eminently reasonable propositions, a number of erudite thinkers have launched the “intelligent design” movement, intent on refuting naturalistic theories of origins with the increasing evidence provides us by contemporary science.  One of the most articulate and aggressive proponents of this view, William Dembski, recently sets forth his position in No Free Lunch:  Why Specified Complexity Cannot Be Purchased without Intelligence (New York:  Rowman & Littlefield Publishers, Inc., c. 2002). 

The book’s initial paragraph replicates Aristotle and sets the agenda:  “How a designer gets from thought to thing is, at least in broad strokes, straightforward:  (1) A designer conceives a purpose.  (2) To accomplish that purpose, the designer forms a plan.  (3) To execute the plan, the designer specifies building materials and assembly instructions.  (4) Finally, the designer or some surrogate applies the assembly instructions to the building materials.  What emerges is a designed object, and the designer is successful to the degree that the object fulfills the designer’s purpose” (p. xi).   Few question the clear evidence of design in houses and cars, hybrid corn and computers.  But the big question Dembski wants to answer is this:  does creation–both the cosmos and the living world we inhabit–indicate a similarly clear, rational design.

He argues that of the three proposed cosmological explanations–necessity, chance, design–the latter makes most sense as “a legitimate and fundamental mode of scientific explanation” (p. 3).  This becomes evident as he sets forth “specified complexity” as a “third mode of explanation” when dealing with empirical data.  A ball rolling off a roof, for example, necessarily falls to earth as a result of gravity’s tug.  A ball breaking a window, as a result of an errant and unexpectedly long hit in a sandlot baseball game, is largely due to chance.  A series of balls (curves, sliders, fast balls) thrown by a pitcher, consistently hitting the strike zone, leading to a no-hitter, nicely reveals design.  Intelligent, well-executed effects illustrate “specified complexity,” the trademark of design.

To defend his position, he explains and rejects alternative explanations.  This involves intricate mathematical formulae and discussions of probability far beyond my ability to fathom.  What is clear is Dembski’s conclusion that “the universe is too small a place” for random events, following the law of probability, to produce the world we confront.  Citing one scientist’s work with proteins, for example, he shows that for accidental collisions of particles to produce “proteins of length 200” (p. 84), the universe would have to be almost infinitely old instead of current estimates of 10-15 billion years.  When one fully understands the incredibly complex information we find in tiny cells, to say nothing of truly complex organisms like ourselves, it takes an almost irrational faith in “chance and necessity” to insist that everything that exists is a result of purely natural, irrational causes. 

But that, of course, is the strongly entrenched scientific mindset, evident in an assertion by Harvard biologist Richard Lewontin in The New York Review of Books:  “‘We take the science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories because we have a prior commitment, a commitment to materialism [i.e., naturalism].  It is not that methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counterintuitive, no matter how mystifying to the uninitiated'” (pp. 370-71). 

Such counterintuitive commitments, Dembski argues, explain the “‘free lunch'” form of magic in which it is possible to get something for nothing” (p. 367).  Almost precisely like Medieval alchemists, convinced they could transform lead into gold, today’s scientists propound even more fanciful notions.   Learned cosmologists “claim that this marvelous universe could originate from quite unmarvelous beginnings,” and biologists reduce the mystery of life to simple mechanical processes (p. 368).  Neurologists claim that the human brain is nothing but the accidental result of a “cobbled together” process that “gave rise to human consciousness, which in turn produces artifacts like supercomputers, which in turn are not cobbled together at all but instead are carefully designed.  Out pop purpose, intelligence, and design from a process that started with no purpose, intelligence, or design.  This is magic” (p. 369). 

Rather than resort to magical explanations, Dembski urges us to weigh the evidence, the evidence for a world packed with information, that prods us to grant that such a world not only appears to be but actually is intelligently designed. 


Dean L. Overman, an attorney who served as Special Assistant to Vice President Nelson Rockefeller and wrote several law books, is one of the nation’s best thinkers in finance and banking, employs some highly mathematical arguments in A Case Against Accident and Self-Organization (New York:  Rowman & Littlefield Publishers, Inc., c. 1997).  Alister McGrath, an eminent English theologian, says:  “This is a well-argued and immensely readable engagement with some profound questions centering on the origins of our universe and ourselves.  Overman’s clear and informed arguments cast serious doubt on the plausibility of the naturalist approach, and reopen the case for divine design.”  One of his German counterparts, Wolfhart Pannenberg, agrees, writing in his Foreword:  “This book argues persuasively against the assumption that the origin of life and the origin of the universe can be accounted for as random events.  According to Overman, it is mathematically not possible to derive the origin of the high level of information necessary for organic life in terms of random fluctuations in pre-organic processes” (p. xiii).   

In his Preface, Overman notes that he “never intended to write this book” (p. xvii).  As a lawyer he had little interest in cosmological and theological debates.  However, he had devoted his life, as a lawyer, to ascertaining the “logic and the validity of premises, inferences and conclusions as they relate to an examination of evidence” (p. xvii).  By chance, however, he read an article describing the Miller and Urey experiment that led him to write a letter objecting to some of its assertions.  That letter led to more study and more writing, and ultimately to this book.  As he studied, he soon realized that “Many otherwise rational persons make unwarranted conclusions which are not based on evidence, but are made in the absence of evidence and contrary to mathematical probabilities because of their faith in the ideology of materialism” (p. 1). 

To free one from irrational ideologies, Overman prescribes logic!  This lead him to explain, in a very understandable chapter, the value of “verbal and mathematical logic.”  A valid syllogism–ab universali ad particulare valet–provides as much certainty as is available to us.  Conversely, invalid syllogisms–a particulari ad universale no valet consequential–lead one astray.  Mix in various fallacies–exptrapolating from limited data to unwarranted inferences, equivocations in the use of words, hidden assumptions (often buried in mathematical formulae), circular reasoning, post hoc, ergo propter hoc–and you have the tools with which to detect erroneous reason.

Logical errors abound, Overman shows, whenever molecular biologists declaim theories explaining how living beings emerged from lifeless matter, because “proponents of the origin of life by accident or chance processes rarely make the mathematical calculations of the probabilities which lie at the foundation of their hypothesis” when in fact “the odds are so overwhelmingly against” it (p. 31).  After showing that “life” is best defined in terms of non-material information, he demonstrates how “chance” or “random abiogenesis,” so routinely cited in biology textbooks, simply cannot account for it.  “The information filled molecules of life are much more complex and structured than previously thought, and calculations of the mathematical probabilities of unguided, chance processes forming life call the theory of accidental abiogenesis into question” (p. 40). 

The Miller-Urey experiments, often cited to explain life’s origin, ignored the fact that oxidizing conditions in the early earth’s atmosphere would have prevented it.  The “prebiotic soup” often credited with incubating life, surely would have left some deposit in ancient sedimentary rocks, but no trace of abiotic compounds appears.  As Herbert Yockey explained it:  “‘the absence of evidence'” is the “‘evidence of absence'” for the prebiotic soup” (p. 47).   Miller and Urey simply manipulated elements known to constitute life in a test tube, ignoring the extraordinary mathematical improbability that such an event could have accidentally happened.  “To paraphrase Louis Pasteur, in experience only life produces life” (p. 49).  

The best current estimates, dealing with the age of the earth and the emergence of life, indicate that “only a maximum of 130 million years were available for random processes to produce life.  Calculations of mathematical probabilities unequivocally demonstrate that it is mathematically impossible for unguided, random events to produce life in this short period of time” (p. 51).  To illustrate this, Overman takes a passage from Shakespeare’s Macbeth containing 379 letters.  The mathematical probability of these 12 lines being accidentally typed is 26379  or 10536, an incredibly improbable number, given that “there are only 1080 atoms in the known universe” (p. 55).    Twelve lines from Shakespeare are relatively simple, however, compared to a simple cell.  Fred Hoyle and Chandra Wickramsinghe calculated the odds involved in accidentally producing a simple bacterium at 1040,000–an utterly impossible event.  As Wickramsinghe noted:  “‘The chances that life just occurred are about as unlikely as a typhoon blowing through a junkyard and constructing a Boeing 747′” (p. 60).  

Then, even if you grant that life could have appeared by chance, the odds against it developing in such profusion during the short time allowed on earth defies all logical canons.  The various complexities of the living world, especially evident in our growing grasp of DNA and RNA, reinforce the words of Michael Polanyi:  “‘all objects conveying information are irreducible to the terms of physics and chemistry'” (p. 88).  The non-material information that’s important in such molecules is markedly different from their  material sugars and phosphates. 

Moving from the microscopic realm of molecules to the macroscopic world of he universe, the same truth stands clear:  the world is intricately designed.  Though controversial when first set forth, few physicists today question the “Big Bang” beginning of the cosmos.  An incredibly dense point, of quarks and leptons, exploded and hurled into space all the matter that makes the universe.  Studying the intricate balance of strong and weak forces, gravity and thermodynamics, grasping mind-bending theories such as the superstring theory, with its ten dimensions, makes one cognizant of the intricacies of our world.  Astronomers and physicists, calculating how it could all happen within 10-15 billion years, increasingly rule out purely material chance and necessity. 

In fact, when all the factors are figured, the probability of the universe simply taking its present form is calculated as “on part in 1010(123)–”an extraordinary figure” (p. 140) impossible to even fully write down.  It even looks as if the whole universe was designed for us!  Freeman Dyson, one of the finest physicists of the 20th century, noted the “fine tuning of the universe” by saying:  “‘The more I examine the universe and the details of its architecture, the more evidence I find that the universe in some sense must have known we were coming'” (p. 159).  More theologically, Dyson said:  “‘God is the Creator with a plan and an intention for the existence of the entire universe.  The very structures of the universe itself, the rules of its operation, its continued maintenance, these are the more important aspects of creation'” (p. 167). 

This is a well-argued, persuasive treatise.  Even though Overman, on a very high level, deals with mathematical and scientific matters, he almost always explains things in ordinary language and easily guides the reader through some tangled thickets.  Unlike scientists and theologians, who often make assumptions simply as a result of residing within a certain intellectual arena, lawyers like Overman bring to the discussion a fresh spirit of inquiry, an ability to get at the essence of the question, and a lack of concern for how their presentations will be received by the academic guilds that tend to stifle unorthodox views. 

* * * * * * * * * * * * * * * * * * *

For readers not quite ready to plow through the heavy mathematical and scientific material in the two books reviewed above, William Dembski and James M. Kushiner have edited a collection of essays, Signs of Intelligence:  Understanding Intelligent Design (Grand Rapids:  Brazos Press, c. 2001).  These essays originally appeared in a special issue of Touchstone (a monthly “journal of mere Christianity” that I highly recommend for its staunchly orthodox stance, committed to the tradition extending from Athanasius to C.S. Lewis).  The essays in this book are short, to the point, and written for the general reader.  Contributors include some of the guiding lights of the Intelligent Design movement, such as Dembski, Michael Behe, and Jonathan Wells, as well as gifted analysts like John Mark Reynolds and Patrick Henry Reardon. 

In chapter one, Phillip E. Johnson, for years a professor of law at the University of California, Berkeley, explains the Intelligent Design movement’s challenge to naturalistic evolution.  Something of the progenitor of the movement, announced in Darwin on Trial more than a decade ago, Johnson’s forte is clear explanation of terms and carefully reasoned argument.  He never pretends to be a scientist, but he insists that scientific claims be made rationally, with demonstrable evidence and coherent logic.  Indeed, he believes that “Once it becomes clear that the Darwinian theory rests upon a dogmatic philosophy rather than the weight of the evidence, the way will be open for dissenting opinions to get a fair hearing” (p. 26). 

The dogmatism of Darwinism stands clearly illustrated in a statement by   Francisco Avala, former president of the prestigious American Association for the Advancement of Science, who put it this way:   “‘The functional design of organisms and their features would therefore seem to argue for the existence of a designer. It was Darwin’s greatest accomplishment to show that the directive organization of living beings can be explained as the result of a natural process, natural selection, without any need to resort to a Creator or other external agent. . . .  Darwin’s theory encountered opposition in religious circles, not so much because he proposed the evolutionary origin of living things (which had been proposed many times before, even by Christian theologians), but because his mechanism, natural selection excluded God as the explanation accounting for the obvious design of organisms'” (p. 146). 

Avala’s assertion, Robert Dehaan and John Wiester insist, is “monumental.”   He, and folks like him, present evolution not as a process explaining observable changes, such as the variations in finch beaks Darwin observed.  Ayala is making vast metaphysical claims, crediting “natural selection” for the existence of all that is.  He also dismisses any notion of “intelligence” in the cosmos.  There’s no order or logic to creation, since all comes about through chance.  “In the pre-Darwinian view, life was planned and purposeful. In the Darwinian view, life arose and evolved solely by what Ayala calls ‘the creative duet of chance and necessity,’ without purpose or a “preconceived design” (p. 146). 

Statements such as Avala’s enable Philip Johnson to point out the difference between “materialistic science,” the philosophically entrenched worldview (“naturalism”) that prevails in the modern academy, and “empirical science,” the openness to data that leads to hypotheses equally open to experimentation and testing.  The dogma of materialistic science, as Nancy Pearcey shows in her essay, is fully evident in the opening sentence of Carl Sagan’s Cosmos:  “The Cosmos is all that is or ever was or ever will be.”  Nothing else matters because matter is all there is.  And yet, when read carefully, Johnson shows, materialists like Sagan routinely invoke invisible (and thus possibly non-material) factors such as agency and reason to explain things.