168 Europe at Risk

In The Cube and the Cathedral:  Europe, America, and Politics Without God (New York:  Basic Books, c. 2005), George Weigel (the noted biographer of John Paul II and Benedict XVI) ponders the plight of a Europe losing the religious faith that birthed it.  He compares “the cube”–La Grande Arche de la Defense, the massive (40 story), modernist glass cube built in Paris by the late socialist French president, Francois Mitterand–with the nearby cathedral,  Notre Dame de Paris, one of the grandest monuments of the Middle Ages.  The two structures symbolize the worldviews contending for the heart of the continent–a struggle that is “fundamentally a problem of cultural and civilizational morale” (p. 6).

Certain trends in post-WWII Europe deeply distress Weigel:  a failure to condemn either communism or Islamofascism; a spineless pacifism vis a vis terrorists and criminals; a mindless support for international organizations, such as the EU and UN; an irrationality evident in the high percentage of French and Germans who think the U.S. actually orchestrated the 9/11 attacks; a marked economic decline; a startling demographic devolution–the dramatic evidence of a people without concern for future generations; a growing contempt for the elderly and deceased; an unwillingness to face the bankruptcy of social welfare and pension systems; and a militantly anti-Christian agenda embraced and imposed by Europe’s elite.

All of these problems, as Aleksandr Solzhenitsyn noted in his 1983 Templeton Prize Lecture, began at the dawn of the 20th century with World War I and Europe’s subsequent “lost awareness of a Supreme Power” and “rage of self-mutilation” that explain why it stood by and allowed “the protracted agony of Russia as she was being torn apart by a band of cannibals. . . .  The West did not perceive that this was in fact the beginning of a lengthy process that spells disaster for the whole world” (pp. 33-34).  Similarly, said Henri de Lubac (in his magisterial The Drama of Atheist Humanism), the 20th century’s great disasters stemmed from “an attempt to promote a vision of man apart from God and apart from Christ. . . .  Forgetfulness of God [has] led to the abandonment of man” (p. 119).

Europe’s loss of faith stands exposed in the proposed constitution for a new Europe that was recently rejected (for economic self-interest, not religious concern) by the people of France and Holland.  This “constitution” had a lengthy historical section that said nothing about the role of Christianity in the making of Europe!  To Weigel, this “self-inflicted amnesia” provides “a key to the ‘Europe problem’ and its American parallels” (p. 55).  What’s largely forgotten is the freedom for excellence embedded in the thought of St. Thomas Aquinas and evident in Western Christian Culture.  “Freedom is the capacity to choose wisely and act well as a matter of habit–or, to use an old-fashioned term, as a matter of virtue.  Freedom, on this understanding, is the means by which we act, through our intelligence and our will, on the natural longing for truth, goodness, and happiness that is built into us as human beings” (pp. 79-80).   To be fully human is to be truly free.  A concert organist freely plays Bach’s music after mastering highly technical disciplines through practice.  One is, likewise, a free person by virtue of mastering the cardinal virtues.  A free society is sustained by similar disciplines and is “characterized by tolerance, civility, and respect for others, societies in which the rights of all are protected by both law and the moral commitments of ‘we the people’ who make the law” (pp. 81-82).

Rejecting and rivaling the realism of Thomas Aquinas was the voluntaristic nominalism of William of Ockham, which has “had a great influence on Christian moral theology” (p. 83).  There is no question that the slow and steady growth of nominalism, during the past 500 years, has weakened the foundations of the West.  In Ockham’s nominalism there are no universals, no absolutes apart from the arbitrary edicts of God–which may or may not be sustained tomorrow.  In modern nominalism, the only edicts are our own–or those manufactured by elites such as the United States Supreme Court.  To Servais Pinckaers (a Belgian Dominican who crafted the phrase “freedom for excellence” regarding Aquinas), “Ockham’s work was ‘the first atomic explosion of the modern era,'” and it brought into being “a new, atomized vision of the human person and ultimately of society.  In Ockham we meet what Pinckaers calls the freedom of indifference” (p. 83).  Freedom is doing whatever one wants to do and “has nothing to do with goodness, happiness, or truth” (p. 85).  Nietzsche’s will-to-power is the modern manifestation of Medieval nominalism, and his pernicious nihilism is everywhere evident in everything from Nazism to the “postmodernism” of Derrida and Foucault, from Supreme Court decisions such as Lawrence v. Texas to the terrorism of Islamofascists  blowing up themselves and their innocent victims in London subways or Amman hotels.

Europe–and America–Weigel insists, must choose either the freedom of excellence or the freedom of indifference, the cathedral or the cube.  This short book is more of a journalistic sketch than an in-depth study, but Weigel’s concerns are both prescient and compelling.

* * * * * * * * * * * * * * * * * * * *

Whereas Weigel is a serious scholar who writes for a popular audience, Tony Blankley is a journalist who’s written a thoughtful treatise entitled The West’s Last Chance:  Will We Win the Clash of Civilizations? (Washington, D.C.:  Regnery Publishing, Inc., c. 2005).  Like Weigel he thinks Europe is at risk and Americans should be concerned.  “The threat of the radical Islamists taking over Europe is every bit as great to the United States as was the threat of the Nazis taking over Europe in the 1940s.  We cannot afford to lose Europe” (p. 21).  Both America and Europe have the resources needed to resist this, but the question we now face is whether they have the will to do so.

The Nazis in the 1920s and 1930s were a small, militant group determined to take control of both  Germany and Europe.  “They particularly targeted German youth” (p. 47).  The same may be said of Islamofascists today.  Large numbers of Muslims are angry at their plight in the world, humiliated by their economic and military weakness vis a vis Israel and the West, and willing to heed the radical voices calling for jihad and terror.  In response, Europe’s leaders (following the pacifist path of those Oxford University students who declared, in the ’30s, they “would not fight for King or country”), engage in denial and accommodation and appeasement.  Like Stanley Baldwin 70 years ago, they court popularity by funding public housing rather arms and actions, talking about peace and security rather than conflict and struggle.

This was clear to George Orwell in 1940, when he wrote:  “‘I thought of a rather cruel trick I once played on a wasp.  He was sucking jam on my plate, and I cut him in half.  He paid no attention, merely went on with his meal, while a tiny stream of jam trickled out of his severed esophagus.  Only when he tried to fly away did he grasp the dreadful thing that had happened to him.  It is the same with modern man.  The thing that has been cut away is his soul, and there was a period–twenty years, perhaps–during which he did not notice.’  In Orwell’s view, Western man has lost his soul in the aftermath of World War I” (p. 133).

But Baldwin and the Oxford students were refuted by events in 1939, when men of sterner stuff (namely Winston Churchill) were needed.  Perhaps in WWII the Allies recovered some of the West’s legacy and breathed life back into their culture.  But it was, Blankley thinks, ephemeral.  And whether or not the soul Orwell declared dying is forever dead only time will tell.  But “Europe’s future is in danger because Europe has forgotten its past.  In the Middle Ages, Europeans held a healthy respect, even fear and awe, of the power and vigor of Islamic culture” (p. 96).  Not so today!  Muslim immigrants have flooded Europe but, unlike other ethnic groups, have refused to assimilate in host countries.  They seek domination, not integration!   Yet Europeans fail to identify and fight them as mortal foes and Islamist jihadists if unopposed will triumph.

* * * * * * * * * * * * * * * * * * * *

Roger Scrutin, a fine English philosopher, focuses his finely honed analytic mind on The West and the Rest:  Globalization and the Terrorist Threat (Wilmington:  Intercollegiate Studies Institute, c. 2002).  He especially stresses the critical philosophical differences between Islam and Christianity.  The very word, Islam, denotes submission, surrender, whereas the Christendom celebrates the personal dignity and freedom that results from a careful separation of church and state.  “The Muslim faith, like the Christian, is defined through a prayer.  But this prayer takes the form of a declaration:  There is one God, and Muhammad is his Prophet.  To which might be added:  and you had better believe it.  The Christian prayer is also a declaration of faith; but it includes the crucial words:  ‘forgive us our trespasses, as we forgive them that trespass against us'” (p. 36).  The two faiths are, quite simply, radically different, and this has led to different “social contracts.”

The West has also been shaped by the Enlightenment, with its commitment to reason and the civic virtues of “law-abidingness, sacrifice in war, and public spirit in peacetime” (p. 55).  To the extent irrationalism, lawlessness, pacifism, and the demise of patriotism pervade the West, they subvert the civilization that has sustained them.  Contempt for objective truth, championed by Nietzsche and his postmodern epigones, easily leads to the disdain for any form of authority so evident on many university campuses.  Nietzschean skepticism inculcates moral relativism, and fewer and fewer Europeans seem willing to risk anything, much their lives, for anything or anyone.  They’re increasingly disinterested in even marrying and rearing children, much less fighting for what’s right.  “Religious societies generate families automatically as the by-product of faith” (p. 69).  Secular societies, however, have little concern for anything sacred such as the family, repudiating it as “patriarchal” and oppressive.

Weakened by the loss of religious faith and Enlightenment values, the West now faces the onslaught of a revived worldwide Islam, committed to the “holy law” of Mohammed.  Scrutin guides the reader in a careful survey of Islamic history and thought, emphasizing that:  “Conquest, victory, and triumph over enemies are a continual refrain of the Koran, offered as proof that God is on the side of the believers” (p. 120).  Still more:  “For the first time in centuries Islam appears, both in the eyes of its followers and in the eyes of the infidel, to be a single religious movement united around a single goal” (p. 123).  And wherever they have come to power, we have witnessed “murder and persecution on a scale matched in our time only by the Nazis and the Communists.  The Islamist, like the Russian nihilist, is an exile in this world; and when he succeeds in obtaining power over his fellow human beings, it is in order to punish them for being human” (p. 127).

Nothing but a revival of a commitment to the West and its virtues can withstand the Muslim onslaught that is now taking place around the world.  To understand why this is necessary, Scrutin proves enlightening.  How to do it, however, is less clear!

* * * * * * * * * * * * * * * * *

For 25 years Bat Ye’or (an Egyptian scholar living in Switzerland writing under a pseudonym to help shelter her from the violence automatically addressed toward scholars who dare criticize Islam) has explored the somber reality of dhimmitude–the status of non-Muslims in Islamic societies.  Her scholarly works include:  The Dhimmi:  Jews and Christians Under Islam (1985), The Decline of Eastern Christianity under Islam:  From Jihad to Dhimmitude (1996), which I reviewed in my “Reedings” #131, and Islam and Dhimmitude:  Where Civilizations Collide (2002).  These are historical works, but her latest treatise, Eurabia:  The Euro-Arab Axis (Madison, N.J.:  Fairleigh Dickinson University Press, c. 2005) describes what’s taking place right now, particularly among the bureaucratic elites who increasingly control the continent.  Above all, she challenges readers to look clearly at the documents, many included as appendices to the text, that challenge the stories told by the largely pro-Islamic culture czars (the journalists and professors and politicians) in both Europe and America.  She’s clearly partisan in her presentation, finding little of value in Islam.  But she’s an enormously well-informed partisan, and her facts and perspective simply cannot be ignored by anyone seriously concerned with world affairs.

“This book,” she says, delineating her thesis, “describes Europe’s evolution from a Judeo-Christian civilization, with important post-Enlightenment secular elements, into a post-Judeo-Christian civilization that is subservient to the ideology of jihad and the Islamic powers that propagate it.  The new European civilization in the making can be called a ‘civilization of dhimmitude.’  The term dhimmitude comes from the Arabic word ‘dhimmi.’  It refers to subjugated, non-Muslim individuals or people that accept the restrictive and humiliating subordination to an ascendant Islamic power to avoid enslavement or death.  The entire Muslim world as we know it today is a product of this 1,3000 year-old jihad dynamic, whereby once thriving non-Muslim majority civilizations have been reduced to a state of dysfunctional dhimmitude.  Many have been completely Islamized and have disappeared.  Others remain as fossilized relics of the past, unable to evolve” (p. 9).

“For well over a millennium,” Bat Ye’or continues, “following the seventh-century Muslim military offensives against Byzantium, European powers instinctively resisted jihad–militarily when necessary–to protect their independence.  The response of the post-Judeo-Christian Europe of the late twentieth century has been radically different.  Europe, as reflected by the institutions of the EU, has abandoned resistance for dhimmitude, and independence for integration with the Islamic world of North Africa and the Middle East.  The three most apparent symptoms of this fundamental change in European policy are officially sponsored anti-Americanism, anti-Semitism/anti-Zionism and ‘Palestinianism'” (p. 10).  The targets may change, but Muslim objectives and strategies remain constant:  “Hostage taking, ritual throat slitting, the killing of infidels and Muslim apostates are lawful, carefully described, and highly praised jihad tactics recorded, over the centuries, in countless legal treatises on jihad” (p. 159).

The French have pioneered the current process of accommodation.  Charles de Gaulle, whose pride was injured by his exclusion from the Yalta Conference as WWII ended, and who witnessed the demise of France’s colonial empire, determined to create a French-led alliance of Mediterranean states that would effectively counteract the growth of American power.  His successors hoped France would become the “protector of Islam and Palestinians against America and Israel.  They hoped that a pro-French Islam would facilitate the quiet control of former colonies within the French orbit and spread French culture,” ultimately establishing “an enormous market” that would restore France’s former glory (p. 148).  The unsuccessful Egyptian-Syrian assault on Israel and the Arab oil embargo in 1973 accelerated such developments.  Europeans abruptly began discussing and defending the “Palestinian people,” a new name for Arabs living in the disputed region between Israel and Jordan.  The nation of Israel, which had enjoyed Europe’s support for 25 years, suddenly became the bete noir of the Middle East.  Hungry for oil, Europe began to support the Muslim dictators who supplied it.  Needing workers during that era of economic expansion, immigrants from North Africa and Turkey were encouraged to relocate.

In the midst of all this, innumerable conferences were held and papers published as part of the EAD (“Europe-Arab Dialogue”) regarding the new coalition.  Bat Ye’or seems to have read every report of every such conference, and she takes seriously their wording, for they seem to have slowly shaped EU policies.  At such gatherings, attended by Arab and European elites, impressive words were uttered regarding human rights, religious rights, women’s rights, workers’ rights, etc.  And the Muslim workers flooding Europe have enjoyed such “rights.”  Still more:  they have secured special privileges and protections under the auspices of the increasingly powerful European Union.  This was, of course, a one-way process.  Arabs, for instance, demand absolute respect for Islam in European lands but continue to persecute Jews and Christians wherever Islam (with its Shari’a) reigns.

Muslims even insist that Islam rather than Christianity be recognized as the primary “civilization”–indeed “the spiritual and scientific fountainhead of Europe” (p. 98).  Robin Cook, the British Foreign Secretary, actually said that “Islamic art, science and philosophy,” along with Greek and Roman culture, had helped shape England (p. 172)!  A fantasyland of Muslim rule in Medieval Spain–”the Andalusian utopia”–is routinely cited as evidence of past Islamic tolerance and educational sophistication.  And, as the recent draft of a EU constitution indicates, Muslim influence grows.  Arab delegates to various EAD sessions ever demand that Europe’s schools and textbooks favorably portray Muslims and Islamic history, whereas Jews and Christians are routinely demonized by Muslim educators.  “Through the ‘Diaglogue,” Bat Ye’or insists, “Arab League politicians and economists have gained a firm ascendancy over Europe’s policy and economy” (p. 123).

Multiplied billions of dollars have been extorted from Europe, sent as “aid” and “development funds” to Muslim countries–the dar al-islam world ruled by Shari’a.  “The huge sums that the EU pays to Arab Mediterranean countries and the Palestinians amount to another tribute exacted for its security within the dar al-sulh [i.e. the subservient state of dhimmitude].  Europe thereby put off the threat of a jihad aimed at the dar al-harb [the world Muslims are obligated to attack and control] by opting for appeasement and collusion with international terrorism–while blaming the increased world tensions on Israel and America so as to preserve its dar al-sulh position of subordinate collaboration, if not surrender, to the Islamists” (p. 77).  By supporting the economic and cultural policies Arabs demand, Europeans have, Bat Ye’or thinks, surrendered their lands and traditions.   By surrendering, they have become dhimmis, for “dhimmis do not fight.  Dhimmitude is based on peaceful surrender, subjection, tribute, and praise” (p. 204).

When President Bush fought back, following the 9/11 attacks, subduing the Taliban in Afghanistan and invading Iraq, “the Anti-Americanism that had been simmering for years among European Arobophiles, neo-Nazis, Communists, and leftists in general” (p. 227) boiled over.  In a profound sense, Bat Ye’or says, such Anti-Americanism thrives in “cowardly or impotent societies, which have chosen surrender through fear of conflict” (p. 242).  It’s the resentment of the weak regarding the strong, “an intellectual totalitarianism disguised as a virtue for states which have entrusted their security to those who threaten them” (p. 242).

Still more:  “By implicitly enlisting in the Arab-Islamic jihad against Israel–under labels such as ‘peace and justice for Palestinians’–Europe has effectively jettisoned its values and undermined the roots of its own civilization.  It even struck a blow against worldwide Christianity, abandoning the Christians in Lebanon to massacres by Palestinians (1975-83), those of Sudan to jihad and slavery, and the Christians of the Islamic world to the persecutions mandated by the dhimma system” (p. 115).  Despite this, however, “the EU is implicitly abetting a worldwide subversion of Western values and freedoms, while attempting to protect itself from Islamic terrorism by denying that it even exists, or blaming it on scapegoats” (p. 227).

In short:  with precious oil and prolific immigrants the Muslims have moved to impose a state of dhimmitude upon Europe.  Jihad is succeeding as Islam extends its sway–though oil has replaced the sword as the weapon of choice and immigrants rather than warriors serve of agents of occupation!  There’s a war going on in Europe, and Europeans have closed their eyes to preserve an “illusion of peace” (p. 252), much as they did while National Socialism and Communism devoured the continent in the 20th century.

# # #

167 “Built to Last” and “Good to Great”

I rarely read books on “leadership” or business management, finding them generally focused on “bottom line” issues and generally irrelevant (or even contrary) to the academic and religious world that’s always been my primary concern.  Some recent treatises, however, merit consideration, for they explore both the realm of economics and the deeper recesses of human nature.  Economics, in its most basic sense, means household management and obligates one to act wisely for the good of one’s family and community–certainly a central concern for any ethic.  And the ways people do business, in any society, obviously offers many clues to the nature of human nature.

A decade ago James C. Collins and Jerry I. Porras published Built to Last:  Successful Habits of Visionary Companies, setting forth the data and insights gained from a six-year research project at the StanfordUniversity graduate School of Business, and it became the number one business book for 1995. Two years later the authors added a new introduction and concluding chapter (New York:  HarperCollins, c. 1997), making the paperback edition both more extensive and conclusive, showing how the book’s business principles apply to individuals and small groups within corporations, non-profits as well as for-profit organizations.

The authors focused on some “truly exceptional companies that have stood the test of time” (p. xxiii), wondering how they lasted while competitors came and went.  What they discovered, first of all, was the difference between “clock building and time telling.”  Patiently constructing a well-honed organization, with less attention to quarterly statistics, matters much in the long run.  Lasting success comes through institutional soundness, not dramatic leadership or ephemeral enthusiasm.  “Luck favors the persistent.  This simple truth is a fundamental cornerstone of successful company builders.  The builders of visionary companies were highly persistent, living to the motto:  Never, never, never give up” (p. 29).

The best companies were almost uniformly devoted to “more than profits.”  Their main concern has been to preserve their “core values.”  The premier “architects” of great companies generally established a “core ideology” that has persisted, in some instances, for more than a century.  Though Henry Ford himself certainly had some unattractive traits, he seemed to care more for making lots of cars than maximizing his fortune.  “I don’t believe we should make such an awful profit on our cars,” said Ford.  “A reasonable profit is right, but not too much,” said he.  “I hold that it is better to sell a large number of cars at a reasonably small profit . . .  I hold this because it enables a larger number of people to buy and enjoy the use of a car and because it gives a larger number of men employment at good wages.  Those are the two aims I have in life” (p. 53).  Though one should always place such rhetoric in perspective, Ford’s claim rings true for his and a number of built-to-last companies.  Collins and Porras conclude:  “Contrary to business school doctrine, we did not find ‘maximizing shareholder wealth’ or ‘profit maximization’ as the dominant driving force or primary objective through the history of most visionary companies.  They have tended to pursue a cluster of objectives, of which making money is only one–and not necessarily the primary one” (p. 55).

Paul Galvin, the founder of Motorola, consistently defined profits as the means to the company’s goal, making a good product, not its raison de etre.  Galvin’s son and successor, Robert, wrote a series of essays in 1991, stressing such things as “creativity, renewal, total customer satisfaction, quality, ethics, innovation, and similar topics; not once did he write about maximizing profits, nor did he imply this was the underlying purpose–the ‘why’ of it all” (p. 82).  Motorola’s competitor, Zenith, by contrast, lacked such a commitment and, following the death of its founder, focused almost singularly on profits and market share, losing its way in the process.

Importantly:  “You do not ‘create’ or ‘set’ core ideology.  You discover core ideology.  It is not derived by looking to the external environment; you get at it by looking inside.  It has to be authentic” (p. 228).  Furthermore:  “You cannot ‘install’ new core values or purpose into people.  Core values and purpose are not something people ‘buy in’ to.  People must already have a predisposition to holding them.  Executives often ask, ‘How do we get people to share our core ideology?’  You don’t.  You can’t!  Instead, the task is to find people who already have a predisposition to share your core values and purpose, attract and train these people, and let those who aren’t disposed to share your core values go elsewhere” (pp. 229-230).

In addition to preserving core values, great companies continually find innovative ways to “stimulate progress.”  Their identity remains constant but their strategies ever evolve.  This, in fact, is “the central concept of this book:  the underlying dynamic of ‘preserve the core and stimulate progress’ that’s the essence of a visionary company” (p. 82).  Successfully doing so involves five things, each given a separate chapter by Collins and Porras:  1) Big Hairy Audacious Goals; 2) Cult-like Cultures; 3) Try a Lot of Stuff and Keep What Works; 4) Home-grown Management; 5) Good Enough Never Is.

Henry Ford’s Big Hairy Audacious Goal was “to democratize the automobile.”  General Electric, under the legendary Jack Welch, sought to “become #1 or #2 in every market we serve and revolutionize this company to have the speed and agility of a small enterprise” (p. 95).  Boeing, in 1965, determined to build the 747 jumbo jet at all costs–and it nearly cost everything, stretching the company to its absolute maximum.  McDonnell Douglas, by contrast, consistently refused to risk losses and thus failed to successfully compete with Boeing.

Cult-like Cultures characterize companies like Nordstroms.  All employees start at the bottom, working on the floor as salesmen.  There they’re on trial, seeing whether they truly satisfy customers.  Employees receive a card–WELCOME TO NORDSTROM–stating the company’s character:  “We’re glad to have you with our Company.  Our number one goal is to provide outstanding customer service.  Set both your personal and professional goals high.  We have great confidence in your ability to achieve them.  Nordstrom Rules:  Rule #1:  Use your good judgment in all situations.  There will be no additional rules” (p. 117).  The company is fanatically committed to this simple rule, and customer satisfaction has validated its effectiveness.   “Nordstrom reminds us,” say the authors, “of the United States Marine Corps–tight, controlled, and disciplined, with little room for those who will not or cannot conform to the ideology” (p. 138).

Still more:  “This finding has massive practical implications.  It means that companies seeking an ’empowered’ decentralized work environment should first and foremost impose a tight ideology, screen and indoctrinate people into that ideology, eject the viruses, and give those who remain the tremendous sense of responsibility that comes with membership in an elite organization.  It means getting the right actors on the stage, putting them in the right frame of mind, and then giving them the freedom to ad lib as they see fit.  It means, in short, understanding that cult-like tightness around an ideology actually enables a company to turn people loose to experiment, change, adapt, and–above all–to act”  (pp. 138-139).

Built-to-last companies continually adapt to the evolving marketplace by trying “a lot of stuff” and keeping “what works.”  In the methodological sense they are totally pragmatic, remarkably Darwinian.  They tenaciously retain their core values but freely change their modus operandi.  They have a “vision” but rarely craft detailed “long range” plans.  Thus “Bill Hewlett told us that HP ‘never planned more than two or three years out’ during the pivotal 1960s” (p. 144), and they learned from their mistakes.  As R.W. Johnson Jr., said, regarding Johnson & Johnson:  “Failure is our most important product” (p. 147).   A Wal-Mart store in Louisiana placed friendly “people greeters” at the store’s entrance primarily to deter shoplifters, only to discover that it was a marvelous public relations strategy that soon spread throughout the giant retail chain.  GE’s Jack Welch, reading Johannes von Moltke’s writings on military strategy, coined the phrase “planful opportunism” to describe the fact that in business as well as in war “detailed plans usually fail, because circumstances inevitably change” (p. 149).

Welch personifies the “Home-Grown Management” the authors find in most successful companies.  Bringing in an outsider–whether because of his charismatic gifts or her politically correct sex or some alleged need for “fresh blood” rarely helps a company.  Welch succeeded at GE because he followed a century of highly successful CEOs.  Amazingly, “across seventeen hundred years of combined history in the visionary companies, we found only four individual cases of an outsider coming directly into the role of chief executive” (p. 173).  Companies that nourish employees’ development, recognize their talent, and reward their commitment find able leaders to assume control of the corporation.  Companies that don’t–such as Disney in the ’70s–flounder while hiring outsiders like Michael Eisner.

Good Enough Never Is” means that successful companies never rest on their laurels.  Their CEOs demand continual improvement.  Thus J. Willard Marriott, Sr., said:  “Discipline is the greatest thing in the world.  Where there is no discipline, there is no character.  And without character, there is no progress. . . .  Adversity gives us opportunities to grow.  And we usually get what we work for” (p. 188).  His son sustained his “Mormon work ethic,” putting in 70-hour weeks and diligently traveling to make sure his facilities were first-rate.  Howard Johnson’s son, however, left the details of the organization to others while he enjoyed the “good life” in New York.  Before long Howard Johnson was failing while Marriott continued to prosper.  Momentary successes, however impressive, are but step stones to ever-higher goals.  “Like great artists or inventors, visionary companies thrive on discontent” (p. 187).   Like great coaches, thriving companies continually recruit the best available talent, work incessantly on training and motivating employees, and challenge everyone to excel in everything they do.

* * * * * *  * * * * * * * * * * * * * *

Having analyzed companies that were “built to last,” Jim Collins (assisted by 10 researchers) sought to explain why a few of them truly excel in Good to Great:  Why Some Companies Make the Leap and Others Don’t (New York:  HarperBusiness, c. 2001).  “Good is the enemy of great,” he says in his first sentence (p. 1).  There are lots of “good” schools, teams, churches, and businesses.  Because they’re “good enough,” however, they never generate the commitment necessary to become truly great, an attribute Collins grants only eleven of the Fortune 500 companies–Abbott; Circuit City; Falnnie Mae; Gillette; Kimberly-Clark; Kroger; Nucor; Philip Morris; Pitney Bowes; Walgreens; Wells Fargo.  So he sought to discover the “timeless principles–the enduring physics of great organizations–that will remain true and relevant no matter how the world changes around us” (p. 15).

Some of the things they didn’t find are striking.  “Larger-than-life, celebrity leaders who ride in from the outside are negatively correlated with taking a company from good to great” (p. 10).  Financial packages for top executives matter little.  Long-range planning strategies aren’t a factor.  Nor do cutting-edge technologies, mergers and acquisitions, motivational novelties, or various other voguish “keys” make for success.  Conversely–and far less flamboyantly–what truly mattered, as companies climbed to greatness, was “a process of buildup followed by breakthrough, broken into three broad stages:  disciplined people, disciplined thought, and disciplined action” (p. 12).  Great “companies have a culture of discipline.  When you have disciplined people you don’t need hierarchy.  When you have disciplined thought, you don’t need bureaucracy.  When you have disciplined action, you don’t need excessive controls.  When you combine a culture of discipline with an ethic of entrepreneurship, you get the magical alchemy of great performance” (p. 13).

It all begins with what Collins calls a “Level 5 Executive,” such as Darwin E. Smith at Kimberly-Clark, who blends “personal humility and professional will” (p. 20) and orchestrates the transformation.  A shy, self-effacing, hard-working farm boy who slowly moved up the ranks of the company, Smith brought a “ferocious resolve” to renew an aging paper producer and did so.  Like Smith, Level 5 leaders are “incredibly ambitious–but their ambition is first and foremost for the institution, not themselves” (p. 21).  The eleven CEOs whose companies “met the exacting standards for entry into this study” (p. 28) were remarkable men, but they’re largely unknown!  The rarely graced the cover of People Magazine or appeared on 60 Minutes or dined with Barbara Streisand!  They rarely talked about themselves, and admiring outsiders tended to focus on the companies, not the executives who ran them!  Level 5 Executives take responsibility for failures and generously praise others for successes.  Lee Iacocca, by contrast, often seemed to lead Chrysler as a means of self-promotion, so “that insiders at Chrysler began to joke that Iacocca stood for ‘I Am Chairman of Chrysler Corporation Always'” (p. 30).  And Iacocca’s flamboyant success in the 1980s resembled a soaring, then quickly deflated, balloon.

The second phase in moving from good to great is finding the right (and eliminating the wrong) people.  People matter more important than plans.  “First who, then what,” guides the great companies.  Nucor succeeded because the company found “that you can teach farmers how to make steel, but you can’t teach a farmer work ethic to people who don’t have it in the first place” (p. 50).  So the company established steel plants in rural areas and focused on hiring men who knew how to work.  Great companies consider an employee’s character more important than job training or school degrees.

Thirdly, great companies “confront the brutal facts (yet never lose faith)” (p. 65).  In the 1960s, while the grocery giant A&P faltered by clinging to antiquated practices, “Kroger began to lay the foundations for a transition” (p. 65) that made it the number one grocery chain by 1999.  Facing facts, not dreaming dreams, distinguish solid leaders.  The reason “charismatic” leaders often fail, in the long run, is because of their penchant for casting unrealistic visions.  Winston Churchill had great oratorical ability, and his words inspired the world in the 1940s.  But he was adamantly realistic, demanding to know the “brutal facts” during the war, and his decisions were rooted in reality, not rhetoric.  Leaders aren’t cheerleaders.  “If you have the right people on the bus, they will be self-motivated.  The real question then becomes:  how do you manage in such a way as not to de-motivate people?  And one of the single most de-motivating actions you can take is to hold out false hopes, soon to be swept away by events” (p. 74).

Next Collins explains “the hedgehog concept, an important aspect of the breakthrough phase.  Unlike foxes, who dash in a dozen different directions, pursuing the freshest trail, hedgehogs “simplify a complex world into a single organizing idea, a basic principle or concept that unifies and guides everything” (p. 91).  Walgreens, for example, decided to establish “the best, most convenient drugstores, with a high profit per customer visit” (p. 92).  Committed to that task, Walgreens prospered while Eckerd (hungry for growth in any area, such as video games) withered.  The hedgehog concept brings together three essentials:  1) determining “what you can be the best in the world at;” 2) knowing that you can be well paid for your efforts; and 3) discovering that you deeply care for and love what you do (p. 96).

The fifth step in becoming great is “a culture of discipline” that is nourished rather than imposed.  Holding employees responsible, but granting them freedom to make their own distinctive contributions, distinguishes great organizations.  This requires rigorous recruitment and hiring–getting “self-disciplined” employees who are committed to the company.  “In a sense, much of this book,” says Collins, “is about creating a culture of discipline.  It all starts with disciplined people.  The transition begins not by trying to discipline the wrong people into the right behavior, but by getting self-disciplined people on the bus in the first place” (p. 126).  “Throughout our research, we were struck by the continual use of words like disciplined, rigorous, dogged, determined, diligent, precise, fastidious, systematic, methodical, workmanlike, demanding, consistent, focused, accountable, and responsible” (p. 127).

Finally, there are “technological accelerators.”  Good-to-great companies freely utilize the latest technologies, but they think differently about them.  Their core values, their hedgehog tenacity, determine the use of technologies.  Rather than insisting on everything be the latest and finest, they pick and choose precisely what new things will actually contribute to the organization’s goals.  Technologies may help, but they never create the momentum needed for success.  If the latest technology fits the goal, then every possible effort must be made to master and utilize it.  Be the very best in making it work for you.  If not, let it alone.

* * * * * * * * * * * * * * * * * *

Quite different in its approach to organizational success is The Way of the Shepherd:  7 Ancient Secrets to Managing Productive People (Grand Rapids:  Zondervan, c. 2004), by Kevin Leman and William Pentah.  The short book tells a story about an ambitious MBA student at the University of Texas who wanted to learn everything his professor could impart–and ended up making weekly visits to the professor’s nearby ranch, learning the art of shepherding.

First, you must “know the condition of your flock.”  This means keeping constantly in touch with employees, getting to know them personally, attending to their daily needs.  Isolated executives inevibably fail to understand the true condition of their organizations.  Nothing substitutes for walking about the workplace, asking questions, answering questions, being available.  Second, you must “discover the shape of your sheep.”  Before you hire an employee, discern whether he or she will contribute to the health of the organization.  Make sure you get healthy sheep and monitor their condition on a regular basis.  It’s the sheep, not the shepherd, who produce the wool and mutton!  Thirdly, you must “help your sheep identify with you.”  To do this you must model “authenticity, integrity, and compassion” (p. 51).  Living out the high standards you expect of your employees, carefully and continually communicating your own values and vision, elicits commitment from your “sheep.”  Being “professional” isn’t enough, because good leaders are primarily “personal” and they treat their workers as subjects rather than objects.

To “make your pasture a safe place” means making sure your employees don’t fight over scarce resources.  There must be enough good grass to eat.  Folks who are content where they are rarely look for “greener pastures.”  Just as sick sheep must be culled from the flock before they spread contagious diseases so too must disgruntled employees be dismissed.  This helps explain the need for a shepherd’s “rod and staff.”  At times one must tap a straying ewe with a staff to rightly direct her.  When a lamb gets stuck in a crevice, the crook on the staff enables one to rescue him.  Ever out front, leading, the shepherd uses the tools necessary to direct and encourage, to nudge or correct, his sheep.  Persuasion, not coercion, is most often the key–but there must be fence lines and limits to the freedom granted one’s flock.  The shepherd’s rod provides the means to protect the sheep from predators–both outsiders and insiders.  A shepherd uses a rod, when necessary, to discipline a wayward lamb or a deviant rebel.

Finally–the seventh point–a good shepherd has a good heart.  Hirelings often do the shepherd’s work, but they often do it poorly because they’re hirelings.  Getting paid is not a sufficient motivation to do the demanding work of a real shepherd.  Having a heart for people, actually caring for them and their situation, makes one a really good leader.  A good business is a good place to work, and a good workplace makes good things.

The Way of the Shepherd is a quick read, obviously rooted in biblical principles running from Psalm 23 to Jesus’ words concerning His shepherd’s role.

166 Benedict XVI

Following the election of Joseph Cardinal Ratzinger as Pope Benedict XVI, I have read or re-read half-a-dozen of his works in an effort to better understand the new pontiff.  Doing so illuminates both the man and (through him) the Roman Catholic Church and the modern world.  Ratzinger provides us a brief overview of his first 50 years in Milestones:  Memoirs 1927-1977 (San Francisco:  Ignatius Press, 1998).  Born and reared in Bavaria by devout parents, he enjoyed a blessed childhood.  His father, a rural policeman, moved frequently about the region between the Inn and Salzach rivers, and he retired at the age of 60 (in 1937) to a house outside Traunstein.  The area is richly rooted in history, reaching back several millennia to the Celts and Romans.  It was early christianized by Irish missionaries.  Ratzinger conveys the sense that he knows his land and people and finds stability therein.

Ratzinger attended the gymnasium in Traunstein, where he thoroughly mastered Latin and Greek, a linguistic foundation for his later mastery of theology.  He began such studies just in time, for Hitler’s National Socialist regime soon required students to study science and modern languages rather than the classics.  Students such as himself, however, were “grandfathered” in and allowed to complete their classical curriculum.  Entering adolescence, he decided to enter the priesthood.  In 1943, as Hitler’s war effort began crumbling, all boarding school students (Ratzinger included) were required to serve in a civil defense force.  When he became eligible for military service, he was spared active duty, but he was forced to work in a labor camp (which he fled as the war was ending) and thus support the regime.

When the war ended, Ratzinger resumed his seminary education at Freising.  Despite the lack of virtually everything material, the students joined together and zestfully studied for the priesthood, delving into a broad spectrum of philosophy and literature as well as theology.  From Freising, Ratzinger went to Munich to study at the university.  Here he encountered outstanding scholars and relished the challenge of new ideas and diverse perspectives.  He also dug deeply into biblical studies and the thought of St. Augustine.  “When I look back on the exciting years of my theological studies,” he recalls, “I can only be amazed at everything that is affirmed nowadays concerning the ‘preconciliar’ Church” (p. 57).  Rather than being a tradition-bound static era, it was a time of ferment and radical questioning.

His intellectual brilliance fully evident, Ratzinger was encouraged to pursue the doctorate and did so while serving as an assistant pastor in Munich.  He worked hard in youth ministry, received his degree, and then began teaching in the seminary in Freising.  Subsequently he moved to Bonn, where he as awarded the chair in fundamental theology.  Soon thereafter (moving quickly up the academic ladder) he was invited to Munster, then Tubingen and Regensberg.  In the midst of his moves, he was fully involved in the theological discussions of the ’50s and ’60s–including the efforts of some to reduce Revelation to the historical-critical method of biblical exegesis.  While at Tubingen, he saw existentialism literally collapse, to be replaced by the pervasive Marxism that continues to shape European universities.  His encounters with Karl Rahner ultimately led him to note that “despite our agreement in many desires and conclusions, Rahner and I lived on two different theological planets” (p. 128).  Scripture and the Fathers, not Kant and modern thought, were his beacons of truth.

Fully expecting to remain in academia for a lifetime, Ratzinter was, quite unexpectedly, appointed archbishop of Munich and Freising in 1977.  He chose, as his Episcopal motto, a “phrase from the Third Letter of John, ‘Co-worker of the Truth'” (p. 153).  To fulfill that calling, he sought to anchor his diocese to the eternal Rock of Christ.  Committing one’s all to “the side of God,” of course, never guarantees worldly success, even in the Church.  But it does give stability to one’s decisions.  And it explains why Pope John Paul II soon called on Ratzinger to take control of the Congregation for the Doctrine of the Faith.

* * * * * * * * * * * * * * * * * * * *

Not long after assuming his new position in Rome, Cardinal Ratzinger was interviewed by Vittorio Messori, an Italian Journalist.  The written record of that meeting, The Ratzinger Report (San Francisco:  Ignatius Press, c. 1985), provides considerable insight into both Ratzinger himself and his concerns for the Church.  He appears as a deeply devout man, clearly troubled by certain developments in the Catholic world following Vatican II that changed the Church more in 20 years than in the previous 200.  Especially troubling were theological currents, recklessly justified as in “the spirit of Vatican II,” which undermined the very foundations of faith.

To Ratzinger, orthodoxy–right belief–must ever remain preeminent in the life of the believers, for “faith is the highest and most precious good–simply because truth is the fundamental life-element for man.  Therefore the concern to see that the faith among us is not impaired must be viewed–at least by believers–as higher than the concern for bodily health” (p. 22).  Timeless truths (e.g. sin and grace) ever offend secularists, but the Church must proclaim them.  Providing an example, the cardinal noted that he hoped some day to have the time to probe “the theme of ‘original sin’ and to the necessity of a rediscovery of its authentic reality” (p. 79).  Failing to take seriously this doctrine “and to make it understandable is really one of the most difficult problems of present-day theology and pastoral ministry” (p. 79).  But unless we’re sinners, we need no salvation!  Yet all around us “Christian” preachers refuse to tell folks the truth about sin!  Poorly informed, many folks just assume that everyone somehow goes to heaven because we’re all good enough to deserve it.  The Church has been entrusted with one great task:  to tell Truth to the world.  You don’t discern Truth by counting ballots.  She is sacramental and hierarchical, not social and democratic.  By teaching the Credo, the Our Father, the Decalogue, and the sacraments, Christ’s ministers can effectively lift up the Truth that’s sufficient for man’s redemption.

Thus the Church is, Ratzinger insisted, a divinely directed rather than a purely human institution and as such must live differently from the world.  Sadly enough, “We have lost the sense that Christians cannot live just like ‘everybody else'” (p. 115).  There is, along with orthodoxy, an orthopraxy that should characterize the Church.  So he insists on the ancient phrase:  Ecclesia semper reformanda.  But true reform comes neither from bureaucrats nor critics.  Reform ever results from the leaven of saints!  “Saints, in fact, reformed the Church in depth, not by working up plans for new structures, but by reforming themselves.  What the Church needs in order to respond to the needs of man in every age is holiness, not management” (p. 53).

The manifest lack of–indeed, contempt for–personal holiness characterizes the sexual revolution birthed in the ’60s.  In time we will lament, Ratzinger said, “the consequences of a sexuality which is no longer linked to motherhood and procreation.  It logically follows from this that every form of sexuality is equivalent and therefore of equal worth” (p. 85).  Unfettered from any real end, “the libido of the individual becomes the only possible point of reference of sex” and everyone does pretty much as he desires.  Consequently, “it naturally follows that all forms of sexual gratification are transformed into the ‘rights’ of the individual.  Thus, to cite an especially current example, homosexuality becomes an inalienable right” (p. 85).  Yet another current of the sexual revolution, radical feminism, has greatly troubling the Church, seeking to alter her very structure and message.  The cardinal was “in fact, convinced that what feminism promotes in its radical form is no longer the Christianity that we know; it is another religion” (p. 97).

In the face of much moral chaos, however, Ratzinger insists the Church must retain her focus and restate her message, come what may!

* * * * * * * * * * * * * * * * * * * *

A decade later another journalist, Peter Seewald, interviewed Ratzinger and published their conversations in Salt of the Earth:  Christianity and the Catholic Church at the End of the Millennium (San Francisco:  Ignatius Press, c. 1996).  Seevold provides a personal introduction, indicating that he had, as a youngster, rejected the Faith and thus interviewed Ratzinger with some genuine personal concerns regarding himself as well as his subject.  So his first section focused on “The Catholic Faith:  Words and Signs.”

Ratzinger’s interested mainly in philosophy, theology, doctrine, ethics.  He grants that knowing theology doesn’t make one a better person, but when rightly studied and appropriated it matters eternally–both for an individual and the Church.  Though more celebrated “problems” may capture newspaper headlines, the real crisis in the Church today is theological, for she’s entrusted with declaring what one ought to believe.  To Ratzinger, “To the substance of the faith belongs the fact that we look upon Christ as the living, incarnate Son of God made man; that because of him we believe in God, the triune God, the Creator of heaven and earth; that we believe that this god bends so far down, can become so small, that he is concerned about man and has created history with man, a history whose vessel, whose privileged place of expression, is the Church” (p. 19).  In our day, especially in Europe, where the Church now represents a minority of the population, it takes courage to uphold the Faith in the face of mounting hostility.

Shifting from the discussion of Faith, Seewald asked Ratzinger a number of biographical questions.  (If one’s read the cardinal’s Milestones, much in this section is repetitious, though one certainly gets fresh perspectives as he answers questions.)  He acknowledges that he is something of a Platonist and is openly devoted to St. Augustine.  He also cites a turning point, for him personally, when Marxists suddenly gained power, especially in the universities, in the late ’60s.  He instantly knew that “Christians” trying to mix Marx with Jesus–flying the flag of  “progressivism”–would lose their integrity as Christians.  Since that time, “progressives” within the Catholic Church have sought to change her sexual standards, to install female priests, to make the Church something akin to themselves rather than Christ.

Obviously, Ratzinger noted, “not all who call themselves Christians really are Christians” (p. 220).  Real Christians seek to live out the Christ-like life divinely imparted to them.  They’re not intent on changing the world!   Indeed, as the 20th century demonstrates, “everything depends on man’s not doing everything of which he is capable–for he is capable of destroying himself and the world–but on knowing that what ‘should’ be done and what ‘may’ be done are the standard against which to measure what ‘can’ be done” (p. 230).  To give us direction we need spiritual renewal, not political revolution.  We need saints, not power-hungry protesters.  “What we really need,” says Ratzinger, echoing his words in The Ratzinger Report, “are people who are inwardly seized by Christianity, who experience it as joy and hope, who have thus become lovers.  And these we call saints” (p. 26).

* * * * * * * * * * * * * * * * * * * *

The third set of published interviews, God and the World:  A Conversation with Peter Seewald (San Francisco:  Ignatius Press, c. 2000), further enriches our understanding of Pope Benedict XVI.  By now the journalist Seewald had returned to the Faith and his questions are both more informed and sympathetic.  The conversations took place during three days in the abbey of Monte Cassino.  That a book of 460 pages, dealing expertly with the whole spectrum of Christianity, can be compiled in three days indicates something of the genius of Ratzinger!

Setting the stage in his preface, hinting at his own journey back to faith, Seewald wondered what to make of the fact that “Within a short period of time, something like a spiritual nuclear attack had befallen large sections of society, a sort of Big Bang of Christian culture that was our foundation” (p. 13).  To which Ratzinger, “one of the Church’s great wise men . . . patiently recounted the gospel to me, the belief of Christendom from the beginning of the world to its end, then, day by day, something of the mystery that holds the world together from within became more tangible.  And fundamentally it is perhaps quite simple.  ‘Creation,’ said the scholar, ‘bears within itself an order.  We can work, out from this the ideas of God–and even the right way for us to live'” (pp. 14-15).  Faith and love, rightly amalgamated, provide us that way.

Consequently, the Faith, rooted in the Truth of Revelation, cannot be compromised.  “I always recall the saying of Tertullian,” Ratzinger says, “that Christ never said ‘I am the custom’, but ‘I am the truth'” (p. 35).  Thus the task of the Church, in the words of Romano Guardini, is to:  “‘steadily hold out to man the final verities, the ultimate image of perfection, the most fundamental principles of value, and must not permit herself to be confused by any passion, by any alteration of sentiment, by any trick of self-seeking'” (p. 65).  To the cardinal:  “Christianity makes its appearance with the claim to tell us something about God and the world and ourselves–something that is true and that, in the crisis of an age in which we have a great mass of communications about truth in natural science, but with respect to the questions essential for man we are sidelined into subjectivism, what we need above all is to seek anew for truth, with a new courage to recognize truth.  In that way, this saying handed down from our origins, which I have chosen as my motto, defines something of the function of a priest and theologian, to wit, that he should, in all humility, and knowing his own fallibility, seek to be a co-worker of the truth” (p. 263).

Seeing the truth, discerning the Logos in creation, enables one to share Sir Isaac Newton’s conviction that “The wonderful arrangement and harmony of the universe can only have come into being in accordance with the plans of an omniscient and all-powerful Being.  That is, and remains, my most important finding” (p. 47).  The clear mathematical structure of the cosmos reveals its Logos.  Equally rational, one discerns moral truths that are as objective and inflexible as mathematical formulae.  The Ten Commandments, explained by Ratzinger as “commandments of love” (p. 180), are always and everywhere valid because they tell us the truth about God and ourselves.  Thus it follows, he says, that:  “Setting moral standards is in fact the most prominent work of mercy” (p. 317).

Since Seewald guides Ratzinger through the major themes of the Catechism, God in the World is a rather handy, informal primer for the Catholic faith.  Combined with The Ratzinger Report and Salt of the Earth, it provides valuable insight into the personality and theology of the new pontiff.

* * * * * * * * * * * * * * * * * * * *

Ratzinger played a significant role, as a young theologian, in the deliberations of  the Second Vatican Council.  Soon thereafter, in 1967, he gave a series of lectures at Tubingen that were published as Introduction to Christianity (New York:  The Seabury Press, c. 1969).  Here we find the scholar, citing current theologians, documenting positions, doing intellectual work of the highest order.  The foundation for much of what he said in the intervening 40 years was made clear in this book.

He first addressed “belief in the world of today,” noting the widespread disbelief that challenges the Church.  Multitudes cannot believe in anything intangible.  Even within the Church, many find themselves troubled with doubts of all kinds.  Ratzinger sheds light on the problem by tracing its history, especially evident in the shift from the Ancient and Medieval position that “Verum est ens” to the view of Giambattista Vico, that “Verum quia factum.”  Following Vico, thinkers like Hegel and Marx reduce all questions to historical issues and consider what man makes, not what God has made.  In time (initially among the intelligentsia but now everywhere) belief in God slowly eroded away.

To introduce modern man to Christianity, then, Ratzinger proposed that Christians primarily declare “I believe in You,” meaning Jesus Christ, and encourage him to find “God in the countenance of the man Jesus of Nazareth” (p. 48).  When one recites the Apostles’ Creed, one certainly assents to its propositions, but more importantly he gives witness to his conversion–his “about-turn”–to the way of Christ.  Having heard the Word, one takes it in from an outside source.  Faith comes to us from God.  We do not construct a worldview of some sort and then decide to live accordingly.  Rather, we take, as a gift, what is revealed to us.  “Christian belief is not an idea but life; it is not mind existing for itself, but incarnation, mind in the body of history and its ‘We’.  It is not the mysticism of the self-identification of the mind with God but obedience and service” (p. 64).

To believe in One God involves not merely monotheism but–unlike the tendency to construct localized or tribal deities–the acknowledgement that the “God of our fathers” is “not the god of a place, but the god of men:  the God of Abraham, Isaac and Jacob.  He is therefore not bound to one spot, but present and powerful wherever man is” (p. 83).  This God simply Is.  He’s not in the process of becoming something, as is the world around us.  This “God who ‘is’ is at the same time he who is with us; he is not just God in himself, but our God, the ‘God of our fathers'” (p. 88).  The God of our fathers is also “our Father.”  Ratzinger says:  “By calling God simultaneously ‘Father’ and ‘Almighty’ the Creed has joined together a family concept and the concept of cosmic power in the description of the one God.  It thereby expresses accurately the whole point of the Christian image of God:  the tension between absolute power and absolute love, absolute distance and absolute proximity, between absolute Being and a direct affinity with the most human side of humanity” (p. 104.

Creation bears witness to its Maker.  “Einstein said once that in the laws of nature ‘an intelligence so superior is revealed that in comparison all the significance of human thinking and human arrangements is a completely worthless reflection'” (p. 106).  To Ratzinger, this means that we merely re-think “what in reality has already been thought out beforehand” (p. 106).  There is a Logos giving rational structure to all that is.  Rejecting the notion that the world is a purely random collection of material things, Christians marvel at it as the artistry of a divine Mind.  This God has revealed himself, preeminently in the Christ who referred to both His Father and the Spirit.  “God is as he shows himself; God does not show himself in a way in which he is not.  On this assertion rests the Christian relation with God; in it is grounded the doctrine of the Trinity; indeed, it is this doctrine” (p. 117).  History reveals how easily we err in trying to rationally explain this doctrine–sliding into monarchianism or subordinationism.  The Church, wisely, has insisted we be “content with a mystery which cannot be plumbed by man” (p. 118).

So too there is a mystery to Jesus Christ, the Word made flesh, reconciling the world to himself.  Fully aware of various theories concerning “the Jesus of History” and the “Christ of Faith,” Ratzinger finds “it preferable and easier to believe that God became man than that such a conglomeration of hypotheses represents the truth” (p. 159).  Thus the Virgin Birth reveals “how salvation comes to us; in the simplicity of acceptance, as the voluntary gift of the love that redeems the world” (pp. 210-211).

We’re saved by grace, given to us by a loving Father in the person and work of His Son.  To respond with faith and love makes us Christian.

165 Pragmatism’s Founders

American historians generally note that “pragmatism” is this nation’s only uniquely homespun philosophy.  Now most all philosophical labels are to a degree misleading generalizations, and the men who crafted pragmatism were hardly of one mind.  Nevertheless, as Louis Menand shows in his marvelously informative The Metaphysical Club:  A Story of Ideas in America (New York:  Farrar, Straus, Giroux, c. 2001), a handful of New England Pragmatists (Oliver Wendell Holmes, Jr.; William James; Charles S. Peirce; John Dewey) shared certain perspectives and deeply shaped this nation.  The men’s biographies–including family roots, personal experiences, social connections, New England backgrounds, and the challenge of Darwinism–provide valuable contexts for understanding their thought.

Indeed:  “together they were more responsible than any other group for moving American thought into the modern world.   . . . .  Their ideas changed the way Americans thought–and continue to think–about education, democracy, liberty, justice, and tolerance.  And as a consequence, they changed the way Americans live–the way they learn, the way they express their views, the way they understand themselves, and the way they treat people who are different from themselves.  We are still living, to a great extent, in a country these thinkers helped to make” (pp. x-xi).  To distill their positions to a single sentence:  “they all believed that ideas were not ‘out there’ waiting to be discovered, but are tools–like forks and knives and microchips–that people devise to cope with the world in which they find themselves” (p. xi).

Menand first treats Oliver Wendell Holmes, Jr., who grew up in Boston, the son of an eminent physician and writer.  Unlike his famous father, he fully ascribed to the abolitionist agenda and enthusiastically marched off to battle when the Civil War began.  He proved to be a courageous, repeatedly wounded soldier.  But in the course of the war he lost his faith in both abolitionism and God.  Moral and metaphysical certainties of any stripe, he decided, lead to ghastly violence.  He simultaneously discovered and fully embraced the philosophical naturalism espoused by Charles Darwin in On The Origin of Species.   (Importantly, one of the constants in the Pragmatists’ story is the influence of Darwin’s theory of evolution through natural selection.  However one responds to the biological hypothesis, one cannot deny its pervasive philosophical and sociological consequences)

Losing faith in God and social reform on the battlefield, Holmes substituted an admiration for his fellow soldiers and the ultimate prerogatives of power.   For the rest of his life he routinely recounted his involvement in battles and reminded folks of his wounds.  Though distressed by the war’s violence, he still seemed fixated on it.  And he clearly concluded that “might makes right” because there really isn’t any ultimate “right.”  Before Nietzsche uttered his oracles Holmes had settled into a Nietzschean nihilism.  “‘You respect the rights of man–,’ he wrote to Laski.  ‘I don’t, except those things a given crowd will fight for–which vary from religion to the price of a glass of beer.  I also would fight for some things–but instead of saying that they ought to be I merely say they are part of the kind of world that I like–or should like'” (p. 63).

Like Holmes, William James was sired by an illustrious father, Henry, who embodied both the enthusiasm and anarchical sectarianism of America’s Second Great Awakening.  Henry passed through a variety of intense religious experiences and even studied briefly at Princeton Theological Seminary.  In time he embraced Swedenborgianism, wherein he enjoyed the freedom to shape his own mystical religious convictions in accord with his own experiences.  No church ever suited him, so he became his own church.  Like many who inherit great wealth and never work to earn a living, he was fully fascinated with socialism, drinking droughts of Charles Fourier and fantasizing about “‘the realization of a perfect society, fellowship, or brotherhood among men'” (p. 85).

Young William, after traipsing about Europe and picking up a smattering of education from various tutors and schools, ultimately studied biology with the acclaimed Louis Agassiz at Harvard.  In time James embraced the very Darwinism that Agassiz rejected, though he was, of course, fully aware of its implicit, inescapable determinism.  For, Menand emphasizes:  “The purpose of On the Origin of Species was not to introduce the concept of evolution; it was to debunk the concept of supernatural intelligence–the idea that the universe is the result of an idea” (p. 121).  One may quite easily believe in a form of evolution under divine guidance.  “What was radical about On the Origin of Species was not its evolutionism, but its materialism” (p. 121).

But a materialist William James was not and could not be, so he struggled to carve out realms of personal freedom within the broader scope of biological necessity.  He could not abide Thomas Huxley’s conclusion that “We are conscious automata.”  Somehow the processes of natural selection had mysteriously spun out human beings who freely choose what to think and how to live.  “There is intelligence in the universe:  it is ours.  It was our good luck that, somewhere along the way, we acquired minds.  They released us from the prison of biology” (p. 146).  Thus the pragmatism James espoused was, he said, “the equivalent of the Protestant Reformation” (p. 88), a new faith for the new scientific world.  If believing in God and freedom enabled one to live better, such beliefs are “true.”

Charles S. Peirce, like James and Holmes, was the son of a prominent Bostonian, Professor Benjamin Peirce.  His father taught mathematics at Harvard and was, in his own right, a highly significant intellectual.  He considered himself an “idealist,” for “he believed that the universe is knowable because our minds are designed to know it.  ‘In every form of material manifestation,’ he explained, ‘there is a corresponding form of human thought, so that the human mind is as wide in its range of thought as the physical universe with it thinks.  The two are wonderfully matched.’  Thought and matter obey the same laws because both have a common origin in the mind of a Creator.  This is why the truths of mathematical reasoning (as [Benjamin] Peirce often reminded his students) are God’s truths” (p. 156).

Young Charles Peirce was precociously brilliant–and almost equally eccentric.  Thus he never settled into an established career.  He earned a living, primarily, as an employ of a federal bureaucracy, thanks to his father’s influence.  And he wrote reams of material never published in his lifetime.  Ironically, perhaps the most brilliant of the “pragmatists” was not, in many significant ways, a Pragmatist!  He did, however, deal with significant issues, such as statistics and probability theory, and in these areas insisted on a form of pragmatic epistemology.  Furthermore, like Holmes and James, he addressed the philosophical implications of Darwinism, wondering how can we know anything if the world is merely the product of chance and necessity.  He decided that “chance variation could explain evolution adequately–[but] he thought God’s love must play a more important role, a theory he called ‘agapism,’ derived in part from the Swedenborgian writings of Henry James, Sr.–and he could not imagine a universe devoid of ultimate meaning” (p. 365).  He also argued that great scientists, like Kepler, came to their conclusions through a “kind of guessing Peirce called ‘abduction’; he thought that it was a method integral to scientific progress, and that it pointed to an underlying affinity between the mind and the universe” (p. 367).

The fourth thinker Menand studies, John Dewey grew up in Vermont and studied at the state’s university in Burlington.  He earned a Ph.D. at The John Hopkins University, and then successively taught philosophy at the University of Michigan, Chicago University, and Columbia University.  He moved intellectually as well as geographically, shifting (in the 1890s) from Hegelian idealism to a form of Pragmatism (often called Instrumentalism) that he espoused thenceforth.  We learn, he decided, almost exclusively by doing.  So schools  should be places where we learn to cook, sew, and construct things; they should be small shops where we work together and solve very practical problems.  Math and science, geography and psychology–whatever’s worth knowing–should be discovered by students engaged in activities of some sort.  In John Dewey progressive educators had their American guru!  Progressive politicians had their guide!  Progressive churchmen had a new Moses!

Menand’s genius is to weave together four men’s biographies and make a tapestry of the times.  His research, evident in both the notes and the evident familiarity he has with the subject, bears witness to his patient plowing through archives as well as publications.  His synthesis, making the book much more than a series of biographical vignettes, reveals the fundamental issues and lasting legacy of the men studied.  The style, crisp and alluring, draws the reader into an exciting intellectual adventure.  The Metaphysical Club is certainly one of the finest works of intellectual history of recent decades.

* * * * * * * * * * * * * * * * * * * *

Menand considers Holmes a pragmatist, and inasmuch as he was an ethical consequentialist, he fits into that tradition.  But Albert W. Alschuler, a Professor of Law at the University of Chicago, portrays him as more properly a nihilistic existentialist, much akin to Nietzsche.  He was likewise a Social Darwinist, fully imbued with that bleakly naturalistic philosophy.  Because of Holmes and his followers, “the central lyric of twentieth-century American jurisprudence” is summed up by Perry Farrell, the lead singer of Porno for Pyros,” who decreed:  “‘Ain’t no wrong, ain’t no right, only pleasure and pain'” (pp. 189-190).  Deeply displeased with such developments, Alschuler incisively critiques Holmes in Law Without Values:  The Life, Work, and Legacy of Justice Holmes (Chicago:  The University of Chicago Press, c. 2000).  Holmes’s philosophy, rightly examined, is as irrationally adolescent and wrong as Perry Farrell’s song.

Without doubt Justice Holmes, Alschuler argues, “more than any other individual, shaped the law of the twentieth century” (p. 1).  And he cast it in utterly amoral terms, contending “that moral preferences are ‘more or less arbitrary . . . .  Do you like sugar in your coffee or don’t you? . . .  So as to truth'” (p. 1).  In the deepest sense, Holmes rejected “objective concepts of right and wrong” and set a “downward” trajectory that explains the moral vacuity of many recent court decisions.   When we wonder about Supreme Court decisions–involving cases concerning the Ten Commandments, homosexual rights, property rights, partial birth abortion, etc.–we do well to trace their philosophical roots to Oliver Wendell Holmes, Jr.

Holmes was, in many ways, a thoroughgoing skeptic, much like the ancient Sophists such as Thrasymachus (portrayed in Plato’s Republic as Socrates’ amoral antagonist).  And Holmes’s followers, in legal circles today, are legion and sophistic.  Relativism reigns.  Moral truth is whatever the largest or most vociferous or politically correct crowd desires.  Vices and virtues are merely words indicating personal preferences.  “All these American scholars,” Alschuler says, “have tilted from Socrates on the issue that marks the largest and most persistent divide in all jurisprudence.  In ancient Athens, the philosopher Thrasymachus anticipated Holmes by 2,3000 years when he said, ‘Justice is nothing else than the interest of the stronger.’

Rejecting this position, Socrates replied that justice was not the enacted will of the powerful but ‘the excellence of the soul.’  He argues that justice was unlike medical treatment (a means to an end) or an amusing game (which had no end beyond itself).  Justice was a good of the highest order–an end and a means, a good to be valued for itself and for its consequences.  In Rome four hundred years later, Cicero described justice as ‘right reason in agreement with nature'” (p. 8).  Cicero and Socrates helped shape the “natural law” or “moral realist” tradition so evident in the founding documents of the United States.  “We hold these truths to be self evident,” said Jefferson, in a succinct declaration of the natural law, “that all men are created equal and are endowed by their Creator with certain unalienable rights.”  This nation’s Constitution and the laws implementing it were shaped by Locke and Blackstone, then stamped with an American imprint by Madison, Marshall, Story, and Lincoln.  The years from 1776-1860 were the “golden age” of American law.

Oliver Wendell Holmes and his followers, however, rejected any “natural” or “divine” law and imposed what Pope Benedict XVI recently described as a “dictatorship of relativism.”  Contrary to Jefferson’s “Declaration of Independence,” Holmes saw “no reason for attributing to a man a significance different in kind from that which belongs to a baboon or to a grain of sand'” (p. 23).  “‘All my life,’ said the architect of 20th century American jurisprudence, ‘ I have sneered at the natural rights of man'” (p. 26).  Instead, he propounded “a power-focused philosophy,” says Alschuler.  As Thrasymachus asserted, “might makes right.”  Thus, while sitting on the United States Supreme Court, Holmes wrote:  “‘I have said to my brethren many times that I hate justice, which means that I know that if a man begins to talk about that, for one reason or another he is shirking thinking in legal terms'” (p. 89).

On a personal level, biographers agree, Holmes cared little for anyone other than himself.  Revealingly, when he died he left his entire estate to the federal government!  After spending 15 years preparing an authorized biography he never published, Grant Gilmore said:  “‘The real Holmes was savage, harsh, and cruel, a bitter and lifelong pessimist who saw in the course of human life nothing but a continuing struggle in which the rich and powerful impose their will on the poor and weak'” (pp. 31-32).   Which is as it should be because it simply is what is!  In his support of eugenics, fore example, he revealed a thinly disguised contempt for the weak and unfit, delighting to uphold laws sterilizing imbeciles and “writing approvingly of killing ‘everyone below standard’ and ‘putting to death infants that didn’t pass the examination'” (p. 29).

In his enthusiasm for military valor and virtues, in his celebrated will-to-power nominalism, he clearly resembled Nietzsche.  The “were born three years apart and had much in common.  Both viewed life as a struggle for power; both were antireligious . . .; both saw ethics as lacking any external foundation; both could fairly be regarded as existentialists; both saw the suffering and exploitation of some as necessary to the creative work of others; both were personally ambitious and had a strong work ethic; both had a strong sense of personal destiny; . . . both often seemed indifferent to the feelings of those around them; both found in their wartime experiences a metaphor for the universe at large; and both had military-style moustaches” (p. 19).

As a legal scholar, Alschuler gives meticulous attention to Holmes’s writings.  Though they enjoy something of a hallowed place in legal circles, Alschuler finds them sorely deficient in many ways.  He regards The Common Law, the treatise that established Holmes’ reputation in the 1880’s, a “clear failure” (p. 125).  Only the first paragraph–the lines recited by most scholars–proves memorable.  Likewise, Holmes’s 1897 article, “The Path of the Law,” considered by Richard Posner “‘the best article-length work on law ever written'” (p. 132), cannot withstand careful scrutiny.  Written, Holmes said, to “‘dispel a confusion between morality and law,'” (p. 150), committed to the proposition that “All law means that I will kill you if necessary to make you conform to my requirements'” (p. 144) the article reveals his amoral positivism.   “In The Path of the Law,” Alschuler says, “Holmes listed five words to illustrate the sort of moral terminology he proposed to banish from law–rights, duties, malice, intent, and negligence” (p. 172).  He particularly despised “duty.”  Laws are issued by whoever is in power, and one obeys them because he must do so.  But there is no inner “ought,” no moral imperative, no reason to do what’s honorable.

“The Path of the Law,” writes Alschuler, “has molded American legal consciousness for more than a century, and lawyers now carry gallons of cynical acid to pour over words like duty, obligation, rights, and justice” p. 176).  It has quite recently been described by legal scholars “as an ‘acknowledged masterpiece in jurisprudence,’ ‘the single most important essay ever written by an American on the law,’ and perhaps ‘the best article-length work on law ever written'” (p. 180).  But Alschuler insists that  Holmes’s “theory of contracts,” set forth to replace the natural law position, is “a hopeless jumble of ill-considered prescriptive and descriptive ideas” (p. 176).  Despite its influence and renown, the essay is in many ways quite “incoherent” (p. 135), and its very incoherence reflects Holmes’s Nietzschean, Darwinian worldview where nothing much makes sense.

Alschuler ends his critique of Holmes with a chapter entitled “Ending the Slide from Socrates and Climbing Back.”  The chapter’s first two paragraphs deserve quoting, for they sum up the case against one of the most powerful 20th century intellectual currents:  “The current ethical skepticism of American law schools (in both its utilitarian and law-as-power varieties) mirrors the skepticism of the academy as a whole.  Some twentieth-century pragmatists, extending their incredulity further than Holmes, have abandoned the idea the human beings can perceive external reality–not only right, wrong, and God (issues on which Holmes took a skeptical stance) but also gravity, suffering, and even chairs (issues on which Holmes was a realist.  “These pragmatists maintain that the only test of truth is what works, and a century of pragmatic experimentation has given that question a clear answer:  “Pragmatism and moral skepticism don’t they are much more conducive to despair than to flourishing.  They fail their own test of truth.  We have walked Holmes’s path and have lost our way” (p. 187).

Given their demonstrable failure, we need to find a better way.

# # #

164 Rodney Stark’s Church History

For several decades Rodney Stark, currently a sociology professor at Baylor University, devoted himself to the sociology of religion.  But he was always “a history buff,” and 20 years ago, he read Wayne Meeks’s The First Urban Christians.  Thus began, somewhat as an avocation, his reading widely in Church history.  With an academic outsider’s perspective, he began asking different questions and taking different approaches to the subject, leading to the publication of highly readable and scintillating works such as The Rise of Christianity:  How the Obscure, Marginal Jesus Movement Became the Dominant Religious Force in the Western World in a Few Centuries (Princeton: Princeton University Press, c. 1996; San Francisco:  Harper San Francisco reprint, 1997).  In general, the book seeks “to reconstruct the rise of Christianity in order to explain why it happened” (p. 3).

But the book is not a sustained chronological narrative.  Rather, each of its10 chapters stands alone–a collection of essays providing an analysis of something that strikes Stark as significant.  In chapter one he considers “Conversion and Christian Growth,” seeking to understand how the120 Christians at Pentecost launched a movement that literally won the world for Christ.  The data indicate the Early Church grew at the rate of “40 percent per decade” for several centuries (p. 6).  This is virtually the same growth rate enjoyed by the Mormons for the past century, and at that rate there would have been only 7,530 Christians by the year 100 A.D., and some 40,000 by 150.  Thereafter, as anyone understanding compound interest understands, the numbers dramatically increased and the Roman Empire was “Christian” mid-way through the fourth century.

In chapter two Stark discounts the popular notion that “Christianity was a movement of the dispossessed.”  Friedrich Engels championed this view, arguing that it was a “‘religion of slaves and emancipated slaves, of poor people deprived of all rights'” (p. 29).  Many historians–and the multitudes of scholars influenced by Ernst Troeltsch–widely embraced such Marxist thinking.  “By the 1930s this view of Christian origins was largely unchallenged” (p. 29), and legions of professors still repeat the litany.  But, Stark insists, it must be discarded because it’s utterly untrue.  Today “a consensus has developed among New Testament historians that Christianity was based in the middle and upper classes” (p. 31).  Aristocrats and wealthy believers, scholars and highly educated folks, were quite prominent in the Early Church.  This squares with current sociological evidence regarding religious sects and cults, which almost never thrive among the poor and dispossessed.  Movements such as the Mormons and Moonies appeal to the well educated and prosperous.  So, Stark argues, it makes sense to envision the Early Christians as appealing to the same social strata.

In the next chapter Stark argues that converted Jews were an enduring Christian constituency.  He thinks that  “not only was it the Jews of the diaspora who provided the initial basis for church growth during the first and early second centuries, but that Jews continued as a significant source of Christian converts until at least as late as the fourth century and that Jewish Christianity was still significant in the fifth century” (p. 49).   There were, of course, far more Jews of the diaspora than Jews living in Palestine.  Many of these Jews had so lost their Hebrew roots that a Greek translation of the Scriptures (the Septuagint) had become necessary.  Among these Hellenized Jews the Christians found a fertile field for the Gospel.  “If we examine the marginality of the Hellenized Jews, torn between two cultures, we may note how Christianity offered to retain much of the religious content of both cultures and to resolve the contradictions between them” (p. 59).

Further contributing to Church growth were epidemics, cited by Church Fathers such as Cyprian and Eusebius as factors in drawing converts to a community that cared for the sick and dying as well as offered the promise of resurrection and eternal life.   From 165-180 A.D., an epidemic (probably smallpox) decimated the Roman Empire, reducing the population by at least one-fourth.  In 251 another empire-wide epidemic (probably measles) raged.  Such horrendous crises precipitate religious questioning–as  the response of American Indians to similar catastrophes document.  Importantly, for the Church in the ancient world, while pagans fled the scene of suffering Christians came alongside those who were ill, choosing to risk death rather than desert those in need.  They also cared for the poor, the widows and orphans, demonstrating a qualitatively different kind of religious faith.  Consequently, multitudes of disillusioned pagans turned to the Christian way.

Women too were drawn to the Early Church, where they were more highly revered than in the Greco-Roman world.  This has long been recognized, but Stark seeks “to link the increased power and privileged of Christian women to a very major shift in sex ratios.  I demonstrate that an initial shift in sex ratios resulted from Christian doctrines prohibiting infanticide and abortion; I then show how the initial shift would have been amplified by a subsequent tendency to over recruit women” (p. 95).  Due to the exposure of female babies–in many families only one baby girl was allowed to live–there were significantly more men than women in the first and second centuries.  This meant, of course, a dramatic depopulation trend!  Christians, conversely, with their high view of marriage and fidelity, considered children a blessing and encouraged large families as well as opposed abortion (thus saving the lives of many women who would have died as a result of this dangerous procedure).  Contrary to some feminist readings of the documents, Stark insists that women were drawn to the Church not because it offered them places of political status and power but because Christians insisted that marriage is sacred, life is sacred, and children are to be treasured.

Finally, Stark proposes a thesis in his final chapter that explains why women and others were drawn to the Church: “Central doctrines of Christianity prompted and sustained attractive, liberating, and effective social relations and organizations” (p. 211).  Love and mercy, rooted in the Christian understanding of God as revealed in Jesus Christ, were not celebrated by pagans, but they formed the foundations for Christianity.  Nor did pagans endorse the sanctity of life.  But “above all else, Christianity brought a new conception of humanity to a world saturated with capricious cruelty and the vicarious love of death” (p. 214).

* * * * * * * * * * * * * * * * * *

In One True God: Historical Consequences of Monotheism (Princeton: Princeton University Press, c. 2001), Stark pursues the thesis that monotheism is the most important “innovation” in history.  He makes a clear distinction between “godless religions,” such as Buddhism and Taoism, and “godly religions” such as Judaism and Christianity.  “Godless religions” may assume a distant, unknowable deity of some sort, but they appeal to an intellectual elite of monks and philosophers.  “I am comfortable,” he says, “with the claim that Taoism, for example, is a religion, but it seems unwise to identify the Tao as a God.  Indeed, for centuries sophisticated devotees of Buddhism, Taoism, and Confucianism have claimed that theirs are Godless religions.  I agree” (p. 10).  Remarkably different, however, are the “godly religions,” that proclaim the reality of the “one true God” who has revealed Himself and has a clear plan for mankind, and they have proved historically momentous.

Primitive cultures–as ably documented by Andrew Lang, Paul Radin and Wilhelm Schmidt–often believed in “High Gods” remarkably akin to monotheism, but only Judaism, Christianity, and Islam embraced a coherent vision of God’s nature and of His will for the “chosen” people.  Monotheists call for conversion, and “to convert is to newly form an exclusive commitment to a God” (p. 50).  Monotheists, uniquely, were missionaries.  Though no longer so, Judaism, in the Ancient World, was known as a “missionizing faith” (p. 52), a fact noted by Max Weber, who credited the success of Jewish proselytism to “‘the purity of the ethic and the power of the conception of God'” (p. 59).  In turn, Christians so successfully spread their faith that within 300 years “more than half of the population of the empire (perhaps as many as thirty-three million people) had become Christians.  More recently, of course, missionaries have taken the Gospel almost literally to the uttermost parts of the earth.

By definition missionaries are true believers!  Clergy in established churches easily lose their evangelistic zeal, and broad-minded “liberals” during the past century (with their focus on tolerance and pluralism) quickly abandoned evangelism of any sort.  Indeed, “churchmen who no longer believed in One True God” lost any reason “for attempting to convert non-Christians” (p. 99).   Skeptical churchmen, who no longer believed “in anything more Godly than an essence, began to express doubts as to whether there was any theological or moral basis for attempting to convert non-Christians” (p. 99).  Following WWI, American liberals rejected the notion of God “as an aware, conscious, concerned, active being” and anticipated Paul Tillich’s hypothetical “God as a psychological construct, the ‘ground of our being'” (p. 100).  Rather that seeking “converts,” liberal Christians engaged in various forms of humanitarian “service,” endeavors which enlist few life-long vocations and attract few converts.

True believers seeking converts cannot but engage in religious conflicts, because “particularism, the belief that a given religion is the is the only true religion is inherent in monotheism” (p. 116), and Stark details some of the darker moments of monotheism–persecution of the Jews by both Muslims and Christians at various times, the Crusades, the 30 Years War.  This same particularism also explains the powerful persistence of the monotheistic religions.  Amazingly, however, as the last chapter–”God’s Grace: Pluralism and Civility”–shows, believers have lately learned to peacefully co-exist.  “Adam Smith’s great insight about social life is that cooperative and socially beneficial outcomes can result from each individual human’s acting to maximize his or her selfish interests” (p. 221).  Such seems true in today’s religious world.  Catholics and Protestants, Christians and Jews, have found “that people can both make common cause within the conventions of religious civility and retain full commitment to a particularistic umbrella” (p. 248).

Ironically, persecution and intolerance now distinguish secularists rather than religionists!  Deistic clergy fulminate against despised “fundamentalists,” and “a new study has demonstrated that the only significant form of religious prejudice in America is ‘Anti-Fundamentalism,’ and it is concentrated among highly educated people without an active religious affiliation” (p. 256).

* * * * * * * * * * * * * * * * * *

In a companion volume to One True God, Rodney Stark has written For the Gloryof God: How Monotheism Led to Reformations, Science, Witch-Hunts, and the End of Slavery (Princeton: Princeton University Press, c. 2003).  Four lengthy chapters (each nearly 100 pp. long) focus on the four topics listed in the book’s subtitle.  And in each one Stark tries to rectify the historical record, duly crediting Christians for their contributions to Western Civilization.  Though he acknowledges his debt to historians’ researches, he admits to being disillusioned by their biases.  He was startled by many of their anti-Christian and (especially) anti-Catholic comments.  “Far more pernicious, however,” he says, “are the many silences and omissions that distort scholarly comprehension of important matters” (p. 13).  To shed light on what really happened motivates this study.

For 2000 years the Christian Church has been renewed by continuous reformations, though Stark focuses almost singularly upon the 15th and 16th centuries. Many of these movements were “sectarian” in nature and, like the early Christians, led by “privileged” rebels such as Peter Waldo, John Wyclif, Jan Hus, and Martin Luther.  Almost never were they the “revolts of the poor” so lionized by Marxist propagandists.  Reformers sincerely sought “God’s Truth.”  Theology, not economics, motivated them, though the success of their movements was powerfully shaped by various cultural factors.  Those that truly mattered, Stark says, were three: 1) Catholic weaknesses in lands that turned Protestant; 2) government response–autocratic regimes sustained Catholicism in countries like Spain; (3 monarchs’ “self-interest,” obviously determinative in Luther’s Saxony and Henry VIII’s England, but crucial wherever Protestantism prevailed.

Stark begins his second chapter, “God’s Handiwork: The Religious Origins of Science,” with a long quotation from Andrew Dickson White’s two-volume A History of the Warfare of Science with Theology in Christendom, the popular source of much misinformation such as the “fact” that Columbus “discovered” that the earth is spherical.  White, as well as fellow atheists such as Carl Sagan and Richard Dawkins, simply falsify the historical record so as to advance their philosophical agenda.  In fact, Stark argues “not only that there is no inherent conflict between religion and science, but that Christian theology was essential for the rise of science” (p. 123).  This is not “news” to those acquainted with the work of Stanley Jaki and Alfred North Whitehead (who in 1925 stated that science developed in tandem with Medieval theology), but it certainly challenges conventional textbook presentations.  It cannot be too strongly stated that Christianity uniquely nourished science, whereas neither the Greeks nor the Chinese, neither the Maya nor the Muslims encouraged scientific development.  And many scientists today are strong Christians!  Indeed, “professional scientist have remained about as religious as most everyone else, and far more religious than their academic colleagues in the arts and social sciences” (p. 124).  Stark illustrates his case with an impressive list of great Christian scientists, past and present.

Still more:  neither the “Dark Ages” nor the “Scientific Revolution” is a historically accurate label, for the latter was very much a continuation of the former.  Significant scholarly, and scientific, work took place during the Medieval era, a time of “‘precise definition and meticulous reasoning, that is to say, clarity'” as Alfred Crosby insisted (p. 135).  “Christianity depicted God as a rational, responsive, dependable, and omnipotent being and the universe as his personal creation, thus having a rational, lawful, stable structure, awaiting human comprehension” (p. 147).  Thus St. Albert the Great was a great scientist in the 13th century–and probably a much better thinker than was Nicholas Copernicus in the 16th!  The greatest scientist of the 18th century, Isaac Newton, devoted inordinate time (and a million written words) to biblical study and speculation.  His private letters “ridiculed the idea that the world could be explained in impersonal, mechanical terms” (p. 168).  According to John Maynard Keynes, who purchased a collection of his manuscripts, Newton “‘regarded the universe as a cryptogram set by the Almighty'” (p. 172).

Newton’s approach to the universe, however, was scuttled by Charles Darwin and his epigones.  In Stark’s judgment, “the battle over evolution is not an example of how ‘heroic’ scientists have withstood the relentless persecuting of religious ‘fanatics.’  Rather, from the very start it has primarily been an attack on religion by militant atheists who wrap themselves in the mantle of science in an effort to refute all religious claims concerning a Creator–an effort that has also often attempted to suppress all scientific criticism of Darwin’s work” (p. 176).  The theory of evolution through natural selection has not really explained the origin of species, though a great deal of rhetorical disingenuity disguises that fact.  For example:  millions of fossils have been unearthed during the past century, “but the facts are unchanged.  The links are still missing; species appear suddenly and then remain relatively unchanged” (p. 180).  Thus great thinkers, such as Karl Popper, have “suggested that the standard version of evolution even falls short of being a scientific theory” (p. 191)

Yet the Darwinian faithful retained their fervor and Popper was assailed for his obtuseness!  What Stark labels “the Darwinian Crusade” has been propelled by “activists on behalf of socialism and atheism” (p. 185).  Alfred Russel Wallace shared Darwin’s evolutionary hypothesis and declared that it unveiled “the coming of that biological paragon of selflessness, ‘socialist man'” (p,. 186).  In Darwin’s library one finds “a first edition of Das Kapital, inscribed to ‘Mr. Charles Darwin.  On the part of his sincere admirer, Karl Marx, London 16 June 1873).’  More than a decade before, when he read The Origin, Marx wrote to Engels that Darwin had provided the necessary biological basis for socialism” (p. 186).  Thomas Henry Huxley’s passionate commitment to Darwinian evolution was deeply rooted in his anti-Christian hostility.  Ideology and emotion, not objectivity, dominates Darwinism!

Stark’s third chapter is entitled:  “God’s Enemies: Explaining the European Witch-Hunts.”  Careful calculations, he insists, indicate that during the witch-hunting era (1450-1750), “in the whole of Europe it is very unlikely that more than 100,000 people died as ‘witches'” (p. 203).  Though radical feminists and anti-Christian historians often toss around numbers in the millions, they are simply venting their feelings and prejudices rather than dealing with the evidence.  Stark suggests that the confluence of satanism, magic, and political developments in the Protestant Reformation best explain the outbreak of witch-hunts.  They rarely occurred in Catholic lands, and they abruptly ended in the 18th century.  Stark’s meticulous research, and his country-by-country tabulations, persuasively discount many of the irresponsible textbook generalizations without defending the irrational frenzy underlying the killing.

“God’s Justice: The Sin of Slavery,” the book’s last chapter, argues that Christians, virtually alone among earth’s peoples, have condemned and eliminated a universal practice (evident in American Indian and African tribal societies as well as Greece and Rome, endorsed by Mohammed as well as Aristotle).  Apart from the Christian world, slavery has been taken for granted, much like the stars’ placement in the heavens.  But, Stark says, “Just as science arose only once, so, too, did effective moral opposition to slavery.  Christian theology was essential to both” (p. 291).  Certainly early Christians, such as St. Paul, “condoned slavery,” but “only in Christianity did the idea develop that slavery was sinful and must be abolished” (p. 291).  Rather than being abruptly abolished, however, it simply faded away, so that in the Medieval World it had disappeared and Thomas Aquinas branded it sinful in the 13th century.

The conquest and colonization of the New World, of course, revived the institution of chattel slavery.  But in Catholic lands it was mitigated somewhat by theological constraints, as is evident in the career of Bartholomew de Las Casas.  And in Protestant lands, by the 18th century, abolitionists began to question its legitimacy.   Revivalists such as John Wesley in the 18th and Charles G. Finney in the 19th century (not “Enlightenment” philosophers such as John Locke and Voltaire)  opposed it.  And ultimately, as “Robert William Fogel put it so well, the death of slavery was ‘a political execution of an immoral system at its peak of economic success, incited by [people] ablaze with moral fervor.'” (p. 365)

” Precisely!” Stark says, in his final sentence.  ” Moral fervor is the fundamental topic of this entire book: the potent capacity of monotheism, and especially Christianity, to activate extraordinary episodes of faith that have shaped Western civilization” (p. 365).

# # #

163 Churchill & Reagan Speak

Few historians today question the significance of Winston Churchill and Ronald Reagan.  Both men led their nations through challenging times, maintained a singular commitment to their core values, and helped shape the contours of the 20th century.  Both were gifted orators, and their recorded speeches and archived papers increasingly reveal the quality of their thought.  Furthermore, both illustrate how statesmen rightly respond to crises and conflicts such as America’s current war with Islamic terrorists.

Winston Churchill’s grandson, Winston S. Churchill, has collected  and published Never Give In!  The Best of Winston Churchill’s Speeches (New York: Hyperion, c. 2003), a 500 page treasury that documents his views from 1897-1963.  In his Preface, the editor expresses admiration for his grandfather and indicates his rationale for publishing the collection–a small portion of the five million words in Churchill’s complete speeches.  Amazingly, Churchill never used a speech writer!  His words are truly his words!  And he worked hard to craft them well.  One of his wartime secretaries, Sir John Colville, told the editor: “In the case of his great wartime speeches, delivered in the House of Commons or broadcast to the nation, your grandfather would invest approximately one hour of preparation for every minute of delivery” (p. xxv).  The speeches are presented in chronological order and divided into five periods, though several themes characterized Churchill: 1) the virtue and necessity of courage, both political and military; 2) opposition to Socialism, both Bolshevism in Russia and the softer version of Britain’s Labor Party; 3) adamant opposition to Nazism, demanding armed response to Hitler’s aggression; 4) the goodness of the “property-owning democracy that explained England’s greatness; 5) the correctness of Conservatism, as he defined it, upholding the grandeur of the Christian tradition and of Western Civilization in general..

The first section, entitled “Young Statesman, 1897-1915,” introduces the reader to a young politician finding himself and his political principles.  Churchill launched his political  career in 1900 at the age of 25, and would serve in Parliament (with one brief absence) until 1964.  Elected as a Conservative in 1900, he broke with his party in 1904, “crossed the floor” and joined the Liberals, primarily because of his commitment to free trade.  Subsequently he rapidly rose through the ranks of the British government, becoming First Lord of the Admiralty in 1911, just as the early tremors of WWI rippled across Europe.  When the war began he declared, with words he would repeat 25 years later: “We did not enter upon the war with the hope of easy victory; we did not enter upon it in any desire to extend our territory, or to advance and increase our position in the world” (p. 59).   Unlike many, he believed: “The war will be long and somber” (p. 59), and it would prove difficult for Churchill himself, for he was forced to resign his cabinet position when his plan to attach Germany from the east, through the Dardanelles, misfired.

This led to the second period of his career, “Oblivion and Redemption, 1916-29.”  Following his Dardanelles disgrace, Churchill left the House of Commons and served as a soldier on the front lines in Flanders.  But he returned to the House when David Lloyd George asked him to serve in the cabinet, and in 1924 he would serve as Chancellor of the Exchequer under the newly elected Conservative Prime Minister.  When the Bolsheviks seized control of Russia in 1917, Churchill immediately declared: “Tyranny presents itself in many forms.  The British nation is the foe of tyranny in every form.  That is why we fought Kaiserism and that I why we would fight it again.  That is why we are opposing Bolshevism.  Of all tyrannies in history the Bolshevist tyranny is the worst, the most destructive, and the most degrading.  It is sheer humbug to pretend that it is not far worse than German militarism” (p. 77).  By 1921, he recognized that:  “The lesson from Russia, writ in glaring letters, is the utter failure of this Socialistic and Communistic theory, and the ruin which it brings to those subjected to its cruel yoke” (p. 81).  Churchill also opposed those in Britain’s Labor Party who wanted to install Socialism, asserting that they were “corrupting and perverting great masses of our fellow-countrymen with their absurd foreign-imported doctrines” (p. 89).  “They borrow all their ideas from Russia and Germany,” he said.  “They always sit adulating every foreign rascal and assassin who springs up for the moment.  All their economics are taken from Karl Marx and all their politics from the actions of Lenin” (p. 89).

From 1930-1939 Churchill endured his “Wilderness Years,” lonely and ridiculed as he opposed Hitler and those who appeased him.  England’s “difficulties,” he said, “come from the mood of unwarrantable self-abasement into which we have been cast by a powerful section of our own intellectuals” (p. 104).  Politicians joined them in offering “a vague internationalism, a squalid materialism, and the promise of impossible Utopias” (p. 104).  Clergymen were particularly reprehensible insofar as they sought “to dissuade the youth of this country from joining its defensive forces, and seek to impede and discourage the military preparations which the state of the world forces upon us” (p. 155).  While pacifists talked peace Hitler armed for war.  In the midst of WWII, Churchill remembered:  “For the best part of twenty years the youth of Britain and America have been taught that war is evil, which is true, and that it would never come again, which has been proved false.”  During that time dictators armed their regimes. “We have performed the duties and tasks of peace.  They have plotted and planned for war” (p. 318).  They illustrated the fact that “The whole history of the world is summed up in the fact that when nations are strong they are not always just, and when they wish to be just they are often no longer strong” (pp. 132-133).  To be both strong and just was Churchill’s goal.  That required military strength and the willingness to use it to prevent Hitler’s ambitions.  Rifles and battleships, not rhetoric and resolutions, could deter war.

“For five years,” he said in 1938, “I have talked to the House on these matters–not with very great success.  I have watched this famous island descending incontinently, fecklessly, the stairway which leads to a dark gulf.  It is a fine broad stairway at the beginning, but after a bit the carpet ends.  A little farther on there are only flagstones, and a little farther on still these break beneath your feet” (p. 166).  To Churchill, Prime Minister Chamberlain’s 1938 pact with Hitler in Munich was a “total and unmitigated defeat” (p. 172).  Churchill feared that it would prove to be “only the first sip, the first foretaste of a bitter cup which will be proffered to us year by year unless, by a supreme recovery of moral health and martial vigour, we arise again and take our stand for freedom as in the olden time” (p. 182).

Few heeded Churchill’s words until Hitler actually moved, invading Poland in 1939 and attacking France six months later.  Then Churchill was called upon to lead his nation through the throes of WWII.  These were, his grandson says, “The Glory Years, 1939-45.”  The war was not simply against Germany, he insisted:  “We are fighting to save the whole world from the pestilence of Nazi tyranny and in defense of all that is most sacred to man” (p. 198).  It was a war to restore “the rights of the individual, and it is a war to establish and revive the stature of man” (p. 198).  It was a war of words–and Churchill  empowered his people with words.  He made memorable speeches during these years, offering nothing “but blood, toil, tears and sweat,”elicited courageous resolve.  His policy, he said in 1940, was” “to wage war, by sea, land and air, with all our might and with all the strength that God can give us; to wage war against a monstrous tyranny, never surpassed in the dark, lamentable catalogue of human crime” (p. 206).  And there was only one goal: victory!   Still more, addressing his alma mater, Harrow School, in 1941, he said: “surely from this period of ten months this is the lesson: never give in, never give in, never, never, never, never–in nothing great or small, large or petty–never give in except to convictions of honour and good sense.  Never yield to force; never yield to the apparently overwhelming might of the enemy” (p. 307).

As the Battle for Britain began, Churchill declared:  “Hitler knows that he will have to break us in this Island or lose the war.  If we can stand up to him, all Europe may be free” and the world saved.  “But if we fail, then the whole world, including the United States . . . will sink into the abyss of a new Dark Age     . . .   Let us therefore brace ourselves to our duties, and so bear ourselves that, if the British Empire and its Commonwealth last for a thousand years, men will still say, ‘This was their finest hour'” (p. 229).  And, indeed, it was.  The RAF defeated the Luftwaffe in the skies over England, and “Never in the field of human conflict was so much owed by so many to so few” (p. 245).   However dark the prospects, Churchill ever insisted that “these are great days” (p. 308) and that the courageous would prevail.

In due time, with the help of the United States and the Soviet Union, the war was won.  Churchill urged his colleagues “to offer thanks to Almighty God, to the Great Power which seems to shape and design the fortunes of nations and the destiny of man” (p. 390).  Addressing the nation on 8 May 1945, the day the war in Europe ended, he said “that in the long years to come not only will the people of this island but of the world, whenever the bird of freedom chirps in human hearts, look back to what we’ve done and they will say ‘do not despair, do not yield to violence and tyranny, march straight forward and die if need be–unconquered'” (p. 391).

He fully intended to finish the war against Japan, but England’s Socialists (the Labor Party) turned against him as soon as the war in Europe ceased.  They demanded a general election in July, 1945, and Churchill found himself battling to maintain his position as Prime Minister.  Feeling betrayed, he strongly denounced Socialism as “abhorrent to the British ideas of freedom.”  Though Labor Party leaders portrayed their positions as indigenously English, “there can be no doubt that Socialism is inseparably interwoven with Totalitarianism and the abject worship of the State.  It is not alone that property, in all its forms is struck at, but that liberty, in all its forms, is challenged by the fundamental conceptions of Socialism” (p. 396).  Desiring to control every aspect of life, Socialists sought to establish “one State to which all are to be obedient in every act of their lives.  This State is to be the arch-employer, the arch-planner, the arch-administrator and ruler, and the arch-caucus-boss” (p. 397).  But the English voters apparently wanted such a system–as well as escape the burdens of war–and Churchill’s Conservative Party lost the election.

The next era, “The Sunset Years 1945-63,” witnessed Churchill leading the opposition to the Labor Party of Clement Atlee, speaking out against Russia’s aggression, and returning to power in 1951 and retiring in 1955, soon after his 80th birthday.  He still spoke prophetically, especially at Westminister College in Fulton, Missouri, in 1946, where, in the presence of President Harry Truman, he warned that we must ever oppose “war and tyranny.”  To do so “we must never cease to proclaim in fearless tones the great principles of freedom and the rights of man which are the joint inheritance of the English-speaking world and which through Magna Carta, the Bill of Rights, the Habeas Corpus, trial by jury, and the English common law find their most famous expression in the American Declaration of Independence” (p. 417).  Such goods were endangered, however, because “From Stettin in the Baltic to Trieste in the Adriatic, an iron curtain has descended across the Continent” (p. 420).  This was not the “Liberated Europe we fought to build up” (p. 421).  Soviet aggression threatened the world’s peace, and Stalin was as much a threat in the late ’40s as Hitler had been in the late ’30s.  Thus began the “cold war.”

Elected Prime Minister again in 1951, he sought to reverse the corrosive Socialist policies established by the Labor Government.  He stood for “a property-owning democracy” and detested “the philosophy of failure, the creed of ignorance and the gospel of envy” basic to Socialism.  High taxes and petty regulations had betrayed the traditions of the nation, he believed.  “In our view the strong should help the weak.  In the Socialist view the strong should be kept down to the level of the weak in order to have equal shares for all.  How small the share is does not matter so much, in their opinion, so long as it is equal” (p. 457).  “Socialists pretend they give the lower income groups, and all others in the country, cheaper food through their system of rationing and food subsidies.  To do it they have to take the money from their pockets first and circulate it back to them after heavy charges for great numbers of officials administering the system of rationing” (p. 460).

Few single volumes better illustrate the great issues of the 20th century, for no one I can think of stood for so long at the center of world events.  Churchill’s wisdom and courage, anchored to the realities of the political world, still command respect and (far better than many theoretical treatises) provide direction for making decisions today.

* * * * * * * * * * * * * * * * * *

Three Ronald Reagan scholars–Kiron K. Skinner, Annelise Anderson, and Martin Anderson– have edited Reagan In His Own Hand: The Writings of Ronald Reagan That Reveal His Revolutionary Vision for America (New York: The Free Press, c. 2001).  During the late 1970s, Reagan broadcast some 1,000 weekly radio talks.  His wife, Nancy, says he wrote these messages at his desk at home, rooting his ideas in “voracious” reading.  He rarely watched TV but devoted himself to reading, thinking, speaking.  Though he often appeared to speak spontaneously, in fact he carefully prepared and followed his written texts.  In his Foreword, George Schultz remembers his close association with Reagan: “I was always struck by his ability to work an issue in his mind and to find its essence, and by his depth of conviction. . . . [and] his intense interest and fondness for the spoken word, for caring very deeply about how to convey his thoughts and ideas to people” (p. ix).

The editors have collected Reagan’s radio talks into four categories, meticulously retaining the spelling and revisions in his handwritten texts.  In the first section, “Reagan’s Philosophy,” the reader discovers the bedrock principles that guided him.  Looking back, in 1989, he noted: “We meant to change a nation, and instead, we changed a world” (p. 4).  This resulted, in part, from the lessons he learned from WWII–lessons Churchill tried unsuccessfully to teach his countrymen in the 1930s.  Military weakness encourages aggression.  One month before Pearl Harbor, Reagan noted, the U.S. “Congress came within a single vote of abolishing the draft & sending the bulk of our army home” (p. 8).  The Japanese doubted America’s resolve and thus dared attack her.  We learned, Reagan said, citing an academic study, “that ‘to abdicate power is to abdicate the right to maintain peace'” (p. 8).  This extended to opposing the USSR and Communism in the Post-WWII era.  To those who argued “better red than dead,” he replied:  “Back in the ’20s, Will Rogers had an answer for those who believed that strength invited war.  He said, ‘I’ve never seen anyone insult Jack Dempsey [then the heavyweight boxing champion]'” (p. 480).

Reagan also believed, in accord with Churchill, that the subtly socialistic views of American liberals threatened disaster for the nation.  America’s strength resided in her confidence in “the individual genius of man” (p. 13).  Liberating the individual from government was a major Reagan theme, one he tirelessly repeated.  Summing up his position, he said:  “The choice we face between continuing the policies of the last 40 yrs. that have led to bigger & bigger govt, less & less liberty, redistribution of earnings through confiscatory taxation or trying to get back on the original course set for us by the Founding Fathers.  Will we choose fiscal responsibility, limited govt, and freedom of choice for all our people?  Or will we let an irresponsible Congress set us on the road our English cousins have already taken?  The road to ec. ruin and state control of our very lives?” (p. 10).

Thus he applauded the stance of Britain’s Margaret Thatcher.  Having visited her before she became Prime Minister in 1979, he predicted she would “do some moving & shaking of Englands once proud industrial capacity UNDER WHICH THE LABOR PARTY has bees running downhill for a long time.  Productivity levels in some industrial fields are lower than they were 40 yrs. ago.  Output per man hour in many trades is only a third of what it was in the 1930’s.  Bricklayers for example laid 1000 bricks a day in 1937–today they lay 300.  I think ‘Maggie’–bless her soul, will do something about that” (p. 47).  Indeed she did!  And she and Reagan became the staunchest of allies once he became President in 1981.

A commitment to freedom stood rooted in a belief in God, who had providentially guided this nation.  Reagan shared and often repeated Thomas Jefferson’s view that: “‘The God who gave us life gave us liberty–can the liberties of a Nat. be secure when we have removed a conviction that these liberties are the gift of God'” (p. 14). Still more:  America has a mission to spread the blessings of freedom.  Indeed, as Pope Pius XII said soon after the end of WWII:   “‘America has a genius for great and unselfish deeds.  Into the hands of Am. God has placed the destiny of an afflicted mankind.’  I don’t think God has given us a job we cant handle” (p. 16).

Such convictions shaped Reagan’s “Foreign Policy,” the second section in the book, summed up by these words: “We want to avoid a war and that is better achieved by being so strong that a potential enemy is not tempted to go adventuring” (p. 21).  Since these radio talks were given while Jimmy Carter was President, there was much for Reagan to criticize.  He discussed, insightfully, developments in Cambodia, Vietnam, Taiwan, Korea, Chile, Panama, Palestine, the USSR and Cuba (rebuking the tyrant Castro as a “liar”).  By contrast, Senator George McGovern, visiting Castro, “found the Cuban dictator to be a charming, friendly WELL INFORMED fellow.  It sort of reminds you of how we discovered Joseph Stalin was good old Uncle Joe, shortly before he stole among other things our nuclear secrets” (p. 183).   Everywhere, he argued, the U.S. should support freedom, especially regarding religious expression and private property rights, and he endorsed “Somerset Maughams admonition: “If a nation values anything more than freedom, it will lose it’s freedom; and the irony of it is, that if it’s comfort of money THAT it values more, it will lose that too'” (p. 85).

Part Three of the book, “Domestic and Economic Policy,” delineates what came to be called “Reaganism” in the ’80s.  Personal freedom, limited government, and minimal taxation anchored Reagan’s positions.  To reduce the rightful role of government to a brief sentence, he said: “Govt. exists to protect us from each other” (p. 288).   He favorably cited the words of an English editorialist:  “‘What the world needs now is more Americans.  The U.S. is the 1st nation on earth deliberately dedicated to letting people choose what they want & giving them a chance to get it.  For all it’s terrible faults, in one sense Am. still is the last, best hope of mankind, because it spells out so vividly the kind of happiness which most people actually want, regardless of what they are told they ought to want'” (p. 227).

Reading Reagan, in his own hand, reveals a thoughtful man cruelly maligned by his critics as an ignorant actor.  He routinely refers to the books and articles he was reading, carefully crediting quotations, and blends (with that justly renowned Reaganesque touch) human interest stories into his talks.

162 Global Warming Forest

                My father worked as a meteorologist for the United States Weather Bureau.  He occasionally joked that it helped, now and then, when compiling a weather report, to look out the window rather than stay buried in the papers cranked out by various machines.  Somewhat the same goes for the current concern for global warming.  That the globe is dramatically warming is an article of faith for most environmentalists and many politicians.  But scores of those who best understand what’s actually happening–looking at the evidence rather than computer projections–urge us to disregard TV snippets or Greenpeace press releases and study the facts.  So argues Patrick J. Michaels, research professor of environmental sciences at the University of Virginia and past president of the American Association of State Climatologists, in Meltdown:  The Predictable Distortion of Global Warming by Scientists, Politicians, and the Media (Washington, D.C.:  Cato Institute, c. 2004).  We can either rely on computer projections or factual observations, simulated scenarios or substantiated facts. 

            Truly the globe has warmed slightly during the past few decades.  But it has, in the more distant past, been considerably warmer, and there is no reason to think the current warming is caused by human activity.  Such warming will change some things, but the changes will be modest–nothing remotely like that described by alarmists who control the major media.  For example:  an article in Nature magazine  (one of the most prestigious scientific journals) recently predicted that a single degree (C) increase would destroy 15 percent of all species on earth.  But the earth warmed by more than a degree (C) a century ago and the planet’s species fared quite nicely!  Summarizing his position, Michaels insists:  “Global warming is real, and human beings have something to do with it.  We don’t have everything to do with it; but we can’t stop it, and we couldn’t even slow it down enough to measure our efforts if we tried” (p. 9). 

            NASA’s James Hansen, whose 1988 congressional testimony launched public concerns for global warming, recently (2001) noted that we can now more accurately assess the threat, and in the next 50 years the earth will probably warm up by less than one degree (C).  Hansen’s projection “is about four times less than the lurid top figure widely trumpeted by the United Nations in its 2001 compendium on climate change and repeated ad infinitum in the press” (p. 20).  In part this is because only one-third of the U.N.’s Intergovernmental Panel on Climate Change (IPCC) are climate scientists.  Even worse, its publications are not peer reviewed.  The IPCC’s influential 1996 Assessment relied on ground-measured temperatures, but utterly ignored the highly significant satellite data (which indicate absolutely no global warming of the atmosphere!).  Many ground temperature measurements are taken in areas that have experienced dramatic urban sprawl during the past century.  Thus, for example, Washington D.C. is significantly warmer than it was 50 years ago, but a measuring station in rural Virginia shows no increase at all.  It’s quite possible that much “warming” is simply the warming of areas adjacent to large cities, whose artificial environment (heat absorbing asphalt and heat producing factories and homes) falsifies the picture.  It’s even possible that the surface-warming trend will be reversed within a few decades.  

            The public knows little of this truth because the press is uncritical (or scientifically illiterate) and many scientists are locked into a funding network that discourages dissent.  Consider, for example, “the truth about icecaps.”  In 2001 a Washington Post headline screamed:  “The End is Near.”  Rising water would soon flood seaside cabins on the Chesapeake Bay, the story declared.  Senator Joseph Lieberman, in that year, repeated the alarmist mantra that melting polar icecaps would raise sea levels by “35 feet, submerging millions of homes under our present-day oceans” (p. 33).  If he wasn’t reading the Post, he could easily have taken this scenario from a similar article in the New York Times based upon “the observations of two passengers on a Russian cruise ship” that sailed through the Artic Ocean.  The passengers took pictures of the ice-free water and wondered if  “‘anybody in history ever got to 90 degrees north to be greeted by water, not ice'” (p. 43).  Though a bit of fact checking by the Times would have demonstrated the normality of this, headlines favor the abnormal and the story fueled the political agenda favored and financed by environmentalists.  But the truth is, as Michaels shows–with numerous graphs and scholarly citations–Greenland’s icecap is growing and there’s no cause for alarm.  And there’s especially no cause for alarm regarding the higher ocean levels predicted by Senator Lieberman!  “In fact, the North Polar icecap is a floating mass, and melting that will have absolutely no effect on sea level; a glass of ice water does not rise when the cubes have melted.  With regard to that other polar ice–Antarctica–most climate models predict little or no change” (p. 203). 

            Michaels applies the same scrutiny to allegations of species extinction.  Alarmist studies of butterflies (one of which was the foundation for the Kyoto Accord so sacred to politicians like Al Gore), toads, penguins, and polar bears simply do not survive careful scrutiny.  Though earth’s surface temperatures have slightly increased, there is no evidence that such warming has led to species’ extinctions.  So too hurricanes and tornadoes, droughts and floods, disease and death, though often attributed to global warming by impulsive journalists such as Dan Rather, simply cannot have been caused by it.  A case study for alarmism is the island of Tuvalu.  For years environmentalists like Lester Brown and local officials of this tiny Pacific island nation have been issuing warnings, asking for “environmental refugee status in New Zealand” for its 11,000 people (p. 203).  Tuvalu’s prime minister declared (a decade ago) that “the greenhouse effect and sea-level rise threaten the very heart of our existence” (p. 204).   London’s Guardian, just in time for an important United Nations conference, certified such fears.  “In fact,” Michaels says, the “sea level in Tuvalu has been falling–and precipitously so–for decades” (p. 204).  Not to be bothered by the facts, however the Washington Post irresponsibly spread the word that melting ice caps in polar regions would soon engulf the island nation! 

            Perhaps more distressing than Michael’s factual presentation is his critique of the scientific community responsible for promoting the myth of impending doom.  Just in time for the 2000 election, the U.S. National Assessment of the Potential Consequences of Climate Variability and Change was published.  Guided to publication by President Clinton’s Assistant for Science and Technology, John Gibbons, the report was carefully wrapped with all the ribbons of solid science.  “Gibbons was a popular speaker on the university circuit, lecturing on the evils of rapid population growth, resource depletion, environmental degradation and, of course, global warming.  His visual aids included outdated population and resource projections from Paul Ehrlich in which ‘affluence’ was presented as the cause of environmental degradation, a notion that has been discredited for decades” (p. 207).  Equally dated were his data on climate change! 

            Gibbons guided the various bureaucratic committees that led to the publication of the influential National Assessment.  These committees, “larded with political appointees,” were designed to deliver a document satisfactory to Vice President Al Gore, gearing up for his presidential campaign.  “The resultant document was so subject to political pressure that it broke the cardinal ethic of science:  that hypotheses must be consistent with facts” (p. 208).  The National Assessment embraced the most extreme computer projections regarding global warming–one of which would have erred by 300 percent if applied to the past century!  It ignored that fact that the most of the past century’s warming took place in the U.S. before there was any significant accumulation of greenhouse gasses in the atmosphere.  It even endorsed a Canadian study that predicted temperatures in the Southeast would soar to 120 degrees (F) by the year 2100–a totally ludicrous notion that could only occur if the Gulf of Mexico (and its moderating influence) evaporated!

            Michael’s final chapter, “The Predictable Distortion of Global Warming,” alerts us to the insidious role played by popular theoretical paradigms and the lure of federal funding in shaping contemporary science.  Today’s climatological paradigm reigns in powerful centers and encourages alarmist studies.  It’s the paradigm underlying various laws, for legislators quickly trumpet what’s taken to be conventional wisdom and public concern.  It’s also responsible for the fact that scientific journals rarely consider the possibility that “global warming is exaggerated” (p. 228).  Add dollars to the equation–a grand total of $20 billion granted scientists since 1990 to “research” global warming–and you begin to understand why it’s promoted!  The most prestigious journals–Science, Scientific American, Nature–simply will not tell the “obvious truth” that only minimal global warming is at all possible during the century to come!  Sad to say, money talks in the allegedly “objective” scientific community as surely as it does in politics! 

            What Michaels hopes, writing this book, is that a small but courage coterie of scientists and journalists will begin to challenge the dominant paradigm.  And certainly this book takes a step in that direction.

* * * * * * * * * * * * * * * * * *

            For a fictional version of Meltdown, pick up a copy of Michael Crichton’s recent thriller, State of Fear (New York:  HarperCollins, 2004).  I’ve never before read a novel that has scores of graphs, footnotes to scholarly articles, and a 10 page annotated bibliography at the end!  But they’re here, and it’s obvious he wanted to write more than a popular novel, which he’s certainly done many times!  The novel pits a handful of dedicated, scientifically-informed heroes struggling to save the earth from the machinations of fanatical environmentalists, nominally led by a Hollywood actor, who are manipulated by professional environmentalists who are more concerned with money and power than environmental integrity.  The environmentalists plan to trigger various global disasters and attribute them to global warming, coldly indifferent to their catastrophic results.  There’s riveting action and snitches of romance.  The plot’s suspenseful, and the pages turn quickly as one sinks into the story. And along with the dialogue and adventure, there’s the message!

            So read the story and enjoy it.  Then think about the book’s message, which was summed up by Crichton in a speech he gave in San Francisco in 2004 wherein he decried “the disinformation age” that results from “a secular society in which many people–the best people, the most enlightened people–do not believe in any religion” and embrace environmentalism.  They cite their scriptures (environmental classics by Aldo Leopold, Rachel Carson and Paul Ehrlich), recite their creeds (mantras regarding the plight of the planet and the evils of capitalism), join their cults (Sierra Club, Greenpeace, Earth First!), and denounce any challenges to their faith, especially including questions concerning global warming.   But they are–like the Hollywood character in State of Fear–misinformed at best and Machiavellian at worst. 

            In the “author’s message” at the end of the book, Crichton says he spent three years reading environmental texts before writing the novel.  What astonished him was how little we actually know about the state of the world.  Some things are obvious:  carbon dioxide in the atmosphere and the surface temperatures have both increased.  But, “Nobody knows how much of the present warming trend might be man-made” (p. 569).  The computer models generally cited in global warming scenarios vary enormously, and the best estimates suggest it will take 100 years to increase one degree centigrade.  He believes things will be much better for earth’s inhabitants in 2100, and he thinks “that most environmental ‘principles’ (such as sustainable development . . . ) have the effect of preserving the economic advantages of the West and thus constitute modern imperialism towards the developing world” (p. 571). 

            Environmental activists–Sierra Club and Environmental Defense League types–generally promote the antiquated scientific views of their youth.  Dramatic breakthroughs, such as nonlinear dynamics, chaos theory, and catastrophe theory, have fundamentally changed science without sinking into “the thinking of environmental activists” (p. 571).  He finds the ideas of “wilderness advocates” routinely spurious, declaring them no better than those propounded by “developers and strip miners” (p. 572).  What’s desperately needed is “more people working in the field . . . and fewer people behind computer screens” (p. 572).  And we need honest, independent scientists whose research isn’t funded by special interests, especially environmental organizations and bureaucracies such as the EPA! 

            Read in conjunction with Meltdown, Crichton’s novel effectively quiets (or at least permits questioning) some of the fears fanned by fanatical environmentalists.

* * * * * * * * * * * * * * * * * * * *

            In the summer of 2003, the Hayman Fire, the largest in Colorado history, started about 10 miles west of my summer home in the mountains.  Subsequently it crept five miles closer.  We were evacuated from our place for two weeks, and suddenly “forest fire” took on a whole different meaning!  The Hayman blaze was started by a Forest Fire employee (now in prison) who apparently wanted to gain fame by first reporting and then extinguishing it!  The fire burned so voraciously, many analysts believed, because of forest service policies which make such fires inevitable.  Thus I was drawn to read Robert H. Nelson’s A Burning Issue:  A Case for Abolishing the U.S. Forest Service (Boulder:  Rowman & Littlefield Publishers, Inc., c, 2000).  Nelson worked in the Department of the Interior for 15 years and knows the way Washington works!  He is now professor of environmental policy in the School of Public Affairs at the University of Maryland and has earlier written Public Lands and Private Rights:  The Failure of Scientific Management. 

            The Forest Service has evolved in accord with the various political agendas that shaped it.  During the first half of the 20th century, it mainly worked with timber companies to extract lumber from the nation’s forests.  This fit the philosophy of Gifford Pinchot and the Progressives who constructed the “administrative state” (p. 2).  The service also worked to suppress fires, to save the trees for harvesting.  Lumber companies cut roads and cleared sections of the forest, helping to limit the expanse of fires that erupted.  Cutting trees for lumber saved trees from burning.  More recently, especially following the Wilderness Act of 1964, recreation has assumed a major role in shaping forest policy, and “preserving wilderness areas” has been promoted.  There are now some 100 million acres reserved as national wilderness, and various kinds of preserved lands have expanded “from 51 million acres in 1964 to 271 million acres in 1993” (p. 9).  To keep increasingly large areas “untrammeled by man” has become the objective of powerful interests, and there have, consequently, been “sharp declines in timber harvesting, mining, and other traditional uses of the national forests.  The Clinton administration has actively sought to instill this ethos as the new core value defining the institutional culture of the Forest Service” (p. xiv).  Millions of acres, unless mechanically harvested, will vanish when “catastrophic fires” ignite them. 

In the midst of these policy shifts, the Forest Service steadfastly fought fires and allowed “the buildup of brush and dense thickets of smaller trees in many forests” that became powder kegs awaiting a spark to explode (p. 6).  State and private forests have not suffered similarly, for they “have in general been more intensively managed, involving higher timber harvest levels per acre and greater application of labor and capital for thinning, disease control, reforestation, and other purposes.  Yet, contrary to a common public impression, the more intensively managed state and private forests ‘appear to be healthier than [the] unmanaged forests,’ mostly in the national forest system” (p. 19). 

            The national forests, however, were increasingly “preserved.”  By 1996, the Sierra Club had moved to a radical position, pushing for a ban on all timber harvesting in national forests, even if it removed excess fuels in an effort to reduce forest fires!  Thus the radical reduction in timber harvest results not from a wood shortage but from “changing environmental values and shifting government policies” dictated by fervent environmentalists (p. 58).  These values, Nelson argues, are primarily religious.  “Environmentalism now claims in effect a unique access to the mind of God, reflecting the common substitution in the modern age of scientific truth for the earlier role of Judeo-Christian religion” (p. 131).  President Clinton’s Interior Secretary, Bruce Babbitt, said “‘we need not sacrifice the integrity of God’s creation on the altar of commercial timber production” (p. 67).  He follows the lead of secular prophets, a long list headed by Gifford Pinchot who envisioned conservation as a means of realizing “‘the Kingdom of God on earth'” (p. 69).  Today’s faithful tend to see preservation as a means of regaining the Garden of Eden!  Wilderness areas are frequently referred to as “‘cathedrals,’ ‘sacred places,’ and other religious terms” (p. 73).  Thus the Wilderness Society motto is a Thoreau declaration:  “in Wildness is he preservation of the world.”  Consequently, says “Thomas Bonnicksen, a respected professor at Texas A&M University, perceives that ‘zealots within the agencies, encouraged by some preservations groups and ideologues in universities, have taken over our National Park and Wilderness areas and converted them into their own quasi-religious temples.’  They have renounced the ‘original purpose of providing for “the enjoyment of the people” and instead are now aiming to “satisfy the spiritual needs of a small but influential subculture”‘” (p. 129). 

            To deliver the nation’s forests from this influential subculture, Nelson suggests, we should abolish the U.S. Forest Service and decentralize its functions.  To allow states and smaller communities to decide what to do with forested lands would lead, he thinks, to better management, restored lumber harvests, healthier trees, less destructive fires.  It would also diminish the influence of powerful environmental groups, with the Washington D.C. headquarters, that lobby politicians and finance “scientific studies” to sustain their power. 

161 War Against the Weak

Terri Schaivo’s recent death illustrates the continuation of a process detailed in Edwin Black’s War Against the Weak:  Eugenics and America’s Campaign to Create a Master Race (New York:  Four Walls Eight Windows, c. 2003).   The author, assisted by some 50 researchers combing various archives, links eugenics’ enthusiasts in the United States a century ago (who were primarily concerned with sterilizing the “unfit” and breeding a better species) with the Nazis who vigorously implemented their ideas a generation later.  “National Socialism,” Black says, “transduced America’s quest for a ‘superior Nordic race’ into Hitler’s drive for an ‘Aryan master race.’  The Nazis were fond of saying ‘National Socialism is nothing but applied biology,’ and in 1934 the Richmond Times-Dispatch quoted a prominent American eugenicist as saying, ‘The Germans are beating us at our own game'” (pp. xvi-xvii).

To accomplish this, eugenicists in both Germany and America had to defy and destroy a deeply-engrained principle in Western Civilization:  the sanctity of life.  As President George W. Bush recently said, the day Terri Schaivo died in Florida, a good civilization is distinguished by its care for her weakest, most vulnerable persons.  But challenging that position is a worldview rooted in Darwinian biology that insists a species evolves (and thus improves) as its fittest individuals survive.  Charles Darwin’s cousin, Francis J. Galton, wriote Hereditary Genius six years after the 1859 publication of On the Origin of Species and coined the word “eugenics” two decades later.  Galton is widely considered the “father” of that alleged scientific discipline,

Eugenic ideas quickly gained a favorable hearing in the United States.  In the same year Galton published Hereditary Genius, 1865 “the utopian Oneida Community in upstate New York [best known for its “free love” experiments under the guidance of John Humphrey Noyes] declared in its newspaper that, ‘Human breeding should be one of the foremost questions of the age. . . .’  A few years later, with freshly expounded Galtonian notions crossing the Atlantic, the Oneida commune began its first selective human breeding experiment with fifty-three female and thirty-eight male volunteers” (p. 21).  Though Galton himself disavowed such breeding endeavors, a small cadre of Americans envisioned infinite progress through racial purification.  One of the trustees of the American Museum of Natural History, Madison Grant, the author of The Passing of the Great Race, made the goal clear, “writing that Nordics ‘were the white man par excellence'” (p. 29).  Grant opposed all interracial unions, asserting that inferior race degraded the “superior” race.  Thus the children of an Iroquois and a German were Iroquois.  Somehow the blood of “inferior” races dominates the genetic development of mixed-blood offspring!  To some eugenicists, one drop of a “mongrel’s” blood makes one a mongrel!

Genetic research in America thrived in large part because wealthy individuals (Mary Harriman) and foundations (Rockefeller, Carnegie) supported it.  In 1904 The Carnegie Institution’s Station for Experimental Evolution opened at Cold Spring Harbor, on Long Island, NY.  The station’s director, Charles Davenport, thenceforth played a crucial role, for nearly 40 years, as the eugenicists’ prime proponent.  Davenport began by gathering a sizeable research library and lab animals.  He effectively enlisted professors from the nation’s most prestigious universities as “associates” at Cold Spring Harbor.  From its inception, Black says:  “Eugenics was nothing less than an alliance between biological racism and mighty American power, position and wealth against the most vulnerable, the most marginal and least empowered in the nation” (p. 57).

One of the powerful men aligned with eugenics was Oliver Wendell Holmes, a justice of the United States Supreme Court, who wrote nearly 1,000 opinions in 30 years on the bench.  He interpreted the country’s Constitution as a “living” document, changing with the decades in accord with new experiences and convictions.  Wounded at Chancellorsville in the Civil War, he read Herbert Spencer’s Social Statics while recovering.  Converted to Spencer’s version of Social Darwinism, Holmes subsequently rejected, as “inherently absurd,” the notion that all men “are endowed by their Creator with certain inalienable rights.”  Truth to tell, he decided:  might makes right.  “‘Truth,’ he declared, ‘is the majority vote of that nation that could lick all others'” (p. 119).  Humanitarians, do-gooders, religionists, sentimentalists–all equally evoked Holmes’ disdain.

Before Justice Holmes’ court came the case of Carrie Buck, a Virginia woman declared “unfit” to bear children.  Under the laws of her state, crafted by eugenicists citing the “research” of Davenport and his colleagues at Cold Spring Harbor, defectives should be sterilized in order to improve the biological basis of society.  Since Carrie Buck was anything but demonstrably defective, her case wound its way to the nation’s highest court.  Writing for the majority of his colleagues, Holmes wrote, in 1927, that Carrie Buck was a “feeble minded” woman who should be sterilized.  “‘It is better for all the world, if instead of waiting to execute degenerate offspring for crime, or to let them starve for their imbecility, society can prevent those who are manifestly unfit from continuing their kind.'” The state had the necessary power.  And, he concluded:  “‘Three generations of imbeciles are enough'” (p. 121).

Having demonstrated the power of eugenics in America, Black turns to its demonic results abroad.  Though folks like H.G. Wells in England insisted that the movement focus upon restricting the births of undesirables, growing numbers scientists in northwest Europe, and especially in Germany, subtly advocated a more active approach:  “eugenicide.”  A report from the American Breeders Association toyed with the notion of “‘painless killing’ of people deemed unworthy of life.  The method most whispered about, and publicly denied, but never out of mind, was a ‘lethal chamber'” (p. 247).  Lethal chambers were widely used to eliminate unwanted pets in England, and a few eugenicists suggested that “idiots” and “imbeciles” could be similarly destroyed.  Unfit children, the American urologist William Robinson declared, should be chloroformed or given potassium cyanide.  And Madison Grant, in The Passing of the Great Race, summed it up:  “Mistaken regard for what are believed to be divine laws and a sentimental belief in the sanctity of human life tend to prevent both the elimination of defective infants and the sterilization of such adults as are themselves of no value to the community.  The laws of nature require the obliteration of the unfit and human life is valuable only when it is of use to the community or race” (p. 259).

Grant received a letter commending his book from an obscure German politician who referred to it as “his Bible.”  The politician was, of course, Adolf Hitler, whose “eugenic writings resembled passages from Grant’s The Passing of the Great Race” (p. 274).  And he would, in time, implement the eugenic policies so favored by Americans such as Grant.  Still more:  the Nazi legislation followed precedents already laid down by laws in America.  Naturally, Hitler derived his views from German eugenicists as well.  And during the 1920s there were vigorous proponents of race purification and perfection whose ideas bolstered his visions of a “master race” imposing its will upon the world.  To get such a race, “breeding facilities” were needed to “mass-produce perfect Aryan babies.”  They would fulfill Nietzsche’s aspiration for Ubermenschen–”taller, stronger and in many ways disease-resistant” (p. 367).  As defined by Hitler’s influential associate, Rudolf Hess, “National Socialism is nothing but applied biology” (p. 270).

To do this, professors, such as Ernst Rudin, were recruited and scholarly institutes, preeminently the Kaiser Wilhelm Institute, established.  Maps were drawn to indicate concentrations of “defective” and “half-breed” populations.  Such folks were dismissed as “worthless eaters” who led a “life unworthy of life.”  Fortunately for the professors, money flowed freely from the Rockefeller Foundation to various “researchers” in Europe.  A new punch card system, perfected and exported by IBM, facilitated such research.  Consequently, in the ’40s, “thousands of Germans taken from old age homes, mental institutions and other custodial facilities were systematically gassed.  Between 50,000 and 100,000 were eventually killed.  Psychiatrists, steeped in eugenics, selected the victims after a momentary review of their records, jotted their destinies with a pen stroke, and then personally supervised their exterminations” (p. 317).

More horrendous developments followed:  Buchenwald; Auschwitz; the Holocaust.  At Buchenwald, Dr. Edwin Katzen-Ellenwald, who had spent many years in America and “served as chief eugenicist of New Jersey under then-Governor Woodrow Wilson” (p. 320), carried out experiments and supervised the killing of thousands of inmates.  He was, ironically, a Jew who was arrested by the Nazis in 1940.  At Auschwitz, another eugenicist, Josef Mengele–the “Angel of Death”–conducted scientific experiments and orchestrated the killing of thousands.  He was particularly interested in the study of twins, following the lead of Francis Galton, to determine precisely how genetics affected a person’s response to various experiments.  “Twins were the perfect control group for experimentation” (p. 348).

With the collapse of the Third Reich and the world’s horrified reaction to its eugenic aspirations, following WWII the movement’s leaders re-labeled it and re-furbished themselves.  Much of the alleged “scientific evidence” accumulated by eugenicists was largely discarded as spurious.  But the basic commitment remained.  The American Society of Human Genetics was established and elected an American, Joseph Muller, who had worked at the Kaiser Wilhelm Institute in the ’30s.  German eugenicists who were not implicated in Hitler’s policies settled quickly into research positions in America and Germany.  England’s James Watson, famed for co-discovering DNA, was deeply involved with the Cold Spring Harbor Laboratory for 40 years.  The term “eugenics” was replaced with a more respectable term, “genetics,” and given fresh energy by environmentalists who focused upon the dangers associated with the world’s “population explosion.”

And we have, today, thousands of scientists clamoring for what Black labels “newgenics.”  To eliminate birth defects, to design a perfect baby, to make ourselves masters of our racial development, are goals widely embraced by many geneticists.  And some propose more radical steps.  “Mass social engineering is still being advocated by eminent voices in the genetics community” such as James Watson, who in 2003 labeled the “lower 10 percent” of the population “stupid” and confessed:  “So I’d like to get rid of that, to help the lower 10 per cent” (p. 442).  Others propose cloning as a means whereby we can perfect our species.  Brave New World approaches!

Black’s book is packed with carefully documented information–60 pages of dense, double-column footnotes.  His research, in various languages, clearly demonstrates the close (if unintended) ties between American eugenics and Nazi genocide.  Though the links may be more correlative than causative, they certainly indicate influence.  There are, as one might expect, certain blind spots in Black’s vision.  Strangely enough, he indignantly condemns Margaret Sanger for supporting sterilization for eugenic purposes while endorsing her stance on abortion.  To oppose the sterilization of a retarded person while allowing the killing of unborn children seems morally confused at best!  But the very title of the book, War Against the Weak, rightly alerts us to the unending struggle at the heart of our culture.


As a companion volume, Richard Weikart’s From Darwin to Hitler:  Evolutionary Ethics, Eugenics, and Racism in Germany (New York:  Palgrave Macmillan, c. 2004) deserves careful study.  Weikart is an associate professor at California State University, Stanislaus, who previously published Socialist Darwinism:  Evolution in German Socialist Thought from Marx to Bernstein.  He does meticulous research and reaches cautious conclusions.  But his message is akin to Black’s:  certain ideas, rooted in Darwinian biology, were brutally implemented in Hitler’s Germany.

Darwin himself, in his Autobiography, noted that a person such as himself, having discounted if not rejected the reality of God and immortality, “can have for his rule of life, as far as I can see, only to follow those impulses and instincts which are the strongest or which seem to him the best one” (p. 21).  Herbert Spencer and Leslie Stephen, almost immediately, developed this position into a purely naturalistic, evolutionary ethics frequently labeled Social Darwinism.  T.H. Huxley–”Darwin’s bulldog”–interpreted Darwin to conclude that “only from death on a genocidal scale could the few progress” (p. 74).  David Friedrich Strauss, famed for his radical portrayal of the purely human “historical Jesus” in The Life of Jesus, published an equally shocking book, The Old Faith and the New in 1872, urging that the old Christian faith be replaced by a naturalistic “worldview containing large doses of Darwinian science” (p. 33).  Though Darwinians today generally demand that the biological theory be separated from its social implications, Spencer and Huxley rightly recognized the inescapable link between them.

The Nazis, six decades later, stepped through Darwin’s door and openly rejected Judeo-Christian morality, seeking to establish a new ethic rooted in both Darwin and Friedrich Nietzsche, whose philosophy was a “direct response to evolutionary ethics” (p. 46).  Nietzsche, for example, encouraged suicide and euthanasia, eliminating disabled children and incurably ill adults.  Many Nazis drank deeply of Nietzsche, and they took seriously the call to revalue all values, to construct a new morality more attuned to natural science.  Indeed, Professor Weikart’s research shows that certain unintended consequences seem inevitable when certain ideas are enthroned in ideological movements.  No one doubts that Darwin would have personally detested the Nazi’s concentration camps, but they are perhaps inevitable consequences of the philosophy he advocated.

Weikart discovered and documents the fact “that many Darwinists believed that Darwinism had revolutionary implications for ethics and morality, providing a new foundation for ethics and overturning traditional moral codes” (p. ix).  Eminent thinkers, such as Darwin’s Cambridge mentor, Adam Sedgwick, immediately recognized this in 1859.  Writing his former student, Sedgwick protested that Darwin ignored the moral or metaphysical aspects of human nature.  Doing so would gravely harm mankind, reducing it “into a lower grade of degradation than” ever recorded by historians (p. 1).  Before Darwin, Weikart repeatedly emphasizes, the “sanctity of life” was an intact, governing principle throughout Western Civilization.  Murder, abortion, infanticide, suicide, and euthanasia were both condemned and illegal.   Though the “sanctify of life” ethic was deeply rooted in Christian, even the manifestly humanistic ideals of the French Revolution–Liberty, Equality, Fraternity–were also repudiated by Darwinists so as to celebrate “determinism, inequality, and selection” (p. 89).

A tight nucleus of positions and policies appeared wherever Darwinian evolution gained currency.  Ernst Haeckel, Darwin’s most aggressive and influential disciple in Germany, envisioned a radically different world, reconfigured by Darwin’s theory of natural selection.  Freed from centuries of Judeo-Christian tradition, Haeckel and his associates “denied any possibility of divine intervention, heaped scorn on mind-body dualism, and rejected free will in favor of complete determinism” (p. 13).  Darwinism, Haeckel argued, has inescapable ethical implications:  “(1) Darwinism undermines mind-body dualism and renders superfluous the idea of a human soul distinct from the physical body.  (2) Darwinism implies determinism, since it explains human psychology entirely in terms of the laws of nature.  (3) Darwinism implies moral relativism, since morality changes over time and a variety of moral standards exist even within the human species.  (4) Human behavior and thus moral character are, at least in part, hereditary.  (5) Natural selection (in particular, group selection) is the driving force producing altruism and morality” (p. 25).

Consequently, Haeckel advocated killing the “unfit” through abortion and infanticide and euthanizing the mentally ill as well as incurable cancer patients and lepers.  All such steps were a “logical consequence of his Darwinian monistic worldview” (p. 146).   Evaluating Haeckel, Weikart notes that “it is striking that the vast majority of those who did press for abortion, infanticide and euthanasia in the late nineteenth and early twentieth centuries were fervent proponents of a naturalistic Darwinian worldview.  Some did not overtly link their views on killing the feeble to Darwinism, though many did” (p. 149).  Anticipating the chorus of criticism, always issued by Darwinists who claim that biology has no social aspects, Weikart sums up his argument:  “First, before the rise of Darwinism, there was no debate on these issues, as there was almost universal agreement in Europe that human life is sacred and that all innocent human lives should be protected.  Second, the earliest advocates of involuntary euthanasia, infanticide, and abortion in Germany were devoted to a Darwinian worldview.  Third, Haeckel, the most famous Darwinist in Germany, promoted these ideas in some of his best-selling books, so these ideas reached a wide audience, especially among those receptive to Darwinism.  Finally, Haeckel and other Darwinists and eugenicists grounded their views on death and killing in their naturalist interpretation of Darwinism” (pp. 160-161).

This was also evident in the militarism so pronounced in WWI.  Darwinists frequently celebrated war as an effective means of natural selection.  The evolution of the human species advanced as “inferior races” were eliminated in combat.  The strongest rightly destroy the weak, and racial progress follows, as Darwin himself said.  In his Descent of Man, for example, Darwin predicted that “savage races” would be replaced by “civilized races,”  as was then taking place in New Zealand.  As H.G. Wells (whose simplistic evolutionary historical works were widely read) declaimed:  “‘there is only one sane and logical thing to be done with a really inferior race, and that is to exterminate it'” (p. 85).  To Franz Conrad von Hotzendorf, the Austrian chief of the general staff, naturalistic evolution explained everything.  He had read Darwin as a youngster, and thenceforth considered “‘the struggle for existence as the fundamental principle of all earthly events [and] the only real and rational foundation of any policy.’  History, he thought, was a continual ‘rape of the weak by the strong,’ a violent contest decided by bloodshed” (p. 173).  To the extent he had a moral code it was that of the ancient Sophist, Thrasymachus:  “‘”Right” is what the stronger wills'” (p. 173).

Weikart persuasively documents the degree to which such views permeated German society in the first three decades of the 20th century.  Such positions were by no means espoused by the majority of the people.  But highly aggressive and influential people did so.  Consequently, when Hitler seized power there were  significant numbers of people prepared to support his racial and eugenic notions.  Most of them took their beliefs from popular writers, but those writers were generally disseminating positions espoused by eminent scientists like Haeckel.  With Haeckel–and then Hitler–they had rejected the Judeo-Christian morality as antiquated and unscientific.  With them, they believed the “Aryan race” was the “master race” and entitled to rule “inferior” peoples.  When necessary, certain “inferior” people should simply be eliminated.  Old-fashioned advocates of the sanctity of life could simply be ignored because Darwinism more accurately described the true reality of things.  “In Hitler’s mind Darwinism provided the moral justification for infanticide, euthanasia, genocide, and other policies that had been (and thankfully still are) considered immoral by more conventional moral standards.  Evolution provided the ultimate goals of his policy:   the biological improvement of the humans species” (p. 215).

# # #

160 Schroeder’s Cosmology

SCHROEDER’S COSMOLOGY

                Tremors cascaded through the philosophical community when Anthony Flew, perhaps the world’s most famous atheist, recently announced that accumulating evidence pointing to a deistic Designer had persuaded him to believe in a Creator-God.  Flew has certainly not embraced theism, but he now seems aligned with thinkers such as Thomas Jefferson.  Explaining his new position, he named Gerald L. Schroeder as one of two thinkers who had most influenced him.  (The second man mentioned is Roy Varghese, the author of The Wonder of the World:  A journey from Modern Science to the Mind of God).  Following his undergraduate and graduate studies at the Massachusetts Institute of Technology and some years in America’s atomic energy establishment, Schroeder has worked at several research institutes in Israel.  In the midst of his scientific work, he also reared children, and their questions regarding Scripture and science prodded him to write three engaging and persuasive books:  Genesis and the Big Bang:  The Discovery of Harmony Between Modern Science and the Bible (New York:  Bantam Books, c. 1990); The Science of God:  The Convergence of Scientific and Biblical Wisdom (New York:  Broadway Books, c. 1997; and The Hidden Face of God:  Science Reveals the Ultimate Truth (New York:  Simon & Schuster, c. 2001).  

            Since the same basic themes weave their way through all three books, I’ll focus on seven of what I take to be Schroeder’s central theses rather than dealing with each book independently.  Firstly, as a physicist seeking wisdom he acknowledges the ultimate importance of metaphysics.  Aristotle, in his great treatise, Metaphysics, noted that “all men suppose what is called Wisdom to deal with the first causes and the principles of all things.”  We cannot but wonder why there is anything rather than nothing.  “Why is there an ‘is’?” (Face, 1).  This is the truly amazing question underlying all Schroeder’s books.  For to understand what Aristotle called the universal “being qua being” is the greatest of all intellectual challenges, and Schroeder believes the evidence (both ancient and modern) points to “a metaphysical Force that brought the physical universe into being.  The universe is the physical expression of the metaphysical” (Face, 1). 

Summing up this conviction in his latest book, he says:  “The physical system we refer to as our universe is not closed to the nonphysical.  It cannot be closed.  Its total beginning required a nonphysical act.  Call it the big bang.  Call it creation.  Let the creating force be a potential field if the idea of God is bothersome to you, but realize the fact that the nonphysical gave rise to the physical.  Unless the vast amounts of scientific data and conclusions drawn by atheistic as well as devout scientists are in extreme error, our universe had a metaphysical beginning.  The existence–if the word existence applies to that which precedes our universe–of the eternal metaphysical is a scientific reality.  That single exotic fact changes the rules of the game.  In fact, it establishes the rules of the game.  The metaphysical has at least once interacted with the physical.  Our universe is not a closed system” (Face, 186). 

Given the importance of metaphysics, Schroeder’s second theme is the intricate relationship between mind, energy, and matter.  The universe, in its most ultimate and important aspect, is mental rather than physical.  The scientific materialism that has so dominated modern science–as well as certain currents of Greco-Roman philosophy–has dramatically lost its allure for 20th century physicists.  They virtually all now agree that “energy is the basis of matter” (Science, xii).  What’s increasingly clear, Schroeder argues, is that “wisdom and knowledge are the basis of, and can actually create, energy which in turn creates matter” (Science, xii).  This view is as controversial today as were Einstein’s theorems a century ago, but most everything points to that conclusion.  John Archibald Wheeler, a renowned Princeton University professor of physics, has “likened what underlies all existence to an idea, the ‘bit’ (the binary digit) of information that gives rise to the ‘it,’ the substance of matter” (Face, 8).   This is a truly “profound” position, for it means “that information is the actual basis from which all energy is formed and all matter constructed” (Face, 154).   What’s really Real, as Plato discerned, are the ideas, the forms, that shape matter.  However offensive this may be to philosophical materialists, it is no more strange than the widely-accepted notion that “the massless, zero-weight photon,” the most elementary of elements, “gives rise to the massive weight of the universe” (Face, 154). 

What scientists like Wheeler now recognize as information or ideas the ancient Hebrew Bible calls wisdom.  This leads to Schroeder’s third main emphasis:  the harmony between science and Scripture.  Today’s scientists, who have discovered the “underlying unity of the physical world,” are “on the brink of discovering an even more sensational reality, one predicted almost three thousand years ago, that wisdom is the basis of all existence.  ‘With the word of God the heavens were made’ (Ps. 33:6).  ‘With wisdom God founded the earth’ (Prov. 3:19)” (Face, 88).  The Bible ( rightly read and interpreted by great exegetes like Maimonides and Nahmanides, allowing for both the literal and symbolic aspects of Revelation) illuminates and harmonizes the most recent scientific discoveries.  Only the Bible, of all the ancient religious texts, generates serious interest in sophisticated scientific circles.  “It alone records a sequence of events that approaches the scientific account of our cosmic origins” (Science, 80).  Indeed, “The parallel between the opinion of present-day cosmological theory and the biblical tradition that predates it by over a thousand years is striking, almost unnerving” (Genesis, 67).  The ancient words of  “Divine revelation” accurately portray the creative process.  But “the words are only a part of the message.  The other part is placed within nature, the wisdoms inherent in the Creation.  Only when we understand those hidden wisdoms will we be able to read between the prophetic lines and fully understand the message.  With the help of science, we are learning to read between the lines” (Face, p. 173). 

            Fourthly, we now read creation in the awesome light of the Big Bang.  Fifty years ago, many physicists, such as Fred Hoyle, still held to the “steady state” cosmos, an eternally existent material world.  Christian thinkers (such as Athanasius in the fourth century and Thomas Aquinas in the 13th) who insisted on creation ex nihilo did so purely on the basis of Revelation, understanding that the word barah “is the only word in the Hebrew language that means the creation of something from nothing” (Genesis, 62).   It cannot be emphasized too strongly that creation ex nihilo “is at the root of biblical faith” (Genesis, 62).  Amazingly, like a relentless blizzard, recent cosmological data regarding the formation of the universe points to precisely such a singular event–an explosion of being–some 15 billion years ago, when literally everything was a super-concentrated bundle of energy “the size of a speck of dust.  It would have taken a microscope to study it” (Genesis, 65).  With incredible power, this concentrated bit of energy expanded and quickly transitioned into matter, taking form in accord with the four basic laws of the universe.  Though these laws largely explain the ways the universe thenceforward developed, they do not, in any way, explain why it came into being.  Neither physics nor any other branch of science can explain being.  So metaphysics, marvelously revealed in the Bible, is needed. 

            Granting the reality of the Big Bang and its explicit confirmation of Einstein’s theories, we move to Schroeder’s fifth point:  time is relative to the speed of light and is thus, like space and matter, hardly constant.  “It is highly significant that light was the first creation of the universe.  Light, existing outside of time and space, is the metaphysical link between the timeless eternity that preceded our universe and the world of time and space and matter within which we life” (Science, 165).  While we cannot fully fathom it, Einstein’s equations indicate that “tomorrow and next year can exist simultaneously with today and yesterday.  But at the speed of light they actually and rigorously do.  Time does not pass” (Science, 164).  A Rolex watch on the moon runs more rapidly than one on earth because it’s subject to less gravity.   Accelerated so as to approach the speed of light, it would virtually stop ticking but still work perfectly.  Depending on one’s reference point, “when a single event is viewed from two frames of reference, a thousand or even a billion years in one can indeed pass for days in the other” (Genesis, 34).  It is, thus, quite correct to insist that the Bible’s six day creation account and a 15 billion year-old cosmos, are identical:  “Deep within Psalm 90, there is the truth of a physical reality:  the six days of Genesis actually did contain the billions of years of the cosmos even while the days remained twenty-four-hour days” (Science, 43).  However illogical it may seem, Schroeder insists that this is literally true:  “Six 24-hour days elapsed between ‘the beginning,’ that speck of time at the start of the Big Bang, and the appearance of mankind, and simultaneously, it took some 15 billion, 365-day years to get from ‘the beginning,’ or the Big Bang as astrophysicists call it, to mankind” (Genesis, 29).  From God’s standpoint–and that’s what’s recorded in the Genesis account–creation took six days.  But His days are the same as 15 billion years from our perspective, as we weigh the scientific data. 

            To cite Schroeder more fully on this critical point, he says:  “To measure the age of the universe, we look back in time.  From our perspective using Earth-based clocks running at a rate determined by he conditions of today’s Earth, we measure a fifteen-billion-year age.  And that is correct for our local view.  The Bible adopts this Earthly perspective, but only for times after Adam.  The Bible’s clock before Adam is not a clock tied to any one location.  It is a clock that looks forward in time from the creation, encompassing the entire universe, a universal clock tuned to the cosmic radiation at the moment when matter was formed.  That cosmic timepiece, as observed today, ticks a million million times more slowly than at its inception” (Science, 58).  Amazingly, “This cosmic clock records the passage of one minute while we on Earth experience a million million minutes.  The dinosaurs ruled the Earth for 120 million years, as measured by our perception of time.  These clocks are set by the decay of radioactive nuclides here on Earth and they are correct for our earthly system.  But to know the cosmic time we must divide earth time by a million million.  At this million-million-to-one ratio those 120 million Earth years lasted a mere hour” (Science, 58).  Summing up what this all means, Schroeder says:  “In terms of days and years and millennia, this stretching of the cosmic perception of time by a factor of a million million, the division of fifteen billion years by a million million reduces those fifteen billion years to six days!” (Science, 58).  Though such statements may utterly perplex those of us mystified by the mysteries of modern physics, Schroeder’s tantalizing suggestions certainly open one’s mind to time’s dilation and the implications it has for understanding the “days” of creation.

            Periodically in his presentations Schroeder points out some of the glaring flaws of evolutionary theory, especially regarding the beginning of the universe and the origin of living organisms, what seems to me his sixth distinct emphasis.  His criticism comes in three categories:  1) the mathematical improbability of aimless evolution; 2) the glaring gaps in the fossil record; and 3) the willful deceptions advanced by some of evolutionary science’s premier advocates. 

Schroeder, like most physicists (and unlike many biologists), finds the cosmos’ structures deeply, indeed astoundingly, mathematical.  But, strangely enough, “many of the texts on evolution eschew any semblance of a mathematical analysis of the theories that random reactions produced this ordered, information-rich complexity” (Face, 120).  When they do they encounter difficulties.  For example, two evolutionary biologists invited a renowned chemist, Henry Schaffer, to calculate the mathematical probabilities of their presentation, Population Genetics and Evolution, and their case was tainted by the fact that “evolution via random mutations has a very weak chance of producing significant changes in morphology” (Face, 120).  Most biologists just ignore the mathematical problems entailed in their presentations.  Yet to Schroeder, alleged “scientists” who blithely dismiss the mathematical improbabilities of evolution betray the fundamental nature of the reality they claim to explain.  He cites favorably the 1968 work of a Yale University physicist, Harold Morowitz,who carefully calculated the probability of earthly life evolving by random selection (as most evolutionists insist).  Five billion years, he showed, affords too little time “for random chemical reactions to form a bacterium–not an organism as complex as a human, not even a flower, just a simple, single-celled bacterium.  Basing his calculations on optimistically rapid rates of reactions, the calculated time for the bacterium to form exceeds not only the 4.5-billion-year age of the Earth, but also the entire 15-billion-year age of the universe” (Genesis, 111).  “In short, life could not have started by chance” (Science, 85). 

            Nor does the fossil record validate the theory of gradual, mindless, random evolution.  “Macro-evolution, the evolution of one body plan into another–a worm or insect or mollusk evolving into a fish, for example–finds no support in the fossil record, in the lab, or in the Bible” (Science, 16).  More than a century after Darwin, Niles Eldredge, one of the world’s most distinguished paleontologists, admitted that the evidence for the gradual evolution of living creatures was still lacking.  “The fossil record of the late 1900s,” says Schroeder, “is as discontinuous as that of Darwin’s (and Wallace’s) time” (Genesis, 134).  Species–and intricate things like eyes and gills– simply appear in the fossil record and “stasis, not change, is the trend with all species yet formed” (Genesis, 135).  During the five million year Cambrian explosion, all the major extant phyla suddenly appeared.  Neither before nor since has anything remotely similar transpired.  For example, five phyla appeared in the Cambrian Era with various kinds of visual systems.  But there is no “common ancestor” for these seeing creatures.  Indeed, “there is no animal, let alone an animal with a primitive eye, prior to these eye-bearing fossils.  Random reactions could never have reproduced this complex physiological gene twice over, let alone five times independently.  Somehow it was preprogrammed” (Face, 121).  This was definitively demonstrated by Harvard’s Elso Barghoorn, who studied the oldest fossil-bearing rocks available.  He discovered “fully developed bacteria” in rocks some 3.6 billion years old.  Living cells were, in fact, present at virtually the same time “liquid water first formed on Earth” (Face, 51).  “Overnight, the fantasy of billions of years of random reactions in warm little ponds brimming with fecund chemicals leading to life evaporated.  Elso Barghoornhad discovered a most perplexing fact:  life, the most complexly organized system of atoms known in the universe, popped into being in the blink of a geological eye” (Face, 51).       

            Finally, Schroeder condemns some eminent evolutionists for their calculating deceits.  To win their case in the court of public opinion, learned scientists have fudged the evidence.  Darwin himself–seven times in the Origin of Species–”implored his readers to ignore the evidence of the fossil record as a refutation of his concept of evolution or to ‘use imagination to fill in its gaps'” (Science, 31).   Shortly thereafter, Charles D. Walcott, the director of the Smithsonian Institute, collected 60,000 Cambrian fossils from Canada’s Burgess Pass.  Hauling them back to Washington, D.C., he stacked them away in laboratory drawers.  Perhaps the richest fossil collection dealing with the appearance of life on earth was deliberately shelved.  Walcott devoutly believed in Darwinian evolution, and the fossils he collected challenged that belief.  If the evidence challenges the theory, hide the evidence!  So Walcott simply kept the fossils out of sight.  Rediscovered in the 1980s, these fossils have played a major role in making clear “Evolution’s Big Bang” in the Cambrian Era. 

More recently, Harvard’s Nobel Prize-winning Professor George Wald declared (in the Scientific American), that life had necessarily arisen from random chemical reactions.  To Schroeder, Wald’s remarks illustrate the fact that such views were “often based on poorly researched science present as fact by one or a few noted personalities” (p. 110).  Wald was in fact so wrong that the magazine printed a retraction of his earlier declaration–an unheard of correction of a Nobel laureate.  Further illustrating the deceitful strategies of eminent evolutionists, Schroeder shows how another Harvard biologist, Stephen Jay Gould, concluded an essay with a quotation from Darwin’s Origin of Species that (in Darwin’s text) left open the possibility that God might have played a role in creation.  Gould, however, deleted some of Darwin’s words and then capitalized a word in the middle of one of Darwin’s sentences to suggest the beginning of a new one, deftly altering the text’s original message.  This was a subtle but significant move by Gould.  And it shows the commitment to a theory that leads to a deliberate distortion of the truth.  Still more:  prestigious museum displays often maximize deception.  The London Museum of Natural History, setting forth a massive “demonstration” of evolution, managed only to portray “pink daisies evolving into blue daisies, little dogs evolving into big dogs, a few dozen species of cichlid fish evolving into hundreds of species of–you guessed it–cichlid fish.  They could not come up with a single major morphological change clearly recorded in the fossil record” (Face, 91).  How revealing it is that evolutionists routinely cite the “evolution” of fruit flies, following intensive genetic manipulation by scientists in laboratories, resulting in bewildering varieties of fruit flies! 

Turning to the computer-generated models that allegedly prove evolution, Schroeder scoffs at the poor mathematical work of celebrated biologists like Richard Dawkins, whose work “proves only that his computer is working correctly!” (Science, 108).   Quite simply:  “As a way of supporting arguments for evolution, computer programs blithely show the transition of outer body forms from amoeba to fish to amphibian to reptile to mammal and human.  The electronic displays deliberately ignore the intricacy of the molecular functions of each cell and serve only to repress the impossibility of randomness as the driving force behind these life processes.  They are in fact an exercise in deception, an insult to adult intelligence” (Science, 189). 

Seventhly, Schroeder regularly discusses human beings as the crowning work of creation, freely responding to the Creator throughout history.  We’re uniquely conscious of ourselves and God.  Amazingly, “If the universe is indeed the expression of an idea, the brain may be the sole antenna with circuitry tuned to pick up the signal of that idea” (Face, 105).  Designed with the ability to discern God’s work and presence in His world, with eyes that are external extensions of the brain, we have, since Adam, been making civilization, preeminently through the use of language, oral and written.  With a brain capable of housing “the information contained in a fifty-million-volume encyclopedia, we ought to be sufficiently wise to succeed at the task” (Science, 170), though we have no idea precisely how non-material words get embedded in and then recalled from a material brain.  Indeed, “the brain is amazing.  The mind is even more so” (Face, 147).  Just as a radio pulls in music from radio waves, so the mind gleans data from the brain.  Without a radio there’s no music; without a brain there’s no thinking.  But a dead radio no more destroys the music than a damaged brain destroys the mind. 

“‘With wisdom’ God created the heavens and the earth” is one way of translating the familiar first verse of Genesis.  Wisely thinking about it all, Schroeder says:  “Life, and certainly conscious life, is no more apparent in a slurry of rocks and water, or in the primordial ball of energy produced in the creation, than are the words of Shakespeare apparent in a jumble of letters shaken in a bag.  The information stored in the genetic code common to all of life, DNA, is not implied by the biological building blocks of DNA, neither in the nucleotide letters no in the phospho-diester bonds along which those letters are strung.  Nor is consciousness implied in the structure of the brain.  All three imply a wisdom that precedes matter and energy” (Face, 178).

To gain such wisdom, fully informed by the best understanding of Bible and science, is our privilege. 

159 Secularizing America


SECULARIZING AMERICA 

            In the 1960s, there was significant cultural ferment as Pope John XXIII “opened the windows of the church” to the modern world and orchestrated the Second Vatican Council, alleged devotees of Dietrich Bonhoeffer  (no doubt misrepresenting him) called for a “religionless” Christian fully attuned to the needs of the world, and Harvey Cox celebrated the “Secular City” as the only forum for a vibrant church.  Perhaps unconsciously, these religionists joined hands with various secularists who, for a century, had emerged as the architects and arbiters of culture.  Traditional, orthodox Christianity, especially, was portrayed as a remnant of the Middle Ages, when faith, not reason, prevailed, and it was widely assumed that truly enlightened, reasonable people would become secular humanists. 

That picture must be re-evaluated in accord with a recent publication by Christian Smith, a sociology professor at the University of North Carolina, who has edited a valuable collection of essays, The Secular Revolution:  Power, Interests, and Conflict in the Secularization of American Public Life (Los Angeles:  University of California Press, c. 2003), that illuminate one of the singular developments of the past century.   Secularism has indeed triumphed in many sectors.  But we’ve been misled, says Smith, about the process that facilitated this triumph.  Secularism displaced religion–helped along by some within the evangelical establishment–as the entrenched worldview of the nation’s elite institutions not because it is a necessary component of a technologically sophisticated and progressive people.  Rather it was brought about by a committed band of skeptical, agnostic, liberal rebels, who plotted to re-make America in their own image.  They gained control of the “knowledge-production occupations” and promoted “materialism, naturalism, positivism, and the privatization or extinction of religion” (p. 1). 

            Until the last quarter of the 19th century, Scriptural Protestantism, rooted in a “Scottish Common Sense Realist epistemology and Baconian philosophy of science” (p. 25) largely shaped America.    “On the broadest scale, the Protestant establishment maintained that Christian virtue, free market capitalism, and civic republicanism were working together to beget a civilization higher than humans had ever known–begetting perhaps the kingdom of God” (p. 26).  By 1880, however, a new view had gained a foothold.  Generally “progressive” and “secular” in nature, champions of this position wielded “science” as club with which to drive religion into carefully circumscribed ghettos.  By the 1930s, “the Protestant establishment was in shambles” (p. 28).  Ordinary believers still populated church pews.  But the universities, newspapers, and judiciary had largely turned secular. 

            This took place, Smith argues, because a self-anointed elite seized control of the nation’s power centers.  William James, speaking to the alumnae of Radcliffe College in 1907, summed up the attitude:  “‘We alumni and alumnae of the colleges are the only permanent presence [in America] that corresponds to the aristocracy in older countries.  We have continuous traditions, as they have; our motto, too, is noblesse oblige; and, unlike them, we stand for ideal interests solely, for we have no corporate selfishness and wield no powers of corruption. We ought to have our own class-conscioussness.  “Les Intellectuels!”  What a prouder clubname could there be than this one?'” (p. 41).  William James and John Dewey, with their respective versions of pragmatism, hugely influenced 20th America–James reducing religion to psychology and Dewey driving it from the schools. 

            The secular elite embraced Kant’s revolutionary Enlightenment dictum:  “Dare to think!”  By which they meant:  think for yourself without restraint.  They particularly embraced the positivism of Auguste Comte, with his “religion of humanity” and sociological “science.”  Herbert Spencer’s naturalistic Social Darwinism also attracted Americans such as William Graham Sumner and Andrew Carnegie, whose philanthropy often included blatantly anti-Christian provisos.  Indeed, the more pernicious aspects of evolution were social rather than biological in nature.  Scores of young Americans studied in German universities, absorbing their “historicism, idealism, theological liberalism, higher biblical criticism, and the ideal of academic freedom as autonomous rationality” (p. 57).  Introducing these ideas into America, establishing their dominance everywhere, became the driving motivation of the secularists–Les Intellectuels!

            Smith traces this story in his essay, “Secularizing American Higher Education.”  For roughly 700 years higher education had flourished, in the West, under the auspices of Christianity.  America’s colleges, prior to the Civil War, clearly continued this tradition, providing a solid education for their constituents.  Radicals intent on transforming them infiltrated the colleges, however, and by 1900 a new zeitgeist reigned.  Rather than respecting the religious commitments responsible for founding the colleges, the new class sought to remove religion from the curriculum and relegate it to increasingly rare chapel services.  Prestigious presidents–Charles Eliot at Harvard, Andrew Dickson White at Cornell–pushed the levers of power to secularize their institutions.  A new academic discipline, Smith’s own sociology, was particularly committed to this process, with virtually all the leading professors quite hostile to religion. 

            Generally speaking, these secularists worked subtly within religious institutions to slowly subvert them.  As Smith shows, they camouflaged their endeavors, seeking to beguile the religious folk who supported the colleges.  What they said in public was often quite different from what they said in their “scholarly” writings, lectures, and work within the schools.  As one of the leaders of the discipline of sociology, Edward Ross, said, “The secret order is not to be bawled from every housetop.  The wise sociologist will show religion a consideration . . . [and] will venerate a moral system too much to uncover its nakedness'” (p. 125).  Yet of the 170 general sociology textbooks published between 1880 and 1930, an overwhelming majority clearly sought to destroy “one moral and epistemological order” (p. 115), Christianity, and establish a secular counterpart.  The texts routinely embraced the views of Comte and Spencer, Darwin and Nietzsche.  Religion, as a purely personal emotional endeavor, might be tolerated.  But it could make no claims to knowing objective truth about anything important. 

            Kraig Beyerlein focuses on the role of the National Education Association in his fine essay, “Educational Elites and the Movement to Secularize Public Education.”  We who routinely observe the NEA in action at the Democratic National Conventions, as well as in local electoral campaigns, should not be surprised at its strongly secularist stance.  Early established (as the National Teachers Association in 1857), to promote “common Christianity” in the schools, the NEA by 1880 had changed, moving within a decade to prevent the schools from conducting religious activities.  This turn followed an intense power struggle within the association, as those responsible for its founding sought to maintain its pro-Christian stance.  Beyerlein quotes eminent educators–many of them clergymen–in the ’70s and ’80s to show how thoroughly Christian were the nation’s public schools.  But as these men aged and were replaced by younger secularists, the NEA changed rapidly and a “new orthodoxy” was installed.  “Only vestiges of the old religious system remained” (p. 193). 

            Eva Marie Garroutte has a short essay, “The Positivist Attack on Baconian Science and Religious Knowledge in the 1870s.”  She carefully studied Popular Science Monthly and Scientific American, journals for the intelligentsia in that era.  Before 1880, America’s leading scientists, such as Yale’s Benjamin Silliman, were frequently devout Christians who approached their studies from a Baconian and Scottish Common Sense perspective, envisioning their work as a “sacred science,” marvelously compatible with the Scriptures.  They worked inductively, eschewing unwarranted hypotheses, committed to drawing universal truths from particular facts.  Ultimately, there were “laws discovered by induction [that] were understood teleologically as descriptions of the mediate intervention of the divine in the world” (p. 198).  The Bible too, said Charles Hodge, was a compendium of facts that, inductively studied, led to an accurate understanding of God.

            This approach to science was overwhelmed by the “positivist attack” Garroutte describes.  Promoting Spencer, Darwin, and Huxley, the new class of intellectuals worked to spread “the gospel of naturalism.”  Cornell University’s Andrew Dickson White helped lead the assault, with his famous (if deeply flawed History of the Warfare of Science with Theology in Christendom.  The positivists shrewdly attacked language, insisting (in the nominalist mode) that there is no real bond between words and the realities they describe.  Horace Bushnell had earlier anticipated this cleavage, arguing that religious language could never depict religious truths with any precision.  One could never take anything in the Bible “literally.”  Thus as the Darwinian controversy escalated, the positivists insisted that the Bible’s version of creation was purely poetic.  Consequently, the “religion-friendly science” of the 19th century was replaced by the religion-hostile science of the 20th. 

            Reducing religion to psychology, a la William James, was a powerful part of the secularization process.  As the astute Yale historian H. Richard Niebuhr noted, reducing Christianity to psychology was a “‘sterile union’ resulting from the revolution introduced by William James and his followers'” (p. 270).  In Keith Meador’s judgment, James’ religious teaching “is finally a threadbare version of pious Christianity, for James was, as he aptly characterized himself to a friend, ‘a Methodist minus a Savior'” (p. 292).  Though James’s Varieties of Religious Experience is warmly supportive of the phenomenon, he basically espoused a pious humanitarianism devoid of any doctrinal content.  The reality of sin never troubled him.  We are, by nature, in need of healing, not forgiveness.  To the extent preachers assure us that “‘God is well, and so are you’” (p. 292) all is well!  To move from James to the “I’m OK, You’re OK” pabulum of the ’60s–or the “self-esteem” educators’ frenzy of the past two decades–easily illustrates the power of his thought. 

One of the men supporting this move was Charles Clayton Morrison, long-term owner and editor of The Christian Century.  His story is evaluated by Keith G. Meador in “‘My Own Salvation’:  The Christian Century and Psychology’s Secularizing of American Protestantism.”  Sadly enough, by 1939 Morrison had come to lament this process, writing an essay, “How My Mind Has Changed,” and declaring:  “I had baptized the whole Christian tradition in the waters of psychological empiricism, and was vaguely awakening to the fact that, after this procedure, what I had left was hardly more than a moralistic ghost of the distinctive Christian Reality.  It was as if the baptismal waters of the empirical stream had been mixed with some acid which ate away the historical significance, the objectivity and the particularity of the Christian revelation, and left me in complete subjectivity to work out my own salvation in terms of social service and an ‘integrated personality‘” (p. 269).  Few statements that I’ve read more powerfully depict the trajectory of mainline Protestantism in the 20th century!

            Morrison was reared in the revivalistic atmosphere of his father’s preaching, within the Disciples of Christ denomination.  After alternatively pasturing and attending college, Morrison graduated from Drake College in 1898 and headed for the University of Chicago to do graduate studies in philosophy under John Dewey and George Herbert Mead.  Along the way he embraced higher criticism, evolution, and the Social Gospel, which proved to be central tenets of his creed.  Darwin, he thought, was the 19th century mental giant who had freed man from ignorance.  Thus he wrote, in 1910:  “If there is any word which is especially dear to all modern liberals it is the word evolution.  We have never seen a liberal who is not an evolutionist” (p. 280).  This is important, because “it is Darwin’s theory of evolution that undergirds almost every psychological theory since the nineteenth century” (p. 284).  G. Stanley Hall, along with his teacher, William James, one of the most influential psychologists of that era, noted that as a student he discovered Darwin’s theory and rejoiced to know that “‘all worlds and all in them had developed very gradually and by an inner and unremitting impulsion from cosmic mist and nebulae . . . while all religions, gods, heavens, immortalities, were made by mansoul'” (p. 284). 

So theology must adjust to the truth of evolution, which for Morrison meant discarding anything supernatural in its claims.  Thus it is Jesus the very human teacher, not the eternally-begotten Incarnate Son of God, that matters.  Religious dogma must be tossed in the trash barrel of history.  Only empirically based, scientific studies–especially in the realm of psychology–can be trusted as authorities for living well.  As James declared in his influential The Principles of Psychology, “Psychology is the science of the mental life” (p. 283).   Self-realization, mental health, social service and genial good cheer should be the true Christian traits.  G. Stanley Hall wrote a two volume study of Jesus, titled Jesus, The Christ, in the Light of Psychology wherein he basically reworked “Ernest Renan’s The Life of Jesus, so that Jesus is not the one in whom Christian believe, but the one with whom Christians believe” (p. 288).   In accord with W.F.D. Schleiermacher’s Liberalism, Hall adjusted attractive elements of the Christian message to his psychological world view. 

Now enters John Dewey!  Influenced by Hall and James, Dewey deeply influenced America’s educators–including religious educators!  Pastoral care in 20th century, ironically, was deeply shaped by Dewey’s 1916 classic, Democracy and Education.   No longer, said Dewey’s acolytes within the seminaries and churches, should we worry ourselves with questions such as the existence of God.  What matters is how our idea of God affects our lives.  If our idea of God enables us to live more productively, it’s true enough for us.  Indeed “‘that conception of God is truest which aids most in guiding, ennobling, comforting, and strengthening man in his devotion to moral ends'” (p. 295).   Theology becomes psychology, preaching becomes therapy, pastoral care becomes counseling. 

All of these trends were celebrated by C.C. Morrison in The Christian Century.  Psychological texts, the works of James, Hall, Dewey, et al., were regularly reviewed and promoted in its pages.  In the midst of it all, Morrison seemed strangely oblivious as to what was really happening.  The allegedly Christian publication seemed remarkably akin to various secular magazines.  By 1939 he realized that much that constitutes Christianity had been deleted from his journal’s pages.  The Social Gospel he’d energetically promoted between WWI and WWII–espoused by the likes of Jane Addams and Henry Emerson Fosdick, featuring the routine support of labor unions, women’s suffrage, prohibition, and pacifism–seemed increasingly vacuous.  A decade later, Morrison declared that the secularization process he’d promoted throughout most of his life had robbed the Church of her riches.  Embracing the “modern culture” had been the “undoing of Protestantism.”  Rather than energetically critiquing the secular society, liberalism had critiqued Christianity!  Morrison concluded, sadly enough, that by embracing “Dewey’s language of ‘adjustment,’ the Darwin-inspired language of ‘progress,'” Protestant Liberalism had resulted in “an accommodation to psychology whose final result was the secularization of American public life” (p. 303). 

            The law, too, was deliberately secularized, as David Sikkink makes clear in “From Christian Civilization to Individual Liberties:  Framing Religion in the Legal Field.”  Following the Civil War, under the influence of law school professors, the legal profession moved away from its traditional grounding in the natural law.  Earlier judges had freely utilized biblical sources and moral reasoning in deciding cases.  A new elite insisted on a “case law” approach more “scientific” in nature, and religion was often labeled “sectarian” and thus somewhat irrelevant in the courtroom.  Then, as the 20th century dawned, this approach faded and there was a rapid shift “from general religion to civil liberties.”   Judges like Felix Frankfurter and Oliver Wendell Holmes increasingly insisted on “making” rather than merely “interpreting” the law. 

            Holmes was especially influential in this transition.  While a student at Harvard, he embraced science (without any God) as his religion.  Darwin’s theory especially affected Holmes as a student.  He also served valiantly as a soldier and saw some of the worst fighting in the Civil War, becoming quite cynical regarding mankind.  Back in Boston, he joined an elite intellectual circle–the “Metaphysical Club–that included Charles Peirce and William James.  Embracing their pragmatism, Holmes published The Common Law in 1880 and quickly moved up through the judicial system, ultimately becoming a United States Supreme Court justice.  Holding that the law is whatever the courts decide it to be, without any higher grounding, he rigorously abolished many of the legal traditions that had guided jurists for more than a century.  The popular notion–often labeled “sociological jurisprudence”–that the Constitution is a “living document,” constantly changing as judges apply it to new situations, comes directly from Holmes.  

            In these essays, and others that I’ve not discussed, there is great attention to detail.  Hundreds of footnotes and bibliographic citations make this an eminently scholarly collection.  Basically written for sociologists–and occasionally getting overly absorbed in intra-mural quibbling about various theories of secularization–the book nevertheless rewards careful reading.  What’s clear is that a self-anointed elite has waged an intense war for more than a century, determined to take charge of the most powerful cultural institutions in America.  By changing the very nature of education, religion, science, and law, they have greatly changed American society.  But, these essays make clear, they won their victories neither through evidence nor argument.  They basically worked within various institutions until they had the power to impose their views.  And thus they have secularized America. 

# # #