248 “The Rising Tyranny of Ecology”

In 1973, Alston Chase abandoned his academic career—as a tenured philosophy professor with degrees from Harvard, Oxford and Princeton—and “returned to nature” in Montana’s mountains.  At the time he considered himself an “environmentalist” and sought to live accordingly.  Successfully relocated, he undertook a writing project to document the development of Yellowstone National Park under the reigning “ecosystems management” principle adopted by park managers.  What he discovered—and detailed in Playing God in Yellowstone (Boston:  The Atlantic Monthly Press, c. 1986)—was the destructiveness of misguided good intentions, leaving the park significantly degraded and endangered.  Particularly informative is his philosophically-nuanced treatment of “the environmentalists” who naively assumed and promoted “the subverted science” articulated by Rachel Carson, whose deeply flawed Silent Spring (labeled “the Uncle Tom’s Cabin of modern environmentalism”) recruited so many of them to the movement.  Such environmentalists include “the new pantheists” who believe, with Thoreau, that:  “In wilderness is the preservation of the world.”  They’re often enamored with “California Cosmologists” such as Theodore Rosak, Alan Watts, Fritjof Capra, and assorted Native American shamans, whose thoroughly Gnostic notions (e.g. panpsychism) promise a mystical union with Nature conducive to the inner bliss of self-realization.  And they follow an assortment of “hubris commandos”—well-heeled urban elites with political connections who want to reserve the wilderness for backpackers.  

His work on Yellowstone piqued Chase’s concern for broader environmental policies impacting the nation, so he researched and wrote In A Dark Wood:  The Fight over Forests and the Rising Tyranny of Ecology (New York:  Houghton Mifflin Company, c. 1995), one of the most probing and important ecological studies in print.  Combining a detailed narrative of events with a penetrating analysis of deeper issues, In A Dark Wood effectively exposes the heart of America’s confusion regarding how to live rightly with the natural world.  Anyone one concerned with the health of the natural world—and of the cultural and political world—should read and reflect on Chase’s work.  Evaluating his conclusions at the beginning of his treatise, Chase confesses they “were far more disturbing than I had anticipated.  An ancient political and philosophical notion, ecosystems ecology, masquerades as a modern scientific theory.  Embraced by a generation of college students during the campus revolutions of the 1960s, it had become a cultural icon by the 1980s.  Today, not only does it infuse all environmental law and policy, but its influence is also quietly changing the very character of government.  Yet, as I shall show, it is false, and its implementation has been a calamity for nature and society” (p. xiii).  Those collegians—drinking deeply from Rachel Carson’s Silent Spring and adopting the pantheism of Emerson  and John Muir—began an effective long march through the nation’s institutions and transformed environmentalism into a religious faith with a radical political agenda.  

That process stands revealed in their determination to preserve the forests of Washington, Oregon, and California.  Rejecting the traditional notion that loggers with their sawmills could wisely manage the forests and provide for their sustained rejuvenation, the activists demanded they be quarantined in what was imagined to be a primitive paradise and allowed to flourish free from human contamination.  Chase shows how timber “harvests soared during the postwar building boom” and “trees grew faster than they were being cut” (p. 73).  This was especially true of California’s redwood forests.  The trees were healthy and provided a healthy income for thousands of folks throughout the Pacific Northwest.  Admittedly, “old growth” forests were declining, but they had been effectively replaced by faster growing, younger, healthier trees.  Still more:  those “old growth” forests were largely a figment of the activists’ imagination!  The best historical evidence shows that pre-Columbian Indians had carefully set and controlled fires, and the Pacific Northwest forests two centuries ago were “‘more open than they are now,’” containing “‘islands of even-aged conifers, bounded by prairies, savannas, groves of oak, meadows, ponds, thickets and berry patches.”  Largely due to broadcast Indian burning, they “‘were virtually free of underbrush and course woody debris that has been commonplace in forests for most of this century’” (p. 404).  

But the defenders of the mythical “old growth” forests, draped in the mantle of “ecology” and taking “the balance of nature” as axiomatic, believe “nature knows best” and requires us to promote her sustainability.  This position is less a scientific schema than an ancient ideological stance shaped by evolutionists such as Ernest Haeckel (who coined the word ecology in 1866) and Aldo Leopold (whose Sand County Almanac became an instructional manual for environmental activists).  It is less an agenda fueled by evidence than a faith founded in improbable historical and metaphysical assumptions.  The Ecosystem became God!  Enamored of “deep ecology,” environmentalists “unwittingly embraced ideas that synthesized an American religion of nature with German  metaphysics:  a holism that supposed individuals were only parts of a larger system and had no independent standing; antipathy to Judaic and Christian values of humanism, capitalism, materialism, private property, technology, consumerism, and urban living; reverence for nature; believe in the spiritual superiority of primitive culture; a desire to go ‘back to the land’; support for animal rights; faith in organic farming; and a program to create nature reserves” (p. 129).  Fortuitously for them, the Endangered Species Act both enacted their aspirations and opened legal portals whereby they could effectively attain their goal of transforming America.  

The ecological activists, looking for an opportunity to kill the logging industry with its “clear-cutting” and access roads, chanced on an “endangered species” in the Pacific Northwest—the spotted owl.  Only a few birds (14 in the first important study) were found, and they appeared to prefer “old growth” forests.  To preserve these owls’ “ecosystem” a massive effort was almost immediately launched to halt all activities in the forests that might endanger it, though in fact “spotted owl policy would be built on the thin air of uneducated guesswork” (p. 251).    Well-funded by environmental organizations such as the Nature Conservancy, the Sierra Club, and the wealthy eastern aristocrats such as Teresa Heinz Kerry who finance foundations (e.g. Pew, Rockefeller, Heinz and the Tides), the activists successfully manipulated the media, academia, and the judiciary to preserve the extensive lands allegedly needed for the spotted owl to flourish.  They deliberately ignored accumulating scientific studies finding ever-more spotted owls thriving throughout the region, especially in recently harvested private timber properties.  “By 1993 six thousand to nine thousand would be estimated to live in northern California alone, and perhaps an equal number in Oregon and Washington.  Yet federal demographic studies continued to claim that the species remained in deep decline” (p. 365).  True believers cannot be deterred by the facts!  

So they successfully pursued their agenda, primarily through the courts, and managed to earmark large sections of the Pacific Northwest as “old growth” forests, forever inviolable and off-limits to cutting of any sort.  Timber harvests in California dropped by 40% within two decades.  Loggers lost their jobs, sawmills closed, and small towns shriveled.  Unlike the urban environmentalists (burnishing their self-esteem by supporting the Sierra Club, which paid skilled lawyers to pursue their agenda through the courts) the working folks in the forests lacked both the money and organizational skills with which to resist the lock-down of their region.  “Saving the owl had effectively shut down an area larger than Massachusetts, Vermont, New Hampshire, and Connecticut combined, costing the economy tens of billions of dollars and casting tens of thousands out of work” (p. 398).  When grass-roots groups (identified as the Wise Use Movement) tried to rally and defend themselves and their livelihood, environmentalists invested millions of dollars to discredit them, calling them “a mob” bankrolled by the evil timber industry.  Environmentalists orchestrated meetings with President Bill Clinton and his Vice President Al Gore, who then packed the President’s Council on Sustainable Development with leaders of various well-heeled environmental organizations.  Remarkably, within three decades the ecological “movement became a war launched by the haves against the have-nots.  It is a situation analogous to what the late Christopher Lasch has called ‘the revolt of the elites’ whereby ‘upper-middle-class liberals have mounted a crusade to sanitize American society.’  Indeed, Lasch could have been thinking of environmentalist when he added that ‘when confronted with resistance, members of today’s elite betray the venomous hatred that lies not far beneath the smiling face of upper-middle-class benevolence’” (p. 415).  

Tragically, the natural world would suffer harm along with the loggers and the small town economies they sustained.  “The great effort to save old growth would eventually destroy the very landscapes it was intended to preserve.  For it demonstrated an important principle:  that seeking to halt change merely accelerates it.  Nothing more clearly revealed this truth than the rising threat of wildfire” (p. 400).  Allegedly “saving” forests and wildlife, the preservationists paved the way for the fires we now witness throughout the West.  Trees that could have been logged and provided a living for thousands of people now burn, for wherever old trees die and underbrush thrives the potential for massive fires increases.  For instance:  “Officials in southern California, following the 1993 firestorm, attributed the lack of prescribed burning that could have reduced or eliminated much of the destruction to public opposition, some of which was based on concern for the habitat of the Stephens kangaroo rat and the gnatcatcher” (p. 401).  The raging fires should awaken us to the truth of Dante’s words that provide Chase the title for his book.  In The Divine Comedy the great poet said:  “I went astray / from the straight road and woke to find myself / alone in a dark wood.  How shall I say / what wood that was!  I never saw so drear, / so rank, so arduous a wilderness.”  Today we’re in a “dark wood” that results from our captivity to an ancient philosophical error:  identifying the good with what is “natural,” imagining the “state of nature” as ideal for man, formulating “new values based on systems ecology, which from the beginning was less a preservation science than a program for social control.  Supposing that protecting ecosystems was the highest imperative for government, it increasingly viewed the exercise of individual liberty as a threat” (p. 413).  

* * * * * * * * * * * * * * * * * *

Nothing better illustrates the “rising tyranny of ecology” than Elizabeth Nickson’s Eco-Fascists:  How Radical Conservationists Are Destroying Our Natural Heritage (New York:  HarperCollins Publishers/Broadside Books, c. 2012).  She claims to “walk the green walk more than anyone I’ve met” and lives on 16 acres immersed in older-growth forest in a geothermal-heated house on a Canadian island (Salt Spring) in Puget Sound.  There she witnessed and recorded, in fascinating detail, how we have been misled by “a corrupt idea—that an ecosystem has to be in balance, with all its members present in the correct proportion, to be considered healthy” (p. 6).  According to the litany:  “Nature knows best.  Man is a virus and a despoiler and must be controlled” (p. 18).  Though touted as “conservation biology” it is a demonstrably “bad science” that is doing great harm.  “Evil may be too strong a word for us modernists to use comfortably, but what else do you call an idea that ruins everything it touches?” (p. 173).  “In just thirty-five years, conservation biology has created one disaster after another, in something that observers are now calling an error cascade.  Tens of millions have been removed from their beloved lands.  Immensely valuable natural resources have been declared off-limits to the most desperate in the developing world” (p. 200).  Consequently:  “Range, forest, and farm are dying; water systems have been destroyed.  Conservation biology has created desert and triggered the dying of entire cultures” (p. 200).  

A seasoned journalist, working in various parts of the world as a reporter for Time magazine, Hickson went home to care for her dying father and remained on the land because she learned to love it.  She “built a cottage at the top of my hill” and “resolved to stay” (p. 12).  In the process, however, whenever she tried to do literally anything on the land she owned and sought to improve she encountered the front line of a totalitarian movement—“an uncompromising and rigid bureaucratic command-and-control structure, which is creating yet another hybrid of the totalitarian state” (Kindle #82)—that saddled her with a series of irrational and onerous restrictions and burdened her not only with anger at the fanatical environmentalists on her island and senseless bureaucratic restrictions but with a concern for the future of our world.  In the process she effectively “lost all but 4 acres of the original 28.  I still pay taxes on the 16.5 acres I supposedly have left, but I’m lucky I am allowed to walk on it” (p. 177).  She had to deal with “grim zealots [many of them angry, wealthy, divorced women] seeking to remake the world” in accord with their mantra of “sustainability” and who find allies in affluent NGOs and “fervent true believers in federal and state agencies” such as the EPA (#98).  She discovered a “labyrinthine public planning process” aptly described by historian David Clary as “the eternal generation of turgid documents to be reviewed and revised forever.”  

True to their Leftist principles, environmental zealots follow the utopian visions formulated by Rousseau and Marx, validating the oft-uttered generalization that “when the Iron Curtain fell, fellow travelers migrated to the environmental movement.  And when they arrived to transform the rural world—a world few of us visit except on vacation, when no one is paying attention—they brought their planning with them” (p. 45).  Consequently:  “There is no starker way to describe what is taking place right now in the country than as the full flourishing of the bureaucratic state.  Private property rights have been largely removed, the culture is dying, but the state, consisting of federal, state, and local ministries and departments, has bulked out so that a giant superstructure of bureaucrats with rulebooks piled high around their desks flourishes, grows, and feeds on ever-diminishing wealth” (p. 47).  

Facilitating this process (and feathering their own nests while granting rare privileges to their wealthy political friends such as Harry Reid) are powerful organizations such as the Nature Conservancy (TNC), the world’s 10th largest NGO, “the biggest of the big dogs, the mythic wolf-king of the forest primeval” (p. 61).  In a complicated, convoluted and surreptitious process, The Nature Conservancy works with “the nation’s richest individuals, like Ted Turner, David Letterman, the Rockefellers, and the DuPonts.  Basically, TNC is acting as agent for the wealthiest among us, acquiring enormous tracts of land, using $25 donations from its 1.3-million-strong membership and $100 million in annual government money, and then selling that land at a discount to the very rich, who in effect receive a substantial tax discount as well as an extremely beautiful place in which to establish a country estate” (p. 68).  The good folks living on the land distrust and fear TNC, so it generally “operates through a proliferation of ‘partner’ land trusts, conservancies, and operatives.  TNC’s sending polite, fresh-faced kids into the middle of nowhere to start local actions for waterbirds or watersheds or ancient forests was the trigger that started the landslide collapse in rural America” (p. 72).  Environmentalists have created legions of smaller foundations, now run “by a subset of grim zealots seeking to remake the world” (#96).  The feared “robber barons” of yore have been replaced by equally pitiless celebrities such as Tom Cruise, Teresa Heinz, and Robert Redford!  Readers such as I (for many years a member and admirer of The Nature Conservancy) will never forget Nickson’s meticulous deconstruction of TNC—and by inference the Sierra Club, the Wilderness Society, etc.  

Her personal frustration led to an investigation, including an extensive journey throughout the rural West (marked by an in-depth interview with Alston Chase in Montana) as well as plowing through the  written materials that resulted in the publication of Eco-Fascism.  She explored the forests where logging and sawmills once sustained a vibrant culture and the open range backcountry where cattle once ranged.  There she found:  “Deserted lands, mile after mile after mile.  No one on the highways, not even trucks.  One broken little hamlet after another.  . . . .  What I was looking at was death.  Death not just of the little towns but death of millions of acres of rangeland.  . . . it was like driving through Ghost World, with wraiths drifting across the fields whispering of what was once all fecundity and life” (p. 235).  She came to believe that there has been a well-orchestrated war on rural America, where folks earn their living from the land rather than preserving it as sanctuary for either veneration or vacation retreats.  Enamored with their own purity, environmentalists have effectively sequestered nearly 700 million acres of land, 30 percent of the nation’s land base.  “The amount of land classified as wilderness has grown tenfold since ecosystem theory took flight, growing from 9 million acres in 1964 to 110 million acres today” (p. 96).  Amazingly, as a result of crusades to create parks and “wilderness areas,” nearly half (40%) of New York state, “almost 14 million acres—is in the process of being rewilded, turned back, in all essentials, to Iroquoia” (p. 40).  Worldwide the same process proceeds apace—as a result of the creation of parks and refuges “more than 10 percent of the developing world’s landmass has been placed under strict conservation—11.75 million square, miles, more than the entire continent of Africa” (p. 38).  In the process, “more than 14 million indigenous people have been cleared from their ancestral lands by conservationists” (p.36).  

With the support of the Clinton Administration in the 1990s and the Obama Administration today, environmental activists have banned logging in millions of acres in the national forests.  However well-intended, the meticulous study of Holly Fretwell, Who Is Minding the Federal Estate—“the most important analysis of the effects of environmental activism on rural America to date” in Nickson’s judgment (p. 129)—shows “that everything, everything, we have been doing was wrong” (p. 130).  Wildfires vividly illustrate this, for nothing—neither timber harvesting nor road building—can compare with the damage that wildfires inflict on” the forests (p. 130).  The fires resulting from environmental policies consume vastly more timber than “evil corporations” could possibly have done, and the devastation inflicted on spruce and pine trees by the pine beetle and budworm could have been controlled by rapid cutting had not environmentalists insisted the bugs be allowed to pursue their destructive ways.    

  Nickson admits:  “The title of this book is harsh, particularly when used with regard to environmentalists, whom most people view as virtuous at best, foolish at worst.  But I do not use this term lightly, nor as a banner to grab attention.  My father landed on D-day and, at the end of the war, was put in charge of a Nazi camp and told to ‘sort those people out.’”  He was thus highly sensitive to the fact “that man defaults to tyranny over and over again, and while the tyranny of the environmental movement in rural America has not reached what its own policy documents say is its ultimate goal—radical population reduction—we cannot any longer ignore that goal and its implications” (#132).  And she believes there is in fact an answer:  “The Gordian knot of the countryside mess can be solved with one swift blow of the sword.  Property rights must be restored to the individuals who are willing to work their lives away tending that land.  The people, the individuals and families, in other words, who want it.  Confiscation by government, multinationals, and agents of the megarich—the foundations and land trusts—must be reversed.  Otherwise devastation beckons” (p. 314).  

247 How Liberalism Became Our State Religion

Barack Obama’s 2008 presidential campaign and election clearly appealed to and elicited a strongly religious fervor.  Devotees fainted at his rallies, messianic claims were attached to his agenda, and Obama promised a fundamental “transformation” of America.  Celebrating his election, he grandiosely declared that peoples henceforth would see that “this was the moment when the rise of the oceans began to slow and the planet began to heal.”  Consequently, actor Jamie Foxx urged fans to “give an honor to God and our lord and savor Barack Obama.”  MSNBC commentator Chris Matthews enthused:  “This is the New Testament” and “I feel this thrill going up my leg.”  Louis Farrakhan, closely aligned with Jeremiah Wright, Obama’s Chicago pastor, told his Nation of Islam disciples:  “When the Messiah speaks, the youth will hear, and the Messiah is absolutely speaking.”  And there’s even The Gospel According to Apostle Barack by Barbara Thompson.  Though previous presidents—notably John F. Kennedy—elicited something of the same enthusiasm, Obama is somewhat unique in America.  But he is not at all unusual when placed against the backdrop of human history, when again and again “charismatic” leaders have claimed and been endowed with supernatural powers.  

Thus there is good reason to seriously ponder Benjamin Wiker’s Worshipping the State:  How Liberalism Became Our State Religion (Washington:  Regnery Publishing, Inc., c. 2013).  He prefaces his treatise with a typically prescient statement by G. K. Chesterton:  “‘It is only by believing in God that we can ever criticize the Government.  Once abolish . . . God, and the Government becomes the God.  That fact is written all across human history . . . .  The truth is that Irreligion is the opium of the people.  Wherever the people do not believe in something beyond the world, they will worship the world.  But, above all, they will worship the strongest thing in the world’” (p. 1).   And inasmuch as the State has (during the past five centuries) gradually expanded its powers, there is a natural tendency to worship it.    

Though secular liberals have frequently touted their “tolerance” and commitment to “diversity,” there is a totalitarian shadow—an irreligious dogmatism—evident in their many current anti-Christian endeavors:  the “war on Christmas” with efforts to enshrine alternatives such as “Winter Solstice;” the cleansing of any Christian content from public school curricula (while simultaneously promoting Islam); the dogmatic support of naturalistic evolution rather than any form of intelligent design in the universities; the removal of crosses or nativity scenes on public lands; the desecration of Christian symbols by “artists” of various sorts; the assault on Christian ethics through court decisions (e.g. Roe v. Wade) and programs such as the abortificient provisions in Obamacare.  Systematically imposed by the federal courts (following the crucial 1947 Everson v. Board of Education Supreme Court decision), “the federal government has acted as an instrument of secularization, that is, of disestablishing Christianity from American culture, and establishing in its place a different worldview” (p. 11).  

Lest we restrict this process to America, however, we must chart some powerful historical developments in Western Civilization that have been unfolding for half-a-millennium.  To Wiker, the triumph of Liberalism in these centuries enabled growing numbers of folks to liberate themselves from the “curse” of Christianity and replace the Church with an enlightened and nurturing State.  “The founders of liberalism believed that Christianity was a huge historical mistake, and therefore they reached back again to the pagans for help in loosening the Christian hold on the world, and quite often adopted precisely those things in paganism that Christianity had rejected” (p. 22).  Consequently, “Christians today find themselves in a largely secularized society” quite akin to the ancient world with an easy-going sexual ethos; “it is as if Christianity is being erased from history, and things were being turned back to the cultural status quo of two thousand years ago” (p. 37).  

Christianity, of course, emerged within a pagan world wherein the state (Egyptian pharaohs, the Athenian polis, Imperial Rome) had been routinely idolized.  Following Christ’s wonderful prescription—“render unto Caesar the things that are Caesar’s and to God the things that are God’s”—his followers established the “two cities” approach definitively set forth by St Augustine.  Priests and kings are to preside over different, though not totally isolated realms.  Clearly delineated in the Bible, “The distinction between church and state, religious and political power, is peculiar to Christianity, and the church invented it” (p. 44).  Of ultimate importance to early Christians was doctrinal Truth, an uncompromising insistence on the singular claims of Christ Jesus, the LORD of His heavenly kingdom.  Christians should not deify the state, and no king should defy God’s Law!  Though routinely blurred in practice and often resembling a wrestling match requiring energetic corrections (such as the Cluniac reforms in the 10th and 11th centuries), the separation of church and state provided the key to much that’s distinctive in Western Civilization by preventing the “Church from becoming a department of the state.”  Prescriptively, in 494 A.D. Pope Gelasius wrote a famously important letter to the Eastern Emperor Anastasius, insisting on a clear separation between “the sacred authority of the priesthood and the royal power.”  (In the East, by contrast, a “Caesaropapism” developed reducing the Church to an arm of the Byzantine Empire).  Thenceforth, uniquely in the West, two realms were established with neither dominating the other.  

During the past 500 years, however, this balance slowly shifted and secular powers have imposed their way on the churches.  Wiker describes it as “the rise of liberalism and the re-paganization of the state.”  Fundamental to this progression was Niccolo Machiavelli, who published The Prince in 1512 A.D. and “invented the absolute separation of church and state that is the hallmark of liberalism.”  The Church had drawn lines between religious and political powers, “but Machiavelli built the first wall between them.  In fact, his primary purpose in inventing the state was to exclude the church from any cultural, moral, or political power or influence—to render it an entirely harmless subordinate instrument of the political sovereign” (p. 104).  An effective ruler—the strong-armed prince—must ignore religious and moral prescriptions, following a “might-makes-right” formula.  Machiavellian secularism now appears in both the “soft “liberalism” designed to satisfy our physical needs and the “hard liberalism” of fascist states.  To Machiavelli, the prince should appease the ignorant masses and “‘appear all mercy, all faith, all honesty, all humanity, all religion’” (p. 110).  Working surreptitiously, however, he should promote a “re-paganized” religion and state-controlled educational system.  “The current belief that the church must be separated from the state and walled off in private impotence—leaving, by its subtraction from the public square, the liberal secular state—all that is Machiavelli’s invention.  The playing out of this principle in our courts today is in keeping with his goal of creating a state liberated from the Christian worldview” (p. 119).  Machiavelli’s moral nihilism also fit nicely with newly-empowered nation-states which followed the cuius regio, eius religio (“whose realm, his religion”) compromise negotiated in 1555 at the Peace of Augsberg and quickly moved to control the churches.  

England’s King Henry VIII—guided by Thomas Cromwell, who had studied Machiavelli’s teachings at the University of Padua—brutally illustrated this trend by establishing the Church of England.  He and his successors placed themselves directly under God, controlling both church and state.  Henry supervised the publication of the 1539 Great Bible, featuring an engraving of himself handing copies of it to both the Archbishop of Canterbury (Thomas Cranmer) and his Lord Chancellor (Cromwell).  A century later, “England gave the world the immensely influential political philosopher Thomas Hobbes, author of the Leviathan, who constructed an entirely secular foundation for Henry’s church, and therefore gave us the first truly secular established church in a major modern state—more exactly, an absolutist, autocratic version” (p. 122).  To accomplish this, he first reduced all reality to the material realm, subtly denying the non-materiality of both God and the soul and eliminating any objective, absolute moral standard.  In Hobbes’ world, lacking both Natural and Divine Law, good and evil are mere labels attached to feelings that either please or displease us.  Thus, Hobbes famously said, in our natural state we are at war with everyone and, consequently our lives are “‘solitary, poor, nasty, brutish, and short’” (p. 130).  To corral our nastiness, a Leviathan—an all-powerful Government—must rule.  We need an absolute Sovereign to protect grant and protect our “rights.”  As with Machiavelli, Hobbes knew the masses needed religion, and he simply insisted the Sovereign had the right to prescribe and enforce it through the Church of England.  His “church is entirely a state church, completely under the king’s power” (p. 134).  

Liberalism, similarly, insists the Church must accommodate the state, and “Liberal Christianity is the form that the established religion of the state takes—perhaps not its final form, but its most visible, obvious form” (p. 120).  To accomplish this, liberal thinkers during the Age of Reason determined to destroy the authority of Scripture, and the “demotion of the Bible from revealed truth to mere myth is the result” (p. 58).  To attain this end Benedict Spinoza marshaled his formidable intellect, garnering credit for being both the “father” of both “modern liberal democracy” and “modern Scripture scholarship.”  More blatantly materialistic than either Machiavelli or Hobbes (declaring God is Nature), he adumbrated a might-makes-right political philosophy that flowered in Hegel, “who declared that the Prussian state was the fullest manifestation of the immanentized ‘spirit’ of God” (p. 145).  In a state thus deified, of course, Scripture must be displaced, so Spinoza simply denied any supernatural dimension to the written Word.  By definition, miracles—especially miracles such as the Incarnation and Resurrection—cannot occur, so “higher critics” cavalierly dismissed all such accounts.  To the extent the Bible has merit, its message was reduced “to one simple platitude:  ‘obedience toward God consists only in love of neighbor’” (p. 154).  

Within the next two centuries this same secularizing process wormed its way into the churches.  As a result of the “higher criticism” launched by Spinoza, a “secularizing approach to Scripture was deeply entrenched among the intelligentsia [such as David Friedrich Strauss, a disciple of Hegel, who wrote The Life of Jesus Critically Examined] and had made great headway in European universities.  The aim was ‘de-mythologizing,’ removing from the Biblical text (just as Spinoza had dictated) all of the miracles, and hence undermining all the doctrinal claims related to Christ’s divinity, so that readers were left with, at best, Jesus the very admirable moral man who was misunderstood to be divine by his wishful disciples.  Christianity—so the Scripture scholarship seemed to establish—was built upon a case of mistaken identity.  But the moral core could be salvaged” (p. 240).  Still more:  through the evolutionary processes (both natural and societal) we humans can deify ourselves!  We should worship Man rather than God, the creature rather than the Creator!  

Sharing Spinoza’s intolerance for intolerance, John Locke proposed a softer (“classic”) form of liberalism, though he fully supported its secular essence and proposed a “reasonable” rather than traditionally orthodox form of Christianity.  Eschewing doctrine to emphasize morality, Locke promoted a “mild Deism” that proved quite influential in 18th century England and America.  Personally pious—and the author of many biblical commentaries especially popular in America—Locke was (many thought) sufficiently “Christian” to embrace philosophically.  Concerned to preserve permanent moral standards, he espoused a version of the Natural Law, but he displaced the Church as a mediating institution and left the individual “facing the state alone” (p. 228).  A privatized religious faith is fine, he thought, so long as it makes no claims to shape public policies.   And his “classical” liberalism was (notably in post-WWII America) progressively folded into the more radical forms attuned to Hobbes and Spinoza. 

In today’s churches, Wiker laments, Spinoza’s “materialistic mindset has increasingly taken hold, and the church has become correspondingly anemic.  The church thus weakened by unbelief in the supernatural is what we call the mainline or liberal Christian church.  That church has total faith in materialist science, fully embraces the ‘scientific’ study of Scripture fathered by Spinoza, and professes a completely de-supernaturalized form of Christianity that is entirely at home in this world and only vaguely and non-threateningly associated with the next” (p. 153).  Thus we are confronted, as H. Richard Niebuhr famously said, with theologians teaching that:  “‘A God without wrath brought men without sin into a kingdom without judgment through the ministrations of a Christ without a cross’” (p. 153).  With this Spinoza would be pleased!  “To sum up Spinoza’s kind of Christianity:  You don’t need the Nicene Creed if you’re nice.  People who fight over inconsequential dogmas are not nice.  They’re intolerant” (p.155).  

Furthering the “liberal” agenda of the European Enlightenment, Jean-Jacques Rousseau envisioned a secular “civil religion” (outlined in his Social Contract) replacing Christianity.  His agenda was implemented by men such as Robespierre (“radical liberals”) in the French Revolution and still exerts enormous influence in our world.  Rousseau propounded his own purely naturalistic version of Genesis, imagining how things were in a pure “state of nature.”  Noble savages, free from the constraints of Judeo-Christian morality, followed their passions and enjoyed the good life.  All were equal and none permanently possessed anything.  To regain that lost estate, a “liberal state” is needed—one that “does not define law in terms of the promotion of virtue and the prohibition of vice, but in terms of the protection and promotion of individuals’ private pleasures—since all such pleasures are natural—are declared to be rights.  Any limitation of these ‘rights’ is considered unjust; that is, justice is redefined to mean everyone getting as much of whatever he or she wants as long as he or she doesn’t infringe on anyone else’s pursuit of pleasure” (p. 172).  

Having carefully explained the views of secular liberalism’s architects, Wiker shows how Leftists of various sorts implemented it in the centuries following the French Revolution, for “as the first attempt to incarnate the new liberal political order in a great state, the French Revolution is iconic for liberalism” (p. 200).  Importantly, a purely naturalistic worldview must be crafted and imposed.  We must be persuaded that “we live in a purposeless universe, so that each person has just as much right as anyone else to pursue his or her arbitrarily defined goals or ends” (p. 187).  Liberals triumphantly cite the Darwinian doctrine of biological evolution to prove “that the development of life is driven by entirely random, material processes,” that man “is an accident,” and that we are not made in God’s image but the product of a “meandering and mindless” natural process (p. 194).  Each person freely fabricates and follows whatever moral standards he desires, relaxing into a hedonistic utilitarianism calculated to enjoy the greatest good for the greatest number.  In effect, this has led to a resurgence of a pagan ethos at ease with abortion, euthanasia, promiscuity, sodomy and pedophilia.  

To accomplish this, liberals determined to deprive the Christian religion of any real power.  In late 19th century France this became clear as officials swept away crucifixes and saints’ statues in public places, outlawed religious processions, closed religious schools, and renamed city streets after Enlightenment heroes rather than saints.  More importantly, they seized control of the educational system, making it an agency of the state.  Secularists in America sought the same ends.  To Wiker:  “One cannot overestimate how significant it was in France (and is in America) for liberals to have gained complete state control of education, and for that education to be mandatory.”  This precipitated “a top-down revolution wherein a relatively small minority may impose its worldview upon the entire population using state power.  And the education establishment in our own country, as was the case in France, is dominated by radicals and socialists from the Left, from the universities right down to the elementary schools” (p. 216).  

Thus Liberalism came to America’s shores, first in the form of Locke and later under the auspices of “higher critics” and socialists of various hues.  In a sense, Wiker argues:  “America had a kind of Jacobin class bubbling away underneath tits Protestant surface, plotting its own version of a radical cultural revolution” (p. 263).  Thomas Paine, one of the more influential publicists during the American Revolution, represents this phenomenon.  The author of Common Sense, promoting independence from England, he also wrote The Age of Reason, promoting Deism and anti-Christian prejudices.  Thomas Jefferson avidly embraced both Locke (e.g. The Declaration of Independence) and Paine (e.g. The Life and Morals of Jesus of Nazareth), laying the groundwork for the famous “wall of separation between church and state” in a letter he sent to the Danbury Baptist Association in 1802.  Not until after WWII, however, did the Supreme Court enshrine this Jeffersonian comment as a reason to exclude religion from the public square.  Though Jefferson represented only a small minority of America’s Founders, his anti-Christian secularism slowly spread through the body politic as the decades passed.    

To a degree this Jeffersonian secularism prevailed in America, Alexis de Tocqueville said, because on a practical level 19th century Americans were notably materialistic—seeking comfort and prosperity without compunction.  They were thus quite “inclined to follow Locke, both in theory and in practice, and hence already well on our way to allowing the soul and heaven to fade away.  Christianity was often quite fervent America, but it was subtly reconstructed to be compatible with passionately this-worldly material pursuits.  It was not a Christianity that could produce martyrs or even severe judges of the fallen secular order” (p. 268).  By the end of the century, then, the nation was unfortunately vulnerable to the radicals at the universities who determined to transform the nation.  Scores of young scholars, following the Civil War, sailed to Europe (especially German universities) and returned with Spinoza and Rousseau, Darwin and Spencer, Strauss and Marx, entrenched in their minds.  They then either established or  controlled the nation’s preeminent universities which (given the largess of state and federal governments)  began to shape the cultural life of America.  New academic disciplines—including sociology and psychology—insisted that trained “experts” do for the people what they could not do on their own.  And the newly-minted law schools, personified by Oliver Wendell Holmes, systematically sought to impose a secular agenda on the land.  Progressive politicians, including Theodore Roosevelt, Woodrow Wilson, and Franklin Roosevelt, heeded their admonitions and implemented their goals.  

The time has come, Wiker argues, to disestablish the secular humanism now ruling America under the guise of “progressivism.”  Like scores of other political ideologies, it’s as clearly a religion (ironically, an unbelief established as a belief) with its own dogmas regarding creation, man’s nature and purpose in life, sin and salvation, good and evil, right and wrong, church and state, death, immortality and life everlasting.  “Liberalism once appeared to be about freeing everyone, believers and non-believers alike, from government-imposed religion and morality, but it has shown that that was just a ruse for establishing its own particular worldview, one that is fundamentally antagonistic to Christianity” (p. 312).  To mount the counterrevolution, believers must first target the nation’s universities—primarily by teaching truthful history—first stemming and then reversing the currents of liberal orthodoxy.  

# # #

246 Christ the King

 For most of his life N.T. Wright—one of the world’s most distinguished biblical scholars as well as an active churchman and bishop in the Church of England—has pondered various of questions regarding Jesus and His people.  In Simply Jesus:  A New Vision of Who He Was, What He Did, and Why He Matters (New York:  HarperOne, c. 2011), he sets forth some definitive answers to his quest.  “This book is written,” he declares, “in the belief that the question of Jesus—who he really was, what he really did, what it means, and why it matters—remains hugely important in every area, not only in personal life, but also in political life, not only in ‘religion’ or ‘spirituality,’ but also in such spheres of human endeavor as worldview, culture, justice, beauty, ecology, friendship, scholarship, and sex” (p. 5).  He also endeavors to move beyond the “conservative vs. liberal,” or “personal salvation vs. social gospel,” divisions by subsuming them all beneath his thesis regarding the neglected Truth declaring Christ’s Kingship.    

To find who Jesus really was requires serious historical research, seeking to understand His milieu rather than re-shaping Him to fit ours.  First, that means properly using proper sources—primarily the four canonical gospels.  It also means understanding the ancient milieu within which they were written, when a powerful religious movement (labeled a “philosophy” by the Jewish historian Josephus) reflected an expectation of the coming Messiah and insisted “that it was time for God alone to be king” (p. 41).  Over the centuries Israel’s prophets, reflecting on crucial events such as the Exodus and Exile (cf. Ezekiel 34) had envisioned a time when God fully manifested his royal authority on earth as well as heaven, working through purified hearts rather than foods and rituals.  Israel’s poets (cf. Psalm 2) expected YHWH, working through His anointed Son, would “establish his own rule over the rest of the world from his throne in Zion” (p. 50).  Jesus, drawing on such passages from the Psalms and Isaiah, portrayed Himself as the “suffering servant” expected by some first century Jews, but “Nobody, so far as we know, had dreamed of combining these ideas in this way before.  Nor had anyone suggested that when the prophet spoke of ‘the arm of YHWH’ (53:1)—YHWH himself rolling up his sleeves, as it were to come to the rescue—this personification might actually refer to the same person, to the wounded and bleeding servant” (p. 173).    

Precisely that’s what happened in Jesus, his disciples insisted!  “Within a few years of his death, the first followers of Jesus of Nazareth were speaking and writing about him, and indeed singing about him, not just as a great teacher and healer, not just as a great spiritual leader and holy man, but as a strange combination:  both the Davidic king and the returning God.  He was, they said, the anointed one, the one who had been empowered and equipped by God’s Spirit to do all kinds of things, which somehow meant that he was God’s anointed, the Messiah, the coming king.  He was the one who had been exalted after his suffering and now occupied the throne beside the throne of God himself” (p. 54).  God’s plan was fulfilled, Luke declared, when Jesus ascended the Cross rather than a throne—“or, rather, as all four gospel writers insist, a cross that is to be seen as a throne.  This, they all say, is how Jesus is enthroned as ‘King of the Jews.’  Jesus’  vocation to be Israel’s Messiah and his vocation to suffer and die belong intimately together” (p. 173).   Jesus and His disciples saw the Cross as “the shocking answer to the prayer that God’s kingdom would come on earth as in heaven” (p. 185).  

Consequently, God Himself is in charge of His Kingdom, ruling through his Son Christ Jesus.  “It was this new world in which God was in charge at last, on earth as in heaven.  God was fixing things, mending things, mending people, making new life happen.  This was the new world in which the promises were coming true, in which new creation was happening, in which a real ‘return from exile’ was taking place in the hearts and minds and lives both of notorious sinners and of people long crippled by disease” (p. 91).  Inevitably this provoked animosity from the principalities and powers determined to replace YHWH!  As is revealed in Jesus’ wilderness temptations, the LORD battles Satan and his earthly satraps—a battle finished on the Cross, where Jesus forever defeated the powers of darkness.  

That Christ is King explains the frequent NT references to Jesus forgiving sins and replacing the Temple (where sins were normally forgiven).  The Temple was YHWH’s dwelling, the sacred site where His Shekinah glory declared His presence.  It was literally the center of the world “where heaven and earth met” (p. 132).  When Jesus dramatically cleansed the Temple He “was staking an implicitly royal claim:  it was kings, real or aspiring, who had authority over the Temple” (p. 127).  By this action Jesus declared “that the Temple was under God’s judgment and would, before too long, be destroyed forever” (p. 129).  Indeed, He Himself would become the Temple!  Still more:  He became the new Sabbath and Jubilee!  Time and space are transformed in the new creation wherein He now rules.  

Emblematic of the new creation is the Passover meal Jesus celebrated with His disciples.   It was a traditional Jewish ceremony, but it was radically new.  “Like everything else Jesus did,” Wright says, “he filled the old vessels so full that they overflowed.  He transformed the old mosaics into a new, three-dimensional design.  Instead of Passover pointing backward to the great sacrifice by which God had rescued his people from slavery in Egypt, this meal pointed forward to the great sacrifice by which God was to rescue his people from their ultimate slavery, from death itself and all that contributed to it (evil, corruption, and sin).  This would be the real Exodus, the real ‘return from exile.’  This would be the establishment of the ‘new covenant’ spoken of by Jeremiah (31:31).  This would be the means by which ‘sins would be forgiven’—in other words, the means by which God would deal with the sin that had caused Israel’s exile and shame and, beyond that, the sin because of which the whole world was under the power of death.  This would be the great jubilee moment, completing the achievement outlined in Nazareth” (p. 180).  

“The gifts of bread and wine,” Wright continues, “already heavy with symbolic meaning, acquire a new density:  this is how the presence of Jesus is to be known among his followers.  Sacrifice and presence.  This is the new Temple, this strange gathering around a quasi-Passover table.  Think through the Exodus themes once more.  The tyrant is to be defeated:  not Rome, now, but the dark power that stands behind that great, cruel empire.  God’s people are to be liberated:  not Israel as it stands, with its corrupt, money-hungry leaders and it is people bent on violence, but the reconstituted Israel for whom the Twelve are the founding symbol” (p. 180).  The Last Supper, of course, set the stage for Jesus’s crucifixion and Resurrection; thereafter his followers—His reconstituted Israel—quickly spread around the world declaring “Jesus is Lord, and He is risen.”  The Risen Lord unveiled “the beginning of the new world that Israel’s God had always intended to make” (p. 191), and in His post-Resurrection appearances He materialized as “a person who is equally at home ‘on earth’ and ‘in heaven’” (p. 192).  After 40 days, He ascended into heaven.  But His heaven permeates the earth—Jesus is in “heaven” but He is everywhere present on earth as well.  “If Easter is about Jesus as the prototype of the new creation, his ascension is about his enthronement as the one who is now in charge.  Easter tells us that Jesus is himself the first part of new creation; his ascension tells us that he is now running it” (p. 195).  

And in time He will fully assert His rule.  He’s coming again!  “Heaven is God’s space, God’s dimension of present reality, so that to think of Jesus ‘returning’ is actually, as both Paul and John say in the passages just quoted, to think of him presently invisible, but one day reappearing” (p. 202).  The new world envisioned in Romans 8 and Revelation 21-22 will be a place under Christ’s control, “administering God’s just, wise, and healing rule” (p. 202).  “The second coming is all about Jesus as the coming Lord and judge who will transform the entire creation.  And, in between resurrection and ascension, on the one hand, and the second coming, on the other, Jesus is the one who sends the holy Spirit, his own Spirit, into the lives of his followers, so that he himself is powerfully present with them and in them, guiding them, directing them, and above all enabling them to bear witness to him as the world’s true Lord and work to make that sovereign rule a reality” (p. 203).  

We Christians (his Christ-bearers, His followers) are assigned a vital role in the Kingdom, for God ever intended to rule earth through human beings.  Jesus redeemed us on the Cross in order for us to join him, ruling the world in accord with His design.  “In God’s kingdom, humans get to reflect God at last into the world, in the way they were meant to.  They become more fully what humans were meant to be.  That is how God becomes king.  That is how Jesus goes to work in the present time.  Exactly as he always did” (p. 213).  And He established His Church (His Body), wherein we work to accomplish His ends.  Our work (as  concisely outlined in the Beatitudes) is to bear witness to His way in His world.  

* * * * * * * * * * * * * * * * * * * *

In How God Became King:  Getting to the Heart of the Gospels (New York:  HarperOne, c. 2012) Tom (a.k.a. N.T.) Wright continues to develop the thesis earlier enunciated in Simply Jesus.  He thinks we have lost touch with the canonical gospels, using them as props or tools to further our own agendas rather than as sources demanding our prayerful attention and implementation.  He acknowledges that for 20 centuries numerous scholars have devoted much time and intellectual firepower to the task of understanding them, but he thinks they have, by-and-large, failed to rightly discern and declare their real message.  (There is, of course, more than a little hubris in any declaration such as Wright’s that he alone has at last found The Truth—but that is something of a scholarly virus, an occupational hazard, frequently found in brilliant folks such as he.)    

In Wright’s reading of Church history, orthodox theologians and preachers have (rather narrowly  following St Paul or Luther or Calvin) reduced “the gospel” to the historic creeds—i.e. Apostles’ and Nicene—and neglected if not totally bypassed the Gospels.  “The great creeds, when they refer to Jesus, pass directly from his virgin birth to his suffering and death,” whereas the four Gospel writers “tell us a great deal about what Jesus did between the time of his birth and the time of his death.  In particular, they tell us about what we might call his kingdom-inaugurating work:  the deeds and words that declared that God’s kingdom was coming then and there, in some sense or other, on earth as in heaven.  They tell us a great deal about that; but the great creeds don’t” (p. 11).  “The gospels were all about God becoming king, but the creeds are focused on Jesus being God” (p. 20).  The creeds are not wrong, Wright insists, in what they affirm!  But when the Faith is reduced to creedal verities the Jesus Message gets lost.  

The Message got lost early on as misinterpretations came to dominate the Church!  For 1500 years or so Christians have seemed to ignore the fact “that the Jewish context of Jesus’ public career was playing any role in theological or pastoral reflection,” and He became “founder” of the faith, “with the type of Christianity varying according to the predilections of the preacher or teacher” (p. 110).   Three centuries ago Enlightenment thinkers, reviving the hedonistic materialism of Epicurus and Lucretius and heeding biblical critics such as H.S. Reimarus and Baruch Spinoza, embarked on a quest for the “historical Jesus” that refused to see Him as the Gospels reveal.  As variously portrayed by multitudes of liberal professors and preachers, poets and songsters, Jesus appears as “a revolutionary, hoping to overthrow the Romans by military violence and establish a new Jewish state.  Or he’s a wild-eyed apocalyptic visionary, expecting the end of the world.  Or he’s a mild-mannered teacher of sweet reasonableness, of the fatherhood of God and the brotherhood of ‘man.’  Or perhaps he’s some combination of the above” (p. 26).  Indeed, according to Rudolph Bultmann and his 20th century epigones, details regarding His life have no bearing on much of anything, for the Gospels (in their view) are not bona fide biographies conveying truthful details.  They were fanciful projections, written long after the events described, of an evolving community looking for illustrations to justify their “faith.”  

As a result of skeptical scholarship, “there seems a ready market right across the Western world for books that say that Jesus was just a good Jewish boy who would have been horrified to see a ‘church’ set up in his name, who didn’t think of himself as ‘God’ or even the ‘Son of God’, and who had no intention of dying for anyone’s sins—the church has got it all wrong” (p. 27).  To Wright, such a reading of the Gospels clearly ignores their obvious content.  Markedly deistic, Enlightenment thinkers wanted nothing to do with a God who intervenes on earth, much less actually rules anything.  They rejected both earthly kings and the heavenly King come to earth in Jesus.  “But the whole point of the gospels is to tell the story of how God became king, on earth as in heaven.  They were written to stake the very specific claim towards which the eighteenth-century movements of philosophy and culture, and particularly politics, were reacting with such hostility” (p. 34).  The Deism promoted by Voltaire and Thomas Paine removed God from His world.  “The divine right of kings went out with the guillotine, and the new slogan vox populi vox Dei (‘The voice of the people is the voice of God’) was truncated; God was away with the fairies doing his own thing, and vox pop, by itself, was all that was now needed” (p. 164).  

It’s now time to escape the intellectual shackles of the eighteenth-century!  It’s time to read the Gospels with 20-20 vision, taking them as trustworthy sources regarding who Jesus was and what He declared.  For, Wright incessantly repeats, they give us a largely forgotten narrative, “the story of how Israel’s God became king” (p. 38).  As the Messiah—a Jewish Messiah fulfilling the Hebrew Scriptures—Jesus came not so much as to provide a pathway to an ethereal heaven removed from the earth as to establish an outpost of heaven on earth.  “Jesus was announcing that a whole new world was being born and he was ‘teaching’ people how to live within that whole new world” (p. 47).  To rightly hear the Gospel we must turn down the volume of skeptical scholars and moralistic reformers and hear the annunciation of Jesus “as the climax of the story of Israel” (p. 65).  As Matthew insists, Jesus consummates the history of Israel initiated by father Abraham and “will save his people from their sins” (Mt 1:21).  But Jesus came to save more than the children of Israel, and “the reason Israel’s story matters is that the creator of the world has chosen and called Israel to be the people through whom he will redeem the world.  The call of Abraham is the answer to the sin of Adam.  Israel’s story is thus the microcosm and beating heart of the world’s story, but also its ultimate saving energy.  What God does for Israel is what God is doing in relation to the whole world.  That is what it meant to be Israel, to be the people who, for better and worse, carried the destiny of the world on their shoulders.  Grasp that, and you have a pathway into the heart of the New Testament” (p. 73).  

Mark’s gospel begins with Jesus’ baptism, where He is anointed with the Spirit and declared God’s Son by the Father.  Thenceforth He selected His 12 disciples, symbolizing the 12 tribes of Israel and doing dramatic things, such as calming the storm on the Sea of Galilee, illustrating what He was doing as God rescuing His people.  Toward the end of Mark’s Gospel, we encounter a Roman centurion who declared the crucified Christ as truly the Son of God.  Given his background, we assume the centurion didn’t fully understand what transpired on Golgotha.  “For him, the phrase ‘God’s son” would normally have meant one person and one person only:  Tiberius Caesar, son of the ‘divine’ Augustus” (p. 94).  The centurion tacitly recognized a larger truth:  in Jesus God regained His rightful throne as earth’s real Ruler.  

This too John makes clear in the Prologue to his Gospel, where he “takes us back to the first books of the Bible, to Genesis and Exodus.  He frames his simple, profound opening statement with echoes of the creation story (‘In the beginning . . .’, leading up to the creation of humans in God’s image) and echoes of the climax of the Exodus (‘The Word became flesh, and lived among us,’ 1.14, where the word ‘lived’ is literally ‘tabernacled’, ‘pitched his tent’, as in the construction of the tabernacle for God’s glory in the wilderness).  This, in other words, is where Israel’s history and with it world history reached their moment of destiny” (p. 77).  John’s Jesus “is a combination of the living Word of the Old Testament, the Shekinah of Jewish hope (God’s tabernacling presence in the Temple), and ‘wisdom’, which in some key Jewish writings was the personal self-expression of the creator God, coming to dwell with humans and particularly with Israel (see Wisdom 7; Sirach 24)” (p. 103).  

Climaxing his story with Jesus on the Cross, John portrays Him as “enthroned,” truly the King of Kings.   It was a new kind of Kingdom, one of Love and Truth rather than Power, as he explained to Pilate, and the Roman Procurator acted more presciently than he imagined when he had a sign (a typical public notice called a titulus) in Hebrew, Greek, and Latin —“JESUS OF NAZARETH, THE KING OF THE JEWS”—affixed to the Cross.  “The cross in John, which we already know to be the fullest unveiling of God’s and Jesus’, love (13:1), is also the moment when God takes his power and reigns over Caesar” (p. 146).  Cross and Kingdom, like hand and glove, go together.  “Jesus, John is saying, is the true king whose kingdom comes in a totally unexpected fashion, folly to the Roman governor and a scandal to the Jewish leaders” (p. 220).  “Part of John’s meaning of the cross, then, is that it is not only what happens, purely pragmatically, when God’s kingdom challenges Caesar’s kingdom.  It is also what has to happen if God’s kingdom, which makes its way (as Jesus insists) by non-violence rather than by violence, is to win the day.  This is the ‘truth’ to which Jesus has come to bear witness, the ‘truth’ for which Pilate’s world-view has no possible space (18:38)” (p. 230).  

Following the Crucifixion, of course, we read of the Resurrection and Ascension, fully affirming Jesus’ mission.  “It is the resurrection that declares that the cross was a victory, not a defeat.  It therefore announces that God has indeed become king on earth as in heaven” (p. 246).  Then comes Pentecost, when  the Spirit fully enters Jesus’ disciples, enabling them to “be for the world what Jesus was for Israel” (p. 119).  And just as Jesus battled satanic powers and tackled worldly tyrants, so too His followers continue that work.  The clash of kingdoms foreseen by Daniel and Isaiah and dramatically evident in the life of Jesus continues today.   As with Pilate, the paramount issue is Truth, to which Jesus and His followers bear witness.  This “truth is what happens when humans use words to reflect God’s wise ordering of the world and so shine light into its dark corners, bringing judgment and mercy where it is badly needed” (p. 145).  

Jewish prophets predicted the Messiah would inaugurate a theocracy—the righteous reign of God, ruling through human beings, stewards of His creation.  “Those who are put right with God through the cross are to be putting-right people for the world” (p. 244).  In the Temple—“the fulcrum of ancient Jewish theocracy”—priests and kings had joined to do God’s work in His world, with priests leading worship and kings establishing justice.  He Himself is the new temple, which, “like the wilderness tabernacle, is a temple on the move, as Jesus’ people go out, in the energy of the spirit, to be the dwelling of God in each place, to anticipate that eventual promise by their common and cross-shaped life and work” (p. 239).  

245 Refuting Relativism

While the recently installed Pope Francis urges empathy with the poor he also laments the spiritual poverty of those in bondage to what Benedict XVI called “the tyranny of relativism.”  He certainly follows St.  Francis of Assisi, urging us to be peacemakers—“But there is no true peace without truth!  There cannot be true peace if everyone is his own criterion, if everyone can always claim exclusively his own rights, without at the same time caring for the good of others, of everyone, on the basis of the nature that unites every human being on this earth.”  His papal predecessor, Benedict XVI, had warned:   “We are building a dictatorship of relativism that does not recognize anything as definitive and whose ultimate goal consists solely of one’s own ego and desires.”  While acknowledging that fanatics too easily assert their confidence in various “truths,” we should not cease discerning and proclaiming with certainty self-evident and trustworthy insights and convictions.  “That is why,” he said, “we must have the courage to dare to say:  ‘Yes, man must seek the truth; he is capable of truth.”   

Benedict’s admonitions would not have surprised Allan Bloom, who in 1987 wrote The Closing of the American Mind as “a meditation on the state of our souls, particularly those of the young, and their education” (p. 19).  Youngsters need teachers to serve as midwives—above all helping them deal with “the question, ‘What is man?’ in relation to his highest aspirations as opposed to his low and common needs” (p. 21).   University students, Bloom said, were “pleasant, friendly and, if not great-souled, at least not particularly mean-spirited.  Their primary preoccupation is themselves, understood in the narrowest sense” (p. 83), preoccupied with personal feelings and frustrations.  Not “what is man” but “who am I” is the question!  They illustrate “the truth of Tocqueville’s dictum that ‘in democratic societies, each citizen is habitually busy with the contemplation of a very petty object, which is himself’” (p. 86).  

     This preoccupation with self-discovery and self-esteem, Bloom believed, flowers easily in today’s relativism, a philosophical dogma espoused by virtually everyone coming to or prowling about the university.  Under the flag of “openness” and “tolerance,” no “truths” are acknowledged and everyone freely follows his own feelings.  So even the brightest of our young people know little about history, literature, or theology, for such knowledge resides in books, which remain largely unread, even in the universities.  Minds shaped by films, rock music and television have little depth, and “the failure to read good books both enfeebles the vision and strengthens our most fatal tendency—the belief that the here and now is all there is” (p. 64).  Deepening his analysis in a section titled “Nihilism, American Style,” Bloom diagnosed the philosophical roots of today’s educational malaise as preeminently rooted in Nietzsche, Freud, and Heidegger.  An enormous intellectual earthquake has shaken our culture to its foundations.  It is “the most important and most astonishing phenomenon of our time,” the “attempt to get ‘beyond good and evil’” by substituting “value relativism” for Judeo-Christian absolutism (p. 141).  

* * * * * * * * * * * * * * * * * * * * * * * * *

What concerned Bloom and the popes at the turning of the millennium was perceptively examined half-a-century earlier by C.S. Lewis in one of his finest books, The Abolition of Man (New York:  Macmillan, 1947).  First presented during WWII as a series of lectures, Lewis began by carefully examining an elementary English textbook he dubbed The Green Book.   While allegedly designed to help students read literature, the text was inadvertently a philosophical justification for relativism, promoting the notion that all values, whether aesthetic or ethical, are subjective and ultimately indefensible.  However, Lewis said:  “Until quite modern times all teachers and even all men believed the universe to be such that certain emotional reactions on our part could be either congruous or incongruous to it—believed, in fact, that objects did not merely receive, but could merit, our approval or disapproval, our reverence or our contempt.  The reason why Coleridge agreed with the tourist who called the cataract sublime and disagreed with the one who called it pretty was of course that he believed inanimate nature to be such that certain responses could be more ‘just’ or ‘ordinate’ or ‘appropriate’ to it than others.  And he believed (correctly) that the tourists thought the same.  The man who called the cataract sublime was not intending simply to describe his own emotions about it:   he was also claiming that the object was one which merited those emotions” (#148 in Kindle).  

Coleridge and others who believed in objective Truth (and truths) generally appealed to what Chinese thinkers revered to as “the Tao.  It is the reality beyond all predicates, the abyss that was before the Creator Himself.  It is Nature, it is the Way, the Road.  It is the Way in which the universe goes on, the Way in which things everlastingly emerge, stilly and tranquilly, into space and time.  It is also the Way which every man should tread in imitation of that cosmic and supercosmic progression, conforming all activities to that great exemplar” (#107).  We instantly recognize—through theoretical reason—certain laws of thought (e.g. the law of non-contradiction) or geometry (e.g. a line is the shortest distance between two points); we also know—through practical reason—certain permanent moral maxims (e.g. murder is wrong).  Any effort to reduce universal values to personal feelings inevitably founder in nihilistic confusion.    

In truth:  “All the practical principles behind the Innovator’s case for posterity, or society, or the species, are there from time immemorial in the Tao.  But they are nowhere else.  Unless you accept these without question as being to the world of action what axioms are to the world of theory, you can have no practical principles whatever” (#358).  “The human mind has no more power of inventing a new value than of imagining a new primary colour, or, indeed, of creating a new sun and a new sky for it to move in” (#398).  By disregarding the Tao, advocates of any new morality sink into a “void” without a  pattern to follow, a nihilistic abyss promoting “the abolition of Man” (#556).  

* * * * * * * * * * * * * * * * * * * * * * * * 

In The Book of Absolutes:  A Critique of Relativism and a Defense of Universals (Montreal:  McGill-Queen’s University Press, c. 2008), William D. Gairdner updates and amplifies an ancient and perennial proposition.  A distinguished Canadian Olympic athlete with degrees from Stanford University, Gairdner has effectively influenced the resurgence of conservatism in his native land.  Though he acknowledges the present power and pervasiveness of relativism, he finds it “a confused and false conception of reality that produces a great deal of unnecessary anxiety and uncertainty, both for individuals and for society as a whole” (#71).  To rectify this problem he wrote “a book to restore human confidence by presenting the truth about the permanent things of this world and of human existence” (#74).  

The current notion—that truth varies from person to person (or group to group), that all perspectives must be tolerated, that moral judgments must be suspended—has an ancient history which Gairdner explores, noting that earlier generations generally judged it “preposterous.  The ancient Greeks actually coined the word idiotes (one we now apply to crazy people) to describe anyone who insisted on seeing the world in a purely personal and private way” (#88).  There were, of course, Greek Sophists such as Protagoras who declared:  “As each thing appears to me, so it is for me, and as it appears to you, so it is for you.”  But their views withered under the relentless refutations of Socrates, Plato, and Aristotle—all defending objective truth and perennial principles—followed by Medieval philosophers such as Thomas Aquinas and Enlightenment scientists such as Isaac Newton.  

Dissenting from the traditional, absolutist position were thinkers such as Thomas Hobbes, who rejected any rooting of moral principles in a higher law, declaring in The Leviathan  that we label “good” whatever pleases us.  Indeed, the words good and evil “are ever used with relation to the person that usesth them:  there being nothing simply and absolutely so.”  A century later Hobbes’ subjectivism would be enshrined by a thinker markedly different from him in many respects, Immanuel Kant, “the most coolly influential modern philosopher to have pushed us toward all sorts of relativist conclusion” (p. 14).   Building on Kant’s position, Friedrich Nietzsche formulated the relativist slogan:  “there are no facts, only interpretations.”  American pragmatists and cultural anthropologists, European existentialists and deconstructionists took up the catchphrase, and today we live in a postmodern culture deeply shaped by epistemological skepticism and moral relativism.  

Sadly, this intellectual shift was bolstered by a profound popular misunderstanding and misrepresentation of one of the monumental scientific discoveries of all time, Einstein’s “Special Theory of Relativity.”  As historian Paul Johnson explains, “‘the belief began to circulate, for the first time at a popular level, that there were no longer any absolutes:   of time and space, of good and evil, of knowledge, and above all of value.  Mistakenly but perhaps inevitably, relativity became confused with relativism.’  And, he adds, ‘no one was more distressed than Einstein by this public misapprehension’” (p. 17).  Nevertheless, it served as a scalpel that helped “‘cut society adrift from its traditional moorings in the faith and morals of Judeo-Christian culture’” (p. 18).   

After explaining various forms of relativism—noting that its moral and cultural forms are most prevalent and pernicious—Gairdner registers some objections to it.  It is, importantly, “self-refuting,” basing its entire case upon the absolute assertion that there are no absolutes.  Thus it cannot withstand Aristotle’s powerful argument, set forth in his Metaphysics, showing how it violates the law of non-contradiction.  That various persons or cultures claim different “truths” hardly affects the fact that a great many beliefs are manifestly wrong, whereas others (e.g. the earth is spherical) are demonstrably right.  The fact that some groups of people (“sick societies”) have condoned human sacrifice or infanticide hardly justifies these practices.  Admittedly, some truths—both scientific (a heliocentric solar system) and moral (slavery is wrong)—become clear only after considerable time or laborious investigation, but that only strengthens their certainty.  Thus:  “Neither believing nor doing makes a thing right or wrong” (p. 39).  

Challenging relativism, Gairdner invokes scholars such as Donald E. Brown, a professor of anthropology, whose 1991 publication, Human Universals, effectively refutes many sophomoric relativistic mantras by listing more than 300 human universals.  For example:  “All humans use language as their principal medium of communication, and all human languages have the same underlying architecture, built of the same basic units and arranged according to implicit rules of grammar and other common features.  All people classify themselves in terms of status, class, roles, and kinship, and all practice division of labour by age and gender.  . . . .  All have poetry, figurative speech, symbolic speech, ceremonial speech, metaphor, and the like.  All use logic, reckon time, distinguish past, present, and future think in causal terms, recognize the concept and possibility of cheating and lying, and strive to protect themselves from the same’” (p. 64).  Everywhere and at all times we humans have acknowledged such universal realities.  

There are indubitable, demonstrable constants (laws) throughout the natural world—notably the law of gravity and Einstein’s famous theorem, E=mc2.  Material things, moving through space, continually change; but laws remain the same.  As David “Berlinski puts it, ‘the laws of nature by which nature is explained are not themselves a part of nature.  No physical theory predicts their existence nor explains their power.  They exist beyond space and time, they gain purchase by an act of the imagination and not observation, they are the tantalizing traces in matter of an intelligence that has so far hidden itself in symbols’” (p. 76).  Still more, says Berlinski:  “‘We are acquainted with gravity through its effects; we understand gravity by means of its mathematical form.  Beyond this, we understand nothing” (p. 78).  

Analogously, as human beings we are, by nature, “hardwired” with important “universals.”  The “blank-slate” notion, promulgated by empiricists such as John Locke and B.F. Skinner, cannot support the accumulating genetic and cognitive evidence concerning our species.  Beyond using the same nucleic acid and DNA basic to all living organisms, we do a great number of remarkable things:  negotiating contracts; acting altruistically and even sacrificially; establishing elaborate kinship ties; walking erectly; engaging in year-round sexual encounters; recognizing ineradicable male-female differences; manifesting an inexplicably miraculous intelligence, reason, and free will.  Gairdner patiently compiles and evaluates the massive evidence available to show that a multitude of “universals” do in fact define us as human beings.  “Contrary to the claims of relativists everywhere that nothing of an essential or innate human nature exists, we find that there is indeed a basic universal, biologically rooted human nature, in which we all share.  This is so from DNA to the smiling instinct.  It is a human nature that runs broad and deep, and nothing about it is socially constructed or invented” (p. 162).  This is markedly evident in the “number of universals of human language” (p. 217) discovered by scholarly linguists such as Noam Chomsky, who insists “‘there is only one human language,’ which he and his followers later labeled ‘UB,’ or ‘Universal Grammar’”(p. 229).  As an English professor, Gairdner devotes many pages, in several chapters, to an explanation and analysis of recent developments in the study of language, truly one of the defining human characteristics.  Importantly, it can “be seen as a mirror of the internal workings of the mind rather than as a mirror of the external workings of culture or society” (p. 291).  

Embedded within this human nature we find a natural law prescribing moral norms.  “The traditional natural law is therefore based on four assumptions:  ‘1.  There are universal and eternally valid criteria and principles on the basis of which ordinary human law can be justified (or criticized).  2.  These principles are grounded both in nature (all beings share certain qualities and circumstances) and in human nature.  3.   Human beings can discover these criteria and principles by the use of right reason.  4.  Human law is morally binding only if it satisfies these criteria and principles’” (p. 164).  It is a hallmark of the philosophia perennis articulated by classical (Plato; Aristotle; Cicero) and Christian (Aquinas; Leibniz; C.S. Lewis) thinkers and noted for its common sense notions regarding God, man, and virtuous behavior.  

Espoused by some of the great architects of Western Civilization—including Aristotle and Cicero,  Aquinas and Calvin, Sir William Blackstone and the American Founders, the Nuremberg judges and Martin Luther King—the natural law tradition has provided the foundation for “the rule of law” so basic to all our rights and liberties.   As defined by Cicero:  “‘true law is right reason in agreement with nature, universal, consistent, everlasting, whose nature is to advocate duty by prescription and to deter wrongdoing by prohibition.’  He further stated that ‘we do not need to look outside ourselves for an expounder or interpreter of this law.’  God, he said, is the author and promulgator and enforcer of this law, and whoever tries to escape it ‘is trying to escape himself and his nature as a human being’” (p. 183).  

So, Gairdner explains:  “The precepts of natural law for rational human creatures are, then, rational directives of logic and morality aimed at the common good for humanity and at avoidance of everything destructive of the good.  This means that human rational fulfillment may be found in such things as preserving the existence of ourselves and others by begetting and protecting children, by avoiding dangers to life, by defending ourselves and our loved ones, by hewing to family and friends, and of course, by hewing to reason itself.  We know many such standards in religion as commandments.  In daily life we know them as natural commands and prohibitions:  love others, do unto them as you would have them do unto you, be fair, do not steal, do not lie, uphold justice, respect property, and so on” (p. 189).  Such precepts, as Aquinas insisted, are intuitively known, per se nota; they are as self-evident as the geometric axioms of Euclid or the North Star’s fixed location in the night sky.  Thus murder and lying and theft and rape are rightly recognized as intrinsically evil.  Honoring one’s parents, respecting the dead, valuing knowledge, and acting courageously are rightly deemed good.  

During the past century, as relativism has flourished, the natural law tradition was widely attacked and abandoned.  Strangely enough, most relativists grant the existence of “an extremely mysterious law of gravity that controls all matter but that is not itself a part of matter, but they will not consent to other types of laws that govern or guide human behaviour, such as a natural moral law” (p. 210).    Apparently, Gairdner says, one of the reasons “for the decline of natural law during the rise of the modern state is that just about every law school, every granting institution, every legal journal, and every public court and tribunal is largely funded by the state.  And no modern state wants to be told by ordinary citizens that any of its laws are not morally binding.  That is why Lord Acton referred to natural law as ‘a revolution in permanence.’  He meant a revolution by those who cherish a traditional society and a morality rooted in human nature against all those who attempt to uproot, reorder, and deny or replace these realities” (p. 182).  

The modern repudiation of absolutes followed developments in 19th and 20th century German philosophy, evident in Hegel, Nietzsche and Heidegger, reaching their apex in Nazi Germany.  Classical and Christian advocates of transcendent metaphysical principles, such as Plato and Aquinas, were discarded by a corps of “existentialists” determined to move “beyond good and evil” and devise a purely secular, humanistic ethos.  French intellectuals, following the lead of Jacques Derrida and Michel Foucault, imported Nietzsche and Heidegger, setting forth the currents of “deconstruction” and “postmodernism” so triumphant in contemporary universities and media centers.  “It was all an echo of Nietzsche’s ultra-relativist claim (later elaborated by Heidegger) that ‘there are no facts, only interpretations’” (p. 252).  

Ironically, Derrida himself, toward the end of his life, announced an important “Turn” in his thought.  He acknowledged “the logical as well as spiritual need for a foundation of some kind.  And out it came, as quite a shock to his adamantly relativist followers:  “‘I believe in justice.  Justice itself, if there is any, is not deconstructible’” (p. 266).  Derrida simply illustrates the perennial power of the natural law—there are some things we just can’t not know!    Derrida’s “Turn” underscores what Gairdner endeavors to do in this book:  “to expose the intellectual weakness of the relativism that pervades modern—especially postmodern—thought and also to offer a basic overview of the absolutes, constants, and universals that constitute the substance of the many fields explored here.  We have see them at work in culture through human universals, in physics via the constants of nature, in moral and legal thought via the natural law, and in the human body via our hardwired biology.  And not least, of course, in view of its close approximation to human thought itself, we have looked at the constants and universals of human language” (p. 308).  He persuasively demonstrates that “we do not live in a foundationless or relativistic world in which reality and meaning, or what is true and false, are simply made up as we go along and according to personal perceptions.  On the contrary, we live in a world in which every serious field of human thought and activity is permeated by fundamentals of one kind or another, by absolutes, constants, and universals, as the case may be, of nature and of human nature” p. 308).  

# # # 

244 Fewer . . . and Fewer of Us

  Among the handful of must-read 20th century dystopian novels—Aldus Huxley’s Brave New World; George Orwell’s 1984; C.S. Lewis’s That Hideous Strength—is P.D. James’s The Children of Men (New York:  Penguin Books, c. 1992), which prods both our imagination and reason by envisioning the potential consequences of demographic trends.  James, a distinguished novelist, known mainly for her riveting (and philosophically nuanced) mystery stories, portrays the world in 2021, twenty-six years after the last baby was born, dooming the race to extinction.  She challenged, in a powerful artistic way, one of the prevailing orthodoxies of our day—the threat of overpopulation.  The Children of Men boldly countered the message of Paul Ehrlich’s 1968 best-selling Population Bomb (one of the most egregiously misguided books published during that pivotal decade), which fueled to the mounting fears of ecological catastrophe then gripping the environmental community.  Because earth’s resources are finite, he predicted:  “In the 1970s the world will undergo famines—hundreds of millions of people are going to starve to death.”  Ehrlich was duly lauded by the academic community (after all he was a certified member of the elite, a professor at Stanford University with an enviable reputation as an entomologist) and courted by the complacent media (Johnny Carson gushing over him for his prescience).  

One of the few journalists willing to risk censure by differing with Ehrlich was Ben J. Wattenberg, who warned of actual population implosion rather than explosion.  Two decades later he wrote The Birth Dearth, examining the “Total Fertility Rate” which was falling around the globe.  He vainly hoped to stimulate a national conversation on the subject, but few (alas) recognized the reality of birth dearth.  Returning to his concern in Fewer:  How the New Demography of Depopulation Will Shape Our Future (Chicago:  Ivan R. Dee, c. 2004), he argued that “never have birth and fertility rates fallen so far, so fast, so low, for so long, in so many places, so surprisingly” (p. 5).  “For at least 650 years,” he says, “the total number of people on earth has headed in only one direction:  up.  But soon—probably within a few decades—global population will level off and then likely fall for a protracted period of time” (p. 5).  

European, Russian and Japanese populations are virtually in free fall, with a Total Fertility Rate (TFR) significantly less than requisite replacement levels (2.1 per woman).  Europe’s population will likely shrink from 728 million in 2000 to 632 million in 2050.  To replace lost babies, Europe would need to take in nearly two million (rather than the current 376,000) immigrants each year.  Russia has a TFR of 1.14 and will lose 30 percent of its population by mid-century.  Not only are folks having fewer children—they want less!  And lest we think this is true only of prosperous, highly industrialized nations, it also applies to Less Developed Countries, where a dramatic reduction in population growth has occurred within the past few decades.  China, for example, had a TFR of 6.06 forty years ago; after instituting a “one child only” policy, by  the beginning of the millennium the TFR fell to 1.8!  Similarly, South Korea’s 2005 rate fell to 1.17.  Brazil and Mexico reveal the same depopulating trajectory.  In fact, few nations are repopulating themselves.  America, however, has proved somewhat exceptional, sustaining a replacement level fertility rate—in part through welcoming immigrants who frequently have large families.  

To explain this phenomenon, Wattenberg points first to increased urbanization, where children are something of a liability rather than an asset.  Secondly, as women pursue higher education and careers—and as couples earn more money—they have proportionally fewer children.  “More education, more work, lower fertility” (p. 96).  Thirdly, abortion disposes of 45 million developing children every year.  Fourthly, divorce lowers fertility as single women welcome fewer children than their married counterparts.  Fifthly, contraception (exploding since the ‘60s) empowers couples to enjoy sexual pleasure without undesired consequences.  And sixthly, since couples marry later in life they inevitably have fewer offspring.  The ominous consequences of this depopulation cannot be ignored, because the welfare states established in virtually all modern countries simply cannot support growing numbers of elderly retirees funded by dwindling numbers of younger workers.  Successful businesses thrive by employing creative young workers and selling goods to expanding numbers of consumers—essential factors inevitably absent as populations decline.  Nations—and civilizations—will lose power and influence as their numbers decline.  Less free, less enlightened dictatorial successors may very well replace them.  The world, quite simply, will be a radically different place within a century.  

* * * * * * * * * * * * * * * * * * * * * * 

In What to Expect When No One’s Expecting:  America’s Coming Demographic Disaster (New York:  Encounter Books, c. 2013) Jonathan V. Last details the latest data regarding population prospects.  The book’s title reveals its thesis:  no one’s expecting these days—and paradoxically, as P.J. O’Rourke quipped, “the only thing worse than having children is not having them.”  Failing to heed the Bible’s first injunction—“be fruitful, and multiply, and replenish the earth”—modern man faces an uncertain prospect bereft of children, coddling pets as their “fuzzy, low-maintenance replacements” (p. 3).  The earth, it seems, will grow emptier.  Indeed, “only 3 percent of the world’s population lives in a country whose fertility rate is not declining” (p. 92).  We are moving from the “First Demographic Transition,” wherein children took center-stage and politicians built careers on looking out for them, to the “Second Demographic Transition,” wherein individual adults shun both families and children to pursue their own careers and pleasures.  “Middle-class Americans don’t have very many babies these days.  In case you’re wondering, the American fertility rate currently sits at 1.93,” significantly below the requisite replacement level (p. 4).  At the moment, the deficit is rectified by Hispanic women, who average 2.35 babies, but they too are rapidly choosing to have less and less.  For example, the once-plenteous supply of Puerto Rican immigrants to New York has collapsed as the island’s birthrate shrunk in 50 years from 4.97 to 1.64.  “Some day,” Last says, “all of Latin America will probably have a fertility rate like Puerto Rico’s.  And that day is coming sooner than you think” (p. 113).  Labor shortages in Latin countries will eliminate the need to emigrate and the U.S. population picture will quickly resemble Europe’s.    

Glancing abroad, by 2050 Greece may lose 20 percent of its people; Latvia has, since “1989 lost 13 percent” and “Germany is shedding 100,000 people a year” (p. 25).  Spain registers barely one baby per woman.  Japan’s population is shrinking and abandoned rural villages bear witness to the trend.  It’s the same in Russia:  “In 1995, Russia had 149.6 million people.  Today, Russia is home to 138 million.  By 2050, its population will be nearly a third smaller than it is today” (p. 25).  Consequently, Vladimir Putin has zealously promoted a variety of failing schemes designed to encourage women to have more children.  But they choose not to!  Other things seem more important.  “Divorce has skyrocketed—Russia has the world’s highest divorce rate.  Abortion is rampant, with 13 abortions performed for every 10 live births.  Consider that for a moment:  Russians are so despondent about the future that they have 30 percent more abortions than births.  This might be the most grisly statistic the world has ever seen.  It suggests a society that no longer has a will to live” (p. 137).  

Portents of things to come stand revealed in Hoyerswerda, a German city near the Polish border.  In 1980 the town had 75,000 residents and “the highest birth rate in East Germany” (p. 98).  With the collapse of the Soviet Union, the folks there (and, more broadly, throughout the former East Germany) simply stopped procreating.  The fertility rate abruptly plunged to 0.8 and within three decades the town lost half of its residents.  Hoyerswerda “began to close up shop” (p. 98).  Buildings, businesses, and homes stood vacant.  Similar developments across the country dictated a significant shift from “urban planning” to expand and develop infrastructures and suburbs to devising ways “to shrink cities” (p. 98).  Parks now proliferate, replacing factories and schools.  The wolf population is actually resurgent, with wolf-packs prowling around dwindling settlements.  

Mirroring these European trends is Old Town Alexandria (the D.C. suburb where Last and his wife lived for a while)—a “glorious preserve of eco-conscious yoga and free range coffee.  My neighbors had wonderfully comfortable lives in part because they didn’t take on the expense of raising children” (p. 25).  As a portent of things to come, white, college-educated women, shopping in Alexandria’s stylish boutiques and devotedly determined to pursue a career, have a fertility rate of 1.6—barely more than Chinese women after decades of that nation’s recently-suspended “one-child” policy.  In 1970 the average Chinese woman bore six children and the Communist regime envisioned multiple problems with the ticking population bomb.  Energetic policies were implemented until quite recently, when the rulers realized the implications of population implosion.  But a trajectory has been set and within forty years “the age structure in China will be such that there are only two workers to support each retiree” (p. 13).  

Looking to explain this world-wide pattern, the author lists a variety of “factors, operating independently, with both foreseeable and unintended consequences.  From big things (like the decline in church attendance and the increase of women in the workforce) to little things (like a law mandating car seats in Tennessee or the reform of divorce statutes in California), our modern world has evolved in such a way as to subtly discourage childbearing” (p. 16).  Certainly there are good reasons not to procreate.  Heading the list is money.  Rearing a child may very well cost parents a million dollars!  Financially, it’s the worst investment possible!  “It is commonly said that buying a house is the biggest purchase most Americans will ever make.  Well, having a baby is like buying six houses, all at once.  Except you can’t sell your children, they never appreciate in value, and there’s a good chance that, somewhere around age 16, they’ll announce:  ‘I hate you’” (p. 43).  

Complicating this are welfare state structures such as Medicare and Social Security that “actually shift economic incentives away from having children” (p. 46).  Though Texas Governor Rick Perry was ridiculed for suggesting it, Social Security really is a “Ponzi scheme” that will only work “so long as the intake of new participants continues to increase” (p. 107).  In 1950 three million folks were getting Social Security checks; thirty years later there were 35 million retirees expecting monthly payments; by 2005 nearly 50 million were on the role, taking $546 billion a year from taxpayers still working.  In its initial (New Deal) phase, Social Security only exacted one percent of a worker’s paycheck; 30 years later (under LBJ’s Great Society) the rate inched up to three percent; by 1971 it jumped to 4.6 percent; and today (shielded from any adjustments by Barack Obama) it amounts to 6.2 percent.  The sky, you might say, is the limit as an endless line of elders look to their shrinking numbers of children to pay the bills.  The same goes for Medicare—except the prognosis is worse by far!  It simply cannot survive in it present form, given the realities of a shrinking population.  Both programs “were conceived in an era of high fertility.  It was only after our fertility rate collapsed that the economics of the programs became dysfunctional” (p. 109).  

Yet looming above all else is “the exodus of religion from the public square” (p. 84).  Devout Catholics and Protestants have more kids.  They shun the behaviors facilitating population decline—contraception, abortion, cohabitation, delayed marriage, divorce—and church-going couples fully enjoy marriage in ways unavailable to their secular counterparts.  Practicing Protestants increasingly resemble practicing Catholics, procreating more than enough youngsters to support population growth.  But non-religious women, according to a 2002 survey, had a fertility rate of only 1.8, whereas women who rated religion as “not very important” clocked in at 1.3.  Ultimately “there’s only one good reason to go through the trouble” of rearing children:  “Because you believe, in some sense, that God wants you to” (p. 170).  For this reason our government especially should craft family-friendly, child-friendly policies—repudiating the feminist and homosexual ideologies shaping the laws and judicial decrees of the past half-century.   

* * * * * * * * * * * * * * * * * * * * * * 

Columnist Mark Steyn, whether writing or speaking, is justly renowned for his infectious humor and verbal dexterity, bringing to his discussions of serious subjects a sustained note of good cheer.  There is little to cheer about, however, in Steyn’s America Alone:  The End of the World as We Know It (Washington:  Regnery Publishing, Inc., c. 2006) wherein he casts a gloomy look at demographic realities and predicted “the Western world will not survive the twenty-first century, and much of it will effectively disappear within our lifetimes, including many if not most European countries” (p. xiii).  Within 40 years “60 percent of Italians [once lionized for their large and boisterous families] will have no brothers, no sisters, no cousins, no aunts, no uncles” (p. xvii).  Declining populations will leave welfare states unsustainable and civilization unfeasible.  “Civilizations,” said Arnold J. Toynbee, in A Study of History, “die from suicide, not murder,” and Western Civilization is in the midst of self-inflicted mortal wounds.  “We are,” Steyn says, “living through a rare moment:  the self-extinction of the civilization which, for good or ill, shaped the age we live in” (p. 3).  

Though we’re tempted to think such things have never happened before, Steyn jolts us with a quotation from Polybius (c. 150 B.C.), one of the greatest ancient historians, who said:  “In our own time the whole of Greece has been subject to a low birth rate and a general decrease of the population, owing to which cites have become deserted and the land has ceased to yield fruit, although there have neither been continuous wars nor epidemics. . . .  For as men had fallen into such a state of pretentiousness, avarice, and indolence that they did not wish to marry, or if they married to rear the children born to them, or at the most as a rule but one or two of them, so as to leave these in affluence and bring them up to waste their substance, the evils rapidly and insensibly grew” (The Histories, XXXVI).    

Basic to demographic decay, as both Polybius and Steyn argue, is an apparently irreversible moral and spiritual decay leaving listless increasing numbers of people.  Irreligious folks inevitably lose faith not only in an invisible God but in equally invisible ethical principles and reasons to live hopfully for the future.  Thus Europe’s population has plunged like a raft going over a waterfall in the wake of the de-Christianization of the continent.  Childless places like Japan and Singapore and Albania have little religious vitality.  Standing alone in the midst of all this is the United States, which still enjoys a modest population growth.  True to form, the U.S. is the extraordinary Western nation still featuring robust religious activity.  Unfortunately this may not long persist since “most mainline Protestant churches are as wedded to the platitudes du jour as the laziest politician” (p. 98).  They “are to one degree or another, post-Christian.  If they no loner seem disposed to converting the unbelieving to Christ, they can at least convert them to the boggiest of soft-left political clichés, on the grounds that if Jesus were alive today he’d most likely be a gay Anglican bishop in a committed relationship driving around in an environmentally friendly car with an ‘Arms Are for Hugging’ sticker on the way to an interfaith dialogue with a Wiccan and a couple of Wahhabi imans” (p. 100).  Without a resurgence of orthodox, muscular Christianity, Steyn thinks, America too will soon choose the childless path to historical oblivion.  

In addition to population implosion, Steyn devotes much attention in America Alone to the threat of Islamic imperialism, facilitated by the growing passivity—the unwillingness to resist terrorism—throughout much of what was once labeled “Western Civilization.”  Indicative of the trend was “the decision of the Catholic high school in San Juan Capistrano to change the name of its football team from the Crusaders to the less culturally insensitive Lions” (p. 158).  (Simultaneously, 75 miles to the south, lock-stepping with the culture, Point Loma Nazarene University—while I was on the faculty—similarly  changed its mascot from Crusaders to Sea Lions.)  This loss of a masculine will-to-fight, as will as the will-to-procreate, signifies a collapsing culture.  Indeed, the “chief characteristic of our age is “‘deferred adulthood’” (p. 191).  And it takes strong adults to create and sustain a culture.  

* * * * * * * * * * * * * * * * * * * 

However realistically we appraise the threat of Islamic Jihadism, demographic realities foretell coming calamities in Muslim lands during the next half-century.  This prospect becomes clear in David P. Goldman’s How Civilizations Die (And Why Islam Is Dying Too) (Washington:  Regnery Publishing, Inc., c. 2011).  Growing numbers of us are aware of the “birth dearth” haunting much of the world, but because it’s underreported, few know that within four decades “the belt of Muslim countries from Morocco to Iran will become as gray as depopulating Europe” (p. x).  For example females in Iran, though now surrounded by half-a-dozen of their siblings, will themselves “bear only one or two children during their lifetimes” (p. x).  “The fastest demographic decline ever registered in recorded history is taking place today in Muslim countries” (p. xv).  

Along with his description of demographic patterns in Muslim nations, Goldman’s discussion of “four great extinctions” makes his book worth pondering.  The first extinction took place a millennium before Christ, with the passing of the Bronze Age and the disappearance of cities such as Troy, Mycenae, and Jericho.  The second extinction, two hundred years before Christ, enervated the Hellenic civilization once centered in cities such as Athens and Sparta.  Aristotle says Sparta shrank within a century from 10,000 to 1,000 citizens.  Increasingly large landed estates, run for the benefit of an ever diminishing aristocracy less and less concerned with rearing children, indulging themselves with sexual perversions such as pederasty, left Sparta bereft of people and militarily listless.  The city was, he said  “ruined for want of men.” (Politics, II, ix).   

The third extinction marked the end of Rome’s power and grandeur in the fourth and fifth centuries A.D.  Even in the glory days of the Empire, when Augustus Caesar reigned, “there was probably a decline” in the empire’s population due to “the deliberate control of family numbers through contraception, infanticide and child exposure” (p. 131).  Augustus himself decreed punishments for “childlessness, divorce, and adultery among the Roman nobility” (p. 131), but nothing worked, and the empire’s needed laborers and soldiers were necessarily drawn from defeated (or volunteering) barbarians from the North.  We are now in the midst of the fourth extinction, when civilizations (East and West) are beginning to show symptoms of rigor mortis.   This extinction began in many ways with the French Revolution in 1789, the “world’s first attempt to found a society upon reason rather than reason” (p. 134), followed by the subsequent waves of revolutionary activity, that transformed Europe into a bastion of atheistic and anti-natal secularism.  

Though Islam seems to be a vibrant religion, currently regaining its virility through movements such as the Muslim Brotherhood, Goldman thinks it is in fact violently (and vainly) reacting to the global secularism fully evident in the dramatic decline of population throughout the Islamic world.  Joining “Western Civilization,” Islam is another dying culture!  So fewer and fewer of us will inherit the earth.  

243 Scared to Death

 Though I routinely recommend various books wishing them widely read, I occasionally finish one wishing everyone fully knew its contents, for, as the prophet Hosea said, “My people are destroyed for lack of knowledge” (4:6).   Scared to Death:  From BSE to Global Warming:  Why Scares Are Costing Us the Earth (New York:  Continuum, c. 2007; 2009 reprint), by two British journalists, Christopher Booker and Richard North, is one of those books.  In brief, they document Shakespeare’s insight in A Midsummer Night’s Dream (“In the night, imagining some fear, how easy is a bush supposed a bear”) showing how a succession of unfounded fears have panicked and harmed millions of people.  Each panic followed a “common pattern,” beginning with alleged scientific data portending a catastrophe in the making.  “Each has inspired obsessive coverage in the media.  Each has then provoked a massive response from politicians and officials, imposing new laws that inflicted enormous economic and social damage.  But eventually the scientific reasoning on which the panic was base has been found to be fundamentally flawed” (p. ix).  Though differing in details, they all resemble the “millennium bug” that so exercised millions of folks as January 2000 approached.  Eminent authorities warned of “potentially disastrous global consequences to both business and government” as computers were predicted to malfunction.  Scores of institutions invested millions of dollars preparing for the crisis.  But absolutely nothing happened!  

The first part of the book delves into a litany of “food scares” that profoundly affected Great Britain.  Beginning in 1985, a few cattle died as a result of brain infection—known as “cattle scrapie” and ultimately dubbed “Mad Cow Disease.”  At the same time scattered salmonella and listeria outbreaks led anxious experts to blame eggs and cheese as the culprits.  Government scientists and bureaucrats leapt into action, persuaded they needed to protect the public, decreeing the slaughter of herds and flocks.  Flooded with sensational statements in the newspapers and on TV, people around the world suddenly shunned British beef and eggs, bankrupting scores of small farmers.  Hygiene became a pressing and paramount issue, though food poisoning incidents “remained curiously stable” (p. 76).  Absolutely no evidence existed linking brain encephalopathies in livestock to human beings, yet nothing deterred government spokesmen and journalists from hyping the threat.  When the “Mad Cow disease” was finally  laid to rest, more than 8,000,000 cattle and sheep had been destroyed with a total cost of more than three billion pounds.  Comprehensively calculated, the panic cost twice that.  “Without question it was the most expensive food scare the world has ever seen” (p. 126).

Having examined, in detail, health-related scares in Britain, Booker and North devote the second part of Scared to Death to “general scares” that duplicate the same pattern.  “In many ways the first truly modern ‘scare’ was one that began in America” following the publication of Rachel Carson’s Silent Spring in 1962 (p. 167).  She blamed DDT, a powerful insecticide widely used following WWII, for poisoning the environment and causing cancer.  Though it had been remarkably successful—reducing malaria mortality rates by 95 per cent—fervent  environmentalists quickly crusaded to ban DDT.  Greenpeace and the World Wildlife Fund effectively pushed for a world-wide ban on the substance, despite the fact that, as Michael Crichton said:  “‘Since the ban two million people a year have died unnecessarily from malaria, mostly children.  All together, the ban has caused more than fifty million needless deaths.  Banning DDT killed more people than Hitler’” (p. 170).  No solid studies have found DDT remotely responsible for cancer in human beings.  Indeed its worst consequence seems to be the thinning of eggshells for birds of prey.  An examination of  “The Modern Witch Craze” documents the incredible claims of Satanic ritualistic abuse of children enkindled in the 1980s.  Beginning with allegations brought by a California  mother who believed her son had been abused in the McMartin Infant School and sustained by a corps of social workers and counselors who insisted children’s stories could not be doubted, dozens of innocent people were brought to trial (in Britain as well as America) and imprisoned before mounting evidence demonstrated the folly of it all.  Some of the accused committed suicide.  We now know that social workers (armed with state authority) separated the children from their parents for weeks or even months at a time to interview them.  The children “were repeatedly plied with leading questions of a type which would never have been allowed in a courtroom” (p. 191).  Their outlandish stories, venturing into the fantastical, were taken literally by the psychological “experts” (often claiming to help children recover repressed memories) and all too frequently trusted by prosecutors.  In time most of the adult “culprits” were vindicated and we now know how untrustworthy both children’s stories and social workers’ constructions can actually be.  But the actual pain and suffering resulting from the witch craze can hardly be calculated.  

Few of us filling our gas tanks with unleaded fuel realize the high price we pay results, in part, from the billions of dollars wasted through the “lead scare” that mandated it.  Concentrated doses of lead (e.g. in ancient Roman water pipes) can certainly be toxic and its presence in gasoline helped pollute the air.  But lead is a “miraculous” additive to gasoline, significantly improving engine efficiency, and there was absolutely no evidence that leaded gas residue was any threat to public health.  However, a single, scientifically dubious study (by Robert Needleman, a child psychologist from the University of Pittsburg) alleging harmful effects on children’s intellectual development, was manipulated by politicians and the Environmental Protection Agency to mandate unleaded gasoline and justify a massive social change.  Yet Needleman was acclaimed and awarded for his work and given the Rachel Carson Award for Integrity in Science in 2004.  According to EPA administrator Carol Browner:  “‘The elimination of lead from gas is one of the great environmental achievements of all time’” (p. 234).   If so, one must wonder precisely what was actually achieved apart from fuzzy feelings about helping the children!  

While no one today doubts the lethal effects of smoking cigarettes, the threat of “passive smoking” can hardly be proven.  Smokers harm themselves but not “innocent” bystanders.  Yet during the past several decades activists have successfully campaigned to require, at considerable cost, a “smoke-free” environment virtually everywhere.  Thus for 20 years it has been illegal to smoke in California “workplaces, bars and restraints, but also within a yard and a half of any public building and on it famous beaches” (p. 254).  Allegations that thousands of people die each year due to “passive smoking” quite simply lack any statistical or factual basis.  Non-smokers may be  offended by the smell of tobacco smoke, but they suffer no real harm.  A massive research project, commissioned by the World Health Organization and conducted by 27 esteemed epidemiologists and cancer specialists, demonstrated this.   “Across the board and in all seven countries, their conclusions were consistent.  They found no evidence that there was any ‘statistically significant’ additional risk from passive exposure to smoke, either at home or in the workplace” (p. 256).  

Another study, “the longest and most comprehensive scientific study ever carried out into the effects of passive smoking anywhere in the world,” commissioned by the American Cancer Society, similarly concluded that “there was no ‘causal relation between environmental tobacco smoke and tobacco-related mortality’” (p. 261).  One would think such evidence would lead to retractions and shifts in public pronounements and policy.  Wrong!  Anti-tobacco fanatics facilely disregarded the evidence, sought to suppress scholarly papers and silence dissenters, linking arms with politicians such as New York’s Mayor Bloomberg on their mission to purify the air of all taints of the hated weed!  Booker and North conclude:  “The triumph of the campaign against passive smoking had provided one of the most dramatic examples in history of how science can be bent and distorted for ideological reasons, to come up with findings that the evidence did not support, and which were in many ways the reverse of the truth.  In this respect, it provided one of the most vivid examples in modern times of the psychological power of the scare” (p. 270).  

Add to the fear of tobacco smoke the fear of asbestos, one of the world’s most wonderful fire-resistant minerals, widely used in water pipes, brake linings, and building materials.  It can be woven into cloth-like products or mixed with plaster and cement as a reinforcement stronger than steel.  As with tobacco, however, some forms of asbestos can prove deadly when inhaled and absorbed by the lungs.  This is true, however, of only one kind of it!  The more common “white asbestos” is “by far the most widely used” and “poses no measurable risk to human health” (p. 276).  Fully 90 percent of the asbestos found in America’s buildings was benign.  But the limited numbers of workers dying of cancer due to intensive exposures to the deadly form of  asbestos enabled scaremongering lawyers and contractors, buoyed by EPA edicts, to pounce on people’s ill-informed fear of any exposure to any kind of it.   Buildings of all sorts (churches, schools, factories, homes) must be cleansed!  Companies must be punished through lawsuits—and, in time, great corporations such as Johns-Manville were destroyed and even Lloyds of London nearly collapsed.  Cunning lawyers extracted billions of dollars from beleaguered asbestos suppliers.  Legislative efforts to curtail the proliferating lawsuits were blocked “by a caucus of Democrat senators [e.g. Joe Biden; Edward Kennedy; John Kerry; Hillary Clinton; John Edwards] who had each received huge sums in campaign funding from law firms” (p. 322).  Ultimately, says Professor Lester Brickman:  “Asbestos litigation has come to consist, mainly, of non-sick people . . . claiming compensation for non-existent injuries, often testifying according to prepared scripts with perjurious contents, and often supported by specious medical evidence . . . it is . . . a massively fraudulent enterprise that can rightly take its place among the pantheon of . . . great American swindles’” (p. 273).  

Even more devastating is the irrational fear of global warming, “the new secular religion,” which is now fraudulently branded “climate change” since the evidence for actual warming has faded.  Objective historians have long noted significant climate changes—a “pre-Roman Cold” period (700-400 B.C), a “Roman Warming” time (200 B.C.-500 A.D), a cold era during the “Dark Ages” (500-900 A.D), a “Medieval Warming” time (900-1300 A.D.), a “Little Ice Age” (1300-1800), and the “Modern Warming” era we’re now in.  It has been much warmer, and much colder, in the past two millennia.  But hugely influential and well-funded scientists such as Michael Mann have distorted the record with sensational “evidence” including his spurious “hockey stick” graph that flattened out both the Medieval Warming and Little Ice Age.  Alarmists such as Mann seek to demonstrate temperature change with data from a single “1993 study of one group of trees in one untypical corner of the USA” (p. 359) and an “unqualified acceptance of the recent temperature readings given by hundreds of weather stations across the earth’s surface” (p. 359).  Ignored is evidence from weather satellites or the probable contamination of weather stations near urban centers.  

Bolstered by suspicious scientific pronouncements, activists such as Al Gore and Barack Obama have ignited widespread fears and orchestrated policies designed to fundamentally alter human behavior on earth through such things as the 1997 Kyoto Protocol.  Though Gore’s documentary—“An Inconvenient Truth”—won awards in various quarters, it was perceptively labeled, by an Australian paleoclimatologist, as “‘a propaganda crusade’” largely “‘based on junk science.  ‘His arguments are so weak that they are pathetic.  It is incredible that they and his film are commanding public attention’” (p. 378).   Calling for a curtailment on burning fossil fuels or developing nuclear energy (by far the best solution to the problem of greenhouse gases), Gore and his green corps demand the development of various forms of “green energy.”  Interestingly enough, environmentalist rhetoric subtly shifted from warning regarding “global warming” to admonitions for “green energy”!  Yet to this point highly-touted “green alternatives” such as wind turbines make little dent on the production of carbon dioxide—e.g. the 1200 turbines  built in Britain that produce only one-eighth of the electricity supplied by one coal-fired plant in Yorkshire!  

In fact, “climate change” is most likely driven by solar activity and clouds, with only minimal impact attributable to human activities.  “In many respects, however, the alarm over global warming was only the most extreme example of all the scares described in this book.  Yet again it had followed the same familiar pattern:  the conjuring up of some great threat to human welfare, which had then been exaggerated far beyond the scientific evidence; the collaboration of the media in egging on the scare; the tipping point when the politicians marshaled all the machinery of government in support of the scare; and finally the wholly disproportionate regulatory response inflicting immense economic and social damage for a highly questionable benefit” (p. 403).  

* * * * * * * * * * * * * * * * * * * * * * * * * * *

Oklahoma Senator James Inhofe’s The Greatest Hoax:  How the Global Warming Conspiracy Threatens Your Future (Los Angeles:  WND Books, c. 2012) seeks to counter the positions promoted by Al Gore and environmental alarmists.  Throughout the book Senator Inhofe pillories Gore, oft-times portrayed by the media as a “climate prophet” or “Goricle.”  Indeed, “Katie Couric famously said that Gore was a ‘Secular Saint,’ and Ophrah Winfrey said that he was the ‘Noah’ of our time” (Kindle #1182)   Obviously Gore and environmentalists have embraced and promote a religion rather than a scientific position.  Thus dissenters from the environmentalist dogma like Inhofe are treated as heretics akin to “Holocaust deniers”!  To Robert F. Kennedy Jr., those who dare differ with Gore are traitors!  To deal with them, one journalist “called for Nuremberg-style trials for climate skeptics” (#1372)!   Their research must be proscribed, their publications censored!  

Folks such as Couric and Kennedy are, manifestly, full-fledged true believers who revel in hysterical rhetoric.  Folks like Senator Inhofe, in opposition, join a distinguished minority of highly-informed people who question the devotees of “climate change.”  Thus they find credible scientists such as Dr. Claude Allegre, a noted French geophysicist, “a former French Socialist Party leader, a member of both the French and U.S. Academies of Science, and one of the first scientists to sound the global warming alarm—who changed around 2006 from being a believer to a skeptic” (#1903).  Joining Allegre, Dr. David Bellamy, highly acclaimed, figure in the UK, was “also converted into a skeptic after reviewing the science.  Bellamy said that “‘global warming is largely a natural phenomenon’ and said that catastrophic fears were ‘poppycock.’  ‘The world is wasting stupendous amounts of money on trying to fix something that can’t be fixed,’ and ‘climate-change people have no proof for their claims.  They have computer models which do not prove anything’” (#1919).  

Sitting on the Senate Environment and Public Works Committee, Inhofe has political acumen and access to substantive scientific studies.  Consequently he played a critical role in defeating President Obama’s “cap and trade” proposals.  (He was, importantly, working at the same time to pass the Clear Skies Act, designed to improve air quality, so he can hardly be dismissed as an enemy of environmental health).  He proudly labels himself “a one man truth squad” on the global warming issue and includes a great deal of personal details regarding his background and concerns regarding the state of the American Union.  Consequently:  “This book, constitutes the wake-up call for America—the first and only complete history of the Greatest Hoax, who is behind it, the true motives, how we can defeat it—and what will happen if we don’t” (#88).  He knows, for example, according to the testimony of EPA Administrator Lisa Jackson, that even if the U.S. enacted the most stringent policies designed to reduce carbon levels in the atmosphere “it would only reduce global temperatures by 0.06 degrees Celsius by 2050.  Such a small amount is hardly even measurable” (#140).  Still more:  “‘No study to date  has positively attributed all or part [of the climate change observed] to anthropogenic causes’” (#706).  

So what’s actually taking place within the global warming scaremongering?  “Looking back, it is clear that the global warming debate was never really about saving the world; it was about controlling the lives of every American.  MIT climate scientist Richard Lindzen summed it up perfectly in March 2007 when he said ‘Controlling carbon is a bureaucrat’s dream.  If you control carbon, you control life’” (#440).  There’s no question that “progressives” from Woodrow Wilson to Barack Obama have striven to take control of our lives, purportedly to maximize pleasure and minimize pain for the public.  More broadly, to Vaclav Klaus, President of the Czech Republic:  “‘The global warming religion is an aggressive attempt to use the climate for suppressing our freedom and diminishing our prosperity.”  It is a “totally erroneous doctrine which has no solid relation to the science of climatology but is about power in the hands of unelected global warming activists ” (#19).  Klaus writes with an understanding of what European leaders such as French President Jacques Chirac envision when they tout the Kyoto treaty as “‘the first component of an authentic global governance’” (#553).  Equally perceptive, Canada’s Prime Minister Stephen Harper “called Kyoto a ‘socialist scheme’” (#561).  Consequently, Inhofe concludes:  “it is crystal clear that this debate was never about saving the world from man-made global warming; it was always about how we live our lives. It was about whether we wanted the United Nations to ‘level the playing field worldwide’ and ‘redistribute the wealth.’  It was about government deciding what forms of energy we could use” (#3280).  

Senator Inhoff’s book takes its title from his “Greatest Hoax” Senate speech, and he is deeply convinced that “global warming” or “climate change” is indeed a bogus scenario manufactured by liberal elites who “truly believe that they know how to run things better than any individual country ever could.  In this way they are like ‘super-liberals’ on an international scale.  On one of its websites, the UN even claims that its ‘moral authority’ is one of its ‘best properties’” (#653).  This moral authority apparently resides in the UN’s self-righteous commitment “to the utopian ideals of global socialism” (#653), frequently promoted as necessary for “sustainable development.   The spurious nature of this Hoax became clear when “Climategate, the greatest scientific scandal of our time broke” (#2319).  A careful reading of the emails between scientists in the UK and US (reprinted in considerable detail as an appendix to this treatise) reveals, in the words of Clive Crook:  “‘The closed-mindedness of these supposed men of science, their willingness to go to any lengths to defend a preconceived message, is surprising even to me.  The stink of intellectual corruption is overpowering’” (#2359).  

The Greatest Hoax is important primarily because of its author’s position in government.  Inhofe  has, to the degree possible for a busy politician, studied the evidence, assembled the data, and come to a reasoned conclusion regarding one of the most momentous issues of our day.  If the global warming alarmists are wrong, to follow their admonition can irreparably harm not only this nation but the world, plunging us into a cataclysmic economic and social black hole.

# # # 

242 Nancy Pearcey’s Apologetics

 As a restless, questing college student immersed in the relativism and subjectivism of her milieu, Nancy Pearcey found her way to Francis Schaeffer’s L’Abri Fellowship in Switzerland in 1971.  Here, for the first time, she found Christians (many of them long-haired hippies) seriously discussing and providing answers to the “Big Questions” she was asking.  Though reared in a Christian home and nurtured in a Lutheran church, she lacked the coherent, in-depth understanding of the faith Schaeffer set forth.  In subsequent decades, through advanced academic work, personal study and reflection, she established herself through a variety of publications as a thoughtful exponent of an orthodox evangelical worldview, writing for a popular audience.  She skillfully laces her discussions with quotations and illustrations, both personal and historical, making them accessible to all thoughtful readers.  Though at times overly simplistic (too easily reducing all issues to a “two-storey” graphic) and superficial (sharing Schaeffer’s distaste for  significant Catholic thinkers), she still provides helpful guidance in charting a meaningful framework for understanding and responding to our world.  Her most recent treatise, Saving Leonardo:  A Call to Resist the Secular Assault on Mind, Morals, & Meaning (Nashville:  B&H Publishing Group, c. 2010), continues her helpful endeavor to engage the culture from a Christian perspective.  

She begins by evaluating the everywhere-evident “threat of global secularism,” a massive cultural current transforming our world, primarily through our educational and artistic milieu.  Though often oblivious to its subtly and power, we Christians must awaken to its threat.  Following the example of Early Church thinkers, we must “address, critique, adapt, and overcome the dominant ideologies of our day” (p. 14) bearing in mind J. Gresham Machen’s maxim:  “‘False ideas are the greatest obstacle to the reception of the gospel’” (p. 15).  To Pearcey—as to John Henry Newman—the false idea of modern secularism is its reduction of all truth (including metaphysical, theological and moral truth) to personal preference.  “Whatever works for you” goes the modern mantra!   To which Christians must respond:  Absolutely Not!  Rightly grasped, Christianity is not primarily a personal perspective nor an inner feeling of peace and optimism, but a trustworthy knowledge of what really IS.  This means we cannot accept the fact/value distinction that frequently dominates worldview discussions.  Many modern thinkers insist that whereas they deal objectively with scientific “facts” all ethical “values” are subjective.  Unfortunately numbers of “Christian” thinkers embrace this disjunction.  Following Schaeffer’s lead, however, Pearcey insists this two-storey view cannot but fail one seeking for an integrated philosophy.  What must be recovered, she says, is a pre-Enlightenment perspective, seeing an interrelated symbiosis of natural and spiritual realities equally authored by an all-wise Creator.  

Having alerted us to the secularist threat, Pearcey gives us a “crash course on art and worldview”—nicely (if not lavishly) illustrated with scores of reprints in this well-appointed volume—that  that helps explain how it emerged during the past two centuries.  Throughout the ancient, medieval, and Renaissance-Reformation eras, styles changed but the underlying purpose endured:  highlighting beauty that reveals truth about God, man and nature, both visible and invisible.  As Walker Percy, one of America’s finest 20th century novelists, declared, “art is ‘a serious instrument for the exploration of reality.’  It is ‘as scientific and cognitive as, say, Galileo’s telescope or Wilson’s cloud chamber’” (p. 99).  

Enlightenment intellectuals, however, restricted “truth” to natural science.  Consequently, “art is merely decorative.  Ornamental.  Entertaining.  Isaac Newton called poetry ‘ingenious nonsense.’  . . . .  Hume denounced poets as ‘liars by profession.’  Philosopher Jeremy Bentham agreed:  ‘All poetry is misrepresentation’” (p. 98).  It might soothe one’s inner turmoil or exalt one’s expectations, but it reveals  nothing about anything ultimately real.  So too, many thought, for religion.  But revolutionary 18th century developments sparked not only political upheavals such as the French Revolution but artistic celebrations of highly individualistic and Romantic perspectives.  While scientists may well weigh and measure the external world of nature (how things are) artists insisted on freely imagining how things might or ought to be.  Thus, by the end of the 19th century, movements such as “impressionism” and “cubism” flourished as monuments to this disconnect between art and objective reality.  Rather than representing Reality like a photograph, Romantic art serves as a film projector in a theater, casting images on the screen, and throughout the 20th century, as Pearcey persuasively illustrates, this conviction intensified.  

As an antidote to these developments, Pearcey recommends a recovery of great Christian artistic works—including the music of Bach, the “fifth gospel,” which is, amazingly, quite popular in Japan.  Resulting from the work of Masaaki Suzuki, a famous conductor, thousands of Japanese have learned to play and appreciate the work of the gifted Baroque composer.  “‘Bach works as a missionary among our people,’ Suzuki said in an interview.  ‘After each concert people crowd the podium wishing to talk to me about topics that are normally taboo in our society—death, for example.  Then they inevitably ask me what “hope” means to Christians.’  He concluded:  ‘I believe that Bach has already converted tens of thousands of Japanese to the Christian faith’” (p. 267).   And along with recovering great art we need to cultivate a high quality art absent in the popular culture.  In our churches and homes we need to powerfully represent the Gospel story, shunning the “spiritual junk food” and “sentimentalism” that so frequently masquerades as Christian “music” and “art.”    

To R. Albert Mohler, Jr., President of the Southern Baptist Theological Seminary, “Nancy Pearcey helps a new generation of evangelicals to understand the worldview challenges we now face and to develop an intelligent and articulate Christian understanding . . . Saving Leonardo should be put in the hands of all those who should always be ready to give an answer—and that means all of us.”  

* * * * * * * * * * * * * * * * * * *

During the past century, the cultural consequences of taking natural science as the sole guide to truth have been increasingly (indeed alarmingly) evident.  Illustrating this trend, Eric Temple Bell, a professor at the California Institute of Technology and former president of the Mathematical Association of America, declared that modern (i.e. non-Euclidean) geometry makes mathematics and logic, as well as metaphysics and ethics, endlessly tentative, asserting, in The Search for Truth, that there is no such thing as “Truth.”  Trashing Euclid and Plato, Aristotle and Aquinas—all of whom “forged the chains with which human reason was bound for 2,300 years”—Bell celebrated the brave new world of modernity freed from any illusions regarding absolutes of any sort.  Consequently, as Pope Benedict XVI noted in his inaugural  message, we struggle with the “dictatorship of relativism” that renders all certainties suspect.  

Rightly responding to such “modern” views, Nancy Pearcey supported, in Total Truth:  Liberating Christianity from Its Cultural Captivity (Wheaton:  Crossway Books, 2003), Francis Schaeffer’s position as articulated in his 1981 address at the University of Notre Dame:  “‘Christianity is not a series of truths in the plural, but rather truth spelled with a capital “T.”  Truth about the total reality, not just about religious things.  Biblical Christianity is Truth concerning total reality—and the intellectual holding of that total Truth and then living in the light of that Truth’” (p . 15).  Most needed, Schaeffer and Pearcey insist, is a Christian “worldview” that fundamentally shapes the lives of millions of ordinary believers, thus transforming their culture.  “A worldview is like a mental map that tells us how to navigate the world effectively,” Pearcey says.  “It is the imprint of God’s objective truth on our inner life” (p. 23).  

Unfortunately, too many Christians (Evangelicals included) have reduced their faith to a purely internal, “spiritual” realm disconnected from the physical and social worlds.  In the opinion of Sidney Mean, a distinguished historian:  “‘This internalization or privatization of religion is one of the most momentous changes that has ever taken place in Christendom’” (p. 35).  Unfortunately, as Charles Malik noted, we need “‘not only to win souls but to save minds.  If you win the whole world and lose the mind of the world, you will soon discover you have not won the world’” (p. 63).  Thus pastors and teachers should do apologetics as well as preach salvation.  “Every time a minister introduces a biblical teaching, he should also instruct the congregation in ways to defend it against the major objections they are likely to encounter.  A religion that avoids the intellectual task and retreats to the therapeutic realm of personal relationships and feelings will not survive in today’s spiritual battlefield” (p. 127).  

Despite their many positive accomplishments—and Pearcey generously notes them—American Evangelicals may not survive today’s battles unless they take ideas seriously.  Though there was a scholarly (largely Calvinistic) dimension to 19th century Evangelicalism, the movement became, in revivalists’ hands,  inordinately populist, more concerned with converging the masses than cultivating their minds.  Believers  were, then, unprepared to resist and refute powerful and anti-Christian ideas propounded by Darwin, Marx, Freud, et al.  “The overall pattern of evangelicalism’s history is summarized brilliantly by Richard Hofstadter in a single sentence.  To a large extent, he writes, ‘the churches withdrew from intellectual encounters with the secular world, gave up the idea that religion is a part of the whole life of intellectual experience, and often abandoned the field of rational studies on the assumption that they were the natural province of science alone’” (p. 323).  

So it’s now time, Pearcey declares, to regain lost ground, to reestablish a Christian perspective, to redeem minds as well as hearts.  To do so, Christians need to structure their worldview in accord with three guiding certainties:  creation; fall; redemption.  An originally good creation has been scarred by sin, but God’s gracious redemptive work in Christ has (to a degree) restored His original intent.  Every worldview requires a creation story.  Materialists, ancient and modern, explain the world in terms of mindless matter-in-motion, and pantheists, whether Stoics or “process” thinkers, endow Nature with divine attributes.  Every worldview includes an explanation for the evil surrounding us—inadequately evolved species or demonic social institutions or malignant genes.  And every worldview promises redemption, whether through scientific breakthroughs or political revolutions or inner enlightenment.  Thus, as John Milton wrote, “the goal of learning ‘is to repair the ruins of our first parents’” (p. 129).  

To make a Christian case for Creation, Pearcey says we must deal cogently with Darwinism, stressing that it is, as Huston Smith said, “‘supported more by atheistic philosophical assumptions than by scientific evidence’” (p. 153).  Excluding any possibility of God, as Carl Sagan declared, “Nature . . . is all that IS, or WAS, or EVER WILL BE!”  Adamantly upholding this assumption, naturalistic Darwinism has become a “universal acid” eating away many of the most fundamental cultural certainties basic to Western Civilization.  “Half a century ago G.K. Chesterton was already warning that scientific materialism had become the dominant ‘creed’ in Western culture—one that ‘began with Evolution and has ended in Eugenics.’  Far from being merely a scientific theory, he noted, materialism ‘is really our established Church’” (p. 157).  Consequently:  “‘The so-called warfare between science and religion,’ wrote historian Jacques Barzun, should really ‘be seen as the warfare between two philosophies and perhaps two faiths.’  The battle over evolution is merely one incident ‘in the dispute between the believers in consciousness and the believers in mechanical action; the believers in purpose and the believers in pure chance’” (p. 173).  

The deleterious and far-reaching cultural impact of Darwinism stands illustrated by Joseph Stalin who, as a seminary student, lost his faith in God after encountering Darwin’s theory.  Subsequently he imposed his murderous form of atheism upon a large swathe of the world.  Less murderously, American thinkers—particularly pragmatists such as John Dewey and Oliver Wendell Holmes, Jr.—launched an effective assault on many of the traditions vital to this nation.  To them, all “truths” evolve in accord with naturalistic evolution and thus no permanent standards of right and wrong, in any area of life, actually exist.   Everyone “constructs” his own reality and writes his own rules.  Darwin himself realized, and feared, this logical outgrowth of his theory, confessing, in a letter:  “‘With me, the horrid doubt always arises whether the convictions of man’s mind, which has been developed from the mind of the lower animals, are of any value or at all trustworthy’” (p. 243).  

To refute Darwinism, Pearcey follows the lead of Philip Johnson, the U.C. Berkeley law professor who wrote Darwin on Trial and Reason in the Balance and helped launch the “intelligent design” movement.  Rather than bog down in secondary details regarding the age of the earth or the reality of microevolution, Christians need to “focus on the crucial point of whether there is evidence for Intelligent Design in the universe” (p. 174).  Importantly, we must grasp the significance of recent scientific insights, summed up by John Wheeler, a noted physicist, who said:  “‘When I first started studying, I saw the world as composed of particles.  Looking more deeply I discovered waves.  Now after a lifetime of study, it appears that all existence is the expression of information’” (p. 179).  The cosmos, it seems, is not ultimately mindless matter-in-motion; it is, rather, imprinted with an immaterial pattern, bearing information, or (as Christians have always believed) a Logos responsible for the design everywhere evident.  Just ponder for a moment the widely-heralded fact that every cell in your body contains as much information as 30 volumes of the Encyclopedia Britannica!  This Logos-structured world is increasingly evident as we begin to grasp the majesty of DNA, aptly defined by President Bill Clinton as “‘the language in which God created life’” (p. 1919).   

Darwin himself recognized this manifest design but sought to discount it as only “apparent,” not real.  Similarly, his modern apostle, Richard Dawkins, admits (in The Blind Watchmaker) that “‘Biology is the study of complicated things that give the appearance of having designed for a purpose’” (p. 183).   But be not deceived, he insists, it’s all an illusion spun by the random  machinations of natural selection.  Such statements, Pearcey shows, pervade evolutionary literature, imploring us all to ignore common sense and believe the sacrosanct theory launched by Darwin.  Outside the faithful community of naturalistic evolution, however, we find alternatives expressed by thinkers such as Nobel Prize-winner Arno Penzias, who says:  “‘Astronomy leads us to a unique event, a universe which was created out of nothing, one with the very delicate balance needed to provide exactly the conditions required to permit life, and one which has an underlying (one might say “supernatural”) plan.’  In fact, he says, ‘The best data we have are exactly what I would have predicted, had I nothing to go but the five books of Moses, the Psalms, the Bible as a whole’” (p. 189).  

Thus, Pearcey argues, the Bible contains the necessary ingredients for a coherent worldview.  Taking it seriously, living in accord with its precepts, gives us the basis for cultural activities.  Interested readers may begin this endeavor by consulting the annotated “recommended reading” list she supplies.  

                                                           * * * * * * * * * * * * * * 

For several years Nancy Pearcey worked with the late Chuck Colson, doing much of the research underlying his BreakPoint radio program and coauthoring his monthly column in Christianity Today.   Re-examining their coauthored  How Now Shall We Live (Wheaton:  Tyndale House Publishers, Inc., c. 1999), one suspects that Pearcey did the bulk of the research and preliminary writing with Colson adding his personal touch (scores of personal anecdotes, mostly taken from his prison ministry) and imprimatur.  This is especially evident in the book’s structure (basically duplicated, with added scholarly references in Pearcey’s subsequent Total Truth), stressing four themes:  Creation (where did I come from?); Fall (what’s wrong with me and the world?); Redemption (is there any hope?); and Restoration (how can we help repair what’s wrong?).  A probing query from Ezekiel, struggling in Babylon during the exile, sets the stage:  “How should we then live?” (33:10, KJV).  

We should live, Colson and Pearcey answer, by crafting and following a biblical worldview, empowered by the realization, as St. Hippolytus said in the third century, that when Jesus ascended “‘His divine spirit gave life and strength to the tottering world, and the whole universe became stable once more, as if the stretching out, the agony of the Cross, had in some way gotten into everything’” (p. 13).  Thus everything, rightly understood, points to and leads to the Christ.  All truth is God’s truth!  As the great astronomer Johannes Kepler declared:  “‘The chief aim of all investigations of the external world should be to discover the rational order and harmony which has been imposed on it by God’” (p. 51).  

Our challenge, as Christians, is to both discover and proclaim this truth in a world increasingly skeptical of all truth claims, frequently claiming to find neither rational order nor harmony anywhere.  Such skepticism grows logically from the philosophical naturalism dominant in our culture and on display in school textbooks, PBS programs, and the EPCOTT Center in Disney World.  In accord with the “just so” naturalistic story, for billions of years things just happened without design or purpose.  Lifeless chemicals somehow ignited biological beings (protons and molecules) mysteriously structured by DNA strands.  And we human beings are nothing more than enlivened chemicals, inexplicably endowed with both consciousness and self-awareness and conscience.  This is the materialist creed; consequently, as C.S. Lewis said:  “‘The Christian and the materialist hold different beliefs about the universe.  They can’t both be right.  The one who is wrong will act in a way which simply doesn’t fit the real universe’” (p. 307).  

To Colson and Pearcey, the Christian worldview enables us to live wisely and well in the real universe, for “Christianity gives the only viable foundation for intellectual understanding and practical living” (p. 489).  Whatever our gifts, whatever our vocation, we may play a vital role in God’s work so long as we do it Soli Deo Gloria—only to the glory of God!  We especially need to give attention to our homes, churches, and neighborhoods and schools, bearing in mind the words of the tempter in C.S. Lewis’s The Screwtape Letters:  “‘What I want to fix your attention on is the vast overall movement towards the discrediting, and finally the elimination of every kind of human excellence—moral, cultural, social, or intellectual’” (p. 331).  Our children and friends, as well as the world, need godly (i.e. good) music and literature, art and philosophy, films and TV.  Rather than bring Rock & Roll into the sanctuary we need to take Bach into the marketplace!  Rather than giving children video games, offer them Lewis’s Narnian chronicles and J.R.S. Tolkien’s trilogy of the Rings.  

People also need to live in neighborhoods where broken windows are repaired and playgrounds are safe, where laws are enforced and elders respected.  So well-informed political action (e.g. voting!), particularly on a local level, should be part of the Christian vocation.  Rather than trying to ape (and somehow Christianize) the world we should seek to transform it with divinely-rooted truths.  Thereby we will implement the vision of Francis Schaeffer, to whom this treatise is dedicated, volunteering for service in Christ’s corps of intellectual warriors, contending for the Faith once delivered to the saints.  

241 “Dupes” and “The Communist”

Writing Witness, Whittaker Chambers—one of the most celebrated 20th century repentant Communists who helped expose Alger Hiss and other Soviet spies—reflected on his years working within the Party:  “While Communists make full use of liberals and their solicititudes, and sometimes flatter them to their faces, in private they treat them with the sneering contempt that the strong and predatory almost invariably feel for the victims who volunteer to help in their own victimization.”  Such malleable liberals are analyzed in depth by Cold War historian Paul Kengor, in Dupes:  How America’s Adversaries Have Manipulated Progressives for a Century (Wilmington:  ISI Books, c. 2010).  He skillfully documents the prescience of Norman Thomas, the perennially nominated presidential candidate of the Socialist Party, who said Americans could not be persuaded to candidly establish “socialism” but under the aegis of  “liberalism” would gradually put it in place “‘without ever knowing how it happened’” (p. 479).  

Due to the collapse of the USSR, historians such as Kengor have profited from “the massive declassification of once-closed Cold War archives, from Moscow to Eastern Europe to the United States” (p. 3), and the archival materials depicting the Communist Party USA are especially illuminating.  Though American Progressives were not Communists, they fully supported the Communist wish list:  “workers’ rights, the redistribution of wealth, an expansive federal government, a favoring of the public sector over the private sector, class-based rhetoric (often demagoguery) toward the wealthy, progressively high tax rates, and a cynicism toward business and capitalism, to name a few.  The differences were typically matters of degree rather than principle” (p. 4).  To Kengor, however, such degrees of difference really matter, for by assisting the Communist movement they “knowingly or unknowingly contributed to the most destructive ideology in the history of humanity.  This is no small malfeasance” (p. 11).  

Following the Bolsheviks’ triumph in Russia, Communists organized two Chicago subsidiaries which soon merged into the Communist Party USA (CPUSA) that for 50 years dutifully followed edicts from Moscow.  To bring about revolution in America, however, Communists needed to strategically misrepresent themselves and subtly subvert the nation’s social and economic structures.  Thus they encouraged and showcased “Potemkin Progressives” such as Corliss Lamont—an atheistic “humanist” professor at Columbia University who helped lead the Friends of the Soviet Union and in 1933 wrote Russia Day by Day to celebrate the glories of the Soviet Union.  They also promoted the agenda of another Columbia professor , John Dewey—the renowned pragmatist philosopher who largely shaped the progressive educational agenda that has dominated America’s public schools for nearly a century.  His educational ideas incorporated significant swathes of Marxism and were actually implemented in Russia in the 1920s before being adopted by American schools.    

Invited to visit the USSR in 1928, Dewey was given the standard Potemkin village tour touting the grandeurs of Communism.  Returning home, he wrote glowing reports of what he’d seen, affirming that the  Bolshevik agenda was “a great success.”  He especially endorsed the Soviet schools, the “ideological arm of the Revolution,” which would lead to the success of “The Great Experiment and the Future” in Russia (pp. 90-99).  Dewey was not uninformed of the brutalities accompanying this great experiment and acknowledged its “secret police, inquisitions, arrests and deportations” (p. 100).  But he glibly rationalized them as necessary for the regime to prosper.  He would thus head a corps of influential intellectuals urging President Franklin D. Roosevelt to formally recognize and establish diplomatic relations with the USSR.  Soon thereafter, however, faced with mounting evidence regarding Stalin’s Great Purge and its massive bloodletting, Dewey bravely retracted his endorsement of the Stalinist regime.  He was especially distressed by Stalin’s attack on Leon Trotsky and joined a commission to Mexico to defend him.  For his apostasy the once-acclaimed philosopher was vilified and branded by Stalinists “an enemy of the people.”  

Though Stalinists distrusted and denounced President Roosevelt, they diligently sought to infiltrate his administration in the 1930s.  Because of his prominence and influence, Harry Hopkins—appointed by FDR to head the WPA, serving as his “right-hand man during World War II,” living in the White House and accompanying Roosevelt to the major WWII conferences—doubtlessly “stands as the most sensational case among the potential Soviet agents” (p. 124).  Newly available archival evidence demonstrates that Hopkins was in fact a Soviet spy who effectively duped the president on behalf of his “buddy,” Uncle Joe and manipulated programs such as Lend-Lease to benefit Russia.  Concurrently, FDR rejected warnings regarding Stalin and followed his “hunch” that the despot could be trusted.  Relying on Hopkins’ advice toward the end of the war, he believed Stalin wanted nothing “‘but security for his country, and I think that if I give him everything I possibly can and ask nothing from him in return, noblesse oblige, he won’t try to annex anything and will work with me for a world of democracy and peace’” (p. 165).  We now know, of course, that Stalin rapidly occupied Eastern Europe after WWII and provoked a Cold War that endured for 40 years, enabling the slaughter of millions of people and devastating the economies of dozens of countries.  FDR died ignorant of the havoc resulting from the fact that “from the start to the finish of his administration, the great New Dealer was greatly trashed, hated, and duped by Communists at home and abroad” (p. 181).  

Communists always recognized the power of propaganda, so using the arts—and especially the cinema as Lenin stressed—was imperative.  Accordingly, the shaping of Hollywood became a central objective for Soviet agents in America and they easily found multiplied dozens of easily duped liberal “stars.”  Playwrights such as Lillian Hellman, Dashiell Hammett, and Arthur Miller (whose The Crucible amply illustrates the process) provided the scripts.  Singers including Paul Robeson and actors such as Katharine Hepburn, Lucille Ball, Gregory Peck and Humphrey Bogart were easily enlisted in the “progressive” (i.e. communist) cause.  Consequently, in 1947 the House Committee on Un-American Activities summoned a number of Hollywood celebrities to testify.  Some, including Ronald Reagan (head of the Screen Actors Guild and “hero” of the hearings) and Gary Cooper, were “friendly witnesses” who documented and denounced Communist activities.  Unfriendly witnesses—most notably the “Hollywood Ten,” four of whom we now know were dedicated Communists)—parroted the Party line, labeling their critics as fascists on a witch-hunt and insisting there wasn’t a trace of communism in Hollywood.  Benefiting from the support of the press and powerful Democrat politicians (e.g. Claude “Red” Pepper of Florida), the propagandized public soon believed that it was the House Committee on Un-American Activities, rather than Communists in Hollywood, which was real threat to the nation!

When the Korean War erupted, Corliss Lamont and Lillian Hellman defended the North Koreans. When the United States joined the conflict in Vietnam protesters such as Tom Hayden (perennially elected from his Santa Monica base to assorted office in California), Dr. Benjamin Spock (author of a fabulously successful book on child-rearing), Arthur “Pinch” Sulzberger Jr. (in due time publisher of the New York Times), and John Kerry (the Democrat Senator and nominee for President in 2004), easily absorbed and articulated the Communist agenda.  Violent revolutionaries such as Bill Ayers worked for the defeat of the “American Empire” and the total transformation of America.  Many of these anti-war radicals in the ‘60s ultimately infiltrated and significantly shaped the Democrat Party, where they still exert influence. 

Resisting such radicals stood Ronald Reagan, fully aware of Communist strategies since his days as head of the Screen Actors Guild.   “For years as a private citizen and candidate, Reagan had fiercely opposed the accommodationist policy of detente and spoken frankly about the true nature of Soviet Communism” (p. 366).  To him, the USSR was an “evil empire” to be confronted and defeated.  Consequently, scores of “dupes” sought to deride and destroy him.  For example, Henry Steele Commager, an influential historian, called the President’s “evil empire” speech “‘the worst presidential speech in American history, and I’ve read them all.’  Why?  Because, said Commager, of “Reagan’s ‘gross appeal to religious prejudice’” (p. 393).  Senator Ted Kennedy chastised Reagan “for ‘misleading Red-scare tactics and reckless Star Wars schemes’” (p. 403).  Yet, says Kengor:  “As we now know from a highly sensitive KGB document, the liberal icon [Kennedy], arguably the most important Democrat in the country at the time, so opposed Ronald Reagan and his policies that the senator approached Soviet dictator Yuri Andropov, proposing to work together to undercut the American president” (p. 407).  

With the demise of the USSR and the emergence of radical Islam as the great threat to America, progressive “dupes” shifted gears while preserving their fundamental ideology.   Thus they opposed President George W. Bush’s Iraq policies.   Senate Democratic leader Harry Reid called him a “loser” and prominent politicians routinely labeled him a “liar.”  “Leftists in media and academia joined politicians like [Edward] Kennedy in attacking the White House” (p. 432).  A Columbia University historian—an avowed socialist and formerly president of the American Historical Association—Eric Foner declared:  “‘I’m not sure which is more frightening:  the horror that engulfed New York City or the apocalyptic rhetoric emanating daily from the White House’” (p. 432).  

The unexpected emergence of Barack Obama was quickly promoted by “Progressives for Obama,” spearhead by Tom Hayden, the leader of the Students for a Democratic Society in the ‘60s who had recruited a number of fellow-travelers such as Daniel Ellsberg and Jane Fonda (whom he married).  Admittedly, Obama is a “progressive liberal” rather than a SDS-style Marxist.   Yet “Hayden saw in Obama a long-awaited vehicle for ‘economic democracy,’ an instrument to channel an equal distribution of wealth—‘economic justice,’ or ‘redistributive change,’ as Obama himself once put it.  Hayden said that, ‘win or lose, the Obama movement will shape progressive politics . . . for a generation to come’” (p. 469).    Though Hayden has successfully operated within the Democratic Party in California, other ‘60s radicals (Bill Ayers and his wife, Bernardine Dohrn, Mark Rudd and Michael Klonsky) promote the cause within higher education.  Education, Ayers says, “‘is the motor-force of revolution’” (p. 475).  Ayers and Barack Obama worked together in Chicago to funnel money into the city’s schools so as to advance the cause of “social justice.”  Klonsky and Ayers co-authored an article that “raved about Arne Duncan, longtime head of Chicago public schools, whom the pair described as ‘the brightest and most dedicated schools leader Chicago has had in memory.’  Today Duncan is President Obama’s secretary of education” (p. 472).  

One of the aging radicals most thrilled with Obama’s election was a physician, Quentin Young, a long-term advocate of socialized medicine who had helped launch Obama’s career “in the living room of Bill Ayers and Bernardine Dohrn” (p. 477).  “Young noted that Obama, as a state senator in Illinois, had supported a ‘single-payer universal healthcare system’” that could be implemented when Democrats took complete control of Congress and the White House (p. 477).  Evaluating the 2008 election, Pravda declared “‘that like the breaking of a great dam, the American descent into Marxism is happening with breathtaking speed, against the backdrop of a passive, hapless sheep.’  That ‘final collapse,’ said the pages of the chief party organ of the former USSR, ‘has come with the election of Barack Obama’” (p. 478).  

For nearly a century communists have patiently worked behind the scenes, promoting their cause through progressive dupes.  Now, amazingly enough, in 2008 “Americans had voted CPUAS’s way:  the party could not contain its excitement over Obama’s victory.  The election of Barack Obama was the chance for a wish list to come true—a potential host of nationalizations, from the auto industry to financial services to health care, beginning with more modest steps like establishing the ‘public option’ in health-care reform, plus massive government ‘stimulus’ packages, more public-sector unionization and control, more redistribution of wealth, more collectivization.  ‘all these—and many other things—are within our reach now!’ celebrated Sam Webb in his keynote speech for the New York City banquet of People’s Weekly World, the official newspaper of CPUSA, which reprinted the speech under the headline ‘A New Era Begins.’  With the election of Obama, said Webb, the impossible’ had become ‘possible’” (p. 478).  A “century of dupery” has succeeded!  

* * * * * * * * * * * * * * * * * * * 

In The Communist:  Frank Marshall Davis:   The Untold Story of Barack Obama’s Mentor (New York:  Threshold Editions, c. 2012), Paul Kengor moves from the general story told in Dupes to a very particular case of a little-known, card-carrying Communist who significantly influenced our nation by helping shape the young Barack Obama while he was growing up in Hawaii.  Kengor’s purpose, however, is not to question Obama’s ideology or agenda.  “My purpose is to show that Frank Marshall Davis—who clearly influenced Obama—was a communist member and closet member of CPUSA with private loyalties to Mother Russia” (p. 18).  The story can now be more clearly told thanks to the treasure trove of documents now available following the collapse of the USSR.  Though Davis’s associates and pro-Soviet journalistic pieces elicited the attention of the House Committee on Un-American Activities following WWII, he and his defenders always denied he was actually a Communist.  But the propriety of the Committee’s concern has been validated by Davis’s recently-revealed admission that sometime during the war, he had “‘joined the Communist party’” (p. 92).  

Kengor tells the story of Davis, from his Arkansas City, Kansas, roots to his Chicago involvement (as a journalist) in Communist causes to his final days in Hawaii.  Three pivotal years in Atlanta in the early ‘30s, witnessing the notorious Scottsboro Boys trial (nine black boys were accused of raping two white girls), acerbated his anger with racism and receptivity to the CPUSA’s lavishly-funded Scottsboro propaganda campaign.    Returning to Chicago and initially working for the Associated Negro Press, he dove quickly into the intellectual waters colored by the views of “dupes” such as John Dewey and Margaret Sanger.  He also interacted with both secret members of the Party (such as the singer Paul Robeson) and more open devotees, including the celebrated writers Langston Hughes and Richard Wright (who later resigned from and lamented his support for the Party).  Following the CPUSA line they supported movements such as the American Peace Mobilization and promoted “progressive” causes of various hues.  (Though self-consciously communists, they invariably insisted on using the term “progressive” to define both themselves and their “social justice” objectives).  Davis also worked with prominent “progressive” black leaders in Chicago including Robert Taylor and Vernon Jarrett (one the maternal grandfather, the other the father-in-law of Valerie Jarrett, widely considered President Obama’s closest friend and adviser).  And he joined Harry Canter (subsequent to his years in Moscow) and his son, David, working with newspapers to advance worker’s unions; in due time the younger Canter mentored David Axelrod, who became Barack Obama’s political guru.  

In Chicago, “Frank Marshall Davis was increasingly involved in events sponsored or covertly organized by the communist left” (p. 88), teaching a History of Jazz for the Abraham Lincoln School (widely labeled the “little Red school house”), and joining assorted communist fronts.  Fortuitously, he was enabled to freely “uncork his opinions” in the pages of “a full-blown pro-CPUSA newspaper of his own lead and editorship:  the Chicago Star” (p. 104).  He especially vilified anti-Communist statesmen such as Winston Churchill and Harry Truman, closely following the “Party line, not questioning Stalin” (p. 103).  Recruited to write for the Star were communist luminaries such as Howard Fast, the Hollywood writer who was awarded the Stalin Prize in 1953.  Florida’s leftist Senator Claude “Red” Pepper also graced the paper’s pages, promoting his favorite cause:  socialized medicine.  Pepper’s chief-of-staff, Charles Kramer, whom we now know was a Soviet agent, both “handed over important information to the USSR” and wrote a bill to create a National Health Program (p. 121).  And Lee Pressman, another Soviet agent and “close colleague” of both Kramer and Alger Hiss, added his weight to the Star’s roster of writers.  These writers, of course, “co-opted the ‘progressive’ label, claiming to be merry liberals” simply devoted to fulfilling the American dream of “social justice” (p. 125). 

Then Davis abruptly left his beloved Chicago in 1948, moving to Hawaii, where he wrote a weekly column for the Honolulu Record, a Communist paper, and worked closely with Harry Bridges, the “progressive” leader of the International Longshoremen’s and Warehousemen’s Workers Union (ILWU).   Though oft-celebrated by the likes of Nancy Pelosi, the ILWU was “one of the most communist-penetrated and –controlled unions of the time” (p. 145).  While Davis claimed to receive no money from the Record, there is every reason to believe he was generously subsidized by the CPUSA in Hawaii, where it was hoped a “mass revolutionary movement” would establish “a satellite in the Soviet orbit” (p. 150).  With his pen Davis was strategically placed to assume a pivotal role in Stalin’s strategy in the Pacific.  So he consistently attacked President Truman, the Marshall Plan, and America’s military excursions in Asia (Korea; Vietnam).   A recently opened 600 page FBI file on Davis reveals that he also took numerous telephoto pictures of Hawaii’s shoreline.  Consequently, he was listed as a security threat on the government’s Security Index, joining a select group of folks deemed highly dangerous to the nation.  

Little actually came of the CPUSA plan for Hawaii, and an aging Frank Davis slipped into the obscurity of retirement.  Yet though he accomplished little as a journalist he left a larger imprint on the world through his acquaintance with Stanley Dunham, Barack Obama’s grandfather, with whom he enjoyed drinking and playing poker.  As is clear in Dreams from My Father, wherein “Frank” frequently appears, young Barack Obama (desperate for male guidance) easily slipped within Davis’s sphere of influence as he sought to define himself.  “‘Away from my mother, away from my grandparents, I was engaged in a fitful struggle.  I was trying to raise myself to be a black man in America, and beyond the given of my appearance, no one seemed to know exactly what that means’” (p. 233).  But Frank Davis provided some clues—and a reading list of radicals such as Franz Fanon.   Consequently, Kengor says:   “Frank remained a thread in the life and mind of Obama” (p. 237).  Thus, when he arrived in California as a college student at Occidental, he was considered a “fellow believer” by one of his then-Marxist friends, John Drew who, in a recent interview with Kengor recalled that:  “‘Obama was definitely a Marxist and that it was very unusual for a sophomore at Occidental to be as radical or as ideologically attuned as young Barack Obama was’” (p. 251).  

In sum:  “The people who influence our presidents matter” (p. 298).  To understand President Obama we need to weigh the role of his “mentor,” Frank Marshall Davis, in his formation.  The Communist thus provides essential information in evaluating him.  

240 Logic of Liberty

Michael Polanyi (1891-1976), a distinguished Hungarian chemist who immigrated to England in the 1930s and devoted his mature years to philosophical inquiry, was one of the premier thinkers of the 20th century.  His Personal Knowledge remains an epistemological classic, especially appealing to philosophical scientists such as John Polkinghorne.  Similarly significant is his The Logic of Liberty:  Reflections and Rejoinders (Chicago:  The University of Chicago Press, c. 1951), an eloquent defense of both personal and political freedom as necessary for reason and social well-being.  Appalled by the claims of reductionistic materialists—for whom thought is nothing more than mechanical or chemical reactions in brain—he proposed, in a series of essays, to demonstrate that the necessity of the freedom requisite in scientific inquiries is equally needed in other realms.  

Unfortunately, the advance of science during the past two centuries was too frequently accompanied by a philosophical Positivism that “believed implicitly in the mechanical explanation of all phenomena” (p. 11) and denied any non-empirically demonstrable truths.  Thus virtues such as courage and wisdom and standards regarding beauty and goodness were considered purely subjective preferences or social conventions.  Rightly understood, however:  “Science or scholarship can never be more than an affirmation of the things we believe in.  These beliefs will, by their very nature, be of a normative character, claiming universal validity; they must also be responsible beliefs, held in due consideration of evidence and of the fallibility of all beliefs; but eventually they are ultimate commitments, issued under the seal of our personal judgment.  To all further critical scruples we must at some point finally reply:  “‘for I believe so’” (p. 31).  Choosing what we believe to be true necessarily follows a process of the free inquiry best illustrated by the scientific community.  One person seeks to solve a problem or explain a phenomenon and, having satisfied his own mind, informs others of his discovery.  His thesis is then tested by qualified scholars and, if found true, accepted by them.  A creative thinker like Einstein set forth novel ideas—but he needed the assent of his peers to establish the truth of his position.  “This unity between personal creative passion and willingness to submit to tradition and discipline is a necessary consequence of the spiritual reality of science” (p. 40).  Only free thinkers, working within a free society, can effectively advance science, as its still-birth or demise in totalitarian societies clearly reveals.  

The same is true in economics, wherein it is clear that “the central planning of production” so celebrated in socialist circles “is strictly impossible” (p. 111).  This was evident when Soviet planners tried to collectivize Russian agriculture, dictating “the scope and the kind of cultivation to be practiced on every one of the twenty-five millions of peasant farms’” (p. 131).  Disastrous consequences followed—famines, rebellions, and the slaughter of recalcitrant Ukrainian kulaks.  “Lenin’s attempt to replace the functions of the market by a centrally directed economic system caused far greater devastation than he worst forms of laissez faire ever did” (p. 169).  There is in fact a “spontaneous order based on persuasion” basic to both scientific and economic development; any effort to dictate truths or policies cannot but fail in markedly  destructive ways.  Setting forth sophisticated mathematical reasons, especially emphasizing the importance of “polycentricity” profoundly evident in the “postural reflexes which keep us in equilibrium while sitting, standing or walking”, Polanyi shows why multitudes of free persons making decisions are always better than bureaucrats dictating policies (p. 176.)  This truth ultimately dawned on Trotsky, one of the leaders of the Bolsheviks, who “declared that it would require a Universal Mind as conceived by Laplace to make a success of such as system” (p. 126).  

Still more:  Polanyi was distressed by two 20th century developments ultimately destructive to the good society—skepticism and utopianism, wherein “an utter disbelief in the spirit of man is coupled with extravagant moral demands” (p. 4).  Europe was ravaged by nihilistic, revolutionary humanitarians because a deep-seated “skepticism had destroyed popular belief in the reality of justice and reason” (p. 5).  Compassion became a political platform (“social justice”) rather than an individual virtue, and it “was turned into merciless hatred and the desire for brotherhood into deadly class-war” (p. 5).  This was evident in super-planners like Friedrich Engels, who declared “that men ‘with full consciousness will fashion their own history’ and ‘leap from the realm of necessity into the realm of freedom,’” thereby showing “the megalomania of a mind rendered unimaginative by abandoning all faith in God.  When such men are eventually granted power to control the ultimate destinies of their fellow men, they reduce them to mere fodder for their unbridled enterprises” (p. 199).  

* * * * * * * * * * * * * * * * * * * * * 

While better known for his classic anti-socialist manifesto, The Road to Serfdom, Friedrich A. Hayek’s magnum opus is doubtless The Constitution of Liberty (Chicago:  The University of Chicago Press, c. 1960), judged by Henry Hazlitt in a Newsweek review as “one of the great political works of our time . . . the twentieth century successor to John Stuart Mill’s essay, ‘On Liberty,’”  While recognized and awarded a Nobel Prize as an economist, Hayek sought in The Constitution of Liberty to address “the pressing social questions of our time” and provide “a comprehensive restatement of the basic principles of a philosophy of freedom” (p. 3).  Citing John Stuart Mill, he wrote with this conviction:  “It is impossible to study history without becoming aware of ‘the lesson given to mankind by every age, and always disregarded—that speculative philosophy, which to the superficial appears a thing so remote from the business of life and the outward interest of men, is in reality the thing on earth which most influences them, and in the long run overbears any influences save those it must itself obey’” (p. 113).  

To provide a philosophical foundation for liberty is imperative, for “liberty is not merely one particular value but . . . the source and condition of most moral values” (p. 6).  He wrote with the tragic awareness that for nearly a century people around the world had been embracing “western ideals at a time when the West had become unsure of itself and had largely lost faith in the traditions that have made it what it is.  This was a time when the intellectuals of the West had to a great extent abandoned the very belief in freedom which, by enabling the West to make full use of those forces that are responsible for the growth of all civilization, had made its unprecedented quick growth possible” (p. 21).   

In the first of the book’s three parts, Hayek asserts “the value of freedom.”  Rightly defined, freedom is neither political nor metaphysical, neither the power to do things celebrated by political progressives nor the inner freedom of the will noted by theologians.  It is, quite simply, the “independence of the arbitrary will of another” (p. 12).  Individuals freely thinking and making decisions, freely cooperating with other individuals doing the same, enable civilization to develop and thrive.  Every individual is fallible and limited, so no one has the knowledge and wisdom necessary to dictate policies, but thinking and acting together we accomplish what is best for mankind.  “The argument for liberty is not an argument against organization, which is one of the most powerful means that human reason can employ, but an argument against all exclusive, privileged, monopolistic organization, against the use of coercion to prevent others from trying to do better” (p. 37).  It was aptly summarized by Cato, who Cicero says believed “the Roman constitution was superior to that of other states because it ‘was based upon the genius, not of one man, but of many; it was founded, not in one generation, but in a long period of several centuries and many ages of men.  For, said he, there never has lived a man possessed of so great a genius that nothing could escape him, nor could the combined powers of all men living at one time possibly make all the necessary provisions of the future without the aid of actual experience and the test of time’” (p. 57).  

The liberty Cato celebrated and Hayek defends developed in 18th century England, rooted in “an interpretation of traditions and institutions which had spontaneously grown up and were but imperfectly understood” (p. 54).  Unlike the utopianism evident in Rousseau and spawned by the French Revolution, culminating in various Sparta-style totalitarian democracies, British thinkers such as Adam Smith, Edmund Burke, and William Paley took an empirical, historical approach, locating “‘the essence of freedom in spontaneity and the absence of coercion’” rather than “‘in the pursuit and attainment of an absolute collective purpose’” (p. 56).  The French approach, however, implementing Rousseau’s reliance on the “general will,” promoted “popular sovereignty and even declared “the voice of the people is the voice of God,” British (and American thinkers such as Madison) sought to limit the power of passing majorities.   Thus, as Burke continually insisted, a respect for tradition safeguards freedom’s flourishing.  Unfortunately, during the 20th century “the French tradition has everywhere progressively displaced the English” (p. 55).  Even in 19th century England, as Benthamite Philosophical Radicals displaced the Whigs in shaping liberalism, the democratic (Fabian) socialism installed by Clement Atlee in 1940 gained ground.   

In socialist systems individuals cede to the state the responsibility for many things (employment, education, health care, etc.), whereas in free societies they take responsibility for their actions.  “Liberty and responsibility are inseparable” (p. 71).  Basic to liberty is “finding a sphere of usefulness, an appropriate job”—perhaps “the hardest discipline that a free society imposes on us.  It is, however, inseparable from freedom, since nobody can assure each man that his gifts will be properly used” (p. 80), and it is up to the individual person to discern and develop his talents.  Nothing we do outweighs the importance of finding and following one’s vocation, playing a productive role in the world.  Importantly:  “In a free society a man’s talents do not ‘entitle’ him to any particular position” (p. 82).  Nor does a free society insure him against failure or distress.  “All that a free society has to offer is an opportunity of searching for a suitable position, with all the attendant risk and uncertainty which such a search for a market for one’s gifts must involve” (p. 82).  

Taking responsibility for themselves, lovers of liberty resist any form of “equality” other than “equality before the law” (p. 85).  We are, as human beings, remarkably different in physique, intelligence, aptitude, ambition, inheritance, etc.  Endeavoring to eliminate such distinctions in the guise of establishing equality is to violate the natural order of things, for the “boundless variety of human nature—the wide range of differences in individual capacities and potentialities—is one of the most distinctive facts about the human species” (p. 86).  Egalitarians committed to abolishing such differences inevitably propose leveling everyone, through governmental mandates, to a common plane by redistributing the wealth, regulating behaviors, subsidizing failures, providing for old age, and excusing assorted deviancies.  “The modern tendency to gratify this passion is to disguise it in the respectable garment of social justice is developing into a serious threat to freedom” (p. 93), whereas a bona fide understanding of justice restricts it to the virtue of giving others what is due them.  “It is,” Hayek says, “one of the great tragedies of our time that the masses have come to believe that they have reached their high standard of material welfare as a result of having pulled down the wealthy, and to fear that the preservation or emergence of such a class would deprive them of something they would otherwise get and which they regard as their due” (p. 130).  

In part two of the book, “freedom and the law,” Hayek critiques all kinds of the coercion.  “True coercion occurs when armed bands of conquerors make the subject people toil for them, when organized gangsters extort a levy for ‘protection,’ when the knower of an evil secret blackmails his victim, and, of course, when the state threatens to inflict punishment and employ physical force to make us obey its commands” (p. 137).   Notably, there is a difference between commands and laws!  Whereas commands negate personal freedom, good laws preserve and enable it to thrive.  Commands (whether issued by Czars in Russia or the White House) privilege cronies and impair adversaries; laws (enforced by judges) insure the even-handed enforcement of policies and adjudication of disputes.  In his Second Treatise on Government John Locke said:  “The end of the law is, not to abolish or restrain, but to preserve and enlarge freedom.  For in all the states of created beings capable of laws, where there is no law there is no freedom.  For liberty is to be free from restraint and violence from others; which cannot be where there is no law:  and is not, as we are told, a liberty for every man to do what he lists.”  Embracing Locke’s understanding, Hayek says:  “The conception of freedom under the law that is the chief concern of this book rests on the contention that when we obey laws, in the sense of general abstract rules laid down irrespective of their application to us, we are not subject to another man’s will and are therefore free.  It is because the lawgiver does not know the particular cases to which his rules will apply, and it is because the judge who applies them has no choice in drawing the conclusions that follow from the existing body of rules and the particular facts of the case, that it can be said that laws and not men rule” (p. 153).  

Lex, Rex!  Law is King!  Locke and other English (Whig) thinkers insisted that the rule of law liberates individuals.  Still more:  a written constitution and the separation of powers guarantees their freedom.  Both distinguished the United States of America at its inception.  In his History of Freedom Lord Acton noted:  “Europe seemed incapable of becoming the home of free States.  It was from America that the plain ideas that men ought to mind their own business, and that the nation is responsible to Heaven for the acts of State . . . burst forth like a conqueror upon the world they were destined to transform, under the title of the Rights of Man.”  Unlike the English, the Revolutionary colonists acknowledged that we are “endowed by our Creator with certain unalienable rights” and inscribed their convictions on paper.  Thus no one (presidents and judges and passing congressional majorities included) can arbitrarily force free Americans to submit to governmental mandates.  No one reading America’s founding documents can avoid concluding that limiting governmental power was their objective.  “Thus the great discovery was made of which Lord Acton later said:  ‘Of all checks on democracy, federalism has been the most efficacious and the most congenial. . . .  The Federal system limits and restrains sovereign power by dividing it, and by assigning to Government only certain defined rights.  It is the only method of curbing not only the majority but the power of the whole people, and it affords the strongest basis for a second chamber, which has been found essential security for freedom in every genuine democracy’” (p. 184).  The Ninth and Tenth Amendments to the Constitution accentuated this commitment.  

These Amendments were blatantly ignored by the progressive architects of the Fair Deal, New Freedom, New Deal, and Great Society in the 20th century.  In 1933, an “extraordinary” man, FDR—equipped with an “attractive voice and limited mind”—believed he knew what the country needed.  He “conceived it as the function of democracy in times of crisis to give unlimited powers to the man it trusted, even if this meant that it thereby ‘forged new instruments of power which in some hands would be dangerous’” (p. 190).  Constitutional principles were cast aside in order to “get the country moving.”  The “liberalism” of the Founding Fathers, which called for limited government, became the coercive democratic “Liberalism” of the Democrat Party.  Largely lost in that process was the liberty won in the American Revolution.  Thus 20th century America slowly moved in the direction of European Continent, where “absolute” governments in France and Prussia had “destroyed the traditions of liberty” (p. 193) in favor of an administratively imposed socialistic “equality.” 

This process has been facilitated by intellectual developments designed to “discredit the rule of law” and support “a revival of arbitrary government” (p. 233).  Arguing the importance of “social” or “distributive” justice, posing as defenders of the “poor” and “disadvantaged,” progressives scoffed (in the words of Anatole France) “at ‘the majestic equality of the law that forbids the rich as well as the poor to sleep under bridges, to beg in the streets and to steal bread’” (p. 235).  Influential jurists, especially in post-WWI Germany, crafted a “legal positivism” that supplanted the Natural Law (dismissed as a “metaphysical superstition”) and attributed individual rights to government rather than God.  The government can, for instance, either grant or withdraw the “right to life” from selected groups such as the unborn or disabled or unproductive.  The government can either guarantee or abolish property rights.  Bolsheviks, Fascists, and Nazis all embraced and implemented this “legal positivism.”  Less brutally, socialists in England and progressives in America promulgated the same philosophy, replacing the rule of law with majority rule.  Thus Dean Roscoe Pound, one of America’s finest legal scholars, noted the “paternalistic policies of the New Deal” and declared:  “‘Even if quite unintended, the majority are moving in the line of administrative absolutism which is a phase of the rising absolutism throughout the world’” (p. 247).  

Such absolutism cannot but characterize welfare states, leading Hayek to title part three of the book “Freedom in the Welfare State.”  Without question social reformers, throughout the West since 1848, have promoted various versions of socialism which, for a century or so generally “meant common ownership of the means of production and their ‘employment for use, not for profit’” (p. 254).  Discredited by disillusioning developments in Russia and Germany, however, reformers committed to “social justice” rephrased and redirected their agenda to establishing a “welfare state” that redistributes income.  Personal liberties must be restricted in order to promote the general welfare, and government must use “its coercive powers to insure that men are given what some expert thinks they need,” providing for their “health, employment, housing, and provision for old age” (p. 261).  Especially by monopolizing social security and medical care—highly effective means of income redistribution—welfare states become dictatorial.  

Redistribution through taxation (e.g. the graduated income tax proposed by Marx and Engels in 1848) further typifies welfare states and is “universally accepted as just” (p. 306).  Earlier thinkers, such as J.R. McCulloch had warned:  “‘The moment you abandon the cardinal principle of exacting from all individuals the same proportion of their income or of their property, you are at sea without rudder or compass, there is no amount of injustice and folly you may not commit’” (p. 308).  But zealous reformers saw it is a singularly effective means to achieve their vaunted “equality” and by the dawn of the 20th century most nations had sanctioned it.  Though allegedly a way to make the wealthy bear their “fair share,” in fact it makes the “masses . . . accept a much heavier load than they would have done otherwise” (p. 311).  It effectively increases the power of the state, which is the real goal of socialists of all stripes.  

Inevitably these welfare state policies and monopolies prove inefficient and ease the slide toward bankruptcy, but that never deters the utopians promoting them.  Unfortunately, as Justice Lewis Brandeis warned, writing in Olmstead v. United States in 1927:  “Experience should teach us to be most on our guard to protect liberty when the Government’s purposes are beneficent.  Men born to freedom are naturally alert to repel invasion of their liberty by evil-minded rulers.  The greatest dangers to liberty lurk in insidious encroachment by men of zeal, well meaning but without understanding.”  To understand Brandeis’ concern and to fully appreciate the importance of liberty as both a crucial ingredient of human nature and an ultimate good for human society, no better treatise exists than The Constitution of Liberty. 

239 “The Spirit of Munich”–Assessing Appeasement

 In The Gathering Storm (volume one of Winston Churchill’s account of the Second World War) the former prime minister penned these memorable words:  “It is my purpose, as one who lived and acted in these days, to show how easily the tragedy of the Second World War could have been prevented; how the malice of the wicked was reinforced by the weakness of the virtuous; how the structure and habits of democratic States, unless they are welded into larger organisms, lack those elements of persistence and conviction which can alone give security to humble masses; how, even in matters of self-preservation, no policy is pursued for even ten or fifteen years at a time.  We shall see how the counsels of prudence and restraint may become the prime agents of mortal danger; how the middle course adopted form desires for safety and a quiet life may be found to lead direct to the bull’s-eye of disaster.”  And his martyred contemporary, Dietrich Bonhoeffer, further declared:  “Silence in the face of evil is itself evil.  God will not hold us guiltless.  Not to speak is to speak.  Not to act is to act “ (pp. 17-18).  

Few historical truths are more easily documented than this:  appeasing evil is evil!  So Bruce S. Thornton’s The Wages of Appeasement:  Ancient Athens, Munich, and Obama’s America (New York:  Encounter books, c. 2011) deserves careful reading and reflection.  A professor of classics and humanities at California State University as well as a fellow at the Hoover Institution, the author brings both erudition and acumen to his text, laying the groundwork for his discussion with an account of ancient Athens’ failure to resist the aggressions of Phillip II of Macedon.  The city’s civic virtues had decayed—as Demosthenes  so eloquently declared—and with the loss of courage to defend themselves came the loss of freedom so celebrated by Pericles a century earlier.   So Athens slowly slid into obscurity and irrelevance.  

Two millennia later the Athenian approach marked Prime Minister Neville Chamberlain’s 1938 attempt in Munich in to forge a “peace with honor” with Adolph Hitler.  For nearly two decades, disillusioned with the consequences of WWI, the resolve of men such as Chamberlain had softened as  increasing numbers of jaded intellectuals (e.g. Bertrand Russell) and cheerful clerics (e.g. the “Red” Dean of Canterbury) embraced self-abasement, disarmament, pacifism, and a naive faith in the efficacy of internationalism.  “The ‘sniggering of the intellectuals at patriotism and physical courage,’ as [George] Orwell put if of prewar English anti-patriotism, ‘the persistent effort to chip away English morale and spread a hedonistic, what-do-I-get-out-of-it attitude to life, has done nothing but harm’” (p. 279).  One of the “sniggering intellectuals” Orwell condemned was the historian G. M. Trevelyan, who said, “‘Dictatorship and democracy must live side by side in peace, or civilization is doomed’” (p. 107).  George Lansbury, a Labor Party leader, actually admitted “he would ‘close every recruiting station, disband the Army and disarm the Air Force.  I would abolish the whole dreadful equipment of war and say to the world “do your worst”’” (p. 94).   And the world did its worst in short order.  As Mussolini and Hitler flexed their muscles—invading Ethiopia and the Rhineland—few men of courage opposed them.  Sufficiently emboldened, Hitler pursued his designs, fearing little resistance from England and France.  “‘I saw them at Munich,’ he said.  ‘They are little worms’” (p. 118).  WWII, with all its horrors, inexorably followed.

“The spirit of Munich,” said Alexander Solzhenitsyn when accepting his Nobel Prize, “has by no means retreated into the past; it was not a brief episode.  I even venture to say that the spirit of Munich is dominant in the twentieth century.  The intimidated civilized world has found nothing to oppose the onslaught of suddenly resurgent fang-baring barbarism, except concessions and smiles.  The spirit of Munich is a disease of the will of prosperous people; it is the daily state of those who have given themselves over to a craving for prosperity in every way, to material well-being as the chief goal of life on earth.  Such people—and there are many of them in the world today—choose passivity and retreat, anything if only the life to which they are accustomed might go on, anything so as not to have to cross over to rough terrain today, because tomorrow, see, everything will be all right.  (But it never will!  The reckoning for cowardice will only be more cruel.  Courage and the power to overcome will be ours only when we dare to make sacrifices)” (pp. 24-25).    

To Solzhenitsyn, America’s retreat from Southeast Asia, abandoning Vietnam just as victory was immanent, revealed the collapse of courage most visible in this nation’s intellectual and political elite.  The spirit of Munich, Thornton says, spread throughout the burgeoning anti-war and anti-American community in the ‘60s ‘70s, seriously compromising our intelligence agencies as well as demoralizing our armed forces.  It saturated the Carter Administration, whose “appeasing response to the Iranian crisis” in 1979 opened the gates to Islamic Jihad around the world.  The architect of the Iranian revolution, the Ayatollah Khomeini had set forth his objective in 1942:  “‘Those who study jihad will understand why Islam wants to conquer the whole world.  All the countries conquered by Islam or to be conquered in the future will be marked for everlasting salvation’” (p. 166).  He pulled no punches, declaring:  “‘Those who know nothing of Islam pretend that Islam counsels against war.  Those [who say this] are witless.  Islam says:  Kill all the unbelievers just as they would kill you!’” (p. 166).  The Koran and the sword are welded together—the book guides the faithful; the sword slays the infidels.  When the Iranians took hostage 66 Americans and seized our Embassy, President Carter equivocated and pleaded, sending a “groveling” letter assuring Khomeini of his commitment to “good relations ‘based upon equality, mutual respect and friendship’” (p. 168).  Though the hostages were released when Ronald Reagan was elected, our world forever changed as radical Muslims embraced jihad.  Despite a series of attacks on American installations, neither Reagan nor Bill Clinton responded decisively, confirming Osama “bin Laden’s estimation that U.S. power was “‘built on foundations of straw’” (p. 190).  Clinton, who regularly “wilted” when faced with politically risky decisions, responded to al Qaeda’s attack on the U.S.S. Cole by ordering “all U.S. Navy vessels to head for the safety of open waters and to avoid using the Suez Canal” (p. 197).  

Then came September 11, 2001!  A very different president, George W. Bush, responded quite differently, committing American troops to wars in Afghanistan and Iraq.  But despite initial support, as the wars drug on President Bush had to deal with hoards of increasingly shrill critics—mainly Democrat luminaries such as Al Gore, Barbara Boxer, Jimmy Carter and Howard Dean, who in 2004 commandeered anti-war throngs by spouting “Marxist clichés about ‘imperialism’ and ‘colonialism’ and the evils of capitalism’” (p. 203).  Joining the anti-war brigade, Barack Obama “campaigned on a foreign policy predicated on moralizing internationalism, a preference for diplomacy and transnational institutions, a focus on human rights and foreign development, and the assumption that the United States was flawed and in need of some humility after the reckless aggression and oppressive practices of the Bush administration” (p. 214), signaling “a return to the Carter philosophy that had helped put in power an Islamist regime in Iran and ignited a Soviet global expansion” around the world (p. 216).  “Thus we rationalize away the jihadists’ careful justifications for their violence in the theology of Islam and seek to ameliorate what we think are the true causes—poverty, lack of political participation, or historical injustices—rather than realizing that those who believe they are divinely sanctioned to kill others will not be talked or bribed out of their beliefs, but can only be destroyed” (p. 280).  

So there is every reason to acknowledge that Obama, committed as he is to an “outreach” to Muslims, personifies the “spirit of Munich.”  He appeases the Islamists just as Chamberlain appeased the Nazis, even insisting that administrative officials—and journalists—sanitize their language in order to falsely portray Muslims (at home and abroad) as “peace-loving” and jihad as “spiritual improvement rather than violence against the enemies of Islam” (p. 255).    In his repeated laments for the sins of the West, he simply ignores the historical evidence that “over the centuries Muslims have conquered, killed, ravaged, plundered, and enslaved Christians and occupied their lands in vastly greater numbers than all the dead resulting from European colonial incursions of America’s recent wars in Muslim lands put together” (p. 263).   Facing such hostile world, Thornton insists, America must respond in ways atypical of the democracies led by men such as Chamberlain and Obama, which almost always make short-sighted, self-serving, emotionally-based decisions.  

* * * * * * * * * * * * * * 

Markedly different from the spirit of Munich so evident in today’s progressive intelligentsia was the “Iron Lady” Margaret Thatcher!  As Claire Berlinski makes clear in “There is No Alternative:  Why Margaret Thatcher Matters” (New York:  Basic Books, c. 2011), appeasement was not in her genes!  Rather, she demonstrated the kind of courage which only comes from deeply-held convictions.  For her, these were established in her early years as she attended a Methodist church.  In contrast to those unprincipled politicians who forever float with the winds of popular opinion, Lady Thatcher refused to bend when principles such as freedom and justice were at stake.  To her there were never two sides to an issue—there was only one, the right side!  In a remarkable statement, responding to those who urged compromise and “consensus,” she declared:  “To me consensus seems to be:  the process of abandoning all beliefs, principles, values and policies in search of something in which no one believes, but to which no one objects; the process of avoiding the very issues that have to be solved, merely because you cannot get agreement on the way ahead.  What great cause would have been fought and won under the banner ‘I stand for consensus’?” (Downing Street Years, p. 167).

In particular, she “was one of the most vigorous, determined, and successful enemies of socialism the world has known” (p. 5).  Living in a “weakly socialist nation” slowly shaped over the decades by the Labour Party following Fabian strategies, she smelled Britain’s festering decadence and refused to sanction the “basic immorality” of socialism, believing that “socialism itself—in all its incarnations, wherever and however it was applied—was morally corrupting.  Socialism turned good citizens into bad ones; it turned strong nations into weak ones; it promoted vice and discouraged virtue; and even when it did not lead directly to the Gulags, it transformed formerly hardworking and self-reliant men and women into whining, weak and flabby loafers.   Socialism was not a fine idea that had been misapplied; it was an inherently wicked idea” (pp. 7-8).  

She emerged as a political powerhouse with a speech to the Conservative Party in 1976, calling for a return to free enterprise economics, a repudiation of the Labour Party which was then “‘committed to a program which is frankly and unashamedly Marxist,’” a shift which would bring about “‘the rebirth of a nation, our nation—the British nation’” (p. 68).  She urged her party to launch a crusade by appealing “‘to all those men and women of goodwill who do not want a Marxist future for themselves or their children or their children’s children.  This is not just a fight about national solvency.  It is a fight about the very foundations of the social order.  It is a crusade not merely to put a temporary brake on socialism, but to stop its onward march once and for all’” (p. 69).  Citing Shakespeare’s rendition of King Henry V’s words before the pivotal Battle of Agincourt, she concluded:  “‘As was said before another famous battle:  “It is true that we are in great danger; the greater therefore should our courage be”’” (p. 69).  

And the courage of King Henry was needed when the Iron Lady came to power in 1979, for Britain was widely regarded as the Sick Man of Europe (a “disgrace,” in the opinion of Henry Kissinger), “sunk to begging, borrowing, stealing” (p. 10).  Little remained of the nation that a century earlier had proudly orchestrated the Pax Britannica, ruling one fourth of the world.  “It was the world’s undisputed premier naval power; it controlled the world’s raw materials and markets; it had long been the world’s leading scientific and intellectual power; it was the financial center of the world and the premier merchant carrier; it had invented the Common Law; it had invented modern parliamentary democracy” (p. 9).  Only faded remnants of such glory days remained.  To Thatcher, Britain’s decline was “a punishment for the sin of socialism” installed by her countrymen in 1945 when they elected Clement Atlee prime minister and established a enveloping welfare state.  Having defeated the Nazis in war, they surrendered to the socialists in peace!  She thought that “in 1945 the good and gifted men and women of Britain had chosen a wicked path.  They had ceased to be great because they had ceased to be virtuous.  In ridding Britain of socialism, she intimated, she would restore it to virtue.  She would make it once again worthy of greatness” (p. 13).  Consequently:  “Hatred of communism, hatred of Marxism, hatred of socialism—and an unflinching willingness to express that hatred in the clearest imaginable terms—was the core of Thatcherism” (p. 47).  

“Thatcherism,” Berlinski says, was rooted in the religiously devout, industriously middle class rearing that nurtured Margaret Thatcher.  Her father was a Wesleyan lay preacher whose influence and convictions shaped her.  As a child she began speaking publically by reading passages in church, sharing in her father’s ministry and resolving to follow his example, frequently citing the Scriptures in her political pronouncements.  Accordingly:  “She did what was right, she did what was right, she did what was right.  She did it because her father told her to” (p. 21).  Her education (a chemistry degree from Oxford, supplemented by a law degree earned while working as a research chemist) equipped her.  Her marriage, to a prosperous businessman, Denis Thatcher, sustained her.  But no external factors fully explain her!  She was, quite simply, a remarkable woman with sharply-honed political skills who guided Britain through her “longest sustained period” of “economic expansion of the postwar era” (p. xix).  Using extensive interviews with her allies and enemies, Berlinski enables us to better appreciate Thatcher’s genius and success.  

Early on, as Prime Minister, she refused to back down to Argentina and successfully waged a 1982 war to maintain British control of the Falkland Islands.  “Without this victory, it is unlikely that the Thatcher Revolution could have occurred” (p. 158).  Standing firm and winning the war greatly enhanced the prestige of both Thatcher and her country, building the popular support she needed to win “a massive victory in the 1983 general election” (p. 179) and subsequently to deal with domestic issues.  “‘We had to fight the enemy without in the Falklands,’ she said.  ‘We always have to be aware of the enemy within, which is much more difficult to fight and more dangerous to liberty’” (p. 183).  The foremost “enemy within,” by all odds, was the thoroughly Marxist trade unions, effectively strangling the British economy.  So Thatcher defied and denatured them, despite a long and bloody coal miners’ strike.  She did so by “stockpiling coal, training the military to drive trains in the event of a sympathy strike by the railway workers, accelerating the development of nuclear power, importing electricity by cable from France, and refurbishing coal-fired power stations to permit them to run on oil” (p. 208).  

Thatcher’s opposition to Marxism included a deep hostility to the USSR, which, when she became Britain’s prime minister in 1978, “appeared to be not only invincible, but ascendant” (p. 270).  In her 1976 speech to her party, she boldly announced her convictions, saying:  “‘I stand before you tonight in my Red Star chiffon evening gown, my face softly made up and my fair hair gently waved, the Iron Lady of the Western World.  A Cold War warrior, an Amazon philistine, even a Peking plotter.  Am I any of these things?  Well, yes, if that’s how they wish to interpret my defense of values and freedoms fundamental to our way of life’” (p. 263).  Undeterred by her critics, impervious to their poison darts, she gladly joined Ronald Reagan in doing whatever possible to throttle Communism wherever it appeared in the 1980s.  “Publicly, Thatcher—and only Thatcher, among the leaders of the world—supported Reagan unwaveringly, despite massive domestic and international pressure to do otherwise” (p. 273).  By doing so, she earned Reagan’s enduring gratitude and friendship.  Together they successfully dealt with a new kind of Communist, Mikhail Gorbachev, whom they both liked, and in short order the Soviet Union collapsed.  

Reviewing the book, Peter Schweizer, the author of Reagan’s War, concluded:  “Finally the Iron Lady gets her due.  Claire Berlinski brilliantly lays out how Margaret Thatcher’s strength and conviction changed the world.  Without a Prime Minister Thatcher there might not have been a President Ronald Reagan.  And Berlinksi reminds us how the whole world would benefit from a new Thatcher today.”  

* * * * * * * * * * * * * * * * * * * 

In Obama’s Globe:  A President’s Abandonment of U.S. Allies Around the World (New York:  Beaufort Books, c. 2012), Bruce Herschensohn places the nation’s foreign policy within the context of the past half-century, showing how the Carter and Obama administrations failed to rightly understand and handle challenges abroad.  Unlike FDR and JFK, Ronald Reagan and George W. Bush, all of whom boldly pressed for victory over our foes, presidents Carter and Obama tried to appease the nation’s enemies and in the process deserted our friends.  “Carter’s abandonment of El Salvador and Nicaragua,” for example, “ended with a fall of those two governments that cost over 70,000 deaths of Central Americans fighting against Soviet proxies who had taken advantage of the opportunity given to them” (p. 27).  He lacked what JFK called “the stuff of presidents,” envisioning himself as a purveyor of “human rights,” a peace-maker rather than a commander-in-chief.  

Following the Carter approach, President Obama annulled agreements with Poland and the Czech Republic in order to curry favor with Russia.  Distancing himself from Britain, Obama (through his Secretary of State Hillary Clinton) proclaimed America’s neutrality regarding the on-going dispute between Argentina and the United Kingdom over the Falkland Islands.  In Tunisia and Egypt, Obama shunned long-term allies, opening the gates for Islamists (notably the Muslim Brotherhood) to take control of Arab nations.  To Herschensohn:  “The celebration of Mubarak’s fall was much more reminiscent of 1979’s fall of the Shah of Iran and the welcome of the Ayatollah Khomeini” (p. 60).  And in Iran, when millions took to the streets demanding liberty from the oppression imposed by Khomeini and his heirs, Obama refused to give even verbal support!  So too in Syria—Obama has feebly protested the genocidal regime of Bashar al-Assad but done nothing of substance to overthrow him.  His policy—termed “leading from behind” by a White House advisor—illustrates timidity and constant concern for his own political position.  Obama’s treatment of Israel further illustrates his abandonment of American allies around the globe.  By consistently referring to the West Bank and Gaza Strip as land “occupied by Israel,” and by announcing, on May 19, 2011, his desire to return to the 1967 boundaries between Israel and Palestine (a noon-existent state), he made “one of the worst, if not the very worst statement made by any U.S. President regarding a friendly nation that won a war” (p. 89), the president has egregiously distanced himself from our only allies in the Middle East.  

Wherever Herschensohn looks (and he deals with many more areas that I’ve indicated), President Obama seems committed to appeasing our enemies and abandoning our allies, hoping the world be safer without American leadership.