268 Flight from the Absolute

Philosophers and educators routinely refer to the importance of one’s Weltanschauung—the philosophy of life or worldview that provides meaning for life.  To be fully human means addressing the “big questions,” wrestling with “ultimate concerns,” finding a unifying belief system.  (Such was the argument of Victor Frankl’s classic Man’s Search for Meaning.)  Though often assumed rather than embraced and frequently absorbed from dubious “authorities” or misperceptions, one’s worldview significantly directs (and at times dictates) much that orders his days.  Thus Paul Gosselin’s two volume set—Flight from the Absolute:  Cynical Observations on the Postmodern West (Samizdat, V. I, c. 2006, 2013; V. II, c. 2014)—provides much to ponder when evaluating current intellectual currents.  While he makes no personal professions, his frequent citations of C.S. Lewis and other Christian scholars indicate his ties to the Christian tradition, and his critique of naturalistic evolution carries with it a defense of  Intelligent Design or Creationism (though not necessarily of the young earth variety).  

Written originally in French by a Canadian scholar with a strong interest in anthropology, literature, music, science, popular culture and philosophy—who apparently works independently rather than within a university—these two volumes are more a series of tantalizing explorations (laced with interesting insights and quotations, reflecting considerable learning) than a systematic treatise.  Thus there are diversions and digressions, repetitions and ruminations that might have been screened out by careful editorial work.  Even the tables of contents (e.g. “vivisecting the Patient,” The Phantom Creed,” “Cannibals”) reveal the impressionistic rather than systematic nature of the books.  But if read patiently—and with serious attention to the lengthy quotations and extensive footnotes—much can be learned from them.    

Gosselin particularly stresses the significant changes, during the past century, wrought by the loss of Christianity’s intellectual clout.  A materialistic secularism has come to dominate schools and media in the West.  This worldview emerged in the Enlightenment and (though kept at bay for a century or more) has gained control of the modern mind.  Portentously, several European countries recently refused to even recognize Christianity’s historical role in shaping Europe!  More particularly, the materialist cosmology decreed by scientists now provides the underlying structure for most every intellectual pursuit.  Though rarely acknowledged as such, this materialist worldview (best understood as an “ideology” of some sort) is as fully religious or mythical as the Christian faith it displaced.  (Importantly, “myth” to Gosselin means a story which may very well be absolutely true.)  Anthropologists generally define “religion” as the effort to answer questions regarding:  1) origins (“where do we come from?”); 2) anthropology (“who am I”); 3) law (“why obey certain authorities, whether human or divine?”); and 4) purposes (“why live?”  “what are man’s proper ends?”).  Once the magisterial authority of Science was established and revered as the source of truth, a series of 18th and 19th century thinkers, culminating in Charles Darwin, discarded the Judeo-Christian cosmology which had shaped the West.  

              Consequently, by assuming and insisting that evolution is true, thinkers such as Jacques Monod and Richard Dawkins confidently reduce human beings to sophisticated animals interacting with an ever-changing physical world.  “Man is, at most, a garbage bag-full of pompous molecules interacting in a universe totally indifferent to his existence” (V. I, Kindle #302).  As one of the characters in Ray Bradbury’s The Martian Chronicles said:  “‘That’s the mistake we made when Darwin showed up.  We embraced him and Huxley and Freud, all smiles.  And then we discovered that Darwin and our religions didn’t mix.  Or at least we didn’t think they did.  We were fools.  We tried to budge Darwin and Huxley and Freud.  They wouldn’t move very well.  So, like idiots, we tried knocking down religion.  We succeeded pretty well.  We lost our faith and went around wondering what life was for.  If art was no more than a frustrated outfling of desire, if religion was no more than self-delusion, what good was life?  Faith had always given us answers to all things.  But it all went down the drain with Freud and Darwin.  We were and still are a lost people’” (#314).  So we find distinguished scientists, such as Steven Weinberg, declaring:  “‘The more the universe seems comprehensible, the more it seems pointless’” (#1325).  To which Alfred North Whitehead wisely decreed:    “‘Scientists animated by the purpose of proving that they are purposeless constitute an interesting subject for study’” (#4898). 

                  Then quite quickly, at the close of the 20th century, a repudiation of many key tenets of the Enlightenment project sallied forth under the banner of Postmodernism, which may be best understood as yet another religious endeavor (or, perhaps more correctly, a variety of religious expressions) best evident in popular culture and public education.    Turning away from collective social structures, Postmoderns focus on individuals “and ask:  ‘What’s in it for me?’  Political parties have become self-serve institutions.  Postmodern religion is custom-fitted to the client’s preferences” (#1363).  “The twentieth-century buried the grand collective political projects.  All that remains is the individual and his sexual, artistic, ideological and professional impulses and ambitions.  His salvation is found in self-fulfillment.  Anything that constrains the individual finds itself opposed to the postmodern perspective.  This is the perfect worldview for eternal teenagers” (#2522).  But despite the clear differences between Postmodernism and Modernism one thing remains constant:  a dogmatic allegiance to Darwinian Evolution.  Postmoderns regularly “deconstruct” ethical and aesthetic standards, patriarchy and colonialism; they proudly dismiss all “truths” as mere opinions and weigh in regularly against all forms of “intolerance”; they reject all prescribed principles or traditional authorities; they consider male/female sexual distinctions “out-dated” and celebrate self-fulfilling same-sex unions; but they remain totally committed to “the West’s own dominant metanarrative, the theory of evolution, as it constitutes the logical  basis for postmodern relativism” (#2446).  

Thus Gosselin regularly returns to his central theme:  evolution through natural selection lacks bona fide scientific standing and, as a worldview, has gravely harmed the world.  Rightly defined, “science deals with observable and reproducible processes.  The rest is outside the domain of science (or should be)” (#1524). When dealing with past events (and especially the origins of the universe or life or human consciousness) we “have left the field of empirical science and have begun to navigate the world and wonderful world of myth and cosmology” (#1533).   Without empirical evidence, we have “little more than nice ‘scientific’ stories framed in the context of the dominant materialistic origins myth.  This is the best we can expect” (1543).  Though Darwinists are determined to maintain their status as “scientists” and insist evolution through natural selection is a “fact” rather than a theory, they do so only by denying the proper constraints of their discipline.  

Even more egregiously, they often don the mantles of religious prophets or wise men, as is evident in the works of Carl Sagan, Jacques Monod, E.O. Wilson, Richard Dawkins, who provide reasons and recipes—worldviews and philosophies of life—allegedly rooted in their scientific knowledge.  Cornell University biology professor William Provine recently summed up the stark components of this evolutionary view:  “‘Let me summarize my views on what modern evolutionary biology tells us loud and clear—and these are basically Darwin’s views.  There are no gods, no purposes, and no goal directed forces of any kind.  There is no life after death.  When I die, I am absolutely certain that I am going to be dead.  That’s the end of me.  There is no ultimate foundation for ethics, no ultimate meaning in life, and no free will for humans, either.  What an unintelligible idea’” (#5166).  Nota bene:  none of Provine’s assertions are empirically evident—all are barefaced assertions of a biologist pontificating on philosophical ideas.  

Yet such efforts have generated a host of problem for their devotees, in part because “those who accept the materialist worldview must also accept that all their cultural and intellectual production is just as rigidly predetermined as the trajectory of a ball falling from the tower of Pisa.  And if that is the case, why should such works be considered significantly or taken seriously?” (#1918).  Necessarily, materialists such as Professor Provine must insist that everything comes into being as a result of natural causes, there is no such thing as free will (whereby man, particularly insofar as he thinks, stands apart from the purely material chain of events).  As C.S. Lewis pointed out quite clearly decades ago:  “‘We do not need . . . to refute naturalism.  It refutes itself’” (#1928).  In more detail, Lewis further explained (in “They Asked for a Paper,” a powerful passage I wish I had successfully instilled in every one of my students):  “‘Long before I believed Theology to be true I had already decided that the popular scientific picture at any rate was false.  One absolutely central inconsistency ruins it; . . . .  The whole picture professes to depend on inferences from observed facts.  Unless inference is valid, the whole picture disappears.  Unless we can be sure that reality in the remotest nebula or the remotest part obeys the thought-laws of the human scientist here and now in his laboratory—in other words, unless Reason is an absolute—all is in ruins.  Yet those who ask me to believe this world picture also ask me to believe that Reason is simply the unforeseen and unintended by-product of mindless matter at one stage of its endless and aimless becoming.  Here is flat contradiction.  They ask me at the same moment to accept a conclusion and to discredit the only testimony on which that conclusion can be based.  The difficulty is to me a fatal one; and the fact that when you put it to may scientists, far from having an answer, they seem not to even to understand what the difficulty is, assures me that I have not found a mare’s nest but detected a radical disease in their whole mode of thought from the very beginning.  The man who has once understood the situation is compelled henceforth to regard the scientific cosmology as being, in principle, a myth; thought no doubt a great many true particulars have been worked into it’” (#7956).

Equally impossible is the discovery of any moral code since “Evolutionary cosmology tells modern man:  ‘You are the culmination of processes that have taken place for billions of years.  Chance is your Father.  Chaos is your mother.  You are alone in the universe.  Your destiny is to establish order as you see fit” (#3958).  Moral values cannot be scientifically proven, so following ones feelings (emotivism) is the only rationale for behavior.  If it feels good, do it!  Most materialistic secularists evade the nihilistic moral message embedded in their worldview, but the Marquis de Sade saw it clearly.  “‘What is man and what difference is there between him and other plants, between him and all the other animals of the world?  None, obviously’” (#4009). So it logically follows that killing a man is no worse than killing any other animal!  Whatever pleases you, whatever gives you pleasure, is allowed.  De Sade himself found pleasure in abusing women since men, the stronger sex, have the natural right to “‘indiscriminately express our wishes to all women, . . . to compel their submission’” and force any available “‘woman to yield to the flames of him who would have her; violence itself being one of that right’s effects, he can employ it lawfully  Indeed!  Has Nature not proven that we have the right by bestowing upon us the strength needed to bend women to our will?’” (#5001).  

Though more mild-mannered and restrained in his rhetoric, Charles Darwin said much the same, lauding the triumph of Caucasians over “lower races” around the world, eliminating them in the “struggle for existence” (#4021).  We must never forget that the subtitle to Darwin’s On the Origin of Species reads:  By means of Natural Selection or the Preservation of Favoured Races in the Struggle for Life.  A contemporary of Darwin, Dostoyevsky discerned the “difficulties involved with developing ethics in the context of a materialistic cosmology” (#4387), for empirical science inevitably proposes solutions “based on brute force” (e.g. the survival of the fittest).  So within decades of the publication of Darwin’s On the Evolution of Species through Natural Selection (1859) an elite corps of scholars had proposed ways to purify the race (eugenics) and justify ruthless behaviors (Social Darwinism).  Particularly in Germany distinguished evolutionary scientists were proposing, by the turn of the century, ways to eliminate criminals and handicapped persons.  The path from Darwin’s devotees (such as Ernst Haeckel) to the Nazis is well-marked and inescapable.  

Darwinists such as Haeckel and Adolph Hitler denied any meaningful difference between human beings and other animals.  “Hitler was not crazy, a ‘deranged’ individual, but was rather the logical progeny of a dysfunctional civilization developed on the basis of a flawed cosmology” (#7327).  He simply worked out his most compelling conviction, recorded in his Tishegesprache/Table-Talks:  “If I can accept a divine commandment, it’s this one:  ‘Thou shalt preserve the species.”  Haeckel and Hitler discarded the traditional Christian beliefs that man is a spiritual being, made in the image of God, possessing an immortal soul, free to embrace or reject personal responsibilities, to live choose between good and evil.  “These men represented the theoretical, logical culmination of mankind’s humanist rebellion against God.  They declared ‘our innate moral consciousness’ to be self-deception, noxious illusion, fiction—as demanded by a rationally ordered consciousness.  This century’s totalitarianism, trampling the human personality and all its rights, rhinocerouoslike, underfoot, is only the application of this theory to life, or humanism put into practice” (#6511).  

From a strictly materialistic perspective, ethical and moral nihilism (denying there are any values) inevitably results.  As he does recurrently, Gosselin cites C.S. Lewis, who wrote, in The Abolition of Man:  “We have been trying, like Lear, to have it both ways; to lay down our human prerogative and yet at the same time to retain it.  It is impossible.  Either we are rational spirit obliged for ever to obey the absolute values of the Tao, or else we are mere nature to be kneaded and cut into new  shapes for the pleasures of masters who must, by hypothesis, have no motive but their own ‘natural impulses.  Only the Tao provides a common human law of action which can over-reach rulers and ruled alike.  A dogmatic belief in objective value is necessary to the very idea of a rule which is not tyranny or an obedience which is not slavery.”  

Absent such an Absolute—the Natural Law or what Lewis calls the Tao—“ultimately the only real moral absolute left is mere survival” (V. II, #541).  Disciples of Darwinism, such as Steven J. Gould, try to evade the ethical implications of their cosmology—Gould simply posited two utterly separate realms of reality, the tough-hewed materialistic world revealed to scientists and the tender-hearted ethical world indwelt by kindly biologists such as himself who support the basic moral standards of Western Christian Culture.  But other biologists and philosophers (notably Peter Singer) are not quite so tender-hearted and scour for evidence of purely material bases for ethical instincts.  Animals, including man they say, have an altruistic urge (as well as a sexual urge) simply because it enables them to survive.  They animalized humans and humanized animals—energetically defending animal rights (PETA spokesmen) or denying human dignity (pro-abortion or active euthanasia activists) as the moment required, without pausing to consider to the incoherence of their position.  

Though deeply problematic as a basis for ethics, the theory of evolution enjoys an exalted status within the 21st century intelligentsia.  Both politicians and professors soon discover that opposing it can easily cost one his career.  Yet unlike the theory of gravity, which can be routinely demonstrated through observations and experiments, the theory of evolution, when applied to critical events such as the origin of life requires considerable faith and imagination.  Properly understood, “science” deals only with the observable natural world.  Events in the distant past, whether the “Big Bang” or the origin of life on planet earth can never be observed or tested in a laboratory.  Incoherently, evolutionary theorists insist the process they revere is both unobservable and empirical!  It’s like saying something is both invisible and visible!  And, indeed, lauded Nobel Prize winners and Harvard professors such as George Wald baldly declare “that the spontaneous generation of a living organism is impossible.  Yet, here we are—as a result, I believe, of spontaneous generation’” (#1923).  Faith in spontaneous generation is endorsed while faith in any Designing Intelligence is rejected as unacceptable!  

Interestingly enough, the evolutionary myth also invokes the “Peter Pan Effect.”  Asked how he could fly so easily, Peter Pan declared:  “You just think wonderful thoughts and they lift you up in the air.”  When the fossil record remains filled with “gaps,” you simply invoke “Chance” (the materialistic deus ex machina that takes on the role of the theistic God who intervenes and intelligently fills in the gaps).  Or you insist that in time Science will explain everything (precisely as do theists when relying on an Omniscient Mind to unveil creation’s inexplicable mysteries).  Acknowledging “the extreme rarity of transitional forms in the fossil record” as the “trade secret of paleontology,” a discipline relying on data “so bad that we never see the very process we profess to study,” Steven J. Gould momentarily wondered if an indiscerrnable “punctuated equilibrium” evolutionary process theory was better than orthodox Darwinism.  And so it goes—endless “scientific” proposals designed to finally explain why Evolution is the “theory of everything” we long for.  

Surely, Gosselin insists, there’s a better way to conceptualize our cosmos.  Challenging and departing from the entrenched evolutionary paradigm demands courage and a willingness to suffer for the truth—but such has ever been the lot of dissidents and pathfinders.  Thus a distinguished American geneticist, Richard Sternberg, was demoted by the Smithsonian’s National Museum of Natural History because he published an article questioning some evolutionary dogma in a journal he edited.  “Never forget, Malcolm Muggeridge said, that only dead fish swim with the stream.”  What’s needed is thinkers such as George Orwell, whose prophetic 1984 warned against the “Newspeak” and “Crimestop” designed by “Big Brother” to stifle freedom of thought and speech.    

Now there’s ample reason to begin seriously questioning the claims of naturalistic evolution if we begin to think rigorously and understand all the available evidence.  Much Darwinism is obviously ideological rather than scientific, mythical rather than empirical.  Many (if not most) scientists simply accept the theory of evolution because it’s what they’re taught rather than seriously thinking through its presuppositions and ramifications.   But in truth they are enraptured by a modern origins myth—providing the story primeval history that provides a raison d’être for all that is, telling them why we’re here, who we are, where we’re headed.   When compared with various other origins myths—carefully chronicled by astute anthropologists cited by Gosselin—Darwinism (or Evolution, personified and portrayed as an Agent working its will everywhere) certainly seems to be more mythical than scientific.  That’s because it:  “necessarily involves events in the past; . . . involves a story, a narrative; . . . provides modern man with an answer to the ‘why’ question; . . . deals with metamorphosis theme; . . . providing meaning for society and its activities at diverse levels” (#3209).  But just maybe there’s a better myth!  Just maybe the ancient Judeo-Christian cosmology is really true—and should re reestablished as our culture’s guiding light.

# # #

267 Love Is What We Need

For its first half-century the Church of the Nazarene’s theology was significantly shaped and masterfully explained by H. Orton Wiley, a close friend and associate of the denomination’s founder, Phineas F. Bresee.  Wiley’s three-volume Systematic Theology was published in the early ‘40s and defined the church’s teaching.   He relied on 19th century Methodist theologians, as well as the holiness prescriptions of Pheobe Palmer to set forth the church’s “cardinal doctrine,” the call to “Christian perfection.”  Wiley frequently cited Methodists who endorsed Palmer’s insistence on a “crisis experience” wherein believers consecrate themselves completely to God, “place their all upon the altar,” and take God at His Word by believing that the “altar sanctifies the gift.”  Wiley especially emphasized the instantaneous nature of the “second work of grace” and tried to make sure that Nazarenes (unlike 19th century Methodists) would tenaciously proclaim it.  In 1928 he helped draft and fully endorsed the article on entire sanctification as set forth (in 1928) in the “Articles of Faith” specifying it to be an “act of God subsequent to regeneration, by which believers are made free from original sin, or depravity, and brought into a state of entire devotement to God, and the holy obedience of love made perfect.”  This language still stands in the church’s most recent Manual.  

Wiley died in 1961, and within a decade younger Nazarene theologians began to overhaul the church’s “cardinal doctrine.”  In 1973 Mildred Bangs Wynkoop’s treatise, A Theology of Love, sparked a turning point in the denomination’s history, if not in its official declarations.  Nazarene Theological Seminary professor Paul Orjala labeled it “one of the most important books ever published” by the denomination, tagging it “the first modern theology of holiness.”  Wynkoop emphasized the “credibility gap” between what preachers said and people experienced and demanded a comprehensive “restructuring of the conceptual framework within which holiness theologians had worked.”  Quietly turning aside from Wiley and the American holiness tradition, she cited John Wesley to craft a “Wesleyan hermeneutic” with a different definition of human nature and sin.  A person is not, she suggested, by nature sinful, so sin is not a thing to be removed.  Rather, sin results from a fractured, dysfunctional relationship with God.  Restoring that relationship, therefore, solves the sin problem.  Nothing essential within one’s soul is changed, bringing about a “state of grace,” but a healthy relationship with God develops.  Holiness is interpersonal love—nothing more, nothing less.  So she minimized the need for a second, instantaneous work of grace.  

Wynkoop’s position clearly informs Relational Holiness:  Responding to the Call of Love, (Kansas City:  Beacon Hill Press of Kansas City, c. 2005) by Thomas Jay Oord and Michael Lodahl, two of the denomination’s best and brightest theologians—who are winsome both in person and as authors.  Though written for the general public rather than the academy, it is endorsed by some of the church’s most distinguished theologians (H. Ray Dunning) and influential leaders (Charles Zink; Ron Benefiel; William Greathouse).  It thus may be taken to represent the current position of the Church of the Nazarene.  

Oord and Lodahl assert the church, in its presentation of holiness, faces more than the “credibility gap” noted by Wynkoop 40 years ago.  Indeed, unless it is explained in ways plausible to 21st century worldviews, the “doctrine” will simply vanish as an artifact of an ancient religious subculture.  To recast the doctrine in relational terms, however, will suit our “postmodern” world, with its sensitivity to environmental realities and to “individuals-in-relation” or “community-created-persons.”  Within this postmodern consciousness, God may be seen as the One who “acts as an ever-present, divine influence—a necessary cause—in everyone’s relational environment.  Just as people affect others through relations, God as the Maker and Sustainer of all things also affects all things, all people, all the time, everywhere.  There is no environment in which God is not related to others as a present, active, and loving agent” (Kindle #332).  God too “is open to and affected by others, because the Creator and the creatures enjoy mutual relations” (#359).  Interacting with the God Who Is Love, we engage in loving relationships with Him and His world, and consequently are, moment-by-moment, more-or-less holy. 

Amidst the variety of views regarding holiness set forth both in scripture and theological traditions Oord and Lodahl seek to find identify one absolutely essential “core” position.  While various alternatives, have value, only love can be the “core” of holiness.  In the authors’ words:  “To love is to act intentionally, in response to God and others, to promote well-being.  To say the same thing in other words, to love is to respond to the inspiration of others—especially God—and by that response effect genuine flourishing” (#862).  Embracing this understanding, life can become an adventure, following Jesus as Guide, responding rightly to the challenges and opportunities we encounter in life’s journey.   Doing so enables us to actively participate in the loving fellowship of the Father, Son and Spirit—the Divine Trinity.  “The Spirit is the Breath in whose life and presence we actually share in the mutual life and love of the Father and Son.”  Still more:  “Perhaps it is not too much to suggest that because the Father and the Son ‘make room’ for us in their common life of the Spirit, our common life together really makes a difference in God’s own life and experience of the world” (#1149).  By “our common life together” they mean mainly the “visible, touchable, experiential” activities that ought to characterize “any and every congregation” (#1360).  Working within a loving relationship with God we rightly interact with others.  “God’s love, then, is perceptible to our senses:  visible, touchable—or at least ought to be—in church communions where the word which you have heard from the beginning is heard and acted upon faithfully, boldly, and bodily” (#1366).  

In the book’s final chapter, Oord and Lodahl call us to be “dancers, not dinosaurs.”  All of the Bible’s definitions of holiness—e.g. following the commandments, being pure, committed, set apart, Christlike and perfect—can be subsumed to and expressed by the “core” value of love.  Led by the “Master Dancer,” Christ Jesus, we can dance beautifully as long as we keep step with Him.  And so, they conclude:  “Let the dance begin!” (#1649).  

* * * * * * * * * * * * * * * * * * * 

Thomas J. Oord, a professor at Northwest Nazarene University, has long pondered and written about the importance of love.  In The Nature of Love:  A Theology (St. Louis:  Chalice Press, c. 2010)—a book dedicated to three Nazarene theologians (H. Ray Dunning, William Greathouse, and Mildred Wynkoop)—he explores the ultimate, greatest virtue of the Christian life.   With John Wesley, he thinks “‘Love is the end of the commandments of God.  Love is the end, the sole end, of every dispensation of God, from the beginning of the world to the consummation of all things’” (#204).  In light of the fact that the Bible repeatedly celebrates the importance of love and the best of Christian theologians for 20 centuries have stressed its import, it should certainly “be the orienting concern and continual focus for speaking systematically about theology.  We should discard ideas or theories that undermine love” (Kindle #191).  

Unfortunately, secondary concerns have often distracted Christians from their main message.  Hugely influential 20th century thinkers such as Paul Tillich and Karl Barth failed to give love its due.  In the past, thinkers such as Martin Luther five centuries ago (and R.C. Sproul today) elevated faith alone to the pride of place.  Calvin in the 16th century (and Millard Erickson in the 20th) developed an intensely logical system celebrating the sovereignty of God.  As Oord explores various theologians’ failures to rightly stress love’s importance, he admits that the word is notoriously hard to define.  So he offers this definition:  “To love is to act intentionally, in sympathetic/empathetic response to God and others, to promote overall well-being” (#489).  Carefully spelling out this definition, he grants that there are many valuable aspects to other (e.g. romantic or friendly) loves.  But when thinking theologically, he insists we abide by this definition.  

Consequently he devotes appreciative sections to—and respectfully rejects the positions of—Anders Nygren, Augustine, and Clark Pinnock.  Anders Nygren wrote Agape and Eros and was “the most influential love theologian in the twentieth century” (#758).  To him only God’s agape qualifies as Christian love.  Following Luther, Nygren “rejected every idea of human merit” and located agape solely in God.  Totally depraved, we bask in God’s love but contribute nothing to our relationship with Him.  Augustine focused on love as desiring (benevolence), but not necessarily doing (benefaction) what is good.  Inasmuch as he insisted on God’s impassibility and timelessness, Augustine did not think He would interact with us in a give-and-take relationship.  Oord lauds Clark Pinnock’s “Open Theism,” which portrays God as a relational Being and grants the importance of love and of the freedom of both human and other kinds of beings to freely respond to Him.  Though sharing many of Pinnock’s positions, Oord faults him for failing to resolve the tension between love and power, especially when explaining the actuality of evil.  A major reason for this is Pinnock’s affirmation, in accord with the vast majority of Christian thinkers, of creatio ex nihilo—creation actually came to be without material antecedents.   Oord argues this notion is neither biblical nor suitable for a theology of love.   Rather than ex nihilo, he thinks God created by transforming the eternal “primordial chaos” into the world that now exists.  Neither time nor matter came into being—they were transformed by the creative act described in Genesis.  Thus evil may be attributed to a residue of the “primordial chaos” that ever abides alongside (and beyond the control of) the God who is absolutely loving.  Since “creatio ex nihilo undermines a coherent doctrine of divine love” we “should reject this nonbiblical idea to affirm consistently the biblical claim ‘God is love’” (#2232).  

Oord’s own position, deeply influenced, in my judgment, by process philosophy, is termed “Essential Kenosis.”  Oord calls it a “biblical theology of love” and believes it overcomes the problem of evil by depicting God as essentially—almost exclusively—love.  It affirms “miracles, the resurrection of Jesus, hope for a final victory at the end of history, and a biblically supported doctrine of creation” (#2098).  And it finds its final illustration on the Cross, where Jesus shows us the true nature of God—“one who experiences pain and joy, sorrow and happiness, life and death” (#2242).  Rightly understood, God does not voluntarily limit himself—He was (and is) involuntarily limited by the nature of His relationship with both the primordial chaos and creaturely freedom.  God loves because He cannot help loving; that is what He is.  He cannot destroy evil because a loving being cannot coerce anything.  But this loving God can court and woo His creatures and draw them into an ever-more holy relationship with Himself.  

* * * * * * * * * * * * * * * * * * 

Thomas A. Noble, a distinguished professor of theology at Nazarene Theological Seminary, recently delivered the Didsbury Lectures in Manchester, England.  The lectures, in print, are titled Holy Trinity:  Holy People:  The Historic Doctrine of Christian Perfecting (Eugene, OR:  Cascade Books, c. 2013).   Targeting a scholarly audience, Noble endeavors to rightly follow his calling as a theologian, “not to perpetuate a Wesleyan ‘distinctive” . . . but to persuade all Christians that this is the heritage of the one, holy, catholic, and apostolic church” (#188).  “Holiness,” he says, “is one of the core concepts of the Christian faith” and deserves the serious, sustained reflection that only comes when one scours the Scripture and consults the long tradition of the Christian Church as well as those “holiness” churches that have more specifically stressed it.  His thoughtful, discriminating discussions of both Catholic and Protestant thinkers, rooting his presentation in the long conversation of great exegetes and theologians, makes this work persuasive and helpful, one of the best expositions I’ve read.    

Importantly, Noble insists, holiness is a perfecting process, not a perfect state of being.  Thus his subtitle urges a perfecting rather than a perfection of the soul.  None of the Church’s greatest theologians “ever taught ‘sinless perfection’—the idea that within this life, Christians could reach that final, absolute state of perfection where they were sinless and perfectly holy” (#804).  They taught, instead, a perfecting process whereby, above all, Christians more fully love as they ought.  At this point St Augustine, one of greatest theologians of love,” famously said, in The City of God:  “Two loves built two cities.  Love of self to the contempt of God built the earthly city:  love of God to the contempt of self, the heavenly.”  We cannot but love, said Augustine—the question is what we will love!  The love of self (concupiscentia) must be tethered while love of God and others (caritas) needs stirring up!   In his wake, great thinkers such as Thomas Aquinas and John Wesley repeated and refined Augustine’s insight.  

Wesley, of course, gave considerable attention to holy living—“faith working by love.”  Indeed, “as Mildred Bangs Wynkoop saw clearly,” bona fide “Christian holiness was a ‘theology of love’” (2411).  In the 18th century renewal movement he led within the Church of England Wesley urged believers (initially sanctified at their new birth) to seek ever-deeper experiential realities available through the gracious workings of God:  the “‘gradual work’ of sanctification that follows regeneration” (#2361).  Though at times he may have erroneously slipped into perfectionistic language, he typically acknowledged “the paradox of this ‘imperfect perfection’” and would even have given guarded assent to Luther’s dictum that the Christian is simul justus et pecccator, at once a sinner and justified” (#2563).  “As in Clement, Origen, the later Greek Fathers, and in Bernard and Aquinas (to name only some of those we selected from the great tradition), there is no thought here of easy, instant holiness.  Rather there is a concept of different levels of stages or ‘degrees’ of perfection’—rungs on the ladder” (#2608).  

After devoting considerable attention to an account of past developments, Noble turns to the task of “reformulating Wesley’s doctrine today.”  Simply repeating an 18th century evangelist will not suffice!  But by carefully considering motivation and relationship we can construct a viable understanding of Christian holiness as “an inner revolution in our motivation as a consequence of a new relationship” (#3013).  Freedom from the bondage of sin—consistently defined as “the self-centered mindset”—is possible insofar as we maintain a sanctifying relationship with the Loving Lord who enables us to consistently “will one thing.”  This means the real focus of our attention should be on the Holy Trinity and the provisions made for our salvation through the atoning work of the Second Person, Jesus Christ.

Following Eusebius and a long line of thinkers, Noble emphasizes the Atonement as the key for us to understand Christian perfecting.  Ultimately, on the Cross, Christ, by dying to sin, opened the way for us to likewise die to sin and become all we’re designed to be.  “Only by meditating on the doctrine of the cross can we be captivated by the love of God in such a way as to love him with that full and whole-hearted love of mind, soul, and strength, which is the essence of ‘entire’ sanctification” (#3723).  It’s all about Him, not us!  Too often holiness has been discussed almost exclusively in terms of us—our sins, our needs, our potential, our fundamentally human ways of attaining sanctity—whereas it must be primarily rooted in the Person of God, Christ Himself!  “‘Entire sanctification’ is not a human possibility, nor is it activated by my total consecration as an individual.  It is God’s gracious activity in the life of each believer, within the contest of the Body of Christ, the church, made possible by God’s once-for-all act of grace in the crucifixion of the old sinful humanity on the cross” (#4144).  

In His Incarnation, as well as His atoning death, Christ provided for us salvation full and free.  The Early Church wrestled long and hard to rightly insist that Jesus was “fully God, fully man.”  Consequently, as Athanasius and others said, by assuming our human nature Christ redeemed and sanctified it.  Sinless Himself, he bore our sins.  Thus “Irenaeus writes of ‘the pure One opening purely that pure womb which regenerates humanity to God, and which he himself made pure.’  As the Symbol of Chalcedon (451) expressed it, he was ‘like us in everything except sin.’  Taking our sinfulness in no way polluted him.  Our debt was swallowed up in his riches, our pollution cleansed in his purity, our sin burned up in the fire of his holiness” (#4528).  Living among us (as well as dying for us) Christ showed us how to be holy persons.  

To be holy, then, is to be rightly rooted in Christ Himself.  “In him, ‘the first-born from the dead’ (Col 1:18; Rev 1:5), the old humanity has died and the new humanity, the new creation (2 Cor 5:17), has begun.  Christian holiness is founded upon—is a participation in—what he has done for us, once for all time in his death and resurrection.  It is a participation in him” (#4887).  Empowered by the Holy Spirit, believers may enjoy fellowship with God Himself and allow Him to inspire and work through them.  Inasmuch as God is Perfect Love, there is a perfecting process engaging believers in the on-going-work of sanctification.  At Pentecost, “the final event in the series of the mighty acts of God in Jesus,” this Reality dawned for the infant Church.  Now, as then, Christians are “able—not merely by effort or moral energy or discipline alone, but by the grace or gift of God—to make his or her consecration fully actual, and to love God and his perfect will whole-heartedly.  While still in the fallen body as part of a fallen human race (‘flesh’) and liable therefore to daily temptation, this mature Christian is no longer a divided mind or heart” (#5207).  Rightly understood, then, Christian holiness is “always a prayer, never a claim” (#5256).  

“Charles Wesley,” Noble says, “expresses for us the constant daily prayer that Christ, who is Love incarnate, crucified, and ascended, may breathe into us too his own Spirit that we may be filled with his love:  ‘Love divine, all loves excelling, / Joy of heaven, to earth come down, / Fix in us thy humble dwelling, / All thy faithful mercies crown!  / Jesu, thou art all compassion, / Pure unbounded love thou art; / Visit us with thy salvation!  / Enter every trembling heart.  / Breathe, oh, breathe thy loving Spirit / Into every troubled breast!  / Let us all in thee inherit; / Let us find that second rest: / Take away the bent to sinning, / Alpha and Omega be, / End of faith as its beginning, / Set our hearts at liberty.  / Come, almighty to deliver, /Let us all thy grace receive; / Suddenly return, and never, / Never more thy temples leave. / Thee we would be always blessing, / Serve thee as thy hosts above, / Pray and praise thee without ceasing, / Glory in thy perfect love.  / Finish then thy new creation, / Pure and spotless let us be; / Let us see thy great salvation / Perfectly restored in thee; / Changed from glory into glory, / Till in heaven we take our place, / ‘ Till we cast dour crowns before thee, / Lost in wonder, love, and praise’” (#5226).  

Intent on loving God, we must seek to reflect Him rather than reflect on ourselves and should, with St Paul, seek to “know Christ” rather than worry excessively about personal purity.  (A self-centered religion is the worst manifestation of sinful self-centeredness! )  Above all we must discover that at the heart of the Trinity there is a loving fellowship between Father, Son, and Holy Spirit.  The “One in whom we live, and move, and have our being” is most deeply Love.  To say God is Holy is to say He is Love.  

* * * * * * * * * * * * * * * *

In light of these recent presentations this is clear:  if H. Orton Wiley and his followers were right, the Church of the Nazarene has abandoned its historic position; if, on the other hand, today’s theologians (Oord, Lodahl, Noble) are right, Nazarenes were appreciably (and, everyone admits, sincerely) misled for half-a-century.  Then, perhaps, neither Wesley nor any of them are right—as Catholics and Calvinists and Pentecostals et al. insist!  Better minds than mine must sort out the answer! 

266 The Never Enough Pity Party

          In Never Enough: America’s Limitless Welfare State, William Voegeli sets forth a somber history  (the 100 years war between successful liberals and retreating conservatives) with an acute analysis of the creeping Leviathan that’s relentlessly assuming ever-more control of all aspects of American life. The book’s title reflects a 1964 Nation editorial which declared, as Lyndon B. Johnson’s Great Society unfolded promising to complete the New Deal, that whatever it proposed was “not enough” (#556). LBJ himself insisted, in 1964: ‘”We’re in favor of a lot of things and we’re against mighty few'” (#570). Whatever anyone wants Uncle Sam will provide! Underlying this attitude, as Steven F. Hayward writes in his perceptive Foreword is this: “Liberalism’s irrepressible drive for an ever larger welfare state without limit arises from at least two premises upon which the left no longer reflects: the elevation of compassion to a political principle (albeit with other people’s money), and the erosion of meaningful constitutional limits on government on account of the imperatives of the idea of Progress” (Kindle #147).

         Without doubt Progressives have successively enrolled a large majority of Americans in various programs, distributing benefits (from Social Security and Medicare to Food Stamps and college loans) that insure the popularity (even among conservatives) of the Welfare State. “The defining victory of the New Deal,” Voegli thinks, “was not the individual programs it created, but the evisceration of the principle that government, especially the federal government, had no rightful business undertaking a whole range of social improvements, no matter how gratifying the beneficiaries might find them. Once this ‘legitimacy barrier’ was demolished, liberals could frame the politics of the welfare state as a contest between the compassionate party that wants the government to give things to people and do things for them, and the mean-spirited party that wants to deprive people of all those indispensable and beneficial things” (#601).

          So long as “someone else” will pay for all these beneficial things, the “compassionate party” maintains its lock on a large percentage of the electorate. “As the British jurish A.V. Dicey wrote in 1914, ‘The beneficial effect of State intervention, especially in the form of legislation, is direct, immediate, and so to speak, visible, whilst its evil effects are gradual and indirect, and lie out of sight. . . . Hence the majority of mankind must almost of necessity look with undue favor upon governmental intervention'” (#3482). Thus Voegeli, though himself a committed conservative, has some somber advice for his compatriots: accept what is and compromise! Forget about abolishing the Welfare State! Republicans as well as Democrats have generally funded and enjoyed its popularity. Ronald Reagan merely tried to “curb” its growth and abjectly failed. His “‘triumph’ was to yield ground more slowly than any other political leader in the battle that conservatives consider their central mission” (#3360). The only workable strategy for those who fear its ultimate destructiveness is to point out its unworkability and support leaders such as Congressman Paul Ryan to carefully correct its abuses and prune away some of its worst excesses.

“Conservatives, in other words, need to take the position that America is going to have a welfare state, should have a welfare state, and it’s not part of the conservative project to bring about the disappearance of the welfare state, even in the distant future. The question is whether we are going to have a welfare state that uses its finite resources intelligently, concentrating on helping the people who need it most, or one that distributes benefits in an undisciplined and nearly random fashion” (#681).

         Having announced his intent in writing the book, Voegeli describes the welfare state. Trying to get a handle on all of the assorted governmental programs (federal, state, and local) truly numbs the mind and tries the soul! Even with official numbers in hand (or in computer) it’s equally hard to rightly interpret them! Limiting himself to federal programs, Voegeli calculates America’s welfare state “was 472 times as big in 2007 as in 1940” (#813). We’re spending 15 times as much on “human resources” (e.g. Social Security, Medicare, Education) programs than we did 60 years ago. Since recipients want “other people” to pay for it, the easiest solution, naturally, is to both inflate the currency and shift the ultimate accounting to coming generations through deficit spending. Progressives talk much about “giving things to people, while  limiting the discussion about the corresponding enterprise of taking things away” (#2413).

          Undergirding all this spending is a philosophical “rationale” carefully crafted by generations of  progressives. Without the historical developments clearing the way for Europe’s socialistic welfare states,  Americans needed to be coaxed into accepting certain economic ideas that were foreign to their limited government, free enterprise traditions. So American Progressives (notably Woodrow Wilson) determined to fundamentally transform things. Rather than taking the Founders’ views of the Constitution—inscribed in memorable documents such as The Federalist Papers—Wilson worked to adapt it to the modern, technological world. Rather than ground government in human nature, he turned to the ever-evolving history of the state, reflecting the influence of 19th century thinkers such as Hegel, Comte and Darwin.

Thus Wilson repudiated the Declaration of Independence’s “self-evident” declaration that all men “are endowed by their Creator with certain unalienable rights,” and declared them state-dispensed. Following President Wilson’s example. Franklin Delano Roosevelt adeptly advanced the Progressive agenda. In a 1932 speech, setting forth the “manifesto” of his New Deal, he celebrated notions of historical progress, justice and equality quite detached from any mooring in the essential human nature assumed by the nation’s Founders. What were once considered “natural rights” (life, liberty, property) by the likes of Thomas Jefferson were now to be re-defined “in terms of a changing and growing social order” (#1455). A “Second Bill of Rights” were now needed and FDR spelled them out—the government should assure everyone “a useful and remunerative job,” a living wage, a “decent home,” good medical care, protection from “the economic fears of old age and sickness and accident and unemployment, and “a good education.” FDR’s expanssive list of “rights” was, of course easily lengthened as increasing numbers of individuals and groups invented them.

         Under FDR’s orchestration, the Chief Executive assumed powers formerly reserved to the

Congress. He transformed the Supreme Court through intimidation and judicious appointments. Thus

began. New Deal historian William Leuchtenburg says, ‘”a revolution in jurisprudence that ended,

apparently forever, the reign of laissez-faire and legitimated the arrival of the Leviathan State'” (#1600). A

“living constitution,” yearly attuned to current conditions by the Court, replaced the one written by

Madison et al. in 1787. In practice, this meant approving virtually all expansions of federal powers.

Anything goes! “‘You’ve got a problem? We’ve got a program'” (2550). Indeed: “The New Deal

changed America’s Constitution from one where the powers of government were enumerated into one

where they were innumerable” (#1772). Yet this progressive triumph poses a very real problem: no one

really knows where we are going or how to scrupulously evaluate our success. Thus the “change” touted

by Barack Obama proves difficult to either define or measure! Just as infinity is immeasurable so too

limitless “rights” cannot be constrained! Consider, for example, the continuously evolving notion of “civil

rights.” The Civil Rights Act of 1964, which strictly required everyone be treated equally, quickly sprouted

into dicta mandating that some folks (through affirmative action) be given special treatment! So an

intensely color-conscious rather than color-blind society quickly emerged.

         Complicating the lack of coherent direction, the Progressive project has manifestly failed to fund

itself. Liberals have sought to evade the inescapable truth that everything must, in some way in due time,

be paid for, indulging in “a protracted exercise in intellectual dishonesty, borne of a conviction that the

question doesn’t need to be answered if it can be made to go away” (#2690). “Don’t worry,” we’re told,

“be happy!” Somehow things will all work out if we trust the social engineers in various branches of

government. Voegeli carefully examines the recipes (e.g. John Maynard Keynes’ economic theories and

John Kenneth Galbraith’s The Affluent Society) for perpetuating the welfare state and painlessly easing

America into a European-style socialism. Yet, as Milton Friedman, among others have insisted, “There is

no such thing as a free lunch.” But just try suggesting this to Barack Obama while he was delivering his

last State of the Union Address!

         Progressives from Roosevelt to Obama inevitably promise to pay for the promised goods, easing

entrance to the Promised Land, by taxing the “rich.” (Exactly what makes one rich is yet another of those

undefined and flexible standards that makes any clear evaluation of Progressive rhetoric so frustrating!).

What Obama fails to mention, however, is the utter impossibility of taking sufficient funds from the “rich”

to pay for programs, which “cannot be realized merely by making the rich less rich. Enacting any

significant portion of the liberal agenda will also require making the merely comfortable noticeably less

comfortable—and liberals are terrified that imposing tax increases on upper middle-class voters will doom

them when those voters go to the polls” (#2999). Lots of folks who never dreamed they were “rich” would

 Reedings #266—Never Enough                                              3

 suddenly awaken to the fact that the State officially defines them as such! To keep such voters happily

 supporting the Welfare State requires they be given much and taxed very little.

          Yet another Progressive strategy is to call for taxing “corporations” rather than individuals.

 However, as Voegeli says: “The distinction between taxes paid by corporations and taxes paid by flesh-

 and-blood voters falls apart when analyzed” (#3211). When taxed, corporations immediately pass along

 their loss to consumers who thus pay the taxes by way of higher prices—a sales tax, to be precise. If the

 corporation cuts its profits in order to maintain prices, then investors (many of them moderate-income folks

 saving for retirement) pay for the welfare programs. Liberals surely know that taxing corporations is

 nothing more than passing along the tax burden to the people who buy their products without admitting to

 actually “taxing” the electorate. They also know the average voter fails to fully grasp this simple truth.

          Despite the many problems evident when “never enough” sets the agenda, American conservatives

 have done little to effectively constrain the expanding Welfare State. So rather than trying to destroy or

 even significantly diminish it, conservatives should take a more modest and potentially useful approach.

 The voters, who now regard programs such as Social Security as their inalienable right, will not accept any

 curtailment of such entitlements. So let them be! Just try to find ways to make the entitlement programs

 more efficient, less abused, and financially sound. “Starving the beast” by making careful cuts in certain

 areas—locating duplicate or antiquated programs—will inject a bit of financial integrity to the system.

“Means-testing” some programs—requiring recipients be clearly worthy of their benefits—is another

technique capable of enlisting voters’ support. Wisely pursued, such modus operandi may even enlist the

 support of thoughtful liberals as well as voters. “If liberals and conservatives decide they can do business

with each other it will be because conservatives accept they’ll never sell voters on ther huge benefit

reductions they ultimately seek, and because liberals decide they’ll never sell the huge tax increases they

ultimately need” (#4230).

******************************************

         William Voegeli has followed up his examination of “America’s limitless welfare state” in Never

Enough with The Pity Party: A Mean-Spirited Diatribe against Liberal Compassion (New York:

HarperCollins, c. 2014). He has nothing but praise for the classic compassion found in Scripture or moral

philosophers of antiquity. To personally feel sorrow in the face of others’ pain is always commendable.

But today’s “liberal compassion” is a new phenomenon. Anyone attentive to public life has easily noted

the increasing attention given various kinds of victims and the “compassion” urged with regard to them.

Bill Clinton’s famous words, “I feel your pain” have become a formidable plank in various political

platforms, and exit polls indicate Barack Obama won the 2012 election primarily because a majority of

voters (who thought Romney would do better in many ways as Chief Executive) thought Obama better

understood and identified with them. “Romney won clear victories among the three-fourths of the

electorate who believed a presidential candidate’s most important quality was whether his ‘vision for the

future’ (54 percent to President Obama’s 45 percent), whether he ‘shares my values’ (56 percent to 42

percent), or was ‘a strong leader’ (61 percent to 38 percent). Obama carried the one remaining category so

decisively, however, as to win reelection. Of the 21 out of every 100 voters who believed the most

important quality in a presidential candidate was that ‘he cares a bout people like me,’ 17 voted for Obama

and 4 voted for Romney” (#2321).

         Obama routinely reduces his political principles and objectives to kindness. Thus he appointed

Supreme Court Justice Sonia Sotomayor not because she was a distinguished jurist but because she could

empathize with people. Such empathy, said George Lakoffin defense of her nomination, ‘”is at the heart of

progressive thought. It is the capacity to put oneself in the shoes of others—not just individuals, but whole

categories of people: one’s countrymen, those of other countries, other living beings, especially those who

are in some way oppressed, threatened, or harmed” (#368). Liberal politicians’ claims—e.g. Al Gore

sorrowing at his sister dying of tobacco-induced cancer and Obama lamenting his dying mother’s problems

with health care insurance—are frequently fudged (if not fabricated) to elicit maximum audience response.

But such rhetorical indulgences guarantee votes, and the Democrat Party has, Voegeli insists, effectively

turned into the Pity Party! Thus we’re witnessing “the Oprahfication of America, evident in the way

political conventions now aspire to be empathy-tests that can hold their own with daytime talk shows”

(#2195). Sadly enough, “A nation increasingly dependent on heartrending anecdotes to focus and activate

its sense of justice is one that’s losing the capacity for moral and abstract reasoning” (#2220).

         In a 2013 speech President Obama endorsed film critic Roger Ebert’s words (“Kindness covers all

of my political beliefs”) and declared “when I think about what I’m fighting for, what gets me up every

single day, that captures it just about as much as anything. Kindness; empathy—that sense that I have a

 Reedings #266—Never Enough                                              4

 stake in your success; that I’m going to make sure, just because [my daughters] are doing well, that’s not

 enough—I want your kids to do well also'” (#146). Unfortunately, as C.S. Lewis presciently said, a tender

 kindness that wants others to be “happy” often lacks specificity. ‘”Kindness, merely as such,” wrote

Lewis, “cares not whether its object becomes good or bad, provided only that it escapes suffering. …. It

 is for people whom we care nothing about that we demand happiness on any terms: with our friends, our

 lovers, our children, we are exacting and would rather see them suffer much than be happy in contemptible

 and estranging modes'” (#1374).

          Since “compassion” is so widely touted as the core value for Progressives—Garrison Keiller

defines his brand of liberalism as “the politics of kindness”—Voegeli insists we rightly define and

understand the word. “According to the Oxford English Dictionary, ‘compassion’ means, literally,

 ‘suffering together with another,’ and is also defined, more substantively, as the ‘feeling or emotion, when

a person is moved by the suffering or distress of another, and by the desire to relieve it; pity that inclines

one to spare or to succor.’ The OED notes a subtle but significant distinction between those two senses of

the term: the first is an emotion shared by ‘equals or fellow-sufferers,’ while the second ‘is shown toward a

person in distress by one who is free from it, who, is, in this respect, his superior'” (#275).

         Thus fee ling compassion marks one as a good person. To support politicians and policies stamped

compassionate enables one join the righteous crowd. “The term ‘compassion’—or ’empathy,’ or even

 ‘kindness’—is routinely used not just to name a moral virtue, but to designate the pinnacle or even the

entirety of moral excellence. Precisely because this moral conviction is ambient, with so many Americans

taking for granted that moral growth requires little else than feeling, acting, and being more compassionate,

it’s an important yet difficult subject to analyze. Compassion is the moral sea we swim in, which works

against our awareness of it, much less efforts to chart its depths and currents” (#136). Importantly, it’s

feeling something rather than doing anything! To “feel your pain” (as tearfully as possible) was sufficient

for President Clinton! Compassion is all about one’s own feelings, not about doing something to help

someone—that would require personally doing acts of mercy or charity. Wealthy liberals love to support

taxes on others to help the poor while evading such taxes themselves through various loopholes.

         This kind of compassion began with the birth of “modernity” in the 18th century. Feeling good

about our good feelings gained credence in the works of Jean-Jacques Rousseau, in many ways the

architect of the French Revolution and many subsequent socio-political movements. When he identified

with someone else, he said, ‘”and I feel that I am, so to speak, in him, it is in order not to suffer that I do not

want him to suffer. I am interested in him for love of myself” (#611). Loving ourselves, we feel better

about ourselves when we feel empathetic or compassionate for others. If we give something to someone in

need, it’s not to actually help him but to inflate our own self-esteem! In the wake of Rousseau, says

historian Michael Kazin, “liberal modernism” has boosted self-expression and self-discovery and self-

esteem rather than self-sacrifice and self-denial: it prescribes ‘”the unchaining of sexual pleasure from

procreation, the liberation of art and literature from the didactic imperative, empathy with ethnic and racial

outsiders and an identification with the rougher aspects of life'” (#707).

         Voegeli charts this form of compassion as applied to such concerns as humanitarian aid, “higher

patriotism,” immigration, poverty programs, race relations etc. Inevitably it promises more than it delivers,

if one judges the actual assistance given needy people. Since they “always want America to be more

compassionate than it is,” (#1532) the needy must be perpetually needy. “Empathizers who get to feel like

good people because of their empathy, however, may prefer to regard empathizees’ sufferings as chronic

conditions to be managed rather than transitory ones to be solved. ‘Pity is about how deeply I can feel,’

[Jean Bethke] Elshtain argued. ‘And in order to feel this way, to experience the rush of my own pious

reaction, I need victims the way an addict needs drugs'” (#1871). This is especially true today when the

plight of America’s blacks is considered. However much “progress” may have occurred, white liberals feel

guilt for the problems plaguing the black community. Highly privileged themselves, they talk about

abolishing privilege! Since slavery is “America’s original sin,” all symptoms of its survival must be cut out

from the human heart as well as various institutions.

         So it seems, to Susan Sontag, that the great achievements of Western Civilization—Mozart’s

music, Newton’s science—cannot “‘redeem what this particular civilization has wrought upon the world.'”

Inasmuch as Westerners had abused non-Western cultures and the environment itself, the “white race is the

cancer of human history'” (#1814). Cancer patients, of course, can do little to save themselves! Seriously

sick people can do little on their own to improve their lot. Appeals to self-reliance or self-sacrifice are

branded hard-hearted and lacking compassion. Victims can neither be blamed for their status nor expected

to escape it. But in feeling pity for them the Susan Sontags of the world feel pleased with themselves!

265 Miracles and Miraculous Cloths

Eric Metaxis has garnered well-deserved acclaim and awards for prize-winning biographies of William Wilberforce and Dietrich Bonhoeffer.  He has also emerged as an influential figure within a flourishing Christian community in New York City.  His most recent publication, Miracles:  What They Are, Why They Happen, And How They Can Change Your Life (New York:  Dutton, c. 2014), bears witness to both his roots in historic orthodoxy and contemporary Christian witness.  Thus he sets forth, in the book’s initial chapters, a philosophical case for the credibility of supernatural workings, followed by a much longer section detailing the stories of persons he knows and trusts who have experienced various kinds of miraculous events.  “To those who might think these stories merely subjective accounts and not objective evidence, it must be said that history comprises the subjective accounts of human beings:  and from these subjective accounts we arrive at an ‘objective’ truth—which is itself still somehow and to some extent subjective.  There can never be a question whether such things are subjective; the only real question can be whether those subjective accounts are reliable” (#98 in Kindle).  

Metaxis’ philosophical case for miracles relies heavily on arguments set forth by great 20th century apologists (e.g. G.K. Chesterton’s Orthodoxy and C.S. Lewis’ Miracles) as well as recent scholarly works such as Craig S. Keener’s recent 1200 page Miracles.  To believe in miracles first and foremost entails believing in God.  If one believes that God created, ex nihilo, all that exists, it hardly seems irrational to believe He could do miraculous things within His creation, including the many biblical interventions and (above all) the Resurrection of Christ.  In a remarkable conversation a century ago between Adolph von Harnack, the incarnation of Protestant Liberalism, and Adolf Schlatter, his orthodox counterpart on the Berlin theological faculty, Harnack said the two were basically in agreement except for one small matter:  miracles.  To which Schlatter replied:  “No we are divided on the question of God, for what is at stake in the question of miracles is in fact whether God is God or merely a part of the realm of subjectivity.”  

As Augustine wisely said:  “Miracles are not in contradiction to nature.  They are only in contradiction with what we know of nature.”  And no one has put the case for the miraculous than Chesterton, who said:  “my belief that miracles have happened in human history is not a mystical belief at all; I believe in them upon human evidences as I do in the discovery of America.”  Taking witnesses at their word is basic to historical inquiry and the judicial process.  Ironically, “believers in miracles accept them (rightly or wrongly) because they have evidence for them. The disbelievers in miracles deny them (rightly or wrongly) because they have a doctrine against them. The open, obvious, democratic thing is to believe an old apple-woman when she bears testimony to a miracle, just as you believe an old apple-woman when she bears testimony to a murder.  The plain, popular course is to trust the peasant’s word about the ghost exactly as far as you trust the peasant’s word about the landlord.  Being a peasant he will probably have a great deal of healthy agnosticism about both.  Still you could fill the British Museum with evidence uttered by the peasant, and given in favour of the ghost.  If it comes to human testimony there is a choking cataract of human testimony in favour of the supernatural.  If you reject it, you can only mean one of two things. You reject the peasant’s story about the ghost either because the man is a peasant or because the story is a ghost story.  That is, you either deny the main principle of democracy, or you affirm the main principle of materialism—the abstract impossibility of miracle.  You have a perfect right to do so; but in that case you are the dogmatist.  It is we Christians who accept all actual evidence—it is you rationalists who refuse actual evidence being constrained to do so by your creed.  But I am not constrained by any creed in the matter, and looking impartially into certain miracles of mediaeval and modern times, I have come to the conclusion that they occurred.  All argument against these plain facts,” Chesterton continues, “is always argument in a circle. If I say, ‘Mediaeval documents attest certain miracles as much as they attest certain battles,’ they answer, ‘But mediaevals were superstitious’; if I want to know in what they were superstitious, the only ultimate answer is that they believed in the miracles.  If I say ‘a peasant saw a ghost,’ I am told, ‘But peasants are so credulous.’  If I ask, ‘Why credulous?’ the only answer is—that they see ghosts.  Iceland is impossible because only stupid sailors have seen it; and the sailors are only stupid because they say they have seen Iceland.”  Circular arguments, naturally, go nowhere!  

“The Greek word for miracle,” Metaxis says, “is ‘simaios,’ which means ‘sign.’  Miracles are signs, and like all signs, they are never about themselves; they’re about whatever they are pointing toward.  Miracles point to something beyond themselves.  But to what?  To God himself.  That’s the point of miracles—to point us beyond our world to another world” (#289).   Rightly understood, the natural sciences can do no more than carefully describe the physical world.  To explain it easily leads us to infer miraculous events—the improbable appearance of life on earth, the existence of our finely-tuned universe, why there is something rather than nothing.  “Reason and science compel us to see what previous generations could not:  that our existence is an outrageous and astonishing miracle, one so startlingly and perhaps so disturbingly miraculous that it makes any miracle like the parting o the Red Sea pale in such insignificance that it almost becomes unworthy of our consideration, as though it were something done easily by a small child, half-asleep.  It is something to which the most truly human response is some combination of terror and wonder, of ancient awe and childhood joy” (#853). 

Turning to the contemporary “miracle stories” narrated by individuals Metaxis knows and trusts, we’re first reminded of transforming prototypes—some (such as St. Paul’s and Chuck Colson’s) instantaneous and others gradual (e.g. William Wilberforce’s and C.S. Lewis’s).  Metaxis himself bears witness to God’s intervention in his life, through an inexplicable dream involving a golden fish (IXTHYS), changing literally everything for him.  In his student days at Yale he’d hungered from something to give life meaning, but God mercifully “had something more for me:  He gave me his son, a living person, Jesus Christ.  I realized in the dream that Jesus Christ was real and had come from the other side to me—to me—and now I was holding him there in the bright sunlight and I was flooded with joy at the thought of it.  At long last my search was over.  It was over.  And it was true.  There was a God and Jesus was God and he’d shown that to me in a way that only I could understand, in a way that utterly blew my mind.  God knew me infinitely better than I knew myself, had taken the trouble to speak to me in the most intimate language there was:  the secret language of my own heart.  That was that” (#2149).  Later on, another powerful dream led him rather specifically to write his book on Bonhoeffer.  Adding to his own story, he shares those of Frederica Mathewes-Green, a talented writer, as well as “Cisco,” a former drug dealer who now gives witness to the powerful change wrought in his life by his Lord and Savior, and Alice von Hildebrand, the widow of Dietrich von Hildebrand who is herself a distinguished philosopher professor.  

Healing miracles abound throughout the history of the Church—and they continue today in New York City!  Indeed, writes Metaxis:  “They are more common than I ever thought” (#2371).  Cisco, the former drug dealer, prayed that an acquaintance be healed of AIDS—and he was!  One of Metaxis’ good friends, Christine, personally witnessed the dramatic healing of her grandfather, who had been unable to stand or walk for six months.  One of Christine’s aunts felt moved to pray for him and “put her hands on the grandfather’s legs and prayed a very powerful prayer that he be healed.  A moment after she had finished, the grandfather stood up and immediately started walking.  They were all stunned to witness it.  Christine said that even now, so many years after it happened, remembering it makes her very emotional.  She remembers thinking that she couldn’t believe it was possible for a miracle to happen right in front of her eyes, that in just a moment’s time God could wipe away so many months of misery and pain” (#2660).  

“Miracles of Inner Healing” also occur with regularity.  Paralyzing guilt disappears, dissolved by God’s forgiving power.  Broken marriages are re-knit, massaged by the Spirit’s reconciling energy.  “Angelic Miracles” recounted by several of the author’s informants point toward the continuous workings of divine messengers involved in earthly affairs.  Small events—such as finding keys or the inspiration to make phone calls or speaking words that touch the heart of a judge on behalf of an innocent cab driver, and near-death experiences—may all rightly be considered miraculous, Metaxis says, given God’s intimate interest and involvement in every facet of our lives.  

* * * * * * * * * * * * * * * * * * *

Last summer, Dr. Rolf Enger, an Air Force Academy colleague of Dieter Rademacher (the pastor of the community church we attend in Lake George, Colorado) gave a fascinating presentation on the Shroud of Turin.  He had been involved, 30 years ago, with a 40-man scientific research team (the Shroud of Turin Research Project) granted access to the Shroud to carefully weigh all the evidence regarding its authenticity.   Members of the team were experts in various fields, representing diverse scientific disciplines, seeking the truth rather than to either debunk or demonstrate the Shroud’s authenticity.  The team were all volunteers and had no obligatory ties to the Catholic Church.  When I asked Dr. Enger to recommend a book detailing the investigation he pulled out a copy of Verdict on the Shroud (Wayne, PA:  Banbury Books, Inc., c. 1981) by Kenneth E. Stevenson (a scientist) and Gary R. Habermas (an historian).  I secured a copy of the book and find it quite well-done and persuasive.  In light of all the evidence, “the more we learn about the Shroud, the more likely it seems that the cloth is what it purports to be—the burial garment of Jesus Christ” (p. 5).  The image on the Shroud is of a bearded male, 5’11” in height, weighing around 175 pounds, well-built and muscular.  His “wounds in their entirety exactly match the wounds Christ received as recorded in the gospels” (p. 43).  

A chapter entitled “The Shroud and History,” provides details regarding the cloth’s 14th century appearance in France with clues regarding its earlier history.  The image on the shroud resembles the face of Christ portrayed by Christian artists from the sixth century onward.  It is, in fact, “the standard face of Jesus in art” (p. 17).  Documents from the sixth century point toward the “image of Edessa, the ‘Holy Mandylion,’”—a cloth thought by some to have been brought from Jerusalem to Edessa by Jesus’ disciple, Jude Thadddeus, in the first century and then found in one of the walls surrounding Edessa.  This cloth was taken to Constantinople in 944, where it was “revered as the true likeness of Christ” (p. 20).  Following the 1204 sack of Constantinople by European crusaders, the cloth disappeared.  How it arrived in France a century later no one knows, though some think the Knights Templar played a role in preserving it.  When it was presented to the public in 1357 it was believed by some to be authentic and by others to be a “pious fraud.”  There seemed to be no way of resolving the controversy until quite recently, when new technologies facilitate a critical appraisal of the Shroud.  

Modern interest in the Shroud began in 1898, when an Italian lawyer, Secondo Pia, took pictures of it when it was publicly displayed.  Developing his pictures in a dark room, Pia was astonished to see the form of a man clearly evident in the negatives.  One can barely detect the form of a man when looking at the Shroud itself.  But the negatives truly brought to light a remarkable figure!  Obviously “the Shroud was not an obvious forgery.  Why would a fourteenth-century forger have painted a negative image?” (p. 71).  Subsequent, more sophisticated photographs detected no traces of pigment on it, indicating it was not a painting.  After more than a century of ever-improving scientific techniques, the authors “conclude that the scientists’ work made a forgery virtually impossible” (p. 122).  Interestingly, there is today more serious interest in the Shroud than in earlier centuries, and our more sophisticated the testing methodologies  increase the likelihood that it was the cloth covering Jesus’ body in the tomb.  

Clearly, there is blood rather than paint on the cloth.  And the image actually seems to have been generated by a “scorch”—a mysterious emanation of heat, almost like radiation!  The cloth itself is similar to others dating from first century Palestine, and some of the tiny plant pollens and spores found on it (discovered by microscopic analysis) are unique to that era and region.  Other tests reveal “that the Shroud image contains three-dimensional data” that can only be explained by it being placed on a recently-deceased body.  “The three-dimensional picture of the head of the man in the Shroud also revealed another surprise:  small button-like objects had apparently been placed over his eyes” (p. 82).  Coins were frequently placed on corpses in Jesus’ day, and a knowledgeable numismaticist concluded “that the coin over the right eye of the man in the Shroud was a lepton minted in the time of Pontius Pilate” (p. 82).  

Whether or not the image on the Shroud is that of Jesus can never be proven.  But the wounds on the man certainly match up with the Gospel accounts of His crucifixion.  He had been scourged, suffering some 220 wounds, with the Roman flagrum, the device used by Roman soldiers in the first century.  His head had been lacerated by a crown of thorns.  Bruises on his shoulder indicate he carried a heavy object.  He was nailed to the Cross with nails through is wrists, not his palms—something we now know was the Roman custom, though not known in the Medieval Era.  His legs were not broken, indicating he died on the Cross.  A wound on his side indicates he suffered a spear thrust as he expired.  “The evidence is consistent at every point.  The man of the Shroud suffered, died, and was buried the way the gospels say Jesus was” (p. 162).  A mathematician, collating all the data, estimated the probability that “we have 1 chance in 82,944,000 that the man buried in the Shroud is not Jesus” (p. 167).  A rather strong probability!  

So what does it mean for us in the 21st Century?  It primarily means the Gospels can be trusted, down to their rather specific details.  The Shroud also strongly supports the Christian belief in Jesus’ Resurrection—the “scorch” on the cloth may have been caused by a burst of supernatural energy as He arose from the dead.  And finally, the Shroud reminds us that philosophical naturalism—including its dogmatic denial of miracles—cannot explain a multitude of things, including the Shroud of Turin.  

* * * * * * * * * * * * * * * * * * * *

Paul Badde is a diligent German journalist, a devout Catholic who has devoted many years to demonstrating the authenticity of the Shroud of Turin and other sacred artifacts.  In the amply, indeed lavishly illustrated The True Icon:  From the Shroud of Turin to the Veil of Manopello (San Francisco:  Ignatius Press, c. 2010), he provides summaries of the latest research that give reasons to believe in the supernatural origin of the both cloths, both well are preserved in Italian churches.  “The shroud [of Turin] has long been the most thoroughly investigated piece of fabric in the world.  And after all that, the origin of the image that rests on its fibers remains utterly inexplicable” (Kindle #140).  

Badde revisits the history of the Shroud, including its probable journey from Jerusalem to Edessa to Constantinople and ultimately to France in the 14th century.  Having studied the documents and visited the sites, he asserts:  “Everywhere it was as if we were following the trail of a protective hand that again and again mysteriously rescued this cloth from a great number of dangers” (#647).  And he sums up the most recent scientific studies regarding its composition.  Beyond the Shroud, however, the Veil (the small sudarium or napkin thought to have been placed on Jesus’ face) of Manopello has been little noticed or acclaimed—something Badde is determined to rectify.  “To this day the little burial cloth complements the large burial cloth and makes it accessible.  Together they fit into the Gospel of John [cf. Jn 20:7] like the last pieces of the puzzle” (#1143).  

Granted their authenticity, the Shroud and Veil are the earliest witnesses to the Gospel of Christ.  Written documents were composed two decades later.  But the Shroud, “with the traces of the Passion is the first page of the Gospels.  The delicate little napkin, which was revered for so long in Rome as ‘the veil of Veronica of Jerusalem’, is the second.  Both originate at the zero hour of Christianity.  Thus two images—and not any new scrolls—form the hot core of the Good News of Christendom.  The images were there when words failed—and the apostles were still speechless” (#1373).  To Badde, in “these two cloths the mystery of the Christian faith is presented as in no other document.  They marvelously fill up the brief text of the Gospel” (#1436).

* * * * * * * * * * * * * * * * * * * * * * *

In The Face of God:  The Rediscovery of the True Face of Jesus (San Francisco:  Ignatius Press, c. 2006), Paul Badde sets forth a deeply personal quest, involving scores of journeys and interviews, to validate the Volto Sancto (the “Holy Veil,” thought by many to be the legendary Veronica’s Veil) of Manoppello as the very cloth placed on Jesus’ face when He was buried.  Badde traces the Veil’s journey across the centuries until it was publically displayed in Rome half a millennium ago.  He also studies a multitude of ancient and medieval artistic works, mostly in churches, depicting Jesus in accord with the face on the Veil.  Though far less renowned than the Shroud of Turin, it portrays the exact same image.  Indeed, Heinrich Pfeiffer, a learned Jesuit professor and highly regarded specialist, says there is a “‘complete correspondence that results when you place the Face from the Shroud of Turin on top of that of Manoppello.’”  Thus we are driven to conclude “that the image on the sudarium and that on the Shroud originated at the same time’” (#1299).  If indeed the two images are authentic, they dramatically reveal to us the deepest truth of the Christian faith, for as Cardinal Ratzinger declared:  “‘God,’ of whom there can be no images, nevertheless has a face and a name and is a person.  And salvation consists, not in being immersed in namelessness, but rather in the ‘satisfaction in seeing his face’ that will be granted to us when we awaken’” (#182).  

Similar to the linen Shroud, the Veil preserves the face of a man, but on an almost transparent, iridescent fabric—byssus, the most expensive of ancient fabrics, which was woven with painstaking care from mussels’ fibers.  With modern microscopic technology, we find no traces of pigment, so it is not a painting.  Still more, it is simply impossible to apply paint to mussel silk.  The image must be the result of some other process.  To Professor Heinrich Pfeiffer, who meticulously examined it in the 1990s, the veil had probably “been laid on top of the large sheet in which the crucified Christ had been laid.  That would also explain, he said, why the Turin Shroud bore a negative image, and the veil laid on top of it, in accordance with the rules of photography, a positive one” (#1227). 

In Badde’s passionate perspective, the images on the Veil and the Shroud were inscribed by Christ’s face, supernaturally revealing God Himself.  “The Veil of Manopello is the sudarium of Christ.  This is the mysterious second cloth from the tomb of the crucified Christ that John the Evangelist discovered about forty hours after the death of Jesus in his empty tomb—together with another linen sheet, which is today preserved in Turin” (#3711).  Both cloths are incredible inasmuch as no naturalistic explanations suffice, and together they “reflect nothing less than the miracle of the absolutely inexplicable Resurrection of Jesus Christ from the dead.  They are not photos or painting; they are themselves marvelous new creations by God.  The two images are as inexplicable as life itself” (#3723).  

264 Diana West

Diana West is a Yale-educated journalist, writing a weekly syndicated column with a decidedly conservative slant.  Determined to understand and explain certain features of modernity, she ties together interesting threads of evidence and teases out possible conduits of elucidation that prod the reader to ponder her presentations rather than thoughtlessly embrace her perspectives.  Four decades ago Eric Hoffer, in Reflections on the Human Condition, warned: “If a society is to preserve its stability and a degree of continuity, it must know how to keep its adolescents from imposing their tastes, attitudes, values, and fantasies on everyday life.” Now Diana West declares the adolescents have done precisely that.  In her first book, The Death of the Grown-Up:  How America’s Arrested Development is Bringing Down Western Civilization (c. 2007) she took a critical look at the effective defection of adults in various crucial societal roles.  Whereas many writers have lamented the “prolonged adolescence” plaguing the Western world, West suggests it has become institutionalized!

  She began awakening to this fact when still a child, after spending a year with her family in Ireland (far away from her Los Angeles home) while her father worked on a novel.  Returning to America, she was struck by the strangeness of many things she’d earlier taken for granted.  This included the childish behavior of adults.  She began to see that:  “Once upon a time, in the not too distant past, childhood was a phase, adolescence did not exist, and adulthood was the fulfillment of youth’s promise.  No more.  Why not?  A profound civilizational shift has taken place, but, shockingly, it is one that few recognize” (#79 in Kindle).   Teenagers no longer aspired to become adults and adults longed to behave like adolescents.  So “father and son dress more or less alike, from message-emblazoned T-shirts to chunky athletic shoes, both equally at ease in the baggy rumple of eternal summer camp” (#124).  Clergymen, once determined to appear as serious adults, now try to dress more casually than day laborers.  In fact, “More adults, ages eighteen to forty-nine, watch the Cartoon Network than watch CNN.  Readers as old as twenty-five are buying ‘young adult’ fiction written expressly for teens” (#97).  

If only such similarities were merely superficial!  But abetted by Hollywood films and rock-and-roll music and pop journalism and progressive education, adults (and particularly fathers) have abandoned their traditional roles.  Whereas children were once duty-bound to care for their parents, now parents are obligated to make life enjoyable for their offspring; children once circled around their parents, but today’s adults orbit like helicopters around their kids.  Before WWII, homes were adult-centered; following the war they became increasingly child-centered.  The signal adult endeavor in centuries past was what Lionel Trilling termed “making a life,” seriously pursued by all mature persons.  Now we are more likely to be concerned with “enjoying life,” playing with our “toys,” and we no longer revere “what goes along with maturity:  forbearance and honor, patience and responsibility, perspective and wisdom, sobriety, decorum, and manners—the wisdom to know what is ‘appropriate,’ and when” (#173).  Consequently, as Mike Males says:  “‘The deterioration in middle-aged adult behavior has driven virtually every major American social problem over the past 25 years’” (#665).  

Beyond describing—with a journalistic flair for telling anecdotes and exaggerations and provocative examples—the various symptoms of societal decay, West seeks to explain what has happened, why America has changed so dramatically in half-a-century.  She concludes, in accord with (though never citing) some of the past century’s finest thinkers (notably C.S. Lewis and Popes John Paul II and Benedict XVI), that moral relativism is the culprit.  Once we began talking about “values” rather than “virtues” we tossed aside the moral objectivity needed for a healthy society.  We thus inhabit a moral universe that “no longer sees any point in inculcating ‘good’ or ‘moral’ behavior in its young.  Rather, it labors to encourage ‘better choices.’  Instead of virtues to live by, society provides ‘news you can use’ about hygiene, about cliques, about tattoos, about sex, about STDs, about alcohol, about drunk driving, about rape, about gang rape, about date rape, about date-rape drugs, about other drugs . . . the list of vices to bone up on is endless” (#1729).   Never is it suggested that casual sex is bad—it’s just something to be properly informed about in order to make personal (i.e. “safe”) choices.  Above all one must never be “judgmental” or “prudish” or “xenophobic” about much of anything lest it “offend” someone.  “Openness and acceptance on every and any level—from personal to national, from sexual to religious—are the highest possible virtues of the postmodern Westerner.  This makes boundaries and taboos, limits and definition—anything that closes the door on anything else—the lowest possible sins” (#3140).  

This is dramatically evident in today’s multicultural climate, wherein nothing critical of Islam is allowed.  Our claims to avoid offense out of respect are more likely the silence of fear.  As President George W. Bush quickly discovered, no reference to a “crusade” will pass the scrutiny of political correctness.   No one dare suggest that Muslims shouting Allahu Akbar—“Praise Allah”—are following Islamic teachings.  None dare insist that Jihad, in Islamic tradition, always means violent aggression, defeating and subduing non-Muslim peoples.   “Terrorists” there may be, we’re told—but they are incidental extremists, a title easily applied to Christians or Jews as well as Muslims.  All religions are equal and thus equally capable of disreputable behavior.  Submitting to this kind of thinking, many Westerners have unwittingly submitted to the dhimmitude described by Bat Ye’or:  the guards around synagogues in Europe and the security lines in airports equally denote a people under siege, a culture capitulating “to the infringement of freedom” orchestrated by the advance wave of militant Islam.  

Unfortunately, “Our leaders and pundits, our generals and academics, pay repetitive and obsequious obeisance to ‘noble Islam’ (with never a bow, of course, to ‘noble’ anything else).  They depict jihad as a mutation of Islam—the ‘distorted,’ ‘hijacked,’ or ‘defiled’ practice by the ‘violent fringe’ or ‘tiny band of extremists’—despite jihad’s central, driving, animating role throughout the history of imperial Islam.  As for dhimmitude, it remains an alien concept, even as non-Muslims in the West are increasingly accommodating themselves to Islamic law and practices.  While the president of the United States appears no longer to consider Islam an out-and-out religion of ‘peace,’ he’s settled into an equally ahistorical formulation by delegitimizing jihad violence s ‘the perversion of a few of a noble faith into an ideology of terror and death’” (#4997).   

How unlike from Barack Obama was Winston Churchill!  “Sharp and direct, Churchill says what he has seen, and what he thinks about what he has seen—sans gag, filter, rose-colored glasses, or net.”  Commenting on the Muslims he’d encountered, he wrote:  “‘How dreadful are the curses which Mohammedanism lays upon its votaries!  Besides the fanatical frenzy, which is as dangerous in a man as hydrophobia in a dog, there is this fearful fatalistic apathy.  The effects are apparent in many countries.  Improvident habits, slovenly systems of agriculture, sluggish methods of commerce, and insecurity of property exist wherever the followers of the prophet rule or live.  A degraded sensualism deprives this live of its grace and refinement; the next of its dignity and sanctity.  The fact that in Mohammedan law every woman must belong to some man as his absolute property, either as a child, a wife, or a concubine, must delay the final extinction of slavery until the faith of Islam has ceased to be a power among men.  Individual Moslems may show splendid qualities. . . .  But the influence of the religion paralyses the social development of those who follow it.  No stronger retrograde force exists in the world’” #5027).  Churchill thought as an adult, facing the oft-harsh reality of things.  We need men like him today.  “Eternal youth is proving fatal; it is time to find our rebirth in adulthood” (#5083).

* * * * * * * * * * * * * * * * * * * *

Few books have sent me to check sources and order cited monographs more than Diana West’s American Betrayal:  The Secret Assault on Our Nation’s Character (New York:  St Martin’s Press, c. 2013).  In part this is because she refers to fresh historical evidence regarding Soviet espionage in America, but mainly because she suggests—almost in a stream-of-consciousness style, studded with journalistic jibes and off-the-cuff comments—connections and plausible interpretations that challenged some of the notions I’d earlier absorbed from mainline historical works.  So I review American Betrayal with a real skepticism regarding West’s position conjoined with an admiration for her willingness to look for fresh explanations while trying to understand this nation’s development.  I also share her concern for what George Orwell discerned in 1936 (when writers dealing with the civil war in Spain lost interest in evidence and objective reporting):  “What is peculiar to our age,” said Orwell, “is the abandonment of the idea that history could be truthfully written.”  Still more, he said:  “I saw, in fact, history being written not in terms of what happened but of what ought to have happened according to various party lines.”  

Unfortunately, some writers (such as Orwell) who have tried to present evidence and stand for truth have all too often been ignored or smeared by devotees of various “party lines.”  Ideology easily trumps truth!  Thus Whittaker Chambers declared, in his memorable memoir, Witness:  “The simple fact is that when I took up my little sling and aimed at Communism, I also hit something else.  What I hit was the forces of that great socialist revolution, which, in the name of liberalism, spasmodically, incompletely, somewhat formlessly, but always in the same direction, has been inching its ice cap over the nation for two decades . . .  [This] is a statement of fact that need startle no one who voted for that revolution in whole or in part, and consciously unconsciously, a majority of the nation has so voted for years.  It was the forces of that revolution that I struck at the point of its struggle for power” (pp. 741-42).   Equally important, Chambers—and Diana West as well—probes beneath the details to a philosophical hypothesis, linking today’s “cultural relativism” to critical decisions made by this nation’s leaders during the past century.   

West’s story begins with the 1934 appearance of William A. Wirt, a famous Indiana schools superintendent, before a select House committee regarding an insidious plot to destroy “the American social order.”  There were, he’d earlier alleged, schemers (notably some of Franklin D. Roosevelt’s “Brain Trusters” such as Jerome Frank, who brought Alger Hiss to Washington, and Rexford Tugwell, who was positively infatuated by the “Soviet Experiment”) working inside some New Deal agencies.  So Wirt came to Washington to disclose what (based on first-hand information) he knew.  The Democrat-controlled committee, however, refused to grant Wirt a fair hearing, taking every opportunity to suppress his evidence and smear his character.  FDR and his devotees in the press ridiculed Wirt and he slid quickly into obscurity.  Six years later, however, one of the Democrats on the committee, John J. O’Connor (D-NY) admitted to helping quash Wirt’s testimony and was lamented having helped turn the “thumbscrews” on him.  In retrospect, O’Connor said he’d come to believe much Wirt had claimed was in fact true.  

That Wirt was right provides Diana West a guiding light whereby to understand how America was first betrayed by supporters of Stalin and his Communist ideology and more recently by defenders of Islam and its role in the world.  Dealing with both movements, American leaders seemed unable to deal honestly with evidence and make clear moral judgments regarding how this nation should respond.  She wrote this book, primarily, what “throughout eight years of George W. Bush and four years of Barack Obama, caused our leadership to deny and eliminate categorically the teachings of Islam from all official analysis of the global jihad that has wracked the world for decades (for centuries), and particularly since the 9/11 attacks in 2001?” (#273).   She actually finds many “parallels between America’s struggle with Communism and with Islam” (#395).  Indeed:  “As enemies of the West, godless Communism and godcentric Islam are strangely, eerily similar, in their collectivist, totalitarian natures, in their dysfunctional ideological reliance on the Eternal Foe for forward thrust, and, above all, in our blindness to all related and resulting implications of our struggle against them” (#513).  

When telling the story of Communist inroads West relies on significant historical studies done since Soviet archives opened to Western scholars in the 1990s, though her interpretations are sometimes more assertive than their carefully-nuanced works suggest.  She also emphasizes the importance of earlier truth-tellers such as the English historian Robert Conquest, the Russian novelist Alexander Solzhenitsyn, and the American journalist, Eugene Lyons.  Conquest’s delineation of Soviet brutality (ca. twenty million killed under Stalin) began with The Great Terror, a 1968 publication countering the generally pro-Stalinist position of academic historians.  Solzhenitsyn’s One Day in the Life of Ivan Denisovich began his career of exposing the Soviet gulag archipelago.  And Lyons’ Red Decade told of Bolshevik inroads into America, while his Assignment in Utopia reveals his transformation from a “committed fellow traveler and dedicated apologist of the Soviet experiment to outspoken and remorseful anti-Communist” (#2253).  

It is now undeniable that Alger Hiss was a Soviet spy and the Rosenbergs transmitted information regarding America’s nuclear research to the USSR.    There’s little doubt that Elizabeth Bentley and Whittaker Chambers were witnesses to the truth in their post-WWII testimony before congressional committees.  But West finds much more regarding Soviet influence within the Roosevelt administration.  She takes seriously the words of a Russian researcher, Vladimir Bukovsky, who said:  “‘Because of the documents I recovered [in Soviet archives], we now understand why the West was so against putting the communist system on trial.  It is not only that the West was infiltrated by the Soviets much deeper than we ever thought, but also that there was ideological collaboration between left-wing parties in the West and Soviet Union.  This ideological collaboration ran very deep [emphasis added]’” (#1288).  

Consider one of the many instances West investigates, the “Soviet First” policy followed in FDR in his Lend-Lease program.  Initially adopted to help England in its “finest hour,” struggling to defend herself, it turned into a massive funnel moving American industrial goods to Russia, even when it meant denying supplies to American forces under Douglas MacArthur then embattled in the struggle with Japan in the Philippines.  According to Major George Racey Jordan, the officer in charge of distributing massive amounts of war materials from a base in Montana, the Soviets were given “first priority” and received  newly-minted airplanes sorely needed by the U.S. Army Air Force.  Implementing Lend-Lease (dubbed by Jordan “the greatest mail-order catalogue”) delivered “to the USSR those half a million trucks and jeeps that Khrushchev declared in 1970 were indispensable to the Red Army sweep across Eastern Europe, pulling the Iron Curtain down behind them” (#2614).  Among other items Jordan shipped to Stalin were the aluminum tubes and uranium needed to build a nuclear reactor.  Conventional historians think supplying Russia with war materials a wise move, necessary to defeat Hitler.  To West, however, it seems better understood as naively arming an evil tyrant, Stalin, who was above all determined to expand his power throughout Eastern Europe.  Drawing upon Jordan’s diaries, along with other sources, she suggests that Harry Hopkins, FDR’s most influential advisor and virtual “co-president” was primarily responsible for dispatching so much aid (via Lend Lease) to the USSR.  Indeed, Jordan said, “Harry Hopkins’s name was invoked daily by the Russians” (#3317) seeking to secure additional Lend Lease supplies.  

Right at the center of the controversial hypotheses highlighted by West in American Betrayal stands Harry Hopkins, labeled FDR’s “one man cabinet” by Life magazine in 1941.  For several years Hopkins, the one-time social worker elevated to cabinet positions by FDR, lived in the White House and constantly advised the President.  He, or his trusted assistants, accompanied FDR to all the important wartime conferences, and his views were clearly shared by the nation’s chief executive.  At the 1943 Tehran Conference, for example, Charles Bohlen remembered, Hopkins played a central role.  “‘Roosevelt was relying more and more on Hopkins, virtually to the exclusion of others.  At Tehran, Hopkins’ influence was paramount’” (#6636).  Illustrative of his eminence, when Hopkins entered a room, Averell Harriman says, Stalin “‘got up, walked across the room and shook hands with him.  I never saw him do that to anybody, not even Roosevelt.  He was the only man I ever saw Stalin show personal emotion for’” (#6336).  The men he fostered and supported form a “Who’s Who of the Roosevelt years:  Army Chief of Staff George C. Marshall, White House Chief of Staff Adm. William D. Leahy, Vice President Henry A. Wallace” (#3178).  FDR’s final Secretary of State, Edward J. Stettinius, who represented the U.S. at Yalta, was a loyal Hopkins’ protégé who had earlier worked within the Lend Lease organization.  

Though West stops short of definitively branding Hopkins a Soviet agent, she certainly provides incriminating evidence leading to that conclusion.  For example, she cites Oleg Gordievsky, “a former KGB colonel and KGB London chief who later served as an undercover British secret agent in Moscow (1974-85);” in 1990 he “reported that as a young KGB agent in the 1960s, he had heard Iskhak Akhmerov, the most spectacular of the secret Soviet spymasters or ‘illegals’ in wartime America, devote most of a lecture at KGB headquarters ‘to the man whom, he alleged was the most important of all Soviet wartime agents in the United States:  Harry Hopkins’” (#3401).  If Hopkins was, in fact “the most important of all” agents—surpassing Alger Hiss and the Rosenbergs and Harry Dexter White—he deserves serious scrutiny!  Thus far, however, conventional historians have dismissed allegations regarding Hopkins—probably seeking to preserve FDR’s reputation.  Without further research, I cannot render a verdict on Hopkins—but I’m now curious and willing to entertain questions regarding his role in shaping FDR’s foreign policy.  

West tackles yet another controversy when she deals with the Allies’ wartime decision of to open a “second front” in northern France and attain “total victory” against Hitler.  In 1943, given the recent successes of American and English armies in North Africa and Italy, some military leaders (General Mark Clark, commander of Allied forces in Italy) and Winston Churchill, urged a concerted military movement through the Balkans and Austria to the heart of Germany.  It would be a shorter route, benefitting from bases and troops already in place around the Mediterranean.  Since Nazi forces were still mired down in the USSR, these analysts believed a rapid end to the war could be achieved.  Many of them also feared that Stalin wanted to ultimately occupy and control Eastern and Central Europe—something he could not do if the war ended quickly.  He desired, according to the Russian historian Viktor Suvorov, “the war to last as long as possible in order to exhaust both Germany and its Anglo-American opponents.  Stalin was fighting to expand the Communist Empire.  He wanted open-ended war to do so” (#6421).  So “Uncle Joe” Stalin adamantly insisted on an invasion in France and FDR (strongly influenced by Harry Hopkins) supported the Russian dictator.  Thus D-Day! 

West certainly leans in the direction of “conspiratorial” suspicions.   She certainly has incited strongly mixed reviews of her work.  But by challenging, with impressive documentation, conventional histories, she drives us to consider views and scholars worth considering.  

263 “Playing God” Environmentalism

    As we grow older, we frequently regret decisions we’ve made and causes we’ve embraced, simply wishing we’d had more wisdom a few decades ago.  Reflecting on this in his Retractions, St Augustine looked back over his many decades of preaching and writing and found himself somewhat terrified when he considered the words of Jesus:  “Of every idle word men speak, they shall give account on the day of judgment” (Mt 12:36).  He further reflected on the words of James, warning teachers not to use words wrongly.  In Augustine’s case, he noted that even in old age he was less than “perfect,” but he was less so in “early manhood,” when he began to “write or to speak to the people, and so much authority was attributed to me that, whenever it was necessary for someone to speak to the people and I was present, I was seldom allowed to be silent and to listen to others and be ‘swift to hear but slow to speak.’”  

In my “early manhood” I was persuaded, by trusted “experts,” that we faced an “ecological crisis” of massive proportions.  With typically youthful enthusiasm I supported the “environmental” movement and invested considerable time and resources championing its message and goals.  I wish then I knew what I now know!  I wish I could have read Alston Chase instead of Rachel Carson and Paul Ehrlich!  Unfortunately, Alston Chase was himself then in the process of learning, to his sorrow, what we both needed to know.  Recently re-reading and again appreciating two of books helped me clarify why I now consider myself a “recovering environmentalist,” still in love with the wonders of creation but deeply skeptical of those writers (e.g. Rachel Carson and Aldo Leopold), organizations (e.g. the Sierra Club, the Nature Conservancy, and Greenpeace), and politicians (e.g. Al Gore and Barack Obama) who use environmentalism to justify their political agendas.  

Forty years ago Alston Chase left a tenured academic post (teaching the philosophy of science but increasingly disillusioned by the radical student assaults on the humanities in the ‘60s) and moved, with his wife, to a remote ranch in Montana’s Smith River country.  Building a log cabin 50 miles from the nearest town, they lived without electricity or telephone, enthusiastic “back-to-earth” devotees.  Later given the opportunity to write a book on Yellowstone National Park—a place he intimately knew and passionately loved—they sold the ranch in 1981 and moved to Paradise Valley, Montana, and he began a research project that culminated with the publication of Playing God in Yellowstone:  The Destruction of America’s First National Park (Boston:  The Atlantic Monthly Press, c. 1986).  A rare blend of scholarly acuity and personal passion, his treatise brilliantly illuminates much about the modern environmental movement.  He is particularly effective in analyzing its philosophical roots and New Left politics. 

Chase began his research project planning to celebrate the conservationism long associated with Yellowstone, which was set aside by Congress in 1872 “for the benefit and enjoyment of the people.”  Plunging into the project he soon grew alarmed at the park’s conditions and management.  Rather than being preserved it was being destroyed!  Key creatures (including the beaver) that flourished 50 years earlier had disappeared.  “Perhaps no animal was more important in Yellowstone ecology than the beaver,” and without them “the ponds had silted in, spring runoff in the streams had increased, the water table had dropped, and the drier ground was not producing the crop of palatable browse that it supported when the beaver had been there” (p. 13).  What had happened to the beaver?  (In time Chase was discovered that the park’s out-of-control elk population had destroyed the beavers’ habitat and driven them from their ancient home!)  

Thus he discovered that many of Yellowstone’s problems stemmed from the unintended consequences of park management.  Since 18th century hunters had depleted the park’s original buffalo and elk herds, game “managers” a century ago determined to restore them.  Once done, however the bison and elk, free from predators (including man)  which had once limited the size of the herds, rapidly proliferated.  “In thirty years the bison had been saved from extinction only to become a nuisance” (p. 22).  More significantly, the burgeoning elk herd especially threatened other species (such as beaver)  in the park.  To address this problem, President Kennedy’s Secretary of the Interior, Stewart Udall, set up a committee which issued the “Leopold Report” in 1963.  Largely attuned to the emergent environmentalism of the day, the committee urged the park be made a “vignette of primitive America.”  Wolves and grizzly bears and mountain lions (but not hunters!) were to be brought back into the park in order to control the elk herd.  

Ironically, Chase says, no one really knows what “primitive America” actually looked like!  In fact, the report “inadvertently replaced science with nostalgia, subverting the goal it had set out to support” (p. 35).   A growing contingent of environmentalists, moving from non-profit organizations such as the Sierra Club into the ranks of the Department of Interior, dreamed of “saving the wilderness.”  They shrewdly invented a “wilderness” that had never existed, since “there was never a place on earth untrammeled by man” (p. 45).  They reintroduced “predators” that showed little interest in elk, so the persisting elk problem accelerated.  The newly-prescribed “natural-fire” policy—allowing fires ignited by lightning to burn freely—failed to effectively clear the park of dead wood and underbrush.  The fires died out quickly, it was found, because the elk had consumed the dead grasses.  They neglected to notice that Indians, for many millennia, had lived and hunted in the Yellowstone area, significantly impacting the ecosystem, especially by lighting fires to keep “large areas in open grassland, forests from reaching climax, sagebrush from spreading, and many edible plants prolific” (p. 97).   Had modern “scientists” and “environmentalists” studied and followed such Indian practices rather than denying their ancient presence therein, Yellowstone would be much better than it is today!  “Denied its Indian past, it deprived us of the knowledge needed to keep it pristine.  As it turns out, ignoring the Indian was not only bad history, but bad ecology as well” (p. 115).  

After meticulously detailing and explaining developments in Yellowstone, Chase effectively analyzes the “environmentalists” whose philosophy and political activism underlie various of the park’s problems.  Many (if not most) of them are in fact pantheistic religious zealots.  Invoking John Muir rather than Jesus, they revere Emerson more than Moses and turn to Thoreau rather than Isaiah.  With Thoreau they believe:   “In wilderness is the preservation of the world.”  Rejecting the Judeo-Christian faith in a personal God, they embrace nature photographer Ansel Adams’ commitment to “‘a vast, impersonal pantheism’” (p. 304).   Remarkably, in the name of “ecology” they also reject objective, empirical, environmental science!  Rather than attending to evidence regarding the environment, they follow their convictions—all too often derived from spurious treatises such as Rachel Carson’s enormously influential Silent Spring—and insist everything be subsumed under a self-regulating “web of life” perspective.  Along with Rachel Carson’s Silent Spring, Aldo Leopold’s The Sand County Almanac (setting forth a celebrated “land ethic”) serves as a sacred text, veritably the Bible of environmentalism.  Citing Carson and Leopold, an alleged “science of ecology” gained momentum, especially among activists without advanced scientific training.  Reflecting this, the “countercultural” historian Theodore Rosak declared:  “The science we call ecology is the nearest approach that objective consciousness makes to the sacramental vision of nature which underlies the symbol of Oneness” (p. 323).  

Rosak, speaking as one of what Chase calls “the California cosmologists,” found in “ecology” a way to salvation for himself as well as the planet.  His The Making of a Counterculture inspired young folks to join “hippies” and Zen Buddhists and mythical “Native Americans” and mystics of various sorts following a “new vision that sacralized nature” and liberated them from traditional social and moral structures.  “A California Cosmology materialized, coalescing around three overlapping ideas.  The search for a new religion led to the insight that Everything is sacred.  The search for a new science led to the principle:  Everything is interconnected.  The search for a new politics of commitment centered on the belief that Self-transcendence is possible through authentic experience” (pp. 347-348).  Intrinsic to this cosmology is the pantheistic dogma of the self-regulating nature of the natural world which became environmentalism’s deepest certainty.  If true, nature would function perfectly if simply left alone.  Thus elk and bison, left alone, simply could not overpopulate Yellowstone—a mysterious “invisible hand” would sustain them in healthy numbers.  But as careful scientists—biologists in the field rather than students in “interdisciplinary” college classes—studied the evidence it became clear that nature does not know best!  The deepest conviction of many ecologists stood refuted by the facts.  “Although few were aware, Leopold’s land ethic—now part of the creed of contemporary environmentalism—rested on no foundation at all” (p. 325).  

Sadly enough, for Yellowstone this foundationless “land ethic” became the prescription for the park’s destruction!  The mule deer and antelope, the bighorn and beaver seen by Theodore Roosevelt can hardly be found.  Following the New Philosophy of Nature dictated by environmentalists, park managers rely on the “interconnectedness of things” rather than biological data.  Trying to “deep-freeze an ecology” that never existed, indulging in nostalgia rather than empirical investigation, Yellowstone may very well become the “Victim of an Environmental Ideal” (p. 375).  

* * * * * * * * * * * * * * * * * * * *

A decade after issuing Playing God in Yellowstone:  The Destruction of America’s First National Park, Alston Chase published an equally prescient treatise focused on the Pacific Northwest entitled In a Dark Wood:  The Fight over Forests and the Rising Tyranny of Ecology (New York:  Houghton Miflin Company, c. 1995).  Writing this book drove him to a deeply “disturbing” conclusion:  “An ancient political and philosophical notion, ecosystems ecology masquerades as a modern scientific theory.  Embraced by a generation of college students during the campus revolutions of the 1960s, it had become a cultural icon by the 1980s.  Today, not only does it infuse all environmental law and policy, but its influence is also quietly changing the very character of government.  Yet, as I shall show, it is false, and its implementation has been a calamity for nature and society” (p. xiii).

Ecology masquerading as science gained credibility in 1962 with the publication and popular reception of Rachel Carson’s 1962 Silent Spring.  “The book launched the modern environmental movement, which changed the values and politics of the nation” (p. 1).  Those changes were dramatically evident in the Pacific Northwest, where environmental zealots effectively disabled a logging industry that represented an earlier understanding and use of natural resources.  Throughout most of American history, loggers had helped fuel and growth and economic vitality of the nation.  They were a hardy and highly-respected corps of workers.  Blessed with productive soils and abundant precipitation, the Pacific Northwest (with its fast-growing, demonstrably renewable Douglas fir and coastal redwood trees) was producing one-fourth of the nation’s softwood.  Forestry experts (rooted in the empirical science of silviculture) and federal bureaucrats (entrusted with managing public lands) alike declared that logging was good for the forests, a blessing for both the people who used wood and the lands that grew trees.  

And, indeed, America’s forests were growing, producing more trees than were being cut!  “Private timberlands that had been clear-cut earlier in the century were coming back, ensuring a continuing supply in the next century.  As tree planting reached record levels, wildlife, benefiting from improving habitat conditions, flourished” (p. 71).  The more scientists studied the California redwoods the more they found evidence favoring clearcutting—an abhorrent thought to tree-hugging “old forest” devotees.  Studious silviculturalists “found that the more trees they cut, the greater the redwood regeneration.  The undisturbed stands experienced almost no regrowth of any kind and the heaviest mortality; the areas of light selection encouraged resurgence of shade-tolerant grand fir and hemlock.  But in the clear-cuts redwood sprang back in profusion” (p. 225).  By the mid-1980s, especially on carefully managed private lands, “redwood forests were growing as they never had before” (p. 225).  

An alternate approach to the forests, however, dramatically surfaced on the first “Earth Day” in 1970, featuring a parade of 100,000 people walking up New York’s Fifth Avenue.  Rooted in the pantheistic visions of Emerson and Thoreau, of John Muir and Ansel Adams, shaped by the shifting paradigm of colleges and universities, and sharing the ethos of the ‘60s New Left, radical “ecologists” championed “preserving” rather than “conserving” Mother Nature.  Rejecting Western Christian Civilization, they imagined themselves capable of inventing a “new,” far better civilization that respected and followed the “web-of-life” portrayed by Rachel Carson type ecologists.   “A new era was dawning in which not sustainability but aesthetics and the desire to maintain forests in their “natural” state would be paramount, and increasing numbers of the public would perceive efficient forestry as an oxymoron.  Forests would be seen by many as cathedrals in which to worship a new god” (p. 74).  

Their worship would be empowered by federal legislation, especially the 1973 Endangered Species Act—termed “a law for all seasons” for its unclear and easily expanded provisions.  The congressmen drafting the law had minimal biological understanding but maximal confidence in what they’d heard about “ecosystems” and “ecology” and “biodiversity” and “the balance of nature.”  They took seriously the pronouncements of “ecosystems ecologists” such as Barry Commoner and environmental activists overflowing with “fuzzy, pantheistic, and animist notions of the unity and spirituality of nature” (p. 103).  “Nature knows best,” Commoner declared in The Closing Circle, and multitudes believed him.  “Few noticed there was little evidence for the doctrine.  Ancient philosophical ideas, resurrected by the government as a means of control and masquerading as science, had captured the public imagination, producing an Endangered Species Act whose consequences no one could anticipate” (p. 104).  

Pushing beyond Commoner’s “nature know best” mantra, Bill Devall (yet another California professor) delved into the notion of “biospherical egalitarianism” and “deep ecology;” therein he picked up a “sledgehammer of an idea” with which he wanted “to change the world” (p. 120).  Probably unaware of the idea’s 19th century roots (in G.W.F. Hegel and Ernst Haeckel), Devall embraced a monistic philosophy that erased significant differences between human beings and other creatures.  Thus all living creatures are equal and merit respect if not reverence.  If everything is interconnected and equal, humans neither differ from other organisms nor have special rights.  Indeed, those “ecosystems” that constitute “fundamental units of existence” may be more entitled to protection than humans.  Followed by a variety of back-to-earth enthusiasts, “deep ecologists . . . unwittingly embraced ideas that synthesized an American religion of nature with German metaphysics:  a holism that supposed individuals were only parts of a large system and had no independent standing; antipathy to Judaic and Christian values of humanism, capitalism, materialism, private property, technology, consumerism, and urban living; reverence for nature; belief in the spiritual superiority of primitive culture; a desire to go ‘back to the land’; support for animal rights; faith in organic farming; and a program to create nature reserves” (p. 129).  

Thus armed, intellectually, radical environmentalists turned their eyes on the Pacific Northwest and determined to “preserve” it in accord with their idyllic image of an “archaic,” primitive forest, unsullied by the hand of man.  And they found an effective tool with which to accomplish their ends—the spotted owl.  Chase’s meticulous examination of the spotted owl should be read by anyone seriously concerned with America’s environment!  With virtually no scientific basis, activists effectively persuaded both the public and the judicial system that the own was an “endangered species” that needed vast amounts of “old-growth forests” to survive.  Though only 14 owls (that’s right:  14!) provided the basis for the first report on them, the political battle to save the trees (and the owls) was launched.  When carefully examined, another influential paper (by Russell Lande) proves to have been “an exercise in scientific wool-gathering, a collection of calculations based on scanty evidence and laced with false assumptions” (p. 247).   By promoting Lande’s paper as bona fide science, “environmentalists captured the political ground while simultaneously writing a new chapter in the continuing corruption of science” (p. 248).  Ever more “species” and spurious “sub-species” were declared to be endangered—the Pacific yew tree, the marbled murelet, several kids of salmon, sparrows, beetles, and trout.  With amazing rapidity, the federal government (primarily through the courts) moved to stop logging throughout much of the Pacific Northwest, bankrupting small logging firms and devastating scores of once-prosperous communities.   Biocentrism reigned!  “Even as evidence accumulated” to the contrary, true believers such as Dave Foreman and his “Earth First!ers” pushed their way into those “positions of power and prestige” that shaped the nation’s future (p. 173).  “Emotion and plausibility, not truth, count in politics” (p. 226).  

Earth First!ers and their allies (notably Judy Bari and her Wilderness Women) used a variety of tactics—spiking trees, staging protests, lobbying politicians, enlisting radicalized professors, filing endless lawsuits.  Chase carefully describes the activists and their proclamations, showing how they were consumed by their biocentric philosophy.  They were in effect waging a class war to defend the forests. Representing the upper-middle class and supported by affluent city dwellers filling the coffers of elitist environmental organizations such as the Wilderness Society and Environmental Defense Fund, the activists overwhelmed the generally less educated and significantly poorer loggers, ranchers, and farmers living on the land.  More importantly, by 1990 “biocentrism had become the philosophy of America’s ruling classes” (p. 359).  Journalists and professors, bureaucrats and professors, elementary teachers and recycling devotees were all passionately committed to a spurious creed that cited computer models resting on “ecosystem” assumptions rather than empirical evidence.  “Teaching that humanity was destroying the earth, they spread fear of global warming, ozone depletion, acid rain, dioxin, asbestos, and indeed anything that was new, was made by humans, or signified change” (p. 362).        

With the election of President Bill Clinton (and his fear-mongering Vice President Al Gore) in 1992, biocentrism spread throughout all levels of the nation’s polity.   “Save the trees!  Save the forests!  Save the fish!  Save the woods!”  Such words, chanted as a “catechism” by Denis Hayes in a 1993 “rock concert for trees” shortly before President Bill Clinton presided over a “Forest Conference” in Portland, demonstrated the triumph of the modern environmental movement.  Though Clinton himself “was another reed blowing in the ideological wind,” he adroitly aligned himself with the Gore-style biocentrists, filling “his administration with apostles of the new order” (p. 384).  Like-minded scientists were relied on as “experts” and activists in various environmental organizations were appointed to head bureaucracies.  “Sustainable development” became the slogan for minutely supervising “every square inch of American real estate” (p. 389).  

262 Israel Today–“Making David into Goliath”

         Anyone seeking to understand Israel today should carefully study Joshua Muravchik’s Making David into Goliath: How the World Turned Against Israel (New York: Encounter Books, c. 2014). A Fellow of the Foreign Policy Institute of the Paul H. Nitze School of Advanced International Studies (SAIS) of the Johns Hopkins University, Muravchik has solidly established himself as a serious scholar determined to chart the historical roots of contemporary affairs. (His 2002 treatise. Heaven on Earth: The Rise and Fall of Socialism, remains one of the best analyses of perhaps the most powerful political movement of the past century). As the subtitle of his most recent publication indicates, he wants to help readers better understand the remarkable reversal of world opinion regarding the state of Israel. Far more than describing what’s transpired in the past half-century, he seeks to explain why things have happened.

 And though he focuses on Israel, much that he says clearly parallels what’s taken place in America. To be precise: the same leftist agenda that has harmfully impacted the only outpost of Western Civilization in the Mideast has successfully subverted much that has traditionally upheld the American way. During the first 25 years of its existence, the state of Israel enjoyed rather widespread world support and admiration. Then things began to change. In part this resulted from the sheer power Muslims wielded by virtue of their numerous oil deposits. They began exerting influence on the United Nations and those nations dependent on them to fuel modem economies. More importantly, Muslims benefited from “an ideological transformation that saw the rise of a new paradigm of progressive thought that Arab and Muslim advocates helped to develop. It involved multiculturalism or race-consciousness in which the struggle of the third world against the West, or of ‘people of color’ against he white man, replaced the older Marxist model of proletariat versus bourgeoisie as the central moral drama of world history (#147).

         Following WWII, largely because of the Holocaust, there was a “reservoir of sympathy for the Jews wider and deeper than they had known over the millennia” (#285). Conversely, the Arabs (who had generally sided with the Nazis) were disliked if not scorned. Yet as they succeeded in establishing the new nation of Israel the Jews—successfully branded Zionists—began to elicit increasing criticism. Following the Six Day War, an astonishing military triumph, propagandists began to successfully portray Israel as a brutal Goliath pulverizing homeless and helpless Palestinians. “The altered perspective that made Israel look big instead of small was accompanied by a shift in ideological appearances that was no less important. The Arab states were seen as autocratic and reactionary. But the groups that came to speak for the Palestinians presented them as members of the world’s ‘progressive’ camp” (# 522). Emerging as the Palestinian spokesman was Yassar Arafat, mentored by his distant relative, Haj Amin al-Husseini—the Grand Mufti of Jerusalem who spent the war years in Berlin broadcasting vicious anti-Semitic screeds—and early aligned with the Muslim Brotherhood.

         Determined to secure politically what Arab armies had failed to accomplish militarily, Arafat shrewdly ingratiated himself with Leftists in Russia, China, North Vietnam, Cuba, Europe and America. Though not a particularly devout Muslim, “he could channel Das Kapital and the holy Koran with equal conviction. ‘Our struggle is part and parcel of every struggle against imperialism, injustice and oppression in the world,’ he affirmed. ‘It is part of the world revolution which aims at establishing social justice and liberating mankind'” (#739). He especially admired the North Vietnamese, who had politically defeated America despite militarily losing the war. Almost overnight Israel lost “the public relations gift of opponents who were collaborators of Hitler and Goebbels; now they faced the comrades of such chic, romanticized figures as Ho Chi Minh and Che Guevara. Not only had David become Goliath, but on the other side the frog had become a prince” (#822).  

          Arafat early envisioned and implemented the use of terror to accomplish (working through the Palestinian Liberation Organization or Fatah) his ends—reviving tactics earlier used in Arab uprisings in the 1920s and 1930s. PLO gangs such as Black September hijacked airplanes, holding passengers and crews hostages in order to exact huge cash ransoms or effect the release of Palestinian prisoners in Israeli  jails. (In 1972 one hijacked plane was successfully stormed and liberated by an elite squad of Israelis including two future prime ministers—Ehud Barak and Binjamin Netanyahu.) In that same year, eight  Black September terrorists attacked the Olympic residence hall of Israeli athletes in Munich, killing two  and taking nine hostages. They demanding a plane to fly them out of Germany, but a firefight at the airport  resulted in all the hostages’ deaths as well as several terrorists’. Though Arafat himself always posed as uninvolved in such terrorist acts, it now “seems clear that Abu lyad, one of Arafat’s two oldest colleagues and top aides, was the chief of Black September, and that blood-soaked group, at first mysterious in its origins, turned out to be nothing other than Fatah in disguise” (#1103).

          Following the Munich attack, Israel’s Massad methodically hunted down and killed all the surviving terrorists. America’s Secretary of State, Henry Kissinger, reacting to the killing of some Americans, sent a spokesman to inform PLO representatives ‘”that this killing of Americans has got to stop—or else . .. torrents of blood will flow. and not all of it will be American.’ They add that this ‘blunt message astonished his listeners [who] had not expected to hear such a direct threat from an American official.’ Kissinger says that after this, ‘attacks on Americans—at least by Arafat’s faction of the PLO—ceased”” (#1141). Europeans, however, sought to passively placate the terrorists. “Such appeasement had  a corrosive effect on the spirit of Europe, as almost always happens when people bow to threats and  violence rather than finding the courage to stand up to them” (#1153). Arab oil—and the threat of its  reduced flow as was manifest in the 1973 embargo and subsequent recession—brought Europeans to their knees. Many countries (including 30 out of 33 black-ruled States in Africa) broke ties with Israel in order to bolster their standing with Arabs.

          Arab ascendency in Europe was paralleled by triumphs in the United Nations. As the UN welcomed delegations from scores of new nations—many of them former European colonies—the balance of power quickly shifted. Given an opportunity to excoriate the rich and powerful, third world delegates engaged in endless rhetorical attacks on America and the West. Israel, identified as a Western outpost amidst an ocean of Arabs, was selected as a special target of abuse. In 1974, the Arab states introduced “Palestine” as an item for consideration and the General Assembly invited the PLO to participate in its discussions. “No one who was not a representative of a government except the Pope … had ever before been granted such a privilege, but the vote was overwhelming, 105 to 4” (#1481) Representing the PLO was Yassar Arafat, accompanied by “none other than Alt Hassan Salameh, the commander of the Munich Olympics massacre” (#1496). Brashly disregarding UN protocol, Arafat kept his pistol on his hip and pointedly called for the elimination of the state of Israel. He also effectively cultivated the strategy of equating “Zionism” with “racism” (a theme quickly spelled out in a UN resolution), thus enrolling all the modern multiculturalists for whom it is virtually the only serious sin. His “bloodthirsty harangue was greeted in the temple of nations with a standing ovation the likes of which had perhaps never been heard there before” (#1513).

         In successive decades, the U.N.’s General Assembly has routinely passed resolutions decrying a laundry list of spurious Israeli “crimes” and “abuses.” Still more: various U.N. agencies (effectively aided by well-funded “non-profit” organizations such as Amnesty International and mainline Protestant denominations) actively work to promote the Palestinian cause both in the Mideast and around the world. “The conclusion is inescapable. By its countless one-sided resolutions and numerous ‘investigations’ of Israel with predetermined results; by providing a global infrastructure for the movement to boycott, divest from, and sanction Israel; and by UNRKW [United Nations Relief and Works Agency], which sustains the idea of the ‘right of return,’ the United Nations has served systematically to challenge Israel’s legitimacy and weaken its global position. This is the crucible of Israel’s demonization” (#1616).

         Anti-Israeli rhetoric and maneuvers especially thrive in socialistic environments, whether intellectual or political. Well-equipped for the task, Muravchik effectively recounts 20th century developments that led to the founding of the new nation of Israel as a thoroughly socialistic state, thereby garnering considerable enthusiasm amongst egalitarian devotees around the world. The Labor Party, the kibbutzim, the general mood of the infant nations thrilled many who envisioned a socialist Utopia minus the negativities of Stalinist Russia and Maoist China. “As a kind of socialist model, Israel enjoyed great prestige within the halls of the Socialist International” (#1847). In time, however, European socialists (such as Austria’s Bruno Kreisky and West Germany’s Willy Brandt) turned away from Israel and cultivated ties with revolutionary movements throughout the “global South,” supporting the likes of Fidel Castro and, naturally, Yassar Arafat.

          Kreisky and Brandt clearly represented significant changes in the socialist world. The “New Left”—birthed in 1968 by European “revolutionaries” and in America by “counter-cultural” agitators such  as Tom Hayden and Bill Ayers—quickly infiltrated and transformed Western institutions. They especially targeted universities—violently seizing control of facilities, imposing demands on administrations, turning campuses into centers of political activism rather than intellectual discipline. “The books and ideas that for generations were regarded as the backbone of Western civilization were now systematically ‘deconstructed.’ Moses and Jesus, Plato and Aristotle, Augustine and Aquinas, Shakespeare and Tolstoy, Locke and Burke, Hamilton and Jefferson were exposed as but so many ‘dead white males’ whose principal importance was to perpetuate the hegemony of their race, class, and gender. At long last, their victims day had come, and the study of their oppression and resistance replaced the traditional ‘canon’ on the front stage of higher education” #2106). Reflecting this transformation, “Jean-Paul Sartre, once an orthodox Stalinist, gave voice to this profound rewrite of leftist canon in his preface to Frantz Fanon’s The Wretched  of the Earth. ‘Natives of all under-developed countries, unite!’ he wrote. The riveting movement for civil rights of blacks in America melded with the global anticolonial cause to create a larger image of ‘the rest against the West,’ or rather against the White West” (#3769).

          Helping orchestrate this New Left agenda was the late Columbia University Professor Edward Said, an American totally devoted to the Palestinian cause. His works are required reading in nearly a thousand university syllabi. Entire courses are devoted to him in top-tier institutions such as UCLA and Georgetown University. Influential leftists, notably Noam Chomsky, took his view of Israel and the Arabs as their own. Said made his mark in 1978 with the publication of Orientalism, a work perfectly attuned to an era dominated by white guilt for racism. A year later he published The Question of Palestine, calling for the liberation and self-determination of the oppressed Arabs residing in Israel-controlled lands. Given his influence. Said deserves examination—something Muravchik does diligently.

          Though he glibly postured as a “Palestinian” Professor Said “largely falsified his background” (#2194). Said was in fact born to wealthy parents in Cairo, Egypt, and lived there until moving to the United States at the age of 15, where he received an elite education (including degrees from Princeton and Harvard) and remained for the rest of his life. His dishonesty extended from his autobiographical materials to “all his work, beginning with the most influential. Orientalism” (#2272). He distorted or ignored

evidence and misused his sources (routinely skewing quotations). At one time these were serious academic sins, but they are easily tolerated in the “postmodern” university, suffused as it is with skepticism, relativism, and nihilism. Because Said castigated white people as racists, redefining Arabs as oppressed persons of “color,” dressing up his “malignant charlatanry” with academic jargon and oblique references to celebrities such as Foucault, he enjoyed a unique status in the academic world, providing him a platform with which to rebrand Israel as a Goliath walking roughshod over poor Palestinians.

          As if dealing with enemies abroad were not enough, Israelis faced mounting internal dissension, largely replicating the New Left’s agitation in Europe and America. As “the left turned against Israel it was inevitable that Jews would appear in growing numbers among Israel’s fiercest critics” (#4077), generally styling themselves “anti-Zionists.” They detested the Zionism personified by one of the nation’s founders, Menachem Begin (the powerful leader of the Likud Party and sometime prime minister) who “believed to his core that ‘the Jewish people have an eternal historic right to the Land of Israel'” (#2676)—thus envisioning a nation with the geographic boundaries established under David in the Old Testament. More secular Israelis worked to establish a compromise with Palestinians leading to a “two state” position. And some “anti-Zionists” even promoted a one-state solution, giving Palestinians full control of the nation!

Thus MIT’s Noam Chomsky, an influential American leftist, “long advocated the replacement of Israel with a bi-national socialist state along the lines of what he called the ‘successful social revolution’ of Communist Yugoslavia” (#4100). Political battles between these factions “proved to be an inexhaustible resource for Israel’s enemies, much as the Vietnam War gave rise to an ‘adversary culture’ in America that stoked an anti-Americanism that strengthened the hand of Communist forces” (#2898). Anti-Zionist academics (including “New Historians” who debunked the nation’s official version of its founding) and journalists (some cultivating a “prophetic” rather than reportorial stance) especially aired their discontent with the nation’s policies, promoting a “peace movement” that triumphed in 1993 with the Oslo Accords.

         Then Prime Minister Yitzhak Rabin, who championed the “peace” attained between Jews and Palestinians at Oslo, was gunned down at a Peace Now rally in 1995. Five years later Yassar Arafat unleashed his intifada with suicide bombers blowing up buses and restaurants. We now know “that once the intifada began, Arafat’s forces released from custody hundreds of terror operatives belonging to Hamas and Islamic Jihad whom the Palestinian Authority had incarcerated under the system of security cooperation with Israel that had been a cornerstone of the peace process” (#4311). It became clear that large numbers of Palestinians, led by the Nobel Peace Prize recipient Arafat himself, wanted to destroy Israel rather than establish an independent state of their own.

          Suddenly some of the Peace Now supporters had second thoughts! Benny Morris, the professor who coined the term “new historians,” said: “The bombing of the buses and restaurants really shook me. They made me understand the depth of the hatred for us. They made me understand that the Palestinian, Arab and Muslim hostility toward Jewish existence here is taking us to the brink of destruction. I don’t see the suicide bombings as isolated acts. They express the deep will of the Palestinian people. That is what the majority of the Palestinians want. They want what happened to the buss to happen to all of us.'” This led him to declare: “‘There is a deep problem in Islam. It’s a world whose values are different. A world in which human life doesn’t have the same value as it does in the West, in which freedom, democracy, openness and creativity are alien. A world that makes those who are not part of the camp of Islam fair game'” (#3257).

          As Muravchik’s analysis of Israel’s intelligentsia makes clear, modem Israel is generally “on the wrong side of the left’s new paradigm.” Pro-Palestinian activists such as Rachel Corrie (a 23 year old American fresh from her studies at Evergreen State College working for the International Solidarity Movement [ISM] and dying under the treads of a military bulldozer while trying to “non-violently” stop Israel’s clearing ground to deter intifada infiltrations) stirred up anti-Israeli sentiments around the world. To effectively do so, ISM distributed “a doctored photo display intended to show that the Israeli bulldozer had struck her deliberately” (#3730). Another American ISM volunteer, Richard Hupper, contributed $20,000 to Hamas, thus supporting that terrorist group’s effort to destroy Israel. To Hamas, ‘”Jihad is our way. Dying in the way of Allah is our highest hope'” (#3718). While rockets rain down and grenades are thrown at them, Israelis must deal with protesters such as Hupper and Corrie who insist the Jews are the provocateurs, the occupiers, the villains in the Mideast.

         Thinking about Rachel Corrie’s work in Israel, Muravchik is perplexed that: “The delicate child who admonished herself not to step on a flower, who could not endure the thought of whales dying or trees being felled, exhibited cold indifference to the death of Israelis. What force was it that had wrought such a transformation?” (#3724). It seems clear that what Eric Hoffer described in The True Believer applies to her. She (like many before her in the French and Russian and Cuban revolutions) had embraced a leftist “creed” that bred ‘”fanaticism, enthusiasm, fervent hope, hatred and intolerance'” (#3724). She clearly shared what Milovan Djilas—a leader in making Yugoslavia Communist before being sent to jail for some deviant thoughts—discerned as a blind faith that “they have been named by a higher power, which they call history, to establish the Kingdom of Heaven in this sinful world'” (#374). To true believers like Come, “the favored groups—blacks, browns, former colonials—were not merely objects of sympathy; they were regarded as the vessels of universal redemption” (#3775).

         However demonstrably misguided and meretricious they may be, leftists such as Rachel Come have effectively placed “Israel in the Dock.” Jews rather than Arabs are called upon to justify their policies—indeed, to justify their very existence! Western news agencies, in their portrayal of Israel, are particularly committed to this approach, inevitably filming incidents staged to portray Palestinians as victims. Academics and churchmen tout boycotts of Israel as a way to liberate the oppressed Palestinians. Thus the famed physicist Stephen W. Hawking withdrew from a 2013 scholarly meeting in Jerusalem in order to demonstrate his “solidarity” with oppressed Arabs. His sensitive conscience was, however, apparently untroubled when he keynoted a conference in China, proudly appearing in Beijing’s Great Hall of the People and demonstrated his indifference to (if not support) of the Communist dictatorship.

         With the likes of Stephen Hawking—and Jimmy Carter and Desmund Tutu—condemning Israel, the nation today stands truly imperiled. “For all its might, Israel remains a David, struggling against the odds to secure its small foothold in a violent and hostile region. The relentless campaign to recast it instead as a malevolent Goliath places it in grave peril” (#4853). A “second holocaust” is, in fact, not only desired by millions of Muslims but quite possible. The sheer number of Arabs and their oil-based wealth certainly threaten the existence of the tiny Jewish state. But more threatening “is the intellectual power of the contemporary Leftist paradigm” that denies Israel’s validity, consigning her “to the side of darkness and villainy, even in the face of the reality that, measured by the Left’s nominal values—freedom, democracy, tolerance of racial, religious, and sexual diversity, equality of status for women, generosity to the needy Israel is among the world’s best countries and its enemies rank among the worst” (#4794).

261 Britain’s Best-Known Dissident: Roger Scruton

         Considered by some “Britain’s best-known intellectual dissident” for his staunch defense of such English traditions as fox-hunting, Roger Scruton is a philosopher who has flourished as a writer who routinely lectures at universities without making a career as a tenured member of the professoriate.  Thus his writings, while addressing the timeless concerns of a philosopher, are much more accessible and wide-ranging than those of his peers.  Nevertheless, he was invited to give The Gifford Lectures (without question the most prestigious award for philosophers concerned with religion) in 2010 and published them under the title of The Face of God (London:  Continuum International Publishing Group, c. 2012).

         Distressed by the “consequences of the atheist culture that is growing around us”—rejecting both God’s Reality and any morality rooted in His Being, thereby escaping “the eye of judgment by wiping away” His face—Scruton endeavors to address questions awakened by those experiences that provoke us to deal with our own “consciousness, judgment, the knowledge of right and wrong, and all the things that make the human condition so singular” (p. 8). While acknowledging the legitimacy and importance of cosmological evidences regarding God’s existence, he prefers to focus on psychological clues pointing to His Presence.  He is the One to Whom we pray.  “He is in and around us, and our prayers shape our personal relation with him” (p. 13).

         To better understand this, Scruton invites us to consider “the meaning of three critical words:  ‘I,’ ‘you’ and ‘why.’  And in exploring those words I shall be constructing a general theory of the face:  the face of the person, the face of the world, and the face of God” (p 23).  If God is a Person, we might best engage Him through dialogue, intentionally interacting with Him in ways that defy purely naturalistic explanations.   As we reflect on our mysterious ability to communicate in languages, both verbal and mathematical, we enter into a realm of reality unobservable to empirical science, a subjective world full of distinctively personal thoughts and judgments and decisions.  Inwardly we know we arc free to think and love and act; we know we are more than biological automata following a pre-determined scheme.  “So maybe God is a person like us, whose identity and will are bound up with his nature as a subject” (p. 45). As a person He can say “I” and interact with other persons such as I.   

         When I say “I” something important is manifest.  I identify myself as a unique being within a world of beings. I think about yesterday’s weather and today’s schedule and tomorrow’s uncertainties, all freely associated within my mind.  I’m also aware of certain moral judgments and responsibilities accompanying my thoughts. I am, in short, self-conscious in ways unknown in the purely animal world.  And I recognize, as a self-conscious person, other persons with whom I discourse, to whom I am accountable, and who should be accountable to me.  Such persons are known to me almost exclusively through their faces, “the outward form and image of the soul, the lamp lit in our world by the subject behind it is through understanding the face that we begin to see how it is that subjects make themselves known in the world of objects” (p. 72).  Indeed; “the face is the subject, revealing itself in the world of objects” (p. 80).  In their spontaneous smiles and laughs and tears and blushes and deeply expressive eyes we intuitively know truths about persons we encounter. In loving relationships we enjoy communion with other persons.  (On the other hand “Fashion models and pop stars tend to display faces that are withdrawn, scowling and closed.  Little or nothing is given through their faces, which offer no invitation to love or companionship.  The function of the fashion-model’s face is to put the body on display; the face is simply one of the body’s attractions, with no special role to play as a focus of another’s interest” (p. 107).  

         So how might we sec the face of God?  As human beings we are deeply troubled by the guilt, disgrace, sorrow and death that result from decisions freely made in the past.   We long for forgiveness and restoration within the community of persons.  We also crave immortality.  Thus a multitude of religious rites and practices have developed within human history, and some of us now and then discern, in “sacred moments,” a supernatural reality beyond the natural world.  “All sacred moments are moments of gift—of gift revealed as the way things are.  The distinctiveness of the Christian Eucharist is that it makes this wholly specific.  The Eucharist commemorates God’s supreme gift, which is the gift of himself—his own descent into the world of suffering and guilt, in order to show through his example that there is a way out of conflict and resentment—a way to restore through grace the givenness of the world” (p. 172)

          For Scruton,  the Christian message of God-in-Christ revealing Himself as agape love “gives the greatest insight into our situation,”’ and “the I that gives itself opens a window in the scheme of things through which we glimpse the light beyond—the I AM that spoke to Moses’” (p. 172).  He IS—and in Christ He is really present.  He Is Really with us (Immanuel, God with us). 

                                                                   * * * * * * * * * * * * * * * * * * * *   

          In The Soul of the World (Princeton: Princeton University Press, c 2014), Roger Scruton returns to (and reinforces) themes earlier treated in The Face of God:  Discerning the Real Presence of God, the mystery basic to mystical experience and divine revelation and liturgical worship.  To this the celebrated mathematician and philosopher Blaise PascaJ gave witness following his nuit de feu, “the night of 23 November 1654 when, for two hours, he experienced the total certainty that he was m the presence of God—‘the God of Abraham, of Isaac and of Jacob, not the God of the philosophers and the wise men.’ other words a personal God, intimately revealed, not conjured by abstract argument.  Pere juste, le monde ne t’a point connu, mais moi, je t’ai connu, [GR translation:  righteous Father, the world has not know you, but I myself, I have known you] he wrote then, on the scrap of paper on which he recorded the experience:  astonishing words, which only total conviction could have engendered” (p. 12).  

          To share Pascal’s conviction in the 21st century requires us to first deal with the highly influential and strident claim of evolutionary psychology (e.g. Patricia Churchland’s Neurophilosophy) that reduces  religious reflection and affection to simple chemical processes within the brain.  Scruton endorses Mary Midgley’s dismissal of the “nothing buttery” that reduces “emergent realities to be ‘nothing but’ the things in which we perceive them.”  To the nothing buttery coterie, “the human person is ‘nothing but’ the human animal, law is ‘nothing but’ relations of social power; sexual love is ‘nothing but’ the urge to procreation; altruism is ‘nothing but’ the dominant genetic strategy described by Maynard Smith; the Mona Lisa is ‘nothing but’ a spread of pigments on a Canvas, the Ninth Symphony is ‘nothing but’ a sequence of pitched sounds of varying timbre.  And so on.  Getting rid of this habit is, to my mind, the true goal of philosophy” (p.39).  

          Skillfully rejecting such reductionism, Scruton insists “that functional explanations of the evolutionary kind have no bearing on the content of our religious beliefs and emotions” (p. 3).  Much more than matter-in-motion distinguishes us human beings.  We may very well function as animals in many ways (eating, sleeping, copulating), but in our minds we wonder about things true, good, and beautiful, we ponder what philosophers call “qualia” and do math not merely because we want to measure distances but because of the sheer beauty of intricately balanced equations.  We think morally, “reaching beyond” the evolutionary struggle to survive, discerning ethical norms and reasons for proper behavior.  And we also speak coherently in highly complex ways, far beyond the capacity of other animals.  Importantly, “Language enables us to distinguish truth and falsehood; past, present, and future; possible, actual, and necessary, and so on.  It is fair to say that we live in another world from nonlinguistic creatures. They live immersed in nature; we stand forever at its edge” (p, 5).

          Standing forever at nature’s edge, we sense another world, a transcendental realm of realities (theological and ethical as well as mathematical and musical) more vital than the material things we touch and taste.  We experience what Scruton repeatedly refers to as a “cognitive dualism,” somewhat akin to Aristotle’s “hylomorphism,” understanding one Reality in two equally valid ways.  Situated at this horizon—immersed in sacred places, repeating sacred chants, celebrating sacred rites—we open our inner being to the timeless realm of God, hungering for a face-to-face encounter with Him.  Religious aspirations are truly perennial deeply embedded in human nature.  Thus Scruton says:  “The real question for religion in our time is not how to excise the sacred, but how to rediscover it, so that the moment of pure intersubjectivity, in which nothing concrete appears, but in which everything hangs on the hear and now, can exist in pure and God-directed form Only when we are sure that this moment of the real presence exists in the human being who experiences it, can we then ask the question whether it is or is not a true revelation—a moment not just of faith but of knowledge, and a gift of Grace” (p. 23}.

         To Scruton, evidence for God’s existence may be found primarily in the psychological, rather than the cosmological, realm.  Probing the depths of human conscious and personal relationships, rather the limits of outer space, brings us in touch with the One He Who Is.  I primarily identify myself as a person—“an individual substance of a rational nature,” according to Boethius, “I know that I am a single and unified subject of experience” (p. 72). Interacting with other persons, I use terms such as good and beautiful, tragic and comedic, necessary for the I-You relationships requisite for us.  In these relationships “ideas of the self and freedom cannot disappear from the minds of the human subjects themselves.  Their behavior toward each other is mediated by the belief in freedom, in selfhood, in the knowledge that I am I and you are you and that each of us is a center of free and responsible thought and action” (p, 64).  “Each human object is also a subject, addressing us in looks, gestures, and words, from the transcendental horizon of the I.  Our responses to others aim toward that horizon, passing on beyond the body to other being that it incarnates.  It is this feature of our interpersonal responses that gives such compelling force to the myth of the soul, of the true but  hidden self that is veiled by the flesh” (p, 74).  So too we may interact with God as a Person.

          Thus we find the Hebrew Scriptures celebrating God’s covenant relationship with his people.  Almighty God established “a binding agreement, in which God Commands obedience only by putting himself under obligations toward those whom he commands. The idea that God can be bound by obligations toward his creation has had a profound impact on our civilization, since it implies that God’s relation to us is of the same kind as the relations that we create through our promises and contracts.  Our relation to God is a relation between free beings who take responsibility for their actions.  And the simplest form that such a relation can take is that of an exchange of promises—a form that has been recognized by the law since ancient times” (p. 78). Consequently, Scruton says, if we think through the implications of this divinely-designed covenant “we will arrive at the ancient concept of natural law: the concept of a law inscribed in human reason itself, and which issues precisely from our disposition to bind ourselves in free agreements and to live with our neighbors on terms.  There is, as I prefer to put it, a ‘calculus of rights, responsibilities, and duties’ that is inherent in our search for agreement, and this calculus lays down the constraints that must be obeyed, if we are to arrive at a consensual political order” (p. 81).

          This “natural law” is not the law of physics or biology, for it transcends them.  It reveals to us a deeper—or higher—realm of reality and truth regarding who we are and what we should do as persons, it prompts us to enjoy people as persons rather than use them as things.  It aligns us with a deeply religious realm wherein we find permanent things—the things that matter most.  We are thus capable of discerning sacred spaces (e.g. the “music of the spheres”) and designating sacred things (e.g. temples and cathedrals).  This “experience of the sacred is interpersonal.  Only creatures with ‘I’ thoughts can see the world in this way, and their doing so depends upon a kind of interpersonal readiness, a willingness to find meanings and reasons, even in things that have no eyes to look at them and no mouth to speak” (p. 134).  Rightly experienced, “The ‘order of the Covenant’ emerges from the ‘order of nature’ in something like the way the face emerges from the flesh or the movement of tones from the sequence of sounds in music.  It is not an illusion or a fabrication, but a ‘well-founded phenomenon’ to use the idiom of Leibniz.  It is out there and objectively perceivable, as real as any feature of the natural world” (p. 175).  And it comes to us from God, who is the “soul of the world”— the “all-knowing subject who welcomes us as we pass into that other domain, beyond the veil of nature” (p. 198).

                                                           * * * * * * * * * * * * * * * * * * * * * * *

         In Gentle Regrets:  Thoughts from a Life (New York: Continuum, c. 2005), Roger Scruton reflects on those “uncomfortable truths” that have in fact given him tasting “comfort.”  Many of them were early found in classic books such as John Bunyan’s Pilgrim’s Progress, Rainer Maria Rilke’s Letters, and Dante’s Divine Comedy.  He discovered that “Shakespeare’s plays are ‘works of philosophy—philosophy not argued but shown” (p. 9).  He found T.S. Eliot’s Four Quartets an effective antidote to Oswald Spengler’s pessimism. Though these and many other books were read while young Scruton was in school, many of the most important of them were not part of the prescribed curriculum,

         Brn in 1944, he “grew to immaturity in the sixties, when disorder was the order of the day.  Like most of my generation, I was a rebel—but a meta-rebel, so to speak, in rebellion against rebellion, who devoted to shoring up the ruins the same passionate conviction that my contemporaries employed in creating them.  How this happened is a mystery.  I have gained nothing whatsoever from my anti-antinomian stance, and discarded my socialist conscience only to discover that a socialist conscience was the one thing required for success in the only spheres where I could aspire to it” (p. 19). He had become, in his mid-20s, following his days at Cambridge, a conservative!  Early granted a lectureship at Birkbeck College, London, he found himself surrounded by leftist luminaries such as Eric Hobsbawm, the Marxist historian whose “vision of our country is now the orthodoxy taught in British schools” (p. 36).  

          Fortuitously, Scruton discovered Edmund Burke, the great 18th century philosopher-statesman, with whom he shared a deep interest in aesthetics.  “Like Burke, therefore, I made the passage from aesthetics to conservative politics with no sense of intellectual incongruity, believing that, in each case, I was in search of a lost experience of home” (p. 39).  Through Burke’s critical analysis of the French Revolution, Scruton realized “that the Utopian promises of socialism go hand in hand with a wholly abstract vision of the human mind—a geometrical version of our mental processes that has only the vaguest relation to the thoughts and feelings by which real human lives are conducted” (p. 40).  Burke stood for such old- fashioned things as individual freedom and sexual standards and religious traditions—things which Scruton celebrated in his 1997 publication, The Meaning of Conservatism, a book that “blighted what remained of my academic career” (p. 41).

          Indelibly branded as a “conservative,” he was effectively ostracized by the English intelligentsia., especially when he linked up with a vigorous minority of like-minded thinkers; seeking to publish their views in the “belligerently anti-communist” Salisbury Review.  His articles and books elicited general disdain from powerful professors such as A.J. Ayer.  “However hard I tried, however much scholarship, thought and open-minded argument I put into what I wrote, it was routinely condemned as ignorant, sloppy, pernicious, or just plain ‘silly’” (p. 55).  The doors to a university career quickly closed to him, so he determined to make his own way as an independent thinker.  

          In the process he slowly shed his youthful atheism and opened himself to the claims of traditional religion.  He found Christianity’s sexual ethos and artistic masterworks persuasive.  And he discovered when interacting with “true believers”—many of them Jiving under oppression in Poland and Czechoslovakia—how “faith transfigures everything it touches, and raises the world to God” (p. 63).  He took to heart some of the words of a devout, and quite conservative, Catholic priest, Monsignor Gilbey, who had been Catholic chaplain in Cambridge when Scruton studied there:   “‘We are not asked to undo the work of creation.   Or to rectify the Fall. The duty of a Christian is not to leave this world a better place.  His duty is to leave this world a better man’” (p. 68).

          The second person who influenced Scruton was a young Polish university student (Barbara) living in Gdansk, Poland.  Asked to teach at the Catholic University of Lublin, he discovered that in “an occupied country with a censored press, there was, comparatively speaking, complete freedom of speech . . . the only university I knew where a right-winger could speak openly in defense of his views” (p. 72).  In discussions with Barbara, he discovered a woman possessed with “the crazy idea . . . that she could help me to salvation” (p. 75).  Her witness—and the series of letters and meetings that followed—introduced Scruton to a person who “observed her world with the eye of religion, seeing in everything the sign of God’s creative power and the call to free obedience.  Hers was a simple, humble, priest-haunted life, and yet it was lived more intensely and more completely than mine” (p. 76). She “spoke easily and quietly of communism, which she saw as the Devil’s work—a swindle, born of the father of lies, but no different in essence from all other attempts, both great and small, both public and private, to live a lie” (p. 78).   She, like Monsignor Gilbey, insisted that the really important things “was not to improve the world, but to improve yourself” (p. 79).

          Added to his growing interest m religion was his experience in being a father.  After his first marriage ended in divorce, plunging him into “an unhappiness that lasted two decades,” and after sampling some of his generation’s sexual revolution, Scruton remarried and sired a son named Sam.  Then 54 years of age, witnessing his son’s birth, after “decades of arrested development, I grew up” (p. 109).  “To watch a child grow up is to become detached from yourself and attached to another, whose total dependence compels independence in you” (p. 115).   As a father he deeply understood the difference between a family and the State, with which it as war in modern society.  He and his wife thus “belong to a growing class of dissidents, at war with the official culture and prepared to challenge it” (p. 117).

         Gradually, bit by bit, Scruton was “regaining my religion.”  Along with most of his contemporaries, he had little concern for religion in his early years. But some of his early longings, awakened by reading Rilke and Eliot, prepared him to consider religious truth-claims.  His own analysis of his own self-consciousness persuaded him of the “truth that we are free, accountable and objects of judgment in our own eyes and m the eyes of others” (p. 226).  He learned to appreciate the importance of sacrifice—particularly self-sacrifice—in living well.  Moving to the country, he began attending a small church where he began listening to readings from the Book of Common Prayer and volunteered to play the organ.  Though unable to affirm traditional, orthodox Christian belief, he did find himself inwardly persuaded that the religious life was more true to life experiences than the secular scientism of modernity.  And so he became perhaps England’s finest conservative philosopher with a somewhat heterodox Christian perspective. 

# # # 

260 A Scintillating Curmudgeon: Charles Murray

         That few conservatives are invited to speak on college and university campuses is a rarely lamented but easily discovered matter of public record. But when conservatives, (such as Condoleezza Rice) or critics of Islam (such as Ayaan Hirsi Ali) are “disinvited” now and again they make the headlines.  So we recently learned that Charles Murray (“arguably the most consequential social scientist alive” in Jonah Goldberg’s opinion) was recently notified by Azusa Pacific University that his invitation to speak on the  institution’s campus had been rescinded so as to salve the sensitivities of “faculty and students of color.” 

         Ever controversial and generally espousing a libertarian perspective. Murray has, for more than three decades, prodded us to confront and frequently re-think important public issues.  In 1984 he published a landmark treatise, Losing Ground–a persuasive demonstration of the tragic failure of LBJ’s “war on poverty” and the welfare state in general—and has continued to issue well-researched, tightly-reasoned  works. Quite different from his usual publications is his latest work—the treatise he planned to discuss at Azusa Pacific University—The Curmudgeon’s Guide to Getting Ahead:  Dos and Don’ts of Right Behavior, Tough Thinking, Clear Writing, and Living a Good Life (New York; Crown Business, c. 2014).  Though mainly targeting voting adults wanting to find good jobs and live a good life, the book began as a “lark” sparked by emails at his place of employment, The American Enterprise Institute.  Thinking some of the younger folks needed sound advice in order to succeed, he identified himself as typical of “highly successful people of both genders who are inwardly grumpy about many aspects of contemporary culture, make quick and pitiless judgments about your behavior in the workplace, and don’t hesitate to art on those judgments in deciding who gets promoted and who acts fired” (#90 in Kindle).  

         Murray sets forth and defends 34 rather basic injunctions, collected under tour categories.  Addressing “On the Presentation of Self in the Workplace,” he advises such things as:  “don’t suck up,” “don’t use first names with people considerably older than you until asked, and sometimes not even then,” “excise the word like from spoken English,” banish obscenities, shun tattoos and body piercings, dress modestly and (especially) work hard, even when the tasks are routine and menial. The people who reallv matter—often curmudgeons to the core—are demonstrably serious “grown-ups.  So cater to them” (# 138).  Toss aside immature self-expressiveness (reflecting the sophistic  “It’s All About Me Syndrome” launched by baby boomers) and snarky rebelliousness!  Learn to do things (i.e. speaking and writing) as they must be done simply because that’s the way the world works.  Could college graduates entering the work force heed such advice, they’d quickly advance, because:  “Good help is hard to find.  Really hard to find.  Sure there are lots of people with the right degrees and resumes, but the kind of employee we yearn for sticks out almost immediately” (#460).  To find one’s true vocation may well involve a period of trial and error.  So even as a student it is wiser to find summer work as a waiter rather than hang around bankers or politicians as an unpaid intern.  And once graduated, you must first leave home and gel a “real Job,” taking responsibility for your life.

         “On Thinking and Writing Well” brings together Murray’s advice for aspiring young adults.  “Unless you’re in the hard sciences, the process of writing is the dominant source of intellectual creativity” (#508).  Take to heart Mark Twain’s classic assertion that  “’The difference between the almost right word and the right word is really a large matter.  It’s the difference between the lightening bug and the lightening.’”   So learn to write well by absorbing the perennially valuable Strunk and White classic, The Elements of Style.  Take seriously grammatical rules and the spelling and meaning of words.  Carefully reread and edit—preferably on hard copy—what you’ve written.  Relish the intellectual ‘rigor’ required for good writing just as you embrace the exertion needed for physical fitness.  

          In the process of working effectively, it’s also important to give attention to “the formation of who you are” and “the pursuit of happiness” found embedded in vocation, family community, and faith.  “Find  work that you enjoy, and find your soul mate” (#1015)—“a good marriage is the best thing that can ever happen to you” (#1382).  It’s important 10 become a fully-developed good person, possessing the classic cardinal virtues (prudence; courage; temperance; justice).  And this leads to an awareness of the perennial truth that “a life well lived has transcendent value” (#1059) most explicitly detailed in religion.  (Though rather irreligious m his early life. Murray has gradually—largely due to his wife’s spiritual odyssey—come to a positive evaluation of the need for and goodness of religious life. He has aJso developed a deep appreciation for the profundity and wisdom of great religious thinkers such as C.S. Lewis).  

          Exasperated by the state of education (and more specifically the harm done by the “kindly lies” permeating the federal government’s “No Child Left Behind” endeavors), Charles Murray published three articles in The Wall Street Journal in January 2007.  Urged to more fully flesh out his ideas, be elaborated them, in Real Education:  Four Simple Truths for bringing America’s Schools back to Reality (New York:  Three Rivers Press, c. 2008).  His truths are quite simple:  1) ability varies; 2) half of the children are below average; 3) too many people are going to college; and 4) America’s future depends on how we educate the academically gifted.  

         Ability varies!  Some of us are outstanding athletes.  Others excel in music or math or chess or forging friendships.  There are many important realms of activity and socially valuable endeavors.  But in the academic world—the world schools should especially address—only three things really matter:  linguistic, logical-mathematical, and certain spatial abilities.  To imagine that all children can fully develop these abilities leads one into the world of fantasy rather than reality.  (Any adoring mother will naturally imagine her child is “above average”).  Yet while most parents may be persuaded that only a few children will be gifted athletes they refuse to accept the equally demonstrable fact that only a few children will be academically gifted and that half of all children are below average.  Such children can certainly learn many things. And they may very well become good, productive persons.  But they should not be misled to think they can excel academically.  “This is not a counsel of despair.  The implication is not to stop trying to help, but to stop doing harm.  Educational romanticism has imposed immeasurable costs on children and their futures. It pursues unattainable egalitarian ideals of educational achievement … at the expense of attainable egalitarian ideals of personal dignity” (p. 66).

         Neither more money nor better schools can obviate the fact that only a few have the intellectual ability to successfully master college material.  Denying this reality prods too many people to go to college, which is suitable, Murray insists, for only 10 to 20 percent of high school graduates (the percent enrolling in the 1950s).  Denying this reality has led all-too-many Colleges and universities (welcoming large numbers of students and banking their money) to “dumb-down” their curricula and compromise their integrity.  Denying this reality leads great numbers of sincere students (70 percent of high school seniors think they will become “professionals” (e.g. doctors or lawyers) to enroll in programs they cannot complete and take out loans they struggle to repay.

         Yet Colleges and universities merit our respect and support,, because “America’s future depends on how we educate the academically gifted.  Inevitably the elites (whether political or business or artistic or religious) that run the country will be highly intelligent.  We must give them the best education possible.  More importantly, we must try to help them become wise as well as smart!  “The encouragement of wisdom requires a special kind of education.  It requires mastery of the tools of verbal expression” and “mastery of the analytical building blocks for making sound judgments.   The encouragement of wisdom requires extended study of philosophy because it is not enough that gifted children grow up to be nice.  They must know what it means to be good.  Finally and indispensably, the encouragement of wisdom requires that we teach students to recognize their own intellectual limits and fallibilities—teach them humility” (p. 113)

                                                                * * * * * * * * * * * * * * * * * *

         In Ameriam Exceptionalism:  An Experiment in History (Washington, D.C,; AEI Press, c 2013), Charles Murray argues that the United States was “the first nation in the world [to] translate an ideology of individual liberty into a governing creed” (#31), something widely recognized by observers in Europe as well as America in 1789.  Carefully defined, “American exceptionalism is a fact of America’s past, not something you can choose whether to ‘believe in’ any more than you can choose whether to ‘believe in’ the battle of Gettysburg” (#53).  To Murray, America’s early exceptionalism resided in four factors:  her geographic setting (the North American continent, with its expanding frontier); her ideology (celebrating the natural rights enumerated in the Declaration of Independence): her distinctive cultural traits (i.e. “honesty, industriousness. religiosity, and morality”); and her political system (a democratic representative republic with significant checks and balances). 

          The distinguishing factors that marked America 200 years ago appear less evident todav. The geographic setting, once providing isolation and opportunity, “now must be guarded against terrorists and illegal immigration” (#326). Ideologically. “The common understanding of the limited role of government that united the Founders, including Hamilton, are now held only by a small minority of Americans, who are considered to be on the fringe of American politics” (#335). A nation conceived with a commitment to limited government; which “never spent more than 4 percent of the GDP in anv of the 140 vears from the founding until 1931,” has morphed into a highly centralized federal system spending (in 2011) some “25 percent of GDP” (#344).

          Such an “erosion would not have surprised the Founders.  As Benjamin Franklin left Independence Hall on the final day of the Constitutional Convention, a woman asked him ‘Well, Doctor, what nave we got?  A republic or a monarchy?’  Franklin replied:  ‘’A republic, if you can keep it.’  His answer epitomized the views of all the Founders’* (#392). ‘Half a century later, young Abraham Lincoln, just twenty-eight years old,” gave a speech in Springfield, Illinois, that “first brought him to public attention His topic was ‘The Perpetuation of Our Political institutions.’  In it, he reflected on the prospects of maintaining the American experiment.”  He noted, in “1838, that the nation was facing a new environment’ due to the fact that the revolutionary generation (“a fortress of strength”) had passed awav and “the silent artillery of time” threatened to demolish their handiwork (#401).  

          Lincoln, of course, effectively preserved the Union.  But whether the nation he loved still exists is, at least to Murray, quite debatable.

                                                        * * * * * * * * * * * * * * * * * * * * * * 

         In Coming Apart:  The State of White America 1960-2010 (New York: Crown Forum, c. 2012), Charles Murray endeavors to describe and explain “an evolution in American society that has taken place since November 21, 1963, leading to the formation of classes that are different in kind and in their degree of separation from anything that the nation has ever known.  I will argue that the divergence into these separate classes, if it continues, will end what has made America America” (p. 11).  What made America America is what Murray frequently calls “the American project,” which “consists of the continuing effort begun with the founding, to demonstrate that human beings can be left free as individuals and families to live their lives as they see fit, coming together voluntarily to solve their joint problems.  The polity based on that idea led to a civic culture mat was seen as exceptional by all the world.  That culture was so widely shared among Americans that it amounted to a civil religion.  To be an American was to be different from other nationalities, in ways that Americans treasured.  That culture is unraveling” (p. 12).  (Having learned—after publishing data in The Bell Curve that documented IQ scores for African Americans—that scholars cannot truthfully report unfavorable information regarding minority communities without igniting, a firestorm of “racist” accusations, Murray determined to study only while Americans?  Had he included data from non-white groups one assumes the picture would be even more bleak.)

         A new upper class has emerged—variously referred to as “bourgeois bohemians” (Bobos), “the educated class,”  or “the creative class” that Murray labels “the new upper class,” amounting to no more than five percent of the population (and perhaps as few as 10,000) “people who run the nation’s economic, political, and cultural institutions” (p. 17).  They arc highly intelligent, affluent College graduates (generally from elite universities), religiously devoted to physical fitness, well-informed about current events, dogmatic liberals, “helicopter parents” determined to insure their children’s future, and (though city-dwellers) staunch environmentalists.  They marry one another and cluster together in fashionable enclaves (most often gated communities) effectively cut off from the rest of society.  They gravitate to places such as the Upper East Side in New York, the swank suburbs of Washington, D.C., Chicago’s North Snore, and Beverly Hills and the Palo Alto area in California.  “The culture of the new upper class carries with it an unmistakable whiff of a ‘we’re better than the rabble’ mentality.  The daily yoga and jogging that keep them whippet-thin are not just healthy things for them to do; people who are overweight are just less admirable as people” (p. 84).  Failing to recycle is “irresponsible” and smokers are “to be held in contempt” (p. 84).

         Conversely, there is an emergent, poorly educated, frequently unmarried “new lower class” that is distinguished by its lack of the founding virtues that once marked ordinary Americans:  “industriousness, honesty, marriage, and religiosity” (p. 127).   Without these traits, manifestly displayed throughout the nation’s history before 1950, our nation cannot succeed, and today’s lower class clearly lacks them.  Lower class males—and especially unmarried ones—increasingly lack industriousness and seem unable to hold jobs.  Whereas virtually no crime takes place in upper class neighborhoods, it pervades lower class areas.  The decline of religiosity during the past half-century marks both upper and lower classes, but the secularizing trend is far more pronounced in the lower class.  On a purely sociological level religion contributes much to the well-being of a society, for religious people are far more likely to visit friends, join service organizations, support local schools and .give money to charity.  They live longer, form more stable marriages, and do better as parents.  The decline of religious life bodes ill for the nation,

          The dramatic break-down of marriage—“the fault line dividing American classes”—should especially alarm us. Healthy marriages and intact families sustain healthy societies.  “No matter what the outcome being examined—the quality of the mother-infant relationship, externalizing behavior in childhood (aggression, delinquency, and hyperactivity). delinquency in adolescence, criminality as adults, illness and injury in childhood, early mortality, sexual decision making in adolescence, school problems and dropping out, emotional health, or any other measure of how well or poorly children do in life—the family structure that produces the best outcomes for children; on average, are two biological parents who remain married” (p. 158). Never-married women do the worst in child-rearing.  The scholarly evidence is overwhelming.  Yet it is “resolutely ignored by network news programs, editorial writers for the major newspapers, and politicians of both major political parties” (p. 158),

          Having made his case that the nation is “coming apart” Murray urges us to consider this:   “The trends of the last half-century do not represent just the passing of an outmoded wav of life that I have  identified with ‘the American project.  Rather the trends signify damage to the heart of American community and the ways in which the great majority of Americans pursue satisfying lives.  The trends of the last half century matter a lot.  Many of the best and most exceptional qualities of American culture cannot survive unless they are reversed” (p. 235).  More than acknowledging what is, he urges us to recover what has been lost, for the founding virtues truly matter.  Indeed, they largely constitute the good life and enable people to find happiness.  This leads Murray to underscore (as he does in other works) the perennial truths inscribed in Aristotle’s Nicomachean Ethics.  By nature man desires to be happy, and he finds happiness by living virtuously—prudently, courageously, temperately, justly, generously, magnanimously, doing good work, forging a good family, belonging to a good community, following a good faith. Careful, quantitative Studies show that “the qualities in individuals that make them happy in their marriages, satisfied with their work, socially trusting, and strongly involved with their religion are also qualities that are likely to make them successful m their jobs” (p. 265).

          Such qualities, however, are rapidly disappearing in America’s new lower class, threatening this nation’s future. By embracing the welfare state, exchanging individual freedom for social security, and becoming increasingly more like Europe.   We are abandoning our inheritance.  In many ways Murray fears we are undergoing what Arnold Toynbee, in A Study of History, identified as a “Schism in the Soul,” evident in “a ‘lapse into truancy’—a rejection of the obligations of citizenship—and ” ‘surrender to a sense of promiscuity’—vulgarization of manners, the arts, and language—that “are apt to appear first in the ranks of the proletariat and to spread from there to the ranks of the dominant minority’” (p. 286).  What was once clearly understood and embraced as “the code for males” has been generally repudiated.  To be courageous and faithful, to consider your word as your bond, to protect women and children (as did the men on the Titanic) was the ideal.  But no longer!  “The code of the American gentleman has collapsed, just as the parallel code of the American lady has collapsed” (p. 2S9).  Replacing it, especially among the upper class, “is a set of mushy injunctions to be nice,” and being nice means, above all, being “’nonjudgmental.”

         But “nice” and “nonjudgmental” qualities don’t make a great nation!  Indeed, they probably insure its decline.  They promise us a country much like Sweden or France, where “the purpose of life is to while away the time between birth and death as pleasantly as possible, and the purpose of government is to make it as easy as possible to while away the time as pleasantly as possible—the Europe Syndrome” (p. 284).  Europeans appear to prefer leisure to work, show little interest in traditional marriages or offspring and take little interest in religion.  “The alternative to the European Syndrome is to say that your life can have transcendent meaning if it is spent doing important things—raising a family, supporting yourself, being a good friend and a good neighbor, learning what you can do well and then doing it as well as you possibly can. Providing the best possible framework tor doing those things is what the American project is all about” (p. 284).

         To repudiate the Europe Syndrome, Murray thinks, requires more than some minor corrections.   A major cultural overhaul—indeed a revival of some sorts—is needed.

259 Making Gay Okay

 Few societal changes have rivaled the rapidity with which homosexuality has been mainstreamed in America!  When the nation’s Supreme Court ruled in favor of “same-sex marriage,” when Attorney General, Eric Holder, accused the Boy Scouts of bigotry for denying openly gay men positions as leaders in the organization, and when President Barack Obama issued edicts insuring special protections for homosexuals in companies granted contracts by the government, it became obvious that one of the most ancient moral standards had been breached. To grasp the nature and enormity of such decisions, Robert R. Reilly’s Making Gay Okay:  How Rationalizing Homosexual Behavior Is Changing Everything (San Francisco:  Ignatius Press, c. 2014) merits close attention.  “The love whose name dare not be spoken (once upon a time) is being shouted, if not from the rooftops, at least from the streets in demonstrations and parades, from the platforms in political rallies, and from the pages of various popular and intellectual journals.  And from the White House” (#91 in Kindle). 

  The great philosopher and Holocaust victim Edith Stein (in a declaration at the book’s inception that discloses much of its message) said:  “Do not accept anything as love which lacks truth.”  What’s important when we address homosexual behavior is, above all, discerning truth regarding reality, for no one, as Plato said, wants to “lie in his soul about the most important things.”  Says Reilly:  “My thesis is very simple.  There are two fundamental views of reality.  One is that things have a Nature that is teleologically ordered to ends that inhere in their essence and make them what they are.  In other words, things have inbuilt purposes.  The other is that things do not have a Nature with ends:  things are nothing in themselves, but are only what we make them to be according to our wills and desires.  Therefore, we can make everything, including ourselves, anything that we wish and that we have the power to do.  The first view leads to the primacy of reason in human affairs; the second leads to the primacy of the will.  The first does not allow for sodomitical marriage, while the second does.  Indeed, the problem is that the second allows for anything.  This is what the same-sex marriage debate is really about—the Nature of reality itself.  Since the meaning of our lives is dependent upon the Nature of reality, it too hangs in the balance” (#47).  

But denying the nature of things pervades the gay community and explains its many illogical rationalizations.  Conscience-stricken homosexual activists think that by forcing everyone else to approve their behavior they will finally feel themselves righteous.  As Aristotle rightly said:  “Men start revolutionary changes for reasons connected with their private lives.”  Or, as Reilly says, “If you are going to center your public life on the private act of sodomy, you had better transform sodomy into a highly moral act.”  It must be more than tolerated—it must be approved, applauded, normalized.   Anyone daring to disapprove must be publically rebuked, chastised for hate speech, and punished.  Gay rights activists persuade themselves that they are highly moral people intent on universalizing their morality.  “Ironically,” however, “the logic behind this process of legitimization of homosexual behavior undercuts any objective standards by which we could judge the moral legitimacy of anything” (#267).  Severed from the Natural Law, gay apologists glide into an inescapably nihilistic maelstrom.  

The Natural Law (as understood by the great architects of the Western Tradition such as Aristotle, Cicero, Augustine and Aquinas) prescribes correct behavior in accord with the given nature of things.  There is a Logos, an understandable, ingrained purpose to all that is.  Acting rightly means following right reason, moving toward a good goal, pursuing true happiness.  With Aristotle, advocates of the Natural Law believe that “Nature ever seeks an end.”  Thus we live well by acting in accord with our design—thinking rationally, eating wisely, and nurturing the absolutely necessary prepolitical institution of the family, touted by Aristotle as the absolutely essential foundation for the state.  That our genitals are not suited for sodomy, and that dire health consequences (far more deleterious than smoking tobacco or becoming obese) invariably accompany the same, cannot be denied.  And yet precisely such truths are denied by homosexual activists.  Reilly, however, leads his readers through a mass of details regarding the truly unnatural and harmful nature of homosexual activity.  “One might say with Professor Harry Jaffa that ‘nature itself seems to reward chastity with health, and punish promiscuity with disease. . . .  It would certainly seem that nature has an interest in the morality that is conducive to the family, and punishes behavior inimical to it’” (#1201). 

  Contrary to assertions frequently made by gay activists, the greatest Greek philosophers (Socrates, Plato, Aristotle) considered homosexual behavior unnatural and thus immoral.  “In the Laws, Plato’s last book, the Athenian speaker says, ‘I think that the pleasure is to be deemed natural which arises out of the intercourse between men and women; but that the intercourse of men with men, or of women with women, is contrary to nature, and that the bold attempt was originally due to unbridled lust’” (#449).  “So what is sex for?” asks Reilly.  “The end of sex . . . is to make ‘one flesh’.  Two becoming ‘one flesh’ encompasses both the generative and unitive Nature of sex.  By Nature only men and women are physically capable of becoming ‘one flesh’.  (Otherwise the pieces don’t fit.)” (#715).  

During the past two centuries, however, this classical notion of man’s nature has been challenged by devotees of Jean-Jacques Rousseau, who insisted that there is no given, essential human nature.  He denied that man is a rational, political animal needing social structures.  Lacking a given (i.e. divinely-designed) nature we are free to become whatever we desire to be.  The solitary individual, the autonomous self, following his own designs, need not consider anything else.  To live freely, we must shun social bonds such as marriage and follow our inner hungers.  Thus the family, he declared, is an artificial thing that “can be changed and rearranged in any way the state or others may desire” (#597).  Rousseau certainly wrote oft-and-eloquently about Virtue—as did his disciple Robespierre in the midst of the French Revolution’s Terror—but to him “virtue becomes whatever you choose.  Virtue is not conforming your behavior to the rational ends of Nature, but conforming things to your desires.  Reason becomes the instrument for doing this; it rationalizes for you” #610).  As Reilly persuasively shows, gay-rights activists are, philosophically, followers of Rousseau, denying the Natural Law, inventing their own morality.  And inasmuch as 20th century jurists have discarded the Natural Law, casting “aside millennia of moral teaching,” the gains made by homosexuals have been largely won through the courts.  

And indeed the courts, whose decisions Reilly examines in depth, have acceded to the homosexual agenda.  In the Supreme Court’s decision endorsing sodomy—Lawrence v. Texas—Justice Anthony Kennedy cited a passage from an earlier decision (Casey) defending abortion rights.  At “the heart of liberty,” he intoned, is “the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.”  Following that prescription, Kennedy declared that “persons in a homosexual relationship may seek autonomy for these purposes, just as heterosexual persons do.”  If one inserts persons in other kinds of relationships (e.g. polygamy, bestiality, adult incest) into Kennedy’s decision, the illogic of the Lawrence decision become instantly evident.  If, however, the court actually meant that a homosexual is free to rationalize his “sodomitical behavior, even after a one-night stand, into something more pleasing and acceptable to your conscience—and you can not only do that but also seek to enforce your rationalization upon other people and the state of Texas by revoking their laws—then it all becomes clear” (#1662).  

Following his careful examination of the rationalizing process basic to the justification of sodomy, Reilly persuasively details how it has successfully marched through our institutions.  Psychiatrists, intimidated by militant gay activists, have redefined homosexuality, transforming what was once considered a perversion into an acceptable and inescapable orientation.  Adoption agencies have been forced to equate same-sex and opposite-sex couples.  Schools now normalize homosexual behavior, verbally pummeling students who dare condemn it.  Inasmuch as educators significantly shape the thinking of a nation, they have been primary targets for homosexual activists, who have almost totally achieved their goals.  Insisting we talk about “gender” (a construct) rather than “sex” (a biological given), demanding we use refer to “partners” rather than husbands and wives, homosexuals have successfully changed much of the language used in the schools.  The Boy Scouts and the military have also suffered repeated attacks (what Reilly calls “the unremitting drum roll for allowing open homosexuality”) from the homosexual community.  Gradually these institutions have shifted their positions, clearly normalizing same-sex activity.   

“If life is sacred,” Reilly concludes, “then the means of generating it must also be sacred.  If generation is intrinsic to the Nature of sex, then sex possesses immense significance.  It is not a toy, or simply an amusement, or an item for sale.  It is profoundly oriented to creation—creation emanating from union.  It has a telos.  As Dr. Jennifer Roback Morse said:  ‘The human person is meant for love, and the human body cries out to be fruitful.’  As stated earlier, the fruit is the incarnation of the love.  If generation is artificially separated from it, sex lapses into insignificance and triviality.  This denial leads to its desecration and is contemptuous of what human beings are meant to be” (#3711).  

Making Gay Okay is a solidly-researched, finely-reasoned treatise.  Anyone concerned with the health of our culture and the direction of our nation should carefully consider its message.  

* * * * * * * * * * * * * * * * 

The struggle to make gay okay first surfaced, for many of us, in the state of Colorado when concerned citizens passed a constitutional amendment designed to prevent gays and lesbians from receiving preferential treatment from the government.  Steven Bransford, a Texan who arrived in Colorado Springs as the struggle began, found himself involved first as an observer, then a participant, and finally a chronicler, writing Gay Politics vs. Colorado and America:  The Inside Story of Amendment 2 (Cascade, CO:  Sardis Press , c. 1994).  Two decades have passed, but the strategies described and the consequences envisioned have proliferated and altered the social landscape of America as well as Colorado.  The battle began in Colorado in the early ‘90s when the “cities of Aspen, Boulder, and Denver had granted gays protected class status in citywide ordinances” (p. 9).  As Bransford learned—and we all should indelibly remember—“to homosexual, laws against wrongful firing, violence, and harassment have never been enough.  These laws merely make them equal, giving them no special advantage to force society to affirm their lifestyle.  Forced affirmation requires going beyond policing actions.  It requires the power to punish people for their thoughts, motives, attitudes, prejudices, hatreds, private biases—even their moral convictions” (p. 102).  Thus gay activists in the “mountain state” determined to impose their agenda on the state as well as selected cities drafted “H.B. 1095, a sweeping gay rights law disguised under the nice sounding title, ‘The Ethnic Harassment Bill’” (p. 9).  The homosexual agenda was strongly endorsed by the state’s political and cultural elite, including Governor Roy Romer and Denver’s Mayor Wellington Webb.   In particular, Wilma Webb (the mayor’s wife and a representative in the state’s legislature) championed it.  But a couple of concerned women read with alarm the bill’s fine print and marshaled a modest but effective movement to oppose it.  Once exposed to the light of day in a congressional committee, the bill failed.  

Thus awakened to the intent of homosexual activists, a small group of concerned citizens (encouraged by former Senator Bill Armstrong, Colorado University’s football coach Bill McCartney, and Colorado Springs automotive dealer Will Perkins) organized themselves as “Colorado for Family Values” (CFV) and determined to use an initiative to add an amendment to the state constitution that would prevent preferential treatment for gays.  Carefully worded, the amendment stated that no branch of government “‘shall enact, adopt or enforce any statute, regulation, ordinance or policy whereby homosexual, lesbian, or bisexual orientation, conduct, practices or relationships shall . . . entitle any person or class of persons to have or claim any minority status, quota preferences, protected status, or claim of discrimination’” (p. 43).    

CFV’s first challenge was to solicit sufficient numbers of signatures on petitions to place the proposal (Amendment #2) on the ballot for the November, 1992 election.  As the volunteers began their work, however, they learned to guard against assaults from homosexual activists.  Some would, for example, volunteer to gather petitions and then destroy them.  For endorsing the amendment, Coach McCartney’s job was jeopardized as his university openly censured him.  Congressman Pat Schroeder labeled the coach a “self-appointed ayatollah.”  The state’s newspapers snidely smeared McCartney and Bill Armstrong.  To Denver Post columnist Ken Hamblin:  “When shallow people like Armstrong and McCartney are permitted to float like scum on top of a sea of knowledge, they take us back to the 14th century’’ (p. 56).  After the signed petitions were collected, Will Perkins tried to hire various armored car agencies to haul them to Denver—but their fears of homosexual retaliation kept them from doing so.  In Boulder, gay rights activists started a Sunday morning fire in the basement of the First Presbyterian Church, which had a few months earlier removed a lesbian choirmaster.  Following the amendment’s passage, vandals desecrated the statue of the Virgin Mary in Denver’s Basilica of the Immaculate Conception.  Such intimidation and violence routinely characterized opponents of the amendment.  And their behavior logically followed their ethical nihilism—if there are no standards for sexual conduct there are, similarly, no standards regulating any activity.  

But the biggest opponent CFV faced, as the election neared, was the press.  “All naive notions of journalistic integrity” quickly dissolved, for, as Will Perkins noted, “‘Language doesn’t shape the campaign—it is the campaign’” (p. 89).  Rather than truthfully describing the amendment, the press routinely referred to it as an “anti-gay” effort to punish, to “legalize discrimination” against a long-suffering minority.  Homosexuals were constantly compared to racial minorities who simply wanted their basic civil rights protected.  Headlines skewed the factual content of the news stories.  Scores of politicians were quoted as opposing it whereas only a dozen could be found with something favorable to say about it.  TV stations in Denver refused to air advertisements supporting of the proposal—revenue apparently meant less than accommodating homosexuals.  As a nationally recognized columnist, Joseph Sobran, observed:  “‘The old journalists had a sense of duty; the new ones have a sense of mission.  There’s a big difference’” (p. 94).  Only talk radio hosts such as Dennis Prager and Mike Rosen provided the public with pro-Amendment 2 information.  

Though the gay-rights activists frequently denounced the “religious right,” there were in fact remarkably few pastors (evangelical or otherwise) who openly supported Amendment 2.  “Letters sent to pastors seemed to fall off their desks and into the round file.  Out of 400 initially contacted, only two pastor’s assistants bothered to reply at all.  This became a major source of disappointment within the ranks of CFV.  They were in the public square alone, being shot at by the most powerful media guns in the state.  It would have been nice to have had these voices of moral support behind them, but they were unable to rally the troops” (p. 135).  Said Will Perkins, a devout Presbyterian layman, “‘I knew how Custer felt the day he modeled the first arrow shirt’” (p. 136).  Nor did Colorado’s Catholic hierarchy assist them.  With his customary clarity, Dennis Prager said:  “‘The Greeks assaulted the family in the name of beauty and Eros.  The Marxists assaulted the family in the name of progress.  And today, gay liberation assaults in the name of compassion and equality.  I understand why gays would do this. . . .  What I have not understood was why Jews or Christian would join the assault’” (p. 138).  

Anti-amendment religious spokesmen, however, abounded.   The American Academy of Religion and The Society of Biblical Literature (the most prestigious of scholarly associations for professors of religion) supported gay rights.   Evangelicals Concerned (renowned for its superstars Ron Sider and Tony Campolo) pushed for its defeat.  A Methodist bishop urged 290 of his clergy to oppose it.  The National council of Churches intoned:  “‘It is blasphemy to invoke the infinite and holy God to assert the moral superiority of one people over another’” (p. 139).  Only the Vatican, belatedly but powerfully, came to the amendment’s defense, declaring:  “‘Sexual orientation does not constitute a quality comparable to race [or] ethnic background’ but ‘homosexual orientation is an objective disorder’” (p. 139).  

As November neared, amendment proponents tended to despair.  Polls predicted it would fail badly, and it seemed that CFV found vitriol and hostility everywhere.  Yet on election day, Colorado’s voters, while giving Bill Clinton an easy victory, resoundingly endorsed Amendment 2.  Many of them, apparently, when earlier questioned by pollsters, had feared to look bigoted and thus claimed to oppose the amendment.  But in the privacy of the voting booth, they showed their true convictions.  And one would think, in a democratic society, that the people had spoken and their desires would be implemented.  But Democrats in Denver on election night began crying out to “Boycott” and “Punish the State of Hate.”  The next night 7,000 protesters, encouraged by Governor Romer (who within a week called a secret meeting to plan how to reverse the people’s decision) and Mayor Webb vowed to annul the amendment.

Thus began a well-orchestrated boycott Colorado movement.  Hollywood celebrities urged people to avoid the state.  A host of cities, including San Francisco, Los Angeles, Berkeley, New York, Boston, and Detroit forbade city council members to conduct official business in Colorado.  Various groups cancelled scheduled conventions, possibly costing the state 39 million in lost revenue.  But here too the gay activists ultimately failed.  Good snow motivated serious skiers to ignore the boycott!  Tourists came in great numbers.  And the bottom line revealed that the following year “Colorado posted more rapid economic gains than any other state during the past year” (p. 182).  

But while the public was not molded to the homosexual agenda the courts, with their commitment to the ever “evolving standards” of society rather than the text of the Constitution, proved hospitable.  Days after the election gay activists (following the ACLU script) filed a lawsuit, Romer v. Evans, and found  a friendly judge, Jeffrey Bayless, to grant an injunction ordering the state to not enforce amendment.  He listened for days “to every conceivable ranting from the homosexual plaintiffs” while the state’s Attorney General did little to effectively uphold the amendment (p. 190).  Six months later, the Colorado Supreme Court, arrogating to itself power supposedly reserved to the federal courts, upheld Bayless’ judgment, allegedly defending “‘the right for gays to participate equally in the political process’” (p. 197).  

Bransford finished his book while Romer v. Evans was winding its way through the judicial system, headed for the Supreme Court of the United States.  He rather naively assumed Colorado’s Amendment 2 and the will of the people in Colorado would be upheld and that what had taken place in Colorado could be duplicated in other states, thus rolling back the gay rights agenda.  But in 1996 the Supreme Court annulled Amendment 2 and provided a crucial legal precedent for the multiplied court decisions granting virtually every demand, including “same sex marriage,” of the homosexual community.