228 Great Men’s Biographies

No 20th century theologian elicits more admiration than Dietrich Bonhoeffer, for whom Eric Metaxas’s recent biography, Bonhoeffer:  Pastor, Martyr, Prophet, Spy, a Righteous Gentile vs. the Third Reich (Nashville:  Thomas Nelson, c. 2010) provides us an engaging and insightful introduction.  As the eminent Israeli physicist Gerald Schroeder says:  “Metaxas has the rare skill of taking the mundane but crucial details of life and weaving them into a history that flows like a novel.  For anyone interested in what the strength of belief and conviction can accomplish, Bonhoeffer is an essential read.”  Though other works (especially those by Bonhoeffer’s closet friend, Edward Bethge) are more detailed and ultimately definitive, this biography is a wonderful fusion of scholarly accuracy and scintillating style.    

In 1906 Bonhoeffer was born into the highest echelon of the German aristocracy, tracing his ancestry through centuries of illustrious physicians, judges, professors and statesmen.  His father was a renowned psychiatrist, a professor at the University of Berlin, his mother a teacher whose “parents and family were closely connected to the emperor’s court at Potsdam” (p. 6).  He received the finest possible education, both at home and school, where he excelled in most everything (including music, which ever remained a major avocation for him).  Profoundly affected by the turmoil of WWI—and especially by the death of an older brother—he determined at the age of 13 to study theology.  After a term at Tubingen University, he returned to Berlin to live at home and complete his university studies under the tutelage of the famed Church historian and devotee of the theological Liberalism, Adolf von Harnack.  While studying under (and ever respectful of) Harnack, however, young Bonhoeffer was drawn to the insurgent Neo-Orthodoxy of Karl Barth that would sustain the Confessing Church in its struggle with National Socialism.  

Following his university years, Bonhoeffer served briefly as a youth pastor for a German congregation in Barcelona, Spain, refining his preaching skills and Christological concerns in the process.  “‘Understanding Christ means taking Christ seriously,’” he insisted.  Doing so requires “‘for us to clarify the seriousness of this matter and to extricate Christ from the secularization process in which he has been incorporated since the Enlightenment’” (p. 83).  Determined to finish his theological training he returned to Berlin, writing a dissertation (Act and Being) that qualified him as a university lecturer in 1930.  An opportunity to further his studies at Union Theological Seminary in New York led him to spend a year in America, where he quickly became disillusioned with the superficiality of students who “‘are unfamiliar with even the most basic questions.  They become intoxicated with liberal and humanistic phrases, laugh at the fundamentalists, and yet basically are not even up to their level’” (p. 99).  In nearby  churches he heard messages delivered by the likes of Harry Emerson Fosdick (presiding over the recently built and prestigious Riverside Church) devoted to most everything but the main thing, “‘namely the gospel of Jesus Christ, the cross, sin and forgiveness, death and life’” (p. 99).  

He did, however, find the gospel proclaimed in the Abyssinian Baptist Church in Harlem by Dr. Adam Clayton Powell Sr., the son of former slaves who “brought an outsized vision of faith to the pulpit,” combining “the fire of a revivalist preacher with great intellect and social vision” (p. 108).  Consequently, Bonhoeffer regularly attended and taught Sunday school at Abyssinian.   Years later, shortly before WWII erupted, Bonhoeffer briefly returned to New York and delighted in attending Broadway Presbyterian Church to hear messages delivered by a fundamentalist pastor (Dr. McComb) who unapologetically preached God’s Word.  To Bonhoeffer, American Fundamentalists were quite akin to the Confessing Church, staying true to the Bible rather than accommodating secular powers.  Siding with the Fundamentalists, in opposition to the Liberals at Riverside Church and Union Theological Seminary, he said:  “‘This will one day be a center of resistance when Riverside Church has long since become a temple of Baal’” (p. 334).  

Though offered a position at Harvard, Bonhoeffer felt led to return to Germany and serve his own people amidst the turbulence of Hitler’s rise to power.  At a crucial moment, he wrote, “‘something happened, something that has changed and transformed my life to the present day.  For the first time I discovered the Bible’” (p. 123).  It alone, he discovered, “‘is the answer to all our questions’” (p. 136).  He’d obviously read and preached from it for years, but suddenly the Bible became a living Word, a Way to live.  He began lecturing at the University of Berlin, insisting that Jesus Christ, not Adolf Hitler, was the only Savior.  As the Nazis tightened their control over all the institutions of society, Bonhoeffer early joined the opposition, especially working to establish the Confessing Church within the state-run  Lutheran Church, which was controlled by the “German Christians” who accommodated Hitler by repudiating the Old Testament and generally promoting Der Fuhrer’s “Nietzschean social Darwinism” (p. 173).  

Metaxas details Bonhoeffer’s valiant struggle to restore an authentic faith and discipleship within the established Lutheran church, noting his conviction that:  “‘The question at stake in the German church is no longer an internal issue but is the question of the existence of Christianity in Europe’” (p. 204).  “‘For me,” he declared, “‘the fight against National Socialism is essentially a fight in defense of the Christian conception of the world.  Whereas Hitler wants to revive the Old Germanic paganism, I want to revive the Christian Middle Ages’” (p. 232).  This led him to engage in various activities:  making international appeals, establishing an underground seminary, and—as WWII erupted and the Nazi’s genocidal policies against the Jews became clear—joining plots against Hitler.  

Bonhoeffer’s family and social ties linked him to the highest echelon of Germany’s military caste—including an anti-Hitler cadre of officers such as Count Helmuth von Moltke and Admiral Canaris—who early sensed the necessity of removing Der Fuhrer for the good of both Germany and mankind.  In 1941, “Bonhoeffer famously said that, if necessary, he would be willing to kill Hitler” (p. 388).  For a pastor who had espoused virtually pacifist positions a decade earlier, this was quite a change!  Several conspiracies almost succeeded, but Hitler seemed to live a curiously charmed life.  Failed assassination attempts led to arrests and executions of those involved and in time the Gestapo apprehended and imprisoned Bonhoeffer.  

Imprisoned for the last two years of his life, Bonhoeffer continued to write and minister to those around him.  Fortunately, his uncle, Paul von Hase, was the military commandment of Berlin and buffered  some of the regime’s strictures on his nephew.  He was thus able to receive visits from his family, friends and fiancée as well as continue his theological study and writing.  The failure of yet another attempt to kill Hitler in 1944, however, led to harsher treatment, as everyone remotely connected to the conspiracy was targeted.  Many, including Paul von Hase, were summarily executed.   One of these officers, shortly before he died, said:  “‘The whole world will vilify us now, but I am still totally convinced that we did the right thing.  Hitler is the archenemy not only of Germany but of the world.  When, in a few hours’ time, I go before God to account for what I have done and left undone, I know I will be able to justify in good conscience what I did in the struggle against Hitler’” (p. 487).  

While his regime was collapsing, Hitler still sought to destroy those who had dared oppose him, personally approving their executions.  So Bonhoeffer and 17 others were taken from his cell in Berlin and sent to Buchenwald, a notorious death camp.  Here, an English prisoner, Payne “Best described Bonhoeffer as ‘all humility and sweetness, he always seemed to me to diffuse an atmosphere of happiness, of joy in every smallest event in life, and of deep gratitude for the mere fact that he was alive. . . .   He was one of the very few men that I have ever met to whom his God was real and ever close to him’” (p. 314).   From Buchenwald he was sent to a smaller camp, Flossenburg, where he was hanged on April 8, 1945, two weeks before the Allies liberated the camp, three weeks before Hitler killed himself.  

* * * * * * * * * * * * * * * * *

Admirers of the late Pope John Paul II frequently call him “The Great,” appraising him the equal of a handful of thusly-titled pontiffs such as Leo I and Gregory I.  They regard him the most significant successor of St. Peter in 500 years.  George Weigel clearly shares this high evaluation in his two-volume (1500 pp) biography, a richly detailed model of scholarship, “the culmination of twenty years of studying and writing” (p. 14).  The first volume is titled Witness to Hope:  The Biography of Pope John Paul II (New York:  HarperCollins Publishers, c. 1999).  

Karol Wojtyla (pronounced Voy-TEE-wah) was born in Wadowice, Poland, in 1920.  His mother died while he was young, but his father, a notably “just man,” an army officer known as “‘the captain,’ was a gentleman of the old school and a man of granite integrity whose army career . . . was based on a combination of intelligence, diligence, dependability, and above all, honesty” (p. 29).  Young Wojtyla flourished in school, mastering his studies and standing out as an athlete.  He and his father moved to Krakow in order for him to enter the Jagiellonian University the year before the Germany invaded Poland and ignited WWII.  In short order the Nazis arrested and deported many Polish leaders, including university professors.  Young Wojtyla, though forced to work as a quarryman, quickly engaged in underground cultural activities (mainly theatrical performances), subtly resisting the occupying forces.  He also sensed and obeyed a call to the priestly ministry, especially as a result of reading the works of St. John of the Cross with their call to total surrender to God’s will, and studied in the university’s “underground” seminary.  

Wojtyla was completing his theological studies as Russians replaced Germans occupying Poland in 1945.  Ordained in 1946 and obviously gifted, he was then sent to further his studies in Rome’s famed Pontifical Athenaeum of St. Thomas Aquinas, where he excelled in every way, writing (in Latin) a doctoral thesis on St. John of the Cross under the direction of the famed Father Reginald Garrigou-Lagrange.  Returning to his native land he creatively engaged in pastoral ministry, notably honing strategies that included skiing and kayaking expeditions to reach young people who became life-long friends.  His intellectual gifts could not be wasted, however, and he was soon given an academic assignment, becoming an ethics professor at the University of Lublin.  While teaching he sustained a “hard-won conviction about the ‘objective’ reality of the world” which “disclosed important things about the virtues, about the pursuit of happiness, and about our moral duties in life” (p. 126).  

Named auxiliary bishop of Krakow by Pope Pius XII in 1958, Wojtyla joined fellow bishops at the Second Vatican Council, where he took an unexpectedly (for such a young bishop) active role, especially emphasizing the Christian’s call to holiness, which  “was nothing less than a ‘sublime sharing in the very holiness of the Holy Trinity,’ of God himself” (p. 162).    Back in Poland he was installed as Archbishop of Krakow and later named a cardinal.  As a bishop he “governed his diocese (and did his philosophy and theology) ‘on his knees’—or at a desk in the sacramental presence of his Lord” (p. 188).   He also continued to vigorously preach and write as well as represent the Church in her continuous struggle with the country’s Communist authorities who were conducting “the assault on human dignity he had described to Henri de Lubac as ‘the evil of our times’” (p. 227).   

In 1978, following the brief pontificate of John Paul I, Karol Wojtyla was elected Pope and soon uttered his signature call:  “Be not afraid!”  He subsequently launched a series of initiatives designed to promote the Gospel around the world, continually traveling (taking scores of what he called “pilgrimages”),  preaching to vast throngs (personally addressing more people than anyone in human history), presiding over unexpectedly momentous “youth days,” teaching through weekly audiences and a steady stream of encyclicals and papal letters, issuing the momentous Catechism of the Catholic Church to clearly define orthodox positions, defending both the rights of the unborn and the freedoms of people enchained by totalitarian regimes, appointing bishops and cardinals and curial officials (most notably the man who would succeed him, Joseph Ratzinger) designed to implement his vision for the Church.   

* * * * * * * * * * * * * * * * * 

The second volume of George Weigel’s biography of John Paul II is entitled The End and the Beginning:  Pope John Paul II—The Victory of Freedom, the Last Years, the Legacy (New York:  Doubleday, c. 2010).  Inasmuch as one third of the book revisits Karol Wojtyla’s life-long struggle with Communism, it is rightly described as an “amplification and completion” of Witness to Hope, the first volume of the study.  This is largely due to the availability of documents recently made public that reveal the machinations of Communist authorities in various countries to discredit and if possible destroy Wojtyla.  Rightly perceiving him as a signal threat to the iron fist crushing the peoples of Russia and Eastern Europe, Poland’s secret police and Russia’s KGB diligently recruited agents to infiltrate priestly circles in both Poland and Rome, determined to both gain information regarding Wojtyla and craft strategies to undermine him.  As Alexandr Solzhenitsyn (the exiled Russian novelist then living in Vermont) declared when Wojtyla was elected Pope, “‘It’s a miracle!  It’s the first positive event since World War I and it’s going to change the face of the world!’” (p. 101).  And indeed—beginning with his epochal nine day pilgrimage to Poland in 1979—he would!  Within a decade the Communist world was collapsing, and Pope John Paul II (along with his ally Ronald Reagan) played a formidable role in its demise.  

Concluding his discussion of Wojtyla’s struggle with Communism, Weigel turns, in part two, to the Pope’s final five years, entitling his presentation “Kenosis.”   Physically vigorous when he became Pope, he had suffered an assassination attempt in 1981 and Parkinson’s disease further weakened him in his final years, but through it all he allowed nothing to compromise his commitment to serving in the place God assigned.  He orchestrated an unexpectedly successful Great Jubilee in 2000, drawing millions of pilgrims to Rome, and despite his infirmities he continued his own pilgrimages to various nations, most notably Israel.  He continued his ambitious project of canonizing saints—especially 20th century martyrs dying at the hands of despots.  

On the basis of three decades of interviews and scholarly studies, Weigel ends his biography with an assessment of his subject.  To do so he begins with an appreciation of Wojtyla’s inner life, organized in terms of faith, hope, and love—the supernatural virtues clearly evident to the Pope’s closest observers.  Still more, the cardinal virtues (prudence, justice, courage, temperance) were equally apparent.  By any standard, his pontificate (despite certain disappointments, including the clerical scandals in America and academic dissidents’ unfaithfulness) was momentously ambitious and successful.  Especially important is the “legacy of ideas,” set forth in the vast corpus of his writings, setting forth doctrinal and moral positions rooted in tradition but attuned to modernity.  

To understand John Paul II—as well as the world he both indwelt and influenced—Weigel’s definitive work is, quite simply, indispensable.  

* * * * * * * * * * * * * * 

Among the 20th century’s “great men” are eminent thinkers who have transformed our understanding the world.  Among these are John Polkinghorne, whose story is told by Dean Nelson (a journalist) and Karl Giberson (a physicist) in Quantum Leap:  How John Polkinghorne Found God in Science and Religion (Grand Rapids:  Monarch Books, c. 2011).  The authors make no pretense of offering a definitive “conventional biography,” though their work rests upon in-depth interviews and careful reading of their subject’s works.  Rather they seek to “tell the story of Polkinghorne, and along the way, we also unfold some bigger issues.  How do we know what ‘Truth’ is?  How does a leading scientist think about the more mysterious aspects of faith—prayer, miracles, life after death, resurrection?” (p. 7).  Furthermore, they want to reach a broad public, rather than an elite corps of physicists and theologians, providing us a warmly written and engaging work, a wonderful introduction to the man and his ideas.  

Polkinghorne’s work on quark theory, as one of an elite group of physicists at Cambridge University, “earned him countless recognitions,” including membership in Britain’s Royal Society, being knighted by Queen Elizabeth II, serving as president of Queen’s College, and delivering the prestigious Gifford Lecture Series (p. 13).  In the midst of his scientific work, however, he sensed a call to the Christian ministry and devoted several years to theological study and pastoral work in an Anglican parish, finding time there to write “one of his most successful books, One World:  The Interaction of Science and Theology” (p. 85).  In his judgment, “‘Christianity affords a coherent insight into the strange way the world is’” (p. 159).  Rejecting some scientists’ contentions that the cosmos has no meaning or purpose, that it’s all “a tale told by an idiot, full of sound and fury, signifying nothing,” he insists we can discover meaning:  “‘In fact the world has a meaning that extends beyond us’” (p. 31).  He envisions God as “a ‘Divine Mind’, and Cosmic Mathematician’, but who also cares for the individual” (p. 48).  

In time he returned to Cambridge University, where he set his mind to the task of synthesizing the two great (and at times paradoxical) concerns of his life:  science and religion.  As a physicist he accepted “the idea that light is both a wave and a particle, two fundamentally contradictory viewpoints.  Acceptance that the simple reality of something as familiar as light required deep paradox serves as a preparation for wrestling with the central Christian belief that Jesus is both human and divine, that he lived and died and lives again” (p. 89).  Science, no less than theology, deals with unseen, transcendent realities.  Thus the Resurrection of Jesus, to Polkinghorne, elicits faith:  “On the truth or falsehood of that belief turns the whole Christian understanding of God and God’s purposes in Jesus of Nazareth’” (p. 90).  So too miracles may well be credible as “‘perceptions of a deeper rationality than that which we encounter in the very day, occasions which make visible a more profound level of divine activity.  They are transparent moments in which the Kingdom is found to be manifestly present’” (p. 95).  They do not contradict the “laws of nature, which themselves are expressions of God’s will, but are revelations of the character of the divine relationship to creation” (p. 95).  

His unique training and perspective led to appointments to prestigious committees, including “the Human Genetics Commission, which evaluated the ethical consequences of recent advances inhuman genetic research” (p. 123).   Accepting such appointments demanded insight and judgment as well as skill in working with others (one of Polkinghorne’s strengths).  He also joined distinguished theologians such as N.T. Wright on a fifteen-member Doctrine Commission established to provide direction for the Church of England.  His views on some items (stem cell research, the ordination of women, etc.) cannot be aligned with those of traditional Catholics or conservative Evangelicals, but they are rather middle-road positions in his church.  And amidst it all he exuded a gentle, gracious spirit bearing witness to his Savior.  

227 Scrutinizing Naturalism

 

Plato’s dialogues persistently probe the essence of the good society, and his final treatise, The Laws, insists cosmology and theology serve as the necessary “prelude” to it.  Should a people embrace the “heresy” that the cosmos has “been framed, not by any action of mind, but by nature and chance only,” Plato said, social chaos inevitably ensues.  Thus the history of philosophy reveals an unending struggle for goodness, truth, and beauty—truly a “cosmic struggle” pitting theists (e.g. Plato) against atheists (e.g. Epicurus), shaping and setting forth divergent worldviews.  There are big questions to address:   Where did I come from?  Why am I here?  Where am I going?  Does life have any meaning and purpose?  Is there any Design to Reality or is all there is a random collection of subatomic bits of matter?  

The most vile characters in C.S. Lewis’s space fiction trilogy (Out of the Silent Planet; Perelandra; That Hideous Strength) consistently espoused what he labeled “scientism,” the philosophical commitment to empiricism (holding that all knowledge comes from the physical senses) as the only valid epistemological strategy, reducing all kinds of inquiry—literature, history, philosophy, et al.—to rigorously material means.  Regarding his first story, Lewis noted that it was clearly “an attack, if not on scientists, yet on something which might be called ‘scientism’—a certain outlook on the world which is usually connected with the popularization of the sciences” (Of Other Worlds).  Consequently, reviewing That Hideous Strength in the New York Times, Orville Prescott judged it “a parable (concerning) the degeneration of man which inevitably follows a gross and slavish scientific materialism which excludes all idealistic, ethical and religious values.”  

Still more, insisting that our minds can be reduced to material brains typified the stance of scientists like Lewis’s fictional physicist Weston, who shared the view of an atheistic Marxist Professor, J.B.S. Haldane, necessarily assuming that:  “If my mental processes are determined wholly by the motions of atoms in my brain, I have no reason to suppose that my beliefs are true . . . and hence I have no reason for supposing my brain to be composed of atoms” (Possible Worlds).  Such scientism, furthermore, assumed a metaphysical materialism that Lewis often labeled “Naturalism.”  Addressing this phenomenon, Lewis made a simple generalization that was neither simplistic nor hasty, differentiating Naturalism from Supernaturalism.  He insisted (in Miracles) that thinkers like Carl Sagan, who declared that “the cosmos is all there is or ever will be” and take the philosophical position that only Nature exists (ontological materialism) and irrationally restrict the realm of Reality to atoms in space.    Consequently, it makes sense to believe in a Supernatural Reality, namely God.  Apart from and above matter there must be Mind.  

Lewis’s concerns permeate The Nature of Nature:  Examining the Role of Naturalism in Science, ed. Bruce L. Gordon and William A. Dembski (Wilmington, DE, c. 2011), a 1000 page collection of 41 essays by noted scholars (both atheists such as Francis Crick and theists such as William Lane Craig, philosophers such as Alvin Plantinga and scientists such as Steven Weinberg) who attended a conference at Baylor University in 2000.  What Lewis sensed, as Steve Fuller writes in his Foreword to this volume, is “the inadequacy of an unreflexive naturalism to explain the aspirations” of science itself.  In particular, the hard-line, reductionistic materialism so espoused by many in the past has increasingly proved “problematic” (p. xiii), for “it is clear that we have moved a long way from the idea that nature can be understood as if it were the product of no intelligence at all” (p. xiv).     

Leading off the book’s essays, Bruce Gordon provides, in “The Rise of Naturalism and Its Problematic Role in Science and Culture,” 60 pages of extensively documented historical details needed to understand the subject.  “Philosophical naturalism,” he argues, “undermines knowledge and rationality altogether, ultimately leading to the instrumentalization of belief and the fragmentation of culture” (p. 1).  He emphasizes—as have Richard Weaver and other noted historians—the influence of Medieval nominalism (which rejected logical realism) in shaping modern science, giving impetus “to empiricism and an explanational preoccupation with material mechanism” (p. 7).  William of Ockham, subordinating God’s intellect to His will, denied He “has an essential nature, and opened the door to divine arbitrariness and universal possibilism, the view that there are no necessary truths, not even logical ones, constraining divine action” (p. 7).  Nominalism also supported empiricism’s attention to particulars rather than universals, leading to an anti-essentialism which incubated philosophical naturalism and the many varieties of relativism that now dominate the intellectual scene.   

This need not have happened, Gordon says, if scientists had retained the transcendent basis for their inquiries that characterized many of the great Christian scientists who reverently read the book of nature.  Contrary to many popular presentations which dismiss (in accord with Edward Gibbon and Carl Sagan) the “Dark Ages” and conjure up a millennium of conflict between religion and science, diligent historians such as Rodney Stark insist we understand and celebrate the remarkable achievements of Medieval Christians, who “provided fertile metaphysics, epistemic, sociocultural, and economic ground for scientific theorizing and experimentation” (p. 20).  The highly vaunted “scientific revolution” of the 17th century was less a revolution than a “continuous logical outworking—derived from developments in scholastic philosophical theology and medieval technological invention—that reached its consummation in this historical period” (p. 22).  

The “problematic” aspects of these developments clearly appear, Gordon argues, in Darwin’s “theory of universal common descent, which purports to explain speciation in the history of life solely by means of natural selection acting on random variation in populations.  ‘Random,’ of course, means exactly that:  objectively undirected and therefore without discernible purpose” (p. 24).  Thus Richard Dawkins rejoices “that ‘although atheism might have been logically tenable before Darwin, Darwin made it possible to be an intellectually fulfilled atheist’” and “Daniel Dennett describes Darwinism as a ‘universal acid’ that ‘eats through just about ever traditional concept, and leaves in its wake a revolutionized world view’” (p. 25).  Rightly evaluated, Gordon insists:  “Darwinism—as an expression of metaphysical purposelessness—has been an indispensable contributor to the spread of secularism in Western society, an undeniable force in our sense of cultural and existential arbitrariness, a logical antecedent to our inevitable embrace of moral relativism, hedonism, and utilitarianism, and a prodigious catalyst for the broader cultural experience of meaninglessness, especially among the younger generations” (p. 26).  If true, quite an indictment!

Gordon’s essay clearly draws important distinctions separating the contributors to this volume.  Some (e.g. Christian de Duve and Francis Crick) are devoted to the scientism and reductionistic naturalism C.S. Lewis attacked.  Others, notably David Berlinski, espouse an agnosticism open to the possibility of a Cosmic Mind.  Still others, including the book’s editors (Gordon and Dembski), Stephen Meyer, J.P. Moreland, Dallas Willard, and William Lane Craig, make the case for Intelligent Design as a viable way to understand the totality of Reality as eminently understandable (Mind-Designed) to rational minds.  

Christian de Duve, a Nobel Prize winner, declares there is no ultimate immaterial “mystery” beyond empirical calculation.   “Science is based on naturalism, the notion that all manifestations in the universe are explainable in terms of the known laws of physics and chemistry.”  This is the “cornerstone of the scientific enterprise,” without which there can be no real scientific research (p. 346).  Scientists such as he have made incredible progress explaining the world in accord with rigidly naturalistic presuppositions.  “It is hardly an exaggeration to say that we have come to understand the fundamental mechanisms of life” (p. 347), and though there remains work to be done “nothing so far has been revealed that is not explainable.”  Consequently:  “There is no justification for the view that living organisms are made of matter ‘animated’ by ‘something else’” (p. 347).  Admittedly there is as yet no explanation for the origin of life.  Perhaps organisms drifted in from outer space (“directed panspermia”)—or perhaps “spontaneous chemical processes” in primeval slime spit out living cells.  But never fear, says de Duve, keep the faith—in time researchers will explain it all in strictly “naturalistic terms.  The fact that the details of this long history have not yet been unveiled is hardly proof that it could not have happened” (p. 350).  So long as it is not at all supernatural, any conjecture is allowed!  

David Berlinski, however, finds dogmatic positions such as de Duve’s less than persuasive.  One of the most gifted advocates of Intelligent Design, Berlinsky, a Princeton-educated mathematician, is an unobservant “secular Jew.”  His genial agnosticism extends to both theology and science, and he espouses a sustained commitment to evidence and reason.  Above all he detests illogic and has deftly defused assorted atheistic arguments in The Devil’s Delusion:  Atheism and Its Scientific Pretensions.  An astute scientist, he acknowledges his discipline’s limitations and thinks atheists who declare their religious faith under the pretense of scientific certainty demonstrate little more than their own confusions.  His essay, “On the Origins of Life,” recounts the various failing proposals advanced during the past two centuries (most notably the Miller-Urey hypothesis), to explain how life began on planet earth.  Recent developments, unveiling the mysterious roles of DNA and RNA in producing proteins and transcribing information, have injected novel perspectives into the discussion.  

In sum, Berlinski says:  “On the level of intuition and experience, these facts suggest nothing more mysterious than the longstanding truism that life comes only from life.  Omnia viva ex vivo, as Latin writers said.  It is only when they are embedded in various theories about the origins of life that the facts engender a paradox, or at least a question:  in the receding molecular spiral, which came first—the chicken in the form of DNA, or its egg in the form of various proteins?  And if neither came first, how could life have begun?” (P. 281).  Indeed, one of the co-discoverers of DNA, Francis Crick, despite his atheism, grants that “‘an honest man, armed with all the knowledge available to us now, could only state that, in some sense, the origin of life appears at the moment to be almost a miracle’” (p. 281).  Earlier “conjectures about the pre-biotic atmosphere were seriously in error” (p. 283), and speculations concerning how various inorganic elements could combine to form organic life-forms remain precisely that—speculations akin to invoking magic or the “evolutionary biologist’s finest friend:  sheer dumb luck” (p. 286).    

Dumb luck, however, looks most improbable to Berlinski, whose mathematical mind grasps the odds against a chance-ruled universe.  As of now, no laboratory has produced a self-replicating RNA ribozyme, basic to life.  We know, however, as geochemist Gustaf Arrhenius explains, that “the minimum length or ‘sequence’ that is needed for a contemporary ribozyme” involves some 100 nucleotides.  Consequently, “Arrhenius notes, there are 4100 or roughly 1060 nucleotide sequences that are 100 nucleotides in length.  This is an unfathomably large number.  It exceeds the number of atoms contained in the universe, as well as the age of the universe in seconds.  If the odds in favor of self-replication are 1 in 1060,  no betting man would take them, no matter how attractive the payoff, and neither presumably would nature” (p. 286).  In short, the “just so” stories told by all too many scientists are rooted in unsubstantiated assumptions and most likely untrue!  

Stephen C. Meyer, in “DNA:  The Signature in the Cell” (recently expanded into a book entitled Signature in the Cell:  DNA and the Evidence for Intelligent Design), shares Berlinski’s critique of the “just so” stories pervading mainstream biology and sets forth a meticulous argument for Intelligent Design as the best explanation for the mystery of life.  He points out that though “the most common popular naturalistic view about the origin of life is that it happened exclusively by chance,” most “all serious origin-of-life researchers now consider ‘chance’ an inadequate casual explanation for the origin of biological information” (p. 304).  The more we understand the amazing, information-laden complexity of living organisms the more doubtful it appears that they could be the simple result of chance chemical reactions.  “The odds of getting a functional protein of modest length (150 amino acids) by drawing a compound of that size from a pre-biotic soup is no better than once chance in 10164.   In other words, the probability of constructing a rather short functional protein at random becomesso small . . . as to appear absurd on the chance hypothesis” (p. 307).   Given the astronomical odds, it takes truly astronomical (or blind) faith to believe in an accidental world.  Importantly, Meyer insists, there’s information embedded in all that lives, and we inevitably attribute information to conscious, intelligent minds, and “what philosophers call ‘agent causation,’ now stands as the only cause known to be capable of generating large amounts of information starting from a nonliving state.  As a result, the presence of specified information-rich sequences in even the simplest living systems would seem to imply intelligent design” (p. 323).  

The philosophical differences between the contributors to The Nature of Nature stand revealed in a section devoted to ethics and religion.  Michael Ruse is an amiable philosopher who was reared as a Quaker but turned agnostic in his 20s.  He appeared as a major witness arguing against allowing “creation science” to be taught in public schools in the 1981 case (McLean v. Arkansas), leading the federal judge to declare it unconstitutional.  In 2001 he joined other scholars in delivering the Gifford Lectures (published in The Nature and Limits of Human Understanding), applying Darwinism to epistemology and ethics.  He finds much to admire in Christian ethics but rejects all theological claims and deplores any notion of Intelligent Design.  Above all he devoutly believes in Evolution and endeavors to apply it to ethics.      

In “Evolution and Ethics” Ruse argues:  “Ethics is an illusion put in place by natural selection to make us good cooperators” (p. 855).  He admires much of the “normative ethics” espoused by theists but insists they can be justified along the purely naturalistic lines he assumes without supporting it with any metaphysical or theological “substantive ethics” (p. 855).  Rejecting the “Social Darwinism” of earlier evolutionists such as Herbert Spencer, he embraces such maxims as “love your neighbor as yourself” as “precisely” what he would “expect natural selection to have promoted” (p. 856).  “Morality is an adaptation like hands and teeth and penises and vaginas” (p. 858).  All we can actually say about morality is it is what it is!   Rather than the “selfish gene” famously celebrated by Richard Dawkins, Ruse gratuitously postulates an “altruistic gene” driving the evolutionary process.  We can hardly refrain from loving our neighbor since it’s the natural thing to do.  Any higher source for our ethical convictions, any “substantive morality is a kind of illusion” genetically ordered to facilitate our social needs (p. 858).  

This is, Ruse admits, “just a statement rather than a proof” (p. 859).  But it is a statement that squares with his commitment to “Darwinian metaethics,” rooted in the conviction that “evolution is a directionless process, going nowhere rather slowly” (p. 860).  Consequently, just as there is no Cosmic Designer there is no ultimate reason (no Logos) that we should love our neighbor rather than abuse him other than the fact that we have evolved with altruism rather than vindictiveness in our genes.  “In other words, my position is that, if you stay with naturalism, then there is no foundation, and in this sense substantive ethics is an illusion” (p. 862), a matter of psychology (as David Hume insisted) rather than metaphysics.  In sum:  “I regard my position as that of David Hume—brought up to date via the science of Charles Darwin” (p. 862).  

On the contrary, Dallas Willard, a philosophy professor at the University of Southern California, insists, in “Naturalism’s Incapacity to Capture the Good Will,” that without an Ultimate Supernatural Source there can be no ethical standards.  Philosophical “naturalism as a worldview lives today on promises,” assuring us that in time scientists will “show how all personal phenomena, including the moral, emerges from the chemistry (brain, DNA) of the human body” (p. 876).  “But after three hundred years or so of promises to ‘explain everything,’ the grand promises become a little tiresome, and the strain begins to show” (p. 876).  Carefully analyzed, naturalistic positions that insist only physical entities are real cannot be philosophically justified.  Indeed, “even if we regard naturalism as merely a human proposal” (such as, I would add, adumbrated by Michael Ruse), “we must still raise the issue of whether straightforward physicalism (the only version of naturalism that makes sense) can deal with ethical phenomena or provide an adequate interpretation of the moral life and moral principles” (p. 869).  Manifestly, says Willard, it cannot!

What’s needed instead is a robust recovery of the great ethical tradition of Western Civilization, imbued with both classical and Christian perspectives and marking the works of “late nineteenth-century thinkers such as Sidgwick, Bradley, and especially T.H. Green” (p. 873).  Sadly enough, during the 20th century, as Alasdair MacIntyre said in After Virtue, “‘we have—very largely, if not entirely—lost our comprehension, both theoretical and practical, of morality’” (p. 872).  Due to the intellectual coup of Hume and Darwin, Ruse and Crick, we now live in a society “where there is no moral knowledge that is publicly accessible in our culture, i.e., that could be taught to individuals by our public institutions as a basis for their development into morally admirable human beings who can be counted on to do the ‘right thing’ when it matters” (p. 877).  To cultivate such persons, only a return to Plato and Aristotle, Augustine and Aquinas will suffice.  

The final essay in The Nature of Nature is William Lane Craig’s “Theism Defended.”   One of the most erudite evangelical scholars now writing, Craig has focused his studies on the intersection of science and theology.  He begins by noting a recent, remarkable intellectual transformation—“a veritable revolution in Anglo-American philosophy that has transformed the face of their discipline” (p. 901).  It became clear, in Quantum physics especially, that “physics is filled with metaphysical statements that cannot be empirically verified.  As philosopher of science Bas van Fraassen nicely puts it:  ‘Do the concepts of the Trinity [and] the soul . . . baffle you?  They pale beside the unimaginable otherness” of such purely postulated realities as quarks and strings and dark matter (p. 902).    Metaphysics, consigned to the dustbin of history by logical positivists, suddenly sprang to life, accompanied by “a renaissance in Christian philosophy” (p. 902).  Academically respectable Theists stepped forth in major universities and publications.  As an atheistic philosopher, Quentin Smith, lamented:  “‘God is not “dead” in academia; he returned to life in the late 1960s and is now alive and well in his last academic stronghold, philosophy departments’” (p. 903).  Such philosophers (including Craig) have reconsidered, rephrased and revived traditional arguments for the existence of God.  

Craig then leads readers through these traditional arguments—contingency, cosmological, teleological, moral, and ontological—and explains why they are as valid as ever.  Though natural science has radically changed since the 13th century, the philosophy of that century’s greatest thinker, St Thomas Aquinas, remains amazingly astute and Supernaturalism remains an eminently viable Weltgeist.   Adding a healthy component to modern science, Plato and C.S. Lewis, Dallas Willard and William Lane Craig stand comfortably positioned as persuasive witnesses to what’s really Real.  And it’s more than mere matter!

226 “The Burden of Bad Ideas”

One of this nation’s most eminent playwrights, David Mamet, occupies an elevated position in contemporary culture.  Awarded the Pulitzer Prize for such plays as American Buffalo, responsible for such films as The Verdict and Wag the Dog (both nominated for an Academy Award), he easily embraced, early on, the regnant liberal ethos of his peers.  As a young writer, he “never questioned my tribal assumption that Capitalism was bad, although I, simultaneously, never acted upon these feelings,” earning a good living because of a nicely-functioning free market.  In time, however, he encountered challenges to his blithe assumptions in works such as Friedrich Hayek’s The Road to Serfdom.  “The great wickedness of Liberalism, I saw, was that those who devise the ever new State Utopias, whether crooks or fools, set out to bankrupt and restrict not themselves, but others” (p. 9).  He then wrote a “political play” that subtly reflected his shifting convictions, and that led to an invitation from the Village Voice to write an article on it.  He titled his article “Why I Am No Longer a Brain-Dead Liberal,” which led to an invitation to expand upon the theme.  Consequently Mamet wrote The Secret Knowledge:  On the Dismantling of American Culture (New York:  Sentinel, c. 2011). 

Composed of 39 short chapters, The Secret Knowledge is more a compilation of thoughts than a carefully constructed treatise.   He touches on subjects ranging from environmentalism to Israel, from literature to sex, that I’ll not address.  But let me illustrate, in this review, Mamet’s concern with Liberalism’s socialistic schema.  Though it only dawned on him lately, he has awakened to the fact that the Nazis and Italian Fascists and Russian Bolsheviks all “believed, in their beginnings, in Social Justice, and the Fair distribution of goods.  But these sweet ideas are encumbered in execution by the realization that someone, finally, has to do the work; their adamant practice will quite soon reveal this:  ‘Oh.  We will need slaves’” (p. 32).  The American Left, though quick to disavow the havoc done by earlier Leftists (Nazis, Fascists, Bolsheviks) cannot escape the link that binds them together:  socialism, which has become a religion, “the largest myth of modern times” (p. 41).  “Liberalism is a religion.  Its tenets cannot be proved, its capacity for waste and destruction demonstrated.  But it affords a feeling of spiritual rectitude at little or no cost” (p. 81).  In accord with Liberal dogma, “the prime purpose of Government is to expand Equality, which may also be stated thus:  to expand its own powers” (p. 92).  

Furthermore:  “The baby boomer generation, my own, is content, if of the Left, to live out our remaining years upon the work and upon the entitlements created by our parents, and to entail the costs upon our children—to tax industry out of the country, to tax wealth away from its historical role and use as the funder of innovation” (p. 43).  These aging boomers still dream of the perfect Commune, a Return to Nature, the abolition of property and marriage, a world of untrammeled self-expression wherein no one “works” but “shares” the surfeit of society.  “It is,” Manet insists, “only in a summer camp (College or the hippie commune) that the enlightened live on the American Plan—room and board included prepaid—and one is free to frolic all day in the unspoiled woods” (p. 141).  Indeed, “Liberalism is a parlor game, where one, for a small stipend, is allowed to think he is aiding starving children in X or exploited workers in Y, when he is merely, in the capitalist tradition, paying a premium, tacked onto his goods, or subtracted from his income, for the illusion that he is behaving laudably (cf. bottled water)” (p. 141).  In short, the “Socialist vision” is, like a magician pulling a rabbit out of a hat, “a trick” (p. 172).  

Rather than magicians the Socialists have demagogic politicians.  “Demagoguery is the attempt to convince the People that they can be led into the Promised Land—it is the trick of the snake oil salesmen, the ‘energy therapists,’ the purveyors of ‘health water,’ and on the other side of the spectrum, the politician and that dictator into which he will evolve absent a vigilant electorate willing to admit its errors” (p. 193).  Did we think rightly, we’d detect “the similarities between ‘Lose Weight Without Dieting,’ and ‘Hope.’  The magicians say the more intelligent the viewer is, the easier he can be fooled.  To put it differently, the more educated a person, the easier it is to engage him in an abstraction” (p. 193).  For Mamet:  “It has taken me rather an effort of will to wrench myself free from various abstractions regarding human interaction.  A sample of these would include:  that poverty can be eradicated, that greed is the cause of poverty, that poverty is the cause of crime, that Government, given enough money, can cure all ills, and that, thus, it should be so engaged” (p. 193).  

The path the leftist boomers (such as Mamet in his youth) follow was identified by Hayek as “The Road to Serfdom.  And we see it in operation here, as we are in the process of choosing, as a society, between Liberty—the freedom from the State to pursue happiness, and a supposed but impossible Equality, which, as it could only be brought about by a State capable and empowered to function in all facets of life, means totalitarianism and eventual dictatorship” (p. 61).  Egalitarian Liberals constantly stress the importance of sympathy and compassion, of caring for others.  Translated into political action, however, these feelings frequently prove destructive, fully evident when Big Government imposes its agenda.  “The judge who forgot the admonition in Proverbs, ‘Do not favor the rich, neither favor the poor, but do Justice,’ who set aside the laws, or who ‘interpreted’ them in a way he considered ‘more fair,’ was, for all his good intentions, robbing the populace of an actual possession (the predictability of the legal codes).  He was graciously giving away something which was not his” (p. 151).   Good intentions can never suffice!  But they “can lead to evil—vide Busing, Urban Renewal, Affirmative Action, Welfare, et cetera, to name the more immediately apparent, and not to mention the, literally, tens of thousands of Federal and Sate statutes limiting freedom of trade, which is to say, of the right of the individual to make a living, and, so earn that wealth which would in its necessary expenditure, allow him to provide a living to others” (p. 151).  

Much that’s wrong with today’s Left, Mamet thinks, stems from a decision to ignore traditional canons of “justice” so as to impose a newly-exalted “social justice,” which can only mean, as Hayek wrote, ‘State Justice’” (p. 46).  Mamet acknowledges “that though, as a lifelong Liberal, I endorsed and paid lip service to ‘social justice,’ which is to say, to equality of result, I actually based the important decisions of my life—those in which I was personally going to be affected by the outcome—upon the principle of equality of opportunity; and, further, that so did everyone I knew” (p. 154).  Inevitably, “social justice” leads to “redistributive justice,” whereby the State “confiscates wealth accumulated under existing laws and redistributes it to those it deems worthy” (p. 46).  

 “To the Left it is the State which should distribute place, wealth, and status.  This is called ‘correcting structural error,’ or redressing ‘the legacy of Slavery,’ or Affirmative Action, or constraining unfair Executive Compensation; it is and can only be that Spoils System which is decried at the ward level as ‘cronyism,’ and lauded at the national level as ‘social justice’” (pp. 46-47).    “Government programs of confiscation and redistribution are called the War on Poverty, or the New Deal, or Hope and Change, but that these programs are given lofty names” (p. 153) guarantees nothing.  Still more:  States striving to insure social justice becomes dictatorial, for it is assumed “that there is a supergovernmental, superlegal responsibility upon the right-thinking to implement their visions” (p. 153).  “This progression, from Social Justice to Judicial Activism and control of means of production and distribution, can be seen . . . wherever the Socialists took power and brought terror and yet the Left, longing for the campfire, votes for collectivism, for better and more powerful and more ‘feeling’ Government” (p. 93).  

Rather than “social justice,” Mamet urges us to recover a commitment to the rule of law, for “The awe and majesty of the Law are our basic inheritance of freedom.  Without these nothing can exist in Freedom:  here is the bright line, stay to the correct side and the community will protect you, venture across, and you will be at the mercy of its other name, the State” (p. 219).  

* * * * * * * * * * * * * * * * * * * *

During the past two decades no journalist, says George Will, “has produced a body of work matching that of Heather Mac Donald.”  With degrees in literature from Yale and Cambridge universities, plus a law degree from Stanford, she brings unique credentials and scholarly depth to her essays, generally dealing with poverty and education and published in New York’s City Journal.  In a collection of some of her essays—The Burden of Bad Ideas:  How Modern Intellectuals Misshape Our Society (Chicago:  Ivan R. Dee, c. 2000)—she documents the harm done to the recipients of social engineering.   “These essays record,” Mac Donald says, “my travels through institutions that have been perverted by today’s elite intellectual orthodoxy, from an inner city high school that teaches graffiti-writing for academic credit . . . to the Smithsonian Institution, now in thrall to a crude academic multiculturalism; from New York’s Dantean foster care system to Ivy League law schools that produce ‘scholarship’ urging blacks to view shoplifting, and pilfering from an employer, as political expression” (p. xi).  

In “The Billions of Dollars That Made Things Worse” Mac Donald explores the impact of philanthropic foundations such as Carnegie and Ford which long ago abandoned their founders’ aspirations (e.g. Carnegie libraries) and now see themselves as agents of social change, funding radical “community activists” around the country, seeking to transform “a deeply flawed American society” (p. 4).  “When,” for example, “McGeorge Bundy, former White House national security advisor, became Ford’s president in 1966, the foundation’s activism switched into high gear.  Bundy reallocated Ford’s resources from education to minority rights” and “created a host of new advocacy groups, such as the Mexican-American Legal Defense and Educational Fund” and “the Native American Rights Fund, that still wreak havoc on public policy today” (p. 9).  The foundations have routinely provided the funds to establish social justice centers on university campuses devoted to race, class, and gender.  They also have subsidized public interest litigation, enabling legions of lawyers to push for bilingual education, voter rights, racial quotas, sexual equality, prisoners’ rights, etc., all designed to  “establish in court rights that democratically elected legislatures have rejected” (p. 20).  

Paralleling the changes in powerful foundations have come similar changes in powerful media, preeminently evident in the New York Times.  Whereas the paper Adolph Ochs bought in 1896 was devoted to sound money, low taxes, and ‘no more government than is absolutely necessary to protect society, maintain individual and vested rights, and assure the free exercise of a sound conscience’” (p. 39), a century later it championed precisely the opposite positions.  Charting the ways poverty has been portrayed in the Times, Mac Donald shows how appeals for individual charity early in the 20th century shifted to demands for an ever-expanding welfare state.  With the passing decades, “elite opinion came to see the cause of poverty not in individual character and behavior but in vast, impersonal social and economic forces that supposedly determined individual fate” (p. 26).  No longer were individuals (including the poor) held accountable to moral standards, which were discarded in favor of a psychoanalytic model.  Distinctions between the “undeserving” and “deserving” poor disappeared from the Time’s pages.  Bad luck rather than bad character explained the plight of the city’s burgeoning welfare recipients.  In her judgment, one of the paper’s editorials cogently summarized the cultural elite’s agenda:  “first, to deny that welfare had become a trap and that conditions in the inner city reflected a moral, as much as an economic, decline; second, to disparage as greedy, unfeeling, and possibly even racist those who questioned the welfare status quo; and third, to insist that individuals acted not of their own free will but because of environmental conditions beyond their control” (p. 36).  None of these ideas, Mac Donald insists, is true.  Yet, sadly enough, as they have been imposed on the poor they have brought much misery.  

Some of the most trenchant essays in The Burden of Bad Ideas deal with one of Mac Donald’s main concerns:  education.  In “Law School Humbug” she dissects the current mission of elite institutions—to purge all racism, sexism, and classism from American society.  Teasing out the implications of the pragmatic jurisprudence of Oliver Wendell Holmes, numbers of law professors espoused varieties of Legal Realism and Critical Legal Studies, producing law review articles devoted to race and feminist theory that “have dispensed with the conventions of legal scholarship—case analysis, statement of legal problem followed by suggestions for its resolution—in favor of personal anecdotes telling of the author’s oppression” (p. 68).  Turning to graduate programs in education in “Why Johnny’s Teacher Can’t Teach,” she probes the mysterious innards of teacher education, which “has been in the grip of an immutable dogma, responsible for endless educational nonsense.  That dogma may be summed up in the phrase:  Anything But Knowledge” (p. 82).  A series of hot items—be it self-esteem or community-building or social justice or saving the planet—quickly becomes the educrats’ theme of the day.  Confidently committed to inducing “critical thinking,” teachers embrace “the central educational fallacy of our time:  that one can think without having anything to think about” (p. 85).  

The titles of other essays—“Compassion Gone Mad,” “Foster Care’s Underworld,” and “Homeless Advocates in Outer Space”—indicate the scope of Mac Donald’s authorial lens, and she successfully pillories many of the conventional liberal ideas that so shape public policy not only in New York but throughout the country.  Refuting the “bad ideas” of the intelligentsia are the realities of a world wherein three things seem clear.  “First was the depth of the dysfunction that I often saw—the self-destruction wrought by drugs and alcohol and promiscuity, the damage inflicted on children by a world from which the traditional family had largely disappeared (though throughout the most troubled neighborhoods I found individuals of extraordinary moral strength fighting for order).  Second was the extent to which government programs shaped life in the ghetto, influencing the choices that individuals made and distorting the forms the social interaction took.  Finally, I was continually amazed by the trenchancy with which those I interviewed could judge their situations and the policies that had gone into making them.  If you want to know how well social policies are working, I learned, ask the poor—when their advocates weren’t around” (pp. vii-viii).  

* * * * * * * * * * * * * * * * * * *

Two decades ago, in The Dream and the Nightmare:  The Sixties’ Legacy to the Underclass (San Francisco:  Encounter Books, 2000, c. 1993), Myron Magnet, the editor of the City Journal, identified “culture, not racism or lack of jobs or the welfare system” as the source of the ominous social crisis we now face.  Slogans from the Sixties (e.g. “if it feels good, do it”) turned toxic when absorbed by the underclass (p. 1).   Eminent baby boomers with their “new morality,” and ambitious cultural elites—the Bill Clintons and John Kerrys, the Bill Ayers and Al Gores—the “Haves” in Magnet’s presentation, “radically remade American culture, turning it inside out and upside down to accomplish a cultural revolution whose most mangled victims turned out to be the Have-nots” (p. 14).  

These privileged Haves have presided over a Nietzchean transvaluation of all values—“letting it all hang out” and “doing it” without any commitment to delayed gratification—that have locked millions in poverty.  Now empowered, this cultural elite—professors and journalists, judges and movie moguls, armed with such texts as Michael Harrington’s The Other America and John Rawls’s A Theory of Justice—vainly wants to “help” the poor!  They mistakenly believe that economic inequities are the “root causes” of all social problems.  However, Magnet asserts:  “The bitter paradox that is so hard to face is that most of what the Haves have already done to help the poor—out of decent and generous motives—is part of the problem.  Like gas pumped into a flooded engine, the more help they bestow, the less able do the poor become to help themselves.   The problem isn’t that the Haves haven’t done enough but that they’ve done the diametrically wrong thing” (p. 21).  “In particular, one belief central to the new culture of the Haves has wreaked incalculable mischief:  the idea that the poor are victims, that poverty is in itself evidence of victimization” (p. 121).  William Blake, two centuries ago, “spoke of ‘mind-forg’d manacles’—the ideas engendered from within one’s own imagination that one invests with power enough to enslave oneself.  Victimization is one such idea” (p. 157).  

The “underclass” that concerns Magnet is worse than poor, for many folks slip in and out of the “poverty” category.  The underclass he considers represents perhaps 2 percent of the population; it “didn’t begin to crystallize as a major American problem until the mid-1960s,” and in the next decade “it tripled in size” (p. 41).  Inexplicably, during economic prosperous times and in conjunction with the many successes of the civil rights movement, “black men suddenly, startlingly, and in ever-increasing numbers began to drop out” of the labor force, and within a decade, “when the underclass had emerged as an obdurate fact, black participation was 8.4 percentage points lower than white participation, a difference that statisticians find colossal” (p. 44).  They also cast aside family responsibilities, sparking a soaring rate of female-headed households that testifies to a massive cultural collapse, sinking into an expanding welfare system—food stamps, Medicaid, AFDC, etc.—which “has been a particularly insidious snare” (p. 57).  “Nothing tells these young women that getting pregnant without being married and having illegitimate babies they can’t support and aren’t equipped to nurture well is wrong.  The culture they live in, both the larger culture and the culture of the underclass, tells them that a life on welfare is perfectly acceptable and, arguably, just as good as the other kind of life” (p. 60).  

The presence of the homeless in our “streets, parks, and train stations in the heart of our cities,” Magnet argues, illustrates “the most extreme and catastrophic failure of the cultural revolution of the Haves and the social policies that resulted from it” (p. 83).  All too many of the homeless are criminals or mentally ill folks who ought to be institutionalized.  In 1963 President Kennedy naively “persuaded Congress to establish community mental health centers for the seriously mentally ill,” and within a decade distributing Thorazine and enrolling clients in various Great Society programs replaced the “insane asylums” of earlier years.   Sad to say:  “advanced ideas about personal liberation came together with advanced ideas about political enfranchisement to create a climate of opinion and a body of social policy that harmed those at the bottom of society in the name of doing them good” (p. 85).  The freedom that sounded so sweet in boomers’ seminars led to a toxic free-for-all chaos on the streets.  

Significantly, such lawless now enjoys legal protection as advocates of the “Living Constitution” as lawyers and judges follow the rationale of the 1954 Brown v. Board of Education school desegregation decision to implement “the new culture’s vision of change and liberation” (p. 184).  Superficially admirable, the judiciary’s rulings regarding “nondiscrimination by race would have to give way to discrimination by race” (p. 192) and counterproductive plans such as busing and affirmative action.    

225 Christopher Dawson

While in graduate school I fortuitously encountered the works of Christopher Dawson, who significantly shaped my understanding and subsequent teaching of Western Civilization as meaningful only as a manifestation of the Christian Culture that emerged in the Medieval World. To him, a providential perspective on the study of history was fully justified: ‘”Whatever else is obscure, it is certain that God is the Governor of the universe and behind the apparent disorder and confusion of history there is the creative action of divine law.” Committed to this endeavor, he acknowledged that he “had to follow my own line of studies and plough a lone furrow for thirty-five years'” because “‘the subject to which I have devoted myself—the study of Christian culture—has no place in education or in university studies'” (p. 10).

In Sanctifying the World: The Augustinian Life and Mind of Christopher Dawson (Front Royal, VA: Christendom Press, c. 2007), Bradley J. Birzer, a professor at Hillsdale College, provides us both a biographical study and intellectual analysis of one of the major 20th century thinkers, rightly praised by notables such as T.S. Eliot, C.S. Lewis, Etienne Gilson and Thomas Merton. Standing beside them, witnessing the upheavals of the 20th century, alarmed at the ‘”dark forces that have been chained by a thousand years of Christian civilization . . . have now been set free to conquer the world”‘ (p. 11), Dawson necessarily assumed the role of a “Jeremiah, prophesying lament and doom as the world followed down the paths of the various ideologues. But he also played the role of the saint, using his considerable intellectual gifts to demonstrate the necessity of virtue and the light of the Logos to the modem world through his writing, his teaching, and his public speaking” (p. vii).

Born to a distinguished family in 1889 in a Welsh castle, Dawson grew up in a largely pre-mechanical rural world. “Distrust of urban areas and masses of men would haunt Dawson for his entire life and greatly shape many of his views on culture, politics and society” (p. 20). Largely educated by private tutors, he largely learned through exploring his family’s estate and extensive personal reading. Though his family was Anglo-Catholic, he converted to Catholicism in 1914, having embraced an Augustinian philosophy of history while attending Oxford University. Subsequently he devoted himself, for a decade, to the reading and note-taking necessary for “a writing career as a historian and general man of letters” (p. 28). Properly prepared, he began publishing a profusion of articles and books—more than a 100 all told—designed to remind his readers of the formative role of Christianity in the making of European civilization.

Though generally disinterested in academic appointments, accepted (at the age of 69) a position at Harvard University as the first Chauncey Stillman Chair in Catholic Studies, where he taught for four years until poor health demanded his retirement. Despite this university appointment, however, Dawson generally considered himself a writer committed to describing and explaining the role of Christianity in history. Thus, following his university years he joined a small circle of thinkers committed to “the Aristotelian/Thomist understanding of order” in society. They further embraced the position of “Edmund Burke, who had stressed the need for the ‘moral imagination’—the ability to see clearly beyond the here and now into the reality of eternal forms—thus allowing one to order one’s soul, one’s present community, and one’s soul to the eternal community” (p.  50).  Such order, as St. Augustine insisted, could come only as God’s grace restored a fallen world to its divine plan, and the Church was the agency called forth “to sanctify the world, and the individual person—if properly ordered in his soul—plays a vital role in the process of sanctification” (p. 57).

Deeply influenced by St. Augustine, Dawson found powerful parallels between the fifth and twentieth century worlds, and to Aidan Nichols his “work is itself ‘best thought of as a latter-day City of God”‘ (p. 66). Both men sought to affirm and advance classical, Christian culture amidst barbarian invasions—Vandals in Augustine’s fifth century world, secularized disciples of the French Revolution, the Deists and doubters and relativists in Dawson’s day. Both did so with words, which they thought more powerful than swords or plows, rifles or computers. “With St. John, Dawson proclaimed the importance of the Word to the human person as well as to history and culture. As ‘little words’—that is, human persons as imago Dei—humans pass on their civilization through the rational use of language” (p. 84).

During the 1930s, Dawson joined like-minded writers seeking to awaken the English to the importance of religious faith and practice, contending that ‘”the whole universe is, as it were, the shadow of God. and has its being in the contemplation or reflection of the Being of God'” (p. 110). He found in the Apostle John’s Logos theology the foundational truth that “Jesus ‘was not only the Christ, the Son of the living God; He was also the Divine Intelligence, the Principle of the order and intelligibility of the created world'” (p. 111). Discerning His Light we may conform both ourselves and our world to Him. Thinking thusly, he said: ‘”A Christian only has to be in order to change the world, for in that act of being there is contained all the mystery of the supernatural life'” (p. 112).

During the same decade he spoke out against some of the pernicious ideologies that were enlisting enthusiasts for centralized bureaucratic systems, critiquing not only Stalin’s Soviet Communism and Hitler’s German National Socialism but FDR’s New Deal “as a constitutional dictatorship'” (p. 124). Indeed, he thought: ‘”It may be harder to resist a Totalitarian state which relies on free milk and birth-control clinics than one which relies on castor oil and concentration camps'” (p. 125). “The Europe of the 1930s, Dawson believed, faced the same fate as Republican Rome in 43 B.C. It would either die, or it would remake itself under a centralized government. In either case, it would never find any meaningful spiritual fulfillment” (p. 136). So it was a time, as Pope Leo XIII had earlier declared, for the Church to ‘”set up a wall and a bulwark to save human society from falling back into barbarism'” (p. 133).

Barbarism, of course, had been vanquished as Christians patiently shaped European Civilization during the Middle Ages. Rooted in the cultures of Greece and Rome, Western Christian Culture preserved the best of antiquity. As Germanic barbarians inundated the Roman Empire, monks and missionaries such as St. Boniface patiently led them both to Christ and civilization, “creating what we would now recognize as the beginnings of Europe, a synthesis of the classical, Christian, and Germanic” (p. 167). Equally important, medieval scholars such as Aquinas (best popularized by Dante) understood “grace as a ‘new spiritual principle which transforms and renews human nature by the communication of the Divine Life'” (p. 174).

This medieval synthesis began dissolving in the 14th century, first under assault from nominalist thinkers and nation states and subsequent (16th century) Protestant theologians and princes. The leading 14 century nominalist, “William of Occam, according to Dawson, played one of the most important roles in the breakup of Christendom and in the growth of nationalism. As ‘the leading mind of his age,’ Occam ‘was the initiator—the “venerable inceptor”—of the via modema [nominalism] which took the place of the classical scholasticism of the 13th century—the via antiqua—as the accepted doctrine of the universities for nearly two centuries, down to the time of Luther'” (p. 178). To his theological nominalism Luther added a staunch nationalism and thus secured an unprecedented alliance between Church and State that tragically divided Europe into warring factions.

Those warring factions flared forth in WWII, a deeply distressing event to Dawson. During and after the war he continued to write articles and books (best evident in what Birzer considers his best work, The Judgment of the Nations) pleading for a restoration of Western Christian Culture, heeding Pope Pius XIPs “call for a new Crusade, ‘to bring men back from the broken cisterns of national interest to the fountain of Divine justice’ and to promote a new and international understanding of the natural law” (p. 194). The 20th century witnessed, he wrote, ‘”the unloosing of the powers of the abyss—the dark forces that have been chained by a thousand years of Christian civilization and which have been set free to conquer the world.’ Together, these dark forces have ‘the will to power.’ The darkest forces first emerged in the French Revolution, and then re-emerged in the Soviet Union, spreading ‘westward, into the very heart of Europe'” (p. 199). To do battle, all Christians (Protestants, Catholics, Orthodox) needed, above all, to employ the Sword of the Spirit, for only in His strength could the battles and wars be won.

Following the war, Dawson was honored by being asked to deliver the distinguished Gifford Lectures and appointed a professor at Harvard. His erudition and insight were rewarded with highly public recognition and an enduring legacy. Yet in many ways, he was an “oddity” immersed in Catholicism and committed to the Reality and role of God in His world. Summing up his fine, exhaustively researched and documented study, Birzer says: “He offered an Augustinian vision of culture and history to the twentieth century; he encouraged men and women to act like men and women in the best of the western tradition—through the virtue of love; he attacked the ideologues of the left and right as nothing more than false prophets promoting false religions and false gods; and, to revive the world through the imagination , he promoted a new an vigorous understanding of the liberal arts. …. He desired to sanctify the world, through grace, to embrace truth, beauty, and goodness” (p. 271).

* * * * * * * * * * * * * *

The Gifford Lectures, devoted to the subject of Natural Theology, in some ways resemble the Nobel Prize for thinkers in the history and philosophy of religion.  So Christopher Dawson was honored to be invited to deliver the lectures in 1947. He began with an anthropological and sociological analysis later published under the title Religion and Culture (London: Sheed & Ward, 1948). To explain his approach to Natural Theology he noted that until modern times it was simply a familiar aspect of Christian theology. Since the Renaissance, however, it has increasingly become a facet of humanism, with its scientific presumptions, rather than traditional faith-based reason.

As an historian Dawson insists we broaden our vision and take in the totality of human experience, acknowledging that from the beginning recorded history “we can never find a time or place where man was not conscious of the soul and of a divine power on which his life depended” (p. 41). Rightly understood, “All religion is based on the recognition of a superhuman Reality of which man is somehow conscious and towards which he must in some way orientate his life. The existence of the tremendous transcendent reality that we name GOD is the foundation of all religion in all ages and among all peoples” (p. 25). Thus Natural Theology was rooted in the basic human awareness “that man has only to look out and to look up in order to see the manifest proofs of Divine power and wisdom” (p. 30).

Discerning this Divine Reality, people developed cultures in accord with it. Consequently, Dawson insists: “Religion is the key of history” (p. 50). “Religion and art are older than agriculture and industry. In the beginning was the word, and man was a seer and an artist before he was a producer” (p. 132). Above all we are by nature homo religiosus, and “every great historic culture, viewed from within through the eyes of its members, represents a theogamy, a coming together of the divine and the human within the limits of a sacred tradition” (p. 54). Furthermore, “every culture, even the most primitive, seeks, like the old Roman civic religion, to establish a. jus divinum which will maintain the pax deorum, a religious order which will relate the life of the community to the transcendent powers that rule the universe.

The way of life must be a way of the service of God” (p. 62). Discerning the divine Design are prophets; propitiating the divine Power are priests; ordering society in accord with the divine Order are kings. “The Prophet is the organ of divine inspiration, the King is the organ of sacred power, but the Priest is the organ of knowledge—the master of sacred science” (p. 102). With enormous erudition—an encyclopedic knowledge of human history—Dawson illustrates the importance of religion in vibrant cultures. In sum: “It is the traditional teaching of Natural Theology that the elements of religious truth are common to the human race and accessible to every rational creature—that the Divine Being is the transcendent end towards which all the different ways of life converge and the divine law the universal norm by which all the different patterns of human behaviour can be co-ordinated” (p. 211).

Tragically, to Dawson, a “new scientific culture [that] is devoid of all positive spiritual content” has gained control of much of the world” (p. 214). What emerged in the 20th century could hardly be considered a culture at all! Given the tyrannies and bureaucracies and nihilism everywhere evident in the West following WWII, our culture “may become the enemy of human life itself (p. 215). Consequently: “We are faced with a spiritual conflict of the most acute kind, a sort of social schizophrenia which divides the soul of society between a non-moral will to power served by inhuman techniques and a religious faith and a moral idealism which have no power to influence human life. There must be a return to unity—a spiritual integration of culture—if mankind is to survive” (p. 217).

* * * * * * * * * * * * * * * * * * *

The cultural integrity needed for mankind to survive was resplendently evident, Christopher Dawson argued, in the Christian Culture of the High Middle Ages. He articulated this case in one of his best books. Religion and the Rise of Western Culture (Garden City, NY: Image Books, 1958; c. 1950 by Sheed & Ward), the second volume of his Gifford Lectures, delivered in the University of Edinburgh, 1948-1949. He declared (citing Lord Acton) that “‘Religion is the key of history'” (p. 15). The energy and creativity that have distinguished the West can only be explained in light of a spiritual vigor “independent of political power or economic prosperity” (p. 18). “The beginnings of Western culture,” Dawson says, “are to be found in the new spiritual community which arose from the ruins of the Roman Empire owing to the conversion of the Northern barbarians to the Christian faith” (p. 26). The missionary nature of Christianity shines forth in the patient work of monks and priests, soldiers and scholars, slowly making Europe Christian. They did so not with the intent of civilizing barbarians or orchestrating “social progress, but with a tremendous message of divine judgment and divine salvation. …. Only by way of the Cross and by the grace of the crucified Redeemer was it possible for men to extricate themselves from the massa damnata of unregenerate humanity and escape from the wreckage of a doomed world” (p. 35). Calling sinners to salvation, however, involved means whereby they could be sanctified. Thus liturgical worship and uplifting architecture and proper education were emphasized.

The men who made the West during the Medieval Era were mainly monks, doing the Opus Dei (work of God) by daily reciting “the divine liturgy of prayer and psalmody” (p. 48). Through them “religion exercised a direct formative influence on the whole cultural development of these centuries” (p. 44). Luminaries such as St. Boniface, the missionary “Apostle of Germany,” and Alcuin of York, who established Charlemagne’s school of the palace, bear witness to the grandeur of Medieval monks. They were “the watchmen or guardians who ‘kept the walls’ of the Christian City and repelled the attacks of its spiritual enemies” (p. 45). In addition to praying the monks worked—oro et labore. “It was the disciplined and tireless labour of the monks which turned the tide of barbarism in Western Europe and brought back into cultivation the lands which had been deserted and depopulated in the age of the invasions” (p. 53).

While Irish and Benedictine monks effectively converted the barbarians and established the Church in the West, equally effective representatives of Byzantium converted Magyars and Bulgars and Slavs and others in the East, dealing with “a series of Asiatic barbarian empires, which constituted a continual threat to the Balkan provinces and the capital itself (p. 104). Especially important, radiating from Kiev following the conversion of Vladimir in 988, a vigorous Russian Orthodox community expanded and flourished. Indeed “the conversion of Russia opened a new channel by which Christian culture could penetrate the pagan North, so that the whole continent seemed about to become a Christian orbis terrarum” (p. 114). The Mongolian invasions, along with the Islamic Turks’ assault on Byzantium, profoundly thwarted this possibility, but successful missionary endeavors of both Eastern and Western Christians in the Middle Ages certainly laid the foundation for Europe.

However externally successful, the Church constantly needed internal reform. Monasteries lost their integrity, bishops neglected their calling, priests flaunted their vows, and princes violated the central precepts of God’s law. To read reformers, such as St. Odo ofCluny in the 10th century, the Church had lost her way and needed the radical renewal evident in the High Middle Ages—the feudal world of chivalry, cathedrals and crusades, of universities and mendicant monastic orders, of saints and scholars such as St. Francis ofAssisi and St. Thomas Aquinas. Dawson surveys and documents the rich cultural life of this Medieval world, arguing (much like Henry Adams in Mont St. Michel and Chartres) that it was one of the finest epochs in human history. “It finds an almost perfect literary expression in Dante’s epic, and it was embodied in visible form in the great French cathedrals. But, above all, it found supreme expression in the philosophic systems of the thirteenth century—those great ‘cathedrals of ideas’, as Professor Gilson has called them, in which all the acquisitions of Aristotle and Arabic science have been organically incorporated with the Christian tradition in an intelligible unity” (p. 197). And it was fueled by fervent religious faith—a faith. Henry Adams declared, more powerful than the 19th century’s electric dynamos. Inevitably, the rich cultural synthesis in the Medieval World declined and dissipated. Reforming orders lost their zeal and wandered into the labyrinth of “ecclesiastical power politics” (p. 215).

Scholasticism lost its intellectual resiliency amidst overly rational speculation and increasingly skeptical nominalism. “This tragic crisis of the medieval spirit,” Dawson says, “is reflected in the greatest literary achievement of that age, the Divina Commedia of Dante. Nowhere can we find a more perfect expression of the power and the glory of the medieval cultural achievement which reached from Heaven to Hell and found room for all the knowledge and wisdom and all the suffering and aggressiveness of medieval humanity in its all-embracing vision of judgment. Yet at the same time it is a most drastic indictment of the medieval Church” (p. 216).

To Dawson, the grandeur of Western Culture, best displayed in the Medieval World, “is not to be found in the external order they created or attempted to create, but in the internal change they brought about in the soul of Western man—a change which can never be entirely undone except by the total negation or destruction of Western man himself (p. 224). Most importantly, that world exemplified “moments of vital fusion between a living religion and a living culture are the creative events in history, in comparison with which all the external achievements in the political and economic orders are transitory and insignificant” (p.224).

###

224 Leviathan Rising

Few of the past century’s developments rival the growth of big government in America.   Once upon a time Americans lived freely, barely aware of the federal government, for there were not income taxes, no Social Security payments, few if any regulations requiring permits and payments.  As Robert Higgs shows, however, in Crisis and Leviathan:  Critical Episodes in the Growth of American Government (New York:  Oxford University Press, c. 1987), such “days, alas are long gone.  Now, in virtually every dimension, our lives revolve within rigid limits circumscribed by governmental authorities; we are constrained continually and all sides by Big Government” (p. ix).  Government may now, Warren Nutter said, “‘take and give whatever, whenever, and wherever it wishes’” (p. 4).  Bureaucrats, “rather than private citizens effectively decide how resources will be allocated, employed and enjoyed” (p. 28).  

This remarkable growth of government has generally been supported by the public, especially since it has occurred during times of crisis.  Thus Rahm Emmanuel’s oft-cited remark, as the Obama Administration took control in 2009, declaring they could not let a crisis go to waste, reflects historical reality Given a sense of national emergency, individuals may easily ignore their own self-interest in order to support collective action (through taxation or regulation) portrayed as necessary for national security or economic justice.  During the 20th century, three grand crises augmented government growth—the two world wars and the Great Depression.  Once the crises passed, however, the  government always retained its hastily-crafted, crisis-forged powers.  “As William Graham Sumner observed, ‘it is not possible to experiment with a society and just drop the experiment whenever we choose.  The experiment enters into the life of a society and never can be got out again’” (p. 58).  

To argue his thesis, Higgs first examines the “crisis under the Old Regime, 1893-1896,” when serious proposals to expand the federal government were debated and rejected.  The wrenching depression of the ‘90s ignited populist protests across the land, particularly in the South and West.  Giving voice to this discontent, William Jennings Bryan gained the Democrat Party’s presidential nomination in 1896 and promised to fundamentally change the American way in accord with the 1892 People’s Party platform which, “besides proposing unlimited coinage of silver and the graduated taxation of incomes, and called for the nationalization of the railroads, telegraph, and telephones and declared support for the organization and objectives of labor unions.  Americans of all political persuasions sensed that the future of the political economy lay in the balance.  ‘The election,’ said newspaperman William Allen White, ‘will sustain Americanism or it will plant Socialism’” (p. 78).  

What White termed “Americanism” was widely supported by the American people at that time.  To “James Bryce, perhaps the most perspicacious foreign observer of American society in the late nineteenth century,” the people believed “that certain rights of the individual, such as the ‘right to the enjoyment of what he has earned . . . are primordial and sacred.’  Moreover, all governmental authorities ‘ought to be strictly limited’ and ‘the less of government the better. . . .  The functions of government must be kept at their minimum’” (p. 83).  Rather than call for revolutionary changes to rectify economic inequities, Bryce continued, “‘the honesty and common sense of the citizens generally’” led them to insist “‘that the interests of all classes are substantially the same, and that justice is the highest of those interests.  Equality, open competition, a fair field to every body, ever stimulus to industry, and every security for its fruits, these they hold to be the self-evident principles of national prosperity’” (p. 83).  

Standing for these self-evident principles and resisting the Populist program was the Democrat elected in 1892, President Grover Cleveland, as well as the Supreme Court.  Philosophically opposed to the radicalism of men such as Bryan, Cleveland determined “to save the gold standard, threatened by silverite inflationists; to preserve an orderly and free labor market, jeopardized by unionist, rioters, and proponents of governmental work relief for the unemployed” (p. 78).  During his first term (1885-89) Cleveland had vetoed a bill calling for sending $10,000 to Texas farmers suffering from a drought.  That they were suffering was manifestly evident, but “he could ‘find no warrant for such an appropriation in the Constitution.’  Further, ‘A prevalent tendency to disregard the limited mission of [the government’s] power and duty should be steadfastly resisted, to the end that the lesson should be constantly enforced that, though the people support the Government, the Government should not support the people’” (p. 84).  

Supporting oneself—that was the American way in the 18th century.  Thus the Supreme Court rejected (as unconstitutional) legislative efforts to impose a federal income tax.  Arguing for it, attorney James C. Carter “freely admitted that it was class legislation,” taking from a few rich folks to supply the needs of the less fortunate.  In response, the opposing attorney, Joseph H. Choate declared:  “‘there are private rights of property here to be protected. . . .  The act of Congress which we are impugning before you is communistic in its purposes and tendencies, and is defended her upon principles as communistic, socialistic—what shall I call them—populistic as ever have been addressed to any political assembly in the world’” (p. 100).  Justice Field, appointed to the Court in 1863, concurred with Choate, predicting “that ‘[t]the present assault upon capital is but the beginning.  It will be but the stepping-stone to others, larger and more sweeping, till our political contests will become a war of the poor against the rich; a war constantly growing in intensity and bitterness’” (p. 102).  

What Justice Field feared transformed within two decades, as the Progressive Movement and World War I substantially altered America’s political and economic system.  Presidents Theodore Roosevelt and Woodrow Wilson led the way, frequently bolstered by muckraking journalistic pronouncements and reflecting a profound change in political theory, moving in a socialistic direction.  Inculcating socialism, were “‘literary socialists such as William Dean Howells and Upton Sinclair,’” and economists including Henry Demarest Lloyd and Richard T. Ely.  “‘Socialism supplied the critique, if not the technique for much Progressive reform; and though not always recognized, its effect was felt in all social sciences’” (p. 116).  

Progressivism’s impact upon America, however, was paltry when compared with the “war socialism” imposed upon the nation during WWI.  By the war’s end, “the government had taken over the ocean shipping, railroad, telephone, and telegraph industries; commandeered hundreds of manufacturing plants; entered into massive economic enterprises on its own account in such varied departments as shipbuilding, wheat trading, and building construction; undertaken to lend huge sums to businesses directly or indirectly and to regulate the private issuance of securities; established official priorities for the use of transportation facilities, food, fuel, and many raw materials; fixed the prices of dozens of important commodities; intervened in hundreds of labor disputes; and conscripted millions of men for service in the armed forces” (p. 123).  Amidst it all, President Woodrow Wilson assumed an alarmingly dictatorial stance.

Though many of the measures adopted during WWI were scaled back during the 1920s, an ideological shift had occurred, facilitating the enormous expansion of Big Government in the 1930s.  The Great Depression, to New Dealers, “John Garraty has written, ‘justified the casting aside of precedent, the nationalistic mobilization of society, and the removal of traditional restraints on the power of the state, as in war, and it required personal leadership more forceful than that necessary in normal times’” (p. 170).  The political rhetoric of class conflict was revealing:  “‘“Competition” became “economic cannibalism,” and “rugged individualists” became “industrial pirates.”  Conservative industrialists, veteran antitrusters, and classical economists were all lumped together and branded “social Neanderthalers,” “Old Dealers,” and “corporals of disaster”’” (p. 179).  

Rather than serving as a guardian of individual rights, the Constitution was construed by New Dealers and ultimately rationalized by the Supreme Court as an enabler of federal power.  Virtually anything desirable was deemed doable, leading many traditionalists to consider the Constitution shredded beyond recognition.  The New Deal’s radical innovations transformed the nation, putting the federal government in control of significant sectors of public life.  Beyond all the regulations and subsidies, social security and labor law, the New Deal’s “most important legacy,” Higgs insists, “is a certain system of belief, the now-dominant ideology” that justifies getting what one wants at the expense of others.  “To take—indirectly if not directly—other people’s property for one’s own benefit is now considered morally impeccable, providing that the taking is effected through the medium of government” (p. 195).  

The unrestrained government established by Franklin D. Roosevelt and the New Deal was further expanded during WWII—years which “witnessed the creation of an awesome garrison economy” (p. 196).  A plethora of federal bureaucracies controlled prices and dictated policies.  “Whether one calls the prevailing political economy ‘war socialism,’ ‘war fascism,’ or something else is largely a matter of linguistic taste, but capitalism it definitely was not” (p. 211).  Given the hand-in-glove relationship between government and industry (as well as government and labor), Higgs says, “vast profits and losses were at stake, and governmental officials had their hands on the levers that controlled how the mechanism would operate.  The politicians, observed Fiorello La Guardia (who knew something about politicians), ‘are drooling at the mouth and smacking their jowls in anticipation of the pickings once they get their slimy claws into the price administration’” (p. 209).   

Following WWII the powerful centralized government retained its war-time powers.  The laissez faire, free-market economy and limited government featured by the nation’s founding documents had lasted only into the 1920s.  As the distinguished economist Joseph Schumpeter noted, in the title of a talk delivered just nine days before he died in 1949, the U.S. had been taking a “March into Socialism” (p. 239), thus risking the tyranny generally associated with such an economy.  Big Government cannot but become a dictatorial Leviathan.  And it has materialized, in crisis-induced spurts, Higgs persuasively argues under the auspices of progressivism and liberalism.  

* * * * * * * * * * * * * * * *

Whereas Higgs, an economist, examines historical epochs that illustrate the expansive Leviathan, Kenneth Minogue seeks to explain the same phenomenon from a more philosophical perspective, wondering if “the moral life can survive democracy” in The Servile Mind:  How Democracy Erodes the Moral Life (New York:  Encounter Books, c. 2010).   The democracy envisioned in the 19th century hardly resembles what now flourishes throughout the Western world; though it once meant “a government accountable to the electorate, our rulers now make us accountable to them” (p. 2), illicitly “telling us how to live” (p. 3).  This reflects a profound ethical shift, substituting an imposed “politico-moral” agenda for the individually free and virtuous standard evident in Aristotle and Aquinas.  

As it has evolved, Minogue thinks, modern democracy “leads people increasingly to take up public positions on the private affairs of others.  Wherever people discover that money is being spent, either privately or by public officials, they commonly develop opinions on how it ought to be spent.  In a state increasingly managed right down to small details of conduct, each person thus becomes his own fantasy despot, disposing of others and their resources as he or she thinks desirable” (p. 214).  As Aristotle expressly warned, given the opportunity “the property-less will exploit those with property” (p. 237).  While once promoting liberty, today’s democracies righteously curtail any freedoms deemed injurious to either public or private well-being.  Independent individualists have been replaced by servile dependents.  “Voting yourself rich,” as P.J. O’Rourke quipped, sure beats working and saving!  And this is no minor matter, for:  “We should never doubt that nationalizing the moral life is the first step toward totalitarianism” (p. 3).  

Whereas 19th century democrats (e.g. Thomas Jefferson) celebrated individualism, their 21st century heirs (e.g. Lyndon Johnson) insist governments feed and house, comfort and care for everyone needing help.  Individuals once considered themselves duty-bound (to their country, spouse, community), but they now think their government obligated to them—thus food stamps, social security, health care, etc.  The right to be free from government control has morphed into the right to demand goods and services from the government.   Schools provide free lunches, whether or not students learn to read and write.  Churches once preaching personal salvation now promote “social justice.”  Replacing the right to pursue happiness one one’s own us the right to demand happiness at the breast of the nanny state.  Living virtuously has been replaced by efforts “to legislate the kind of society we want, or think we want” (p. 123).  Ironically, Minogue argues, the intellectual and political elites that celebrate the wisdom and maturity of “the people” treat them as incompetent children who need constant supervision and subsidies.  They flatter the masses when giving commencement speeches but enact policies predicated upon the assumption that environment (poverty, discrimination) dictates behavior.  

Pervading it all is the ethical utilitarianism birthed by Jeremy Bentham and John Stuart Mill and today evident in philosophers such as Richard Rorty and Princeton’s notorious Peter Singer.   Morality was reduced to sentiment by 18th century skeptics such as David Hume; so a subjective, “sentimental moralism” has replaced the sturdy objective “natural law” principles of Christian ethicists such as Thomas Aquinas and Thomas Reid.  The “greatest good for the greatest number” has become the democratic imperative.  Outcomes—in schools and churches and federal agencies—become the singular criteria whereby we distinguish good from evil.  Whether taxing the rich or endorsing affirmative action, the question is not whether it is just but whether it promotes economic equality or compassionate feelings.  “Redistributionist taxation,” for instance, “is often defended as socially just, and therefore as being a moral as well as a civic obligation, but no one who observes the incompetence of governments in first raising large sums by taxation and then spending so much of it wastefully is likely to be impressed by this invocation of morality” (p. 64).  

Discrimination of virtually any sort violates the modern democratic creed—the “religion of equality” that has become our regnant “piety” (p. 83).   Though ignoring the evident reality of human nature by trying to mandate equality (racial, sexual, economic) is the equivalent of “making water run up hill” (p. 81), radicals routinely embark upon utopian endeavors designed to do so.   Even attitudes and “feelings” must be legislatively normalized—thus university speech codes and “hate crime” legislation!  “Liberation” movements inevitably promote the politico-moral agenda subtly “denying the basic reality on which modern Western Civilization is based” (p. 317).  

In brief, Minogue defines the servile mind “as the abdication of moral autonomy and independent agency in favor of either some unreflective collective allegiance or agency in favor either of some unreflective collective allegiance or some inevitably partial and personal impulse for illicit satisfaction” (p. 192).  Increased attention to “victims” (necessarily slaves rather than free men) and their “rights” has transformed modern democracies.  We have, he laments, sold our birthright (with Esau of old) for a mess of porridge!  

# # #

223 Women and the Future of the Family

In 1998 Elizabeth Fox-Genovese, a distinguished humanities professor at Emory University, gave the Kuyper Lectures at Eastern College in Pennsylvania.  The lectures appear in Women and the Future of the Family (Grand Rapids:  Baker Books, c. 2000).  As an historian, she sought to place current issues in their proper context; as a concerned woman she sought to contribute to the formation of better, stronger families.  She began by noting the problems of sex and violence among our young.  While many “experts” blame external factors such as pornography and firearms, something more profound is at work, and refusing to recognize “direct connections between the aberrant behavior of children and the nature of family life” (p. 16) is manifestly self-deceptive.  To the author, “even while it is impossible to blame a child’s family for his or her behavior, it is entirely appropriate to draw connections between prevailing types of families and prevailing patterns of behavior among children and youth” (p. 16).  

Discerning “the signs of the times,” she acknowledges that the modern family faces great stresses and that our children are doing poorly.   It is obvious “that children would fare better if their mothers did not work outside the home, or, at least, if one of their parents were at home when the children return from school.  These days only the most unreconstructed traditionalists—many with some hesitation—dare to suggest that a mother and a father may play different roles in a child’s life and, hence, have different responsibilities” (p. 17).  Disturbingly, there is an “astounding complacency toward the ominous tendencies of our political, social, and cultural life, for within a remarkably brief period we have, almost without noticing, embraced a cataclysmic transformation of the very nature of our society” (p. 17).  

In part this results from the historic rise of individualism.  Whether one considers economics, with men and women seeking salaried employment, or religion, where neither men nor women seem particularly heedful of their churches’ admonitions, Americans clearly value their freedoms.  To many, personal autonomy—releasing the chains that tie them to spouses or children or parents, communities or traditions, churches or theology—is life’s summum bonum.  This love of freedom flourished, particularly on the frontier, from the country’s earliest days, but it was, until recently, counterbalanced by strong families—still quite hierarchical and traditional—wherein the father assumed “authority over all, including his wife,” and parents assumed “authority over their children” (p. 21).  By reacting against any husband’s authority,  however, 19th century feminists launched a liberation movement that subtly bore fruit in the 20th century.  Yet, Fox-Genovese asserts:  “One thing is blindingly clear:  “The transformation of women’s lives and expectations during recent decades has no historical precedent, and its consequences reach into every aspect of family a societal life.  Above all, the changes in women’s lives and expectations are having a radical impact on families and the very idea of the family, and therefore on the lives of children, and therefore on the character and prospects of future generations” (p. 24).  

Much has improved for women, thanks to the feminist movement, and “the comparative improvement in the position of women relative to that of men has been revolutionary, vastly surpassing the improvement secured in a comparable span of time by any other working group in history” (p. 26).  But the very freedoms enjoyed by modern women bring with them another set of challenges.  Sexual liberation, secured by abortion rights, has certainly been less than an unmixed blessing!  Importantly, preeminently:  “Defense of abortion on demand has remained a sacred tenet of feminists, who regard it as the cornerstone of women’s sexual freedom and who oppose any restrictions on it” (p. 28).  Autonomous individuals cast aside all ties that bind!  Autonomous women must, above all, be freed from childbearing, even if it involves killing infants in the womb.  Recourse to abortion, of course, frees a woman from “children—the possible consequence of her sexuality.  This strategy effectively divorces children from any social institution by labeling them the concern of a woman rather than of a woman and a man” (p. 28).  The courts, notably the Supreme Court in Planned Parenthood of Central Missouri v Danforth, declared “the husband has no more stake in his wife’s pregnancy than any other individual, which effectively strips him of any stake in the family and strips the family of any standing as an organic unit.  More disturbing, as Tiffany R. Jones and Larry Peterman argue, Danforth, but shredding the husband’s stake in children, establishes that ‘there is nothing of one’s own in the most serious sense left for husbands in the family’” (p. 29).  

Inevitably—it necessarily follows—the family loses standing, subject to the volatile desires of adults and children who may or may not choose to live together.  Sexual liberation cannot but cause “the disintegration of the family” (p. 31).  Though virtually all careful studies demonstrate how children suffer when their parents divorce, roughly half of them will spend at least part of their lives in a single parent home.  With an inexorable inevitability, in the wake of the “equality” of the sexes came the “skyrocketing number of out-of-wedlock births and the declining rate of marriage” (p. 32), developments hardly anticipated by the champions of women’s liberation 50 years ago.  Like it or not, Fox-Genovese says:  “The sexual liberation of women, combined with the feminist campaign against marriage and motherhood as the special vocation of women, has directly contributed to the declining birthrate, the proliferation of single-parent or single-mother families, and the number of children born outside of marriage” (p. 35).  

Even apparently bland feminist demands, including calls for egalitarian marriages and insisting men and women should abandon traditional roles with men cleaning and cooking and keeping house, spoil domestic tranquility, for “couples in which men share domestic tasks with their wives are more likely to divorce than those in which they do not; those in which the man earns more than 50 percent of the family’s income are less likely to divorce than those in which he does not; and the larger the share of the family’s income the wife earns, the more likely her husband is to abuse her” (p. 34).  Feminists insist that marriage be a “contract,” obligating both partners to share equally in all aspects of life together.  Consequently, a woman defending her “rights” within such a relationship easily feels umbrage when it seems she is doing more than her fair (i.e. equal) share.  Forgotten is the fact that marriage, unlike a business deal, demands surrendering rather than promoting one’s rights!  

Women asserting individual autonomy encouraged men to claim it as well, and “the sexual liberation of women has realized men’s most predatory sexual fantasies.  As women shook themselves free from the norms and conventions of sexual conduct, men did the same” (p. 31).  The permission granted our sexual license “effectively destroys the ideal of binding moral norms.  By definition, when morality becomes a matter of personal preference, it ceases to be a binding social norm, and personal preference is merely the logical application of the consumer choice vigorously promoted by global corporations.  The discrediting of binding social norms in turn undermines our ability to protect children, who themselves are now seen to enjoy virtually the same individual rights as adults” (p. 39).  

While this great social upheaval has transformed our social world, the only institution (the Church) capable of providing guidance amidst it all has “showed little enthusiasm for condemning the disintegrative forces out of hand” (p. 37).   Quite the opposite!  The churches have in fact become agents for sexual liberation and feminist theology.  To Fox-Genovese—so lately returned to the Christian faith—this poses a major challenge.  Indeed:   “The greatest danger of all may lie in the dissemination of sexual egalitarianism within our churches, for the core of Christianity has always lain in the simultaneous reality of our particularity and our universality” (p. 44).  

This brief book, composed of five succinct lectures, is followed by responses by Stanley J. Grenz, Mardi Keyes, and Mary Stewart Van Leeuwen, all taking exception to some of Fox-Genovese’s views.  Basically they all support some strain of evangelical feminism.  Other than illustrating enduring tensions within the Christian world, they merit only cursory attention.  

* * * * * * * * * * * * * * * * *

In the 1980s Elizabeth Fox-Genovese was a lustrous fixture in the pantheon of academic feminism.  A decade earlier she’d aligned herself with militant feminists who  “supported a woman’s right to have an abortion, equal pay for equal work, a married woman’s right to keep her name, women’s equal access to credit, and no fault divorce” (p. 15).  Along wither her husband, Eugene Genovese, she articulated a Marxist vision of history, and she was selected to establish and head a department of women’s studies at Emory University.  Having done so, however, she found that the women she recruited as professors quickly envisioned themselves as social activists rather than serious scholars, and in time she was shoved aside by   supposed “scholars” determined to indoctrinate naive students.  

By the 1990s Fox-Genovese had become both disillusioned with radical feminism and drawn to the Christian Faith—and in particular the Catholic Church which she joined.  To better understand her final views on the subject one must read her “Feminism Is Not the Story of My Life”:  How Today’s Feminist Elite Has Lost Touch with the Real Concerns of Women (New York:  Doubleday, c. 1996).  In writing the book Fox-Genovese blends an extensive number of interviews with the academic literature so well known to her.  She found that for ordinary women, almost without exception, “Feminism is not talking about my life” (p. 2).  Feminists talked incessantly about themselves and their theories, but they knew little or nothing about common women who love men and deeply treasure marriage and children.  The author’s “book is no conservative manifesto,” says Maggie Gallagher.  “Instead it is one feminist writer’s attempt to understand why, at a time when feminist ideas about work and equality are widely accepted, so few American women identify with feminism as a political cause.”  

Consequently most of these women avoid any association with “feminism.”  For example, to a “tough, independent and strikingly beautiful” New Mexico rancher’s wife, who routinely saddled up and helped work cattle, “feminism has nothing to do with her life and feminists, whom she views as soft as well as softheaded liberals, would no last two days on her ranch” (p. 22).   Such women bear bad news for feminists, for the “overwhelming majority of American women perceive feminism as irrelevant.  In their view, feminism has no answer for the women’s issues that most concern them” (p. 33).  And Fox-Genovese has increasingly identified with these hard-working common folks rather than her peers in the academic elite!  For example:  “I have always known that, faced with a choice, my marriage would come before my career” (p. 6).  As a childless academic, she never faced that choice, but in her heart she’d already decided—her man comes first! 

The author’s love for and commitment to her husband revealed her deeply feminine nature.  But feminists, she notes, “have not had much patience with femininity, which they see as a trap that distracts women from the pursuit of power and independence.  For what is femininity except a disguise that women adopt to appeal to men?  As it happens, most women still do want to appeal to men, which may help to explain why they do not have much use for feminism” (p. 36).  This conviction leads Fox-Genovese to justify and actually celebrate many of the alluringly feminine interests and endeavors so despised by prominent feminists—shopping with friends, fashionable dresses, sculpted nails, regular perms, etc.  Feminists who condemn femininity ignore the blatant truth that women themselves cheerfully embrace it.  

For many radicals, Fox-Genovese says, “feminism is mainly about sex,” and their fervent commitment to pre-marital sex, shacking up, no-fault divorce and abortion rights—“the litmus test” of the movement—illustrates it (p. 59).  Sexual activity has been freed from moral restrictions, reduced to a pleasurable realm wherein most anything goes.  Almost overnight (in the 1960s) “a solid majority of young people in their late teens and twenties saw no connection between premarital sex and morality at all” (p. 75).  Women could, at long last, enjoy unfettered sex as freely as men. “This much is clear:  The sexual revolution has irreversibly transformed the lives of American women, who are trying to understand what that transformation means for them” (p. 60).  For most women, however, the sexual revolution has proven problematic, and for a large number of them its consequences have been injurious.  “What these apostles of liberation were unwilling to imagine was that sex itself might make women unequal to men” (p. 63).  Unmarried, impoverished, rearing children by themselves, numbers of mature women pay the price of our nation’s sexual liberation!

Coping with their sexual liberation certainly challenges the women Fox-Genovese interviewed for this book.  Feminism “has convinced a surprising number of Americans that ‘fairness’ to women requires permitting them virtually the same sexual freedom as men, although they obviously face immeasurably greater risks.  Uncertainty about what that freedom should mean has undermined their willingness to defend any single public moral standard” (p. 90).  Cut loose from the firm standards once prescribed by Christian churches, left to devise their own “personal” perspectives, they struggle to make sense of their world and look critically at feminists’ promulgations.  “Slightly more than half have come to believe that the increased acceptance of premarital sex has been bad for society, and only 30 percent think more sexual freedom in the future a good idea” (p. 94).  Now that they’re rearing their own children, the promiscuous freedoms of the ‘60s generation seem less and less exciting!  “Teen pregnancy, AIDS, drugs, and pornography intertwine to threaten everything they believe in and, especially, everything they want for their children.  They do not view the collapse of traditional values as liberation” (p. 101).  While wealthy and well-educated women—the ruling class and leaders of NOW—insulate themselves from the consequences of their social engineering, ordinary women find themselves saddened by them.  

The feminist movement, of course, helped establish career opportunities for millions of women.  Indeed:  “Modern feminism emerged as a direct response to the economic revolution that has transformed our world” (p. 111).  Yet the sword that granted employment proved two-edged, effectively severing ties between mothers and children!  There’s a monumental difference between working women and working mothers!  And successful careerists, whether male or female, cannot but prioritize work—80 hour weeks for young attorneys, constant-on-call status for young doctors.  Charging up the career path, there’s simply no good time to pause and have children.  But normal women deeply desire children!  “Hence the grim threat of the economic revolution:  As workers women need to be liberated from children; as mothers, they need to be liberated from work” (p. 114).  Most all the alleged “inequities” women suffer in the workplace stem from their strong maternal desire to be present with and actively engaged in the lives of their children.  Feminists angrily insist they pursue careers, dispatching their young to day care centers and schools, content to spend a few hours of  the ever-elusive “quality time” each week with their kids.  But their children, inevitably, “know that they cannot count on their mothers’ always being there” (p. 123).  

To radical feminists, the mother simply should not be there, if by “there” you mean home!  Let nannies or day care workers do the nitty-gritty work of nursing babies and changing diapers.  A real woman should be working, breaking through the glass ceiling and making her mark on the world, proving herself the equal of any man.  “Feminists tend to see any talk of women’s responsibilities to mother as a male plot” (p. 188).  Most women, however, find motherhood important and rewarding, and “most would prefer jobs close to home with flexible hours and higher pay for the time worked.  Half would prefer not to work at all while their children are young” (p. 189).  Challenging the feminist elite which frequently condemns women who stay home with their children, Fox-Genovese defends their right to choose what suits them.  “By any reasonable standard,” she concludes, “the rearing of children is the most important thing that individuals—or, for that matter, societies—do.  And the evidence is mounting on all sides that, especially in a society as complex and dangerous as our own, that rearing takes time” (p. 197).  

Without question, Fox-Genovese says, children torpedo the feminist message of sexual equality and personal autonomy.  This message regarding marriage and family, succinctly stated, is “driven by two convictions:  first, that women should not be forced to marry in order to have children, and, second, that children do not need relations with parents of both sexes” (p. 235).  Such thinking, however, has little contact with reality.  “Children, not men, restrict women’s independence; children, not men, tend to make and keep women poor.  Few but the most radical feminists have been willing to state openly that women’s freedom requires their freedom from children.  Yet the covert determination to free women from children shapes much feminist thought and most feminist policies even, and especially those policies aimed at having the government assume a large part of the responsibility” (p. 229).   

Sadly enough—and despite so many mothers’ sacrificial efforts—children are not being reared well.  Symbolizing our disinterest in our young is the massive killing of the unborn through abortion.  In countries where children flourish, there are “significant limitations on abortion, which none of them defines as a woman’s ‘right’” (p. 244).  Given our elites’ support of abortion rights, however, it ought not surprise us that “the United States stands out among industrialized nations as the one in which women do best and children do worst.  Our society is unmistakably failing its children, who are increasingly being left to cope alone with a world that adults find daunting.  American parents spend 40 percent less time with their children than they did only a few decades ago—down from thirty hours a week to seventeen” (p. 201).  Thanks largely to radical feminists, firmly established in socially powerful positions (universities, media, and bureaucracies), neither traditional marriage nor devotion to children receives little praise and stay-at-home moms frequently find themselves subject to ridicule and discrimination.  

Yet women want men, as well as children, nearby.  They actually like men!  Unfortunately, radical feminists, pushing their cause beyond legal equity and economic opportunities, have launched “an assault on all manifestations of masculinity” (p. 145).  The ordinary women Fox-Genovese interviewed, however, consider “the struggle against men as actually an attack on their own femininity and sense of what it means to be a woman.  Increasingly, the ‘backlash’ against feminism is coming from women who are appalled by the claims and efforts presumptuously made on their behalf” (p. 145).  They appreciate and even celebrate the differences between the sexes and deeply crave a romantic union with a strong man who will protect and care for them.  However permissively they may regard premarital sex, they still want a lasting marriage with a faithful man.  Admittedly they frequently find their husbands “just plain impossible,” acting all too often like little boys!  But they still want to marry and stay married and, “with eyes wide open, women have clung to love and sex as central, if risky, to a woman’s life” (p. 168).  Contrary to the message conveyed by TV programs celebrating single mothers, ordinary women know how difficult it is to rear children without a husband, though large numbers of them are doing so.  “Ask any woman who as tried,” (p. 174) Fox-Genovese insists, and they will disabuse you of any fantasies regarding their lot.  Single mothers, unlike TV characters (or the elite female professors in universities) are overwhelmingly poor, and there’s usually too little money, too little time, too little help to make life enjoyable.  

222 Housing Boom & Bust

In 2002—under the auspices of “compassionate conservatism”—President George W. Bush promoted affordable housing for all Americans, declaring:  “We can put light where there’s darkness, and hope where there’s despondency in this country.  And part of it is working together as a nation to encourage folks to own their own home.”  A year later he proudly signed the “American Dream Downpayment Act,” implementing his aspirations.  Yet thoughtful critics, both academic and congressional, warned against such policies, with Barrons magazine prophetically decrying as spurious any compassion that exposed “taxpayers to tens of billions of dollars of possible losses, luring thousands of moderate-income families into bankruptcy, and risking the destruction of entire neighborhoods. . . .   Free down payments carry catastrophic risks. . .   Transferring the risk of homeownership from buyers to taxpayers does not endow virtue in America.  Giving people a handout that leads them to financial ruin is wrecking-ball benevolence’” (p. 46).  What a memorable phrase—“wrecking-ball benevolence”!  Six years later, looking bewildered amidst the economic meltdown, a baffled Bush asked his Secretary of the Treasury, “How did we get here?”  Amazingly, Thomas Sowell notes, “neither he nor many others in politics and the media saw any connection between their housing crusades and the economic crisis now facing the nation” (p. 100).  

Sowell’s The Housing Boom and Bust, rev. ed. (New York:  Basic Books, 2010) makes this connection and helps us understand the “great recession” of the past three years.  Though politicians such as Barney Frank and Barack Obama feverously blame “corporate greed” and Wall Street “fat cats” and unregulated capitalism, in truth:  “The development of lax lending standards, both by banks and by Fannie Mae and Freddie Mac standing behind the banks, came not from a lack of government regulation and oversight, but precisely as a result of government regulation and oversight, directed toward the politically popular goal of more ‘home ownership’ through ‘affordable housing,’ especially for low-income home buyers.  These lax lending standards were the foundation for a house of cards that was ready to collapse with a relatively small nudge” (p. 57).  

As an economist (who has taught at prestigious universities such as UCLA) and syndicated columnist, Sowell deftly analyzes and explains what actually happened, beginning with “the economics of the housing boom.”  Housing sales skyrocketed during the first half-decade of the 21st century largely as a consequence of risky policies promoted in Washington D.C. (Fannie Mae; Freddie Mac; HUD; the Federal Reserve System) and Wall Street (banks and brokers).  Underlying it all was a “smart growth” process launched in the 1970s that radically restricted land use in some areas, notably California, under the aegis of “preserving ‘open space,’ ‘saving farmland,’ ‘protecting the environment,’ ‘historical preservation’ and other politically attractive slogans” (p. 11).  In fact, “vast amounts of land for which the local inhabitants have paid nothing are nevertheless controlled by them politically for their own benefit, to provide a buffer zone between themselves and less affluent people” (p. 131).  Consequently, one could buy the same house in Houston for a fraction of what was required in San Francisco, so “most of the country was not suffering from skyrocketing housing prices, which were largely confined to particular communities in which there were severe limitations on the building of housing” (p. 16).  Housing prices and risky loans were, consequently, concentrated in these areas.  Add to this the “creative financing” that surged in the 1990s—low (or no) down payment loans, adjustable-rate mortgages, bundling mortgages—and there were soon millions of people “buying homes that they would not be able to afford in the long run” (p. 19).

Much of this resulted from political stratagems promoted by the likes of Barney Frank and Christopher Dodd (most recently co-authors of the Dodd-Frank Wall Street Reform and Consumer Protection Act, the massive financial regulatory mandate imposed by the Obama administration), designed to insure “affordable housing” for everyone.  The Community Reinvestment Act of 1977, hugely expanded by the Clinton Administration in the 1990s, enabled federal agencies to pressure banks and mortgage companies to finance “underserved” groups, especially low income and racial minorities.  When, under Clinton, HUD secretaries Henry Cisneros and Anthony Cuomo were given oversight of Freddie Mack and Fannie Mae—transforming staid conservative loan agencies into depositories for high-risk mortgages—new banking strategies were put in place, ripe for abuse.  And abused they were!  Community activists such as Jesse Jackson extracted millions of dollars from financial institutions fearing any accusation of racial profiling.  Crying out for “social justice,” these activists, including Saul Alinsky disciples such as Chicago’s Gale Cincotta, declared:  “‘We want it.  They’ve got it.  Let’s go get it’” (p. 117).  All told, Sowell calculates:  “Over the years, the sums of money extracted from financial and other business organizations by community activist organizations, using a variety of tactics, have amounted to more than a trillion dollars, according to the national Community Reinvestment Coalition—nearly all of this money being received since 1992” (p. 119).  

Having described the phenomena, Sowell succinctly analyzes the problem by distinguishing “enabling causes from impelling causes from precipitating causes” (p. 138).  Easy credit, available on virtually every street corner, was the primary enabling cause.  Impelling the process “were growing pressures from government regulatory agencies for mortgage lenders to reduce their lending requirements,” allowing most anyone who wanted a home to acquire one (p. 139).  The primary precipitating factor was the abrupt fall in housing prices, especially impacting those speculators (“flippers”) who banked on rapid, booming home values, resulting in the tsunami of defaults, leaving us amidst the ruins of lost savings and battered IRAs.  “Few things,” Sowell laments, “blind human beings to the actual consequences of what they are doing like a heady feeling of self-righteousness during a crusade to smite the wicked and rescue the downtrodden” (p. 162).  Declaiming themselves champions of social justice, politicians and community activists polished their images in the light of a pandering press and acted “like scavengers, able to extract large sums of money from banks and other institutions by raising claims of discrimination, whose power to delay government approval of bank mergers and other business decisions made pay-offs to these activists the only prudent course for those accused” (p. 162).  

* * * * * * * * * * * *

Paul Sperry, former Washington bureau chief for Investor’s Business Daily, begins The Great American Bank Robbery:  The Unauthorized Report about What Really Caused the Great Recession (Nashville:  Thomas Nelson c. 2011) with a somber note:  “It is official:  According to the Federal Reserve Board, the financial crisis has wiped out $14 trillion in American household wealth—an amount equal to the entire gross domestic product, and the worst loss of wealth since the Great Depression.  This equates to an average loss of more than $123,000 per household.  Yet Americans didn’t lose it.  It was taken” (p. xi).  And it was taken not by “predatory lenders”—the “fat cats” maligned by President Obama—but “by Washington social engineers and housing-rights-activists who used lenders to integrate them into the economic mainstream—regardless of their financial wherewithal” (p. ix).  In fact, the data indicate that the government, not Wall Street, was responsible for more than two thirds of the risky loans that caused the financial collapse.  

Leading the charge to close the “mortgage gap” by expanding the Community Reinvestment Act in the ‘90s was President Bill Clinton.  He issued executive orders, appointed activists to key bureaucratic posts in HUD, Fannie Mae and Freddie Mac, and used Janet Reno’s Department of Justice to further his agenda.   In 1995, and again in 2000, HUD pressured Fannie and Freddie to reduce their underwriting standards and approve loans they would have earlier rejected.  Clinton “plunged Fannie and Freddie into the subprime market and turned them into the twin towers of toxic debt they are today” (p. 10).  He “undercut traditional rules for lending” and “created an easy credit orgy” that resulted in the 2008 crash.  Before the crash, however, “Clinton’s top regulators boasted that their policies helped create both the primary and secondary markets for subprime loans” providing “minorities a ‘good option’ to buy houses and refinance debt.  Clinton himself at the time bragged about plundering banks for record hundreds of billions of dollars in loans for minority communities, before falling silent as those loans defaulted” (p. 4).  In retrospect, “Clinton’s brawnier CRA created a multitrillion-dollar shakedown industry that as devastated the financial industry.  The graveyard of banks bullied into making unsafe loans by ACORN and its clones piles higher and higher” (p. 153).  

At the time Republican Senator (and former economics professor) Phil Gramm, then chairman of the Senate Banking Committee, warned that Clinton’s policies enabled agitators “to blackmail banks for ‘kickbacks and bribes’” (p. 134).  And sure enough, fearing punitive measures from the federal government, banks “pledged billions of dollars in urban loans to ACORN and other radical community organizers to make them go away” (p. 134).  Consequently, Sperry calculates, “community organizers have shaken the banking industry down for an eye-popping $6.1 trillion . . . in total CRA agreements and commitments to poor and minority communities” (p. 134).  

Supporting, and profiting from, the Clinton policies of the ‘90s was a young “community organizer” in Chicago with close ties to the Association of Community Organizations for Reform Now (ACORN), Barack Obama, whose “fingerprints are on the subprime scandal” (p. 37).  He was tutored by John McKnight, the Northwestern University professor who recommended him for admission to Harvard Law School. Implementing the strategies of Saul Alinsky, McKnight now directs the National People’s Action and trains street agitators to coerce banks to underwrite housing in minority neighborhoods.  Moving from Chicago to Washington, Obama determined to amplify, through executive orders, the Community Development Act far beyond Clinton’s goals.  Still more, claiming the financial crash was due to poorly regulated financial institutions, Obama established (through the Dodd-Frank Act) “the Consumer Financial Protection Bureau, a huge new federal bureaucracy that will, among other things, police lenders’ underwriting for ‘traditionally underserved consumers,’ and punish companies who do not do enough of it” (p. 45).  Rather than reverse the policies that precipitated the Great Recession, the president and his party have resolved to expand them!  “Overhauling the banking system without fixing Fannie and Freddie is like fighting terrorists without attacking the jihadist ideology motivating them.  All that’s changed with passage of the Dodd-Frank Act—which should be renamed the Fannie-Freddie Protection Act—is the size of government’s hand in the economy, now bigger than ever” (p. 216).  

This should awaken us, Sperry argues, to this reality:  “This country is in very serious danger of transitioning from an entrepreneurial economy to a parasitic economy—whereby race racketeers, grievance mongers, and street agitators (or as the First Lady euphemistically calls them, ‘social entrepreneurs’) along with group-identity politicians to such the lifeblood out of the real entrepreneurs in private industry” (p. 206).  Personifying the gravity of the danger is Elizabeth Warren, the Interim director of Obama’s Consumer Financial Protection Bureau.  A professor at Harvard Law School, she appeared in “Michael Moore’s market-bashing film, Capitalism:  a Love Story” and “is an anti-business crusader who favors nationalizing banks and capping the interest rates they can charge consumers” (p. 211).  To Sperry, she represents the “guilt-ridden country club radicals with their Ivy League pedigrees, who exploit underprivileged minorities and enlist them as foot soldiers in their romantic revolution for ‘social justice’ as atonement for their own privileged family status” (pp. 211-212).  They’re doing an inside job, not cracking safes but still looting the national treasury under the guise of remaking the world.    

* * * * * * * * * * * * * 

In Architects of Ruin (New York:  HarperCollinsPublishers, c. 2009) Peter Schweizer explains How Big Government Liberals Wrecked the Global Economy—and How They Will Do It Again If No One Stops Them.  He rejects, as patently untrue and self-serving, assertions by Congressman Barney Frank and New York Times’ columnist Paul Krugman that only the federal government can clean up and prevent the mess created by the unregulated private sector—what George Soros brands “free market fundamentalism.”  Au contraire, argues Schweizer:  American capitalism is, in fact, already tightly regulated; there was no deregulation during the decade leading up the Great Recession; and government agencies energetically policed and intimidated financial institutions.  

The title of chapter one sets the tone for Schweizer’s tome:  “The Robin Hood Agenda:  How a Gang of Radical Activists and Liberal Politicians Set the Stage for the Biggest Bank Heist in History.”  Their agenda rested “on the Marxist premise that all accumulated wealth is ipso facto an unjust expropriation of collective resources” (p. 39).  The progressive Robin Hoods were frequently community activists and lawyers working with organizations such as Operation PUSH and ACORN who sued banks,  accusing them of racist policies—i.e. red-lining loans in minority communities.  In a typical case, two plaintiffs received a total of $60,000 while their lawyers collected $950,000 of the million dollar out-of-court settlement.  “The charge of racism in banking launched a movement in the 1970s that has utterly transformed the American financial system” (p. 5).  

Targeting banks (the most vulnerable link in the capitalist system) was a basic strategy of Saul Alinsky, who spoke much of “helping people” but actually sought to gain access to power—taking it from the Haves and giving it to the Have-Nots.  Alinsky deeply appealed to a whole generation of young radicals—Caesar Chavez and Jesse Jackson, Hillary Clinton and Barack Obama.  Less well-known, but especially important, was a Chicago housewife, Gale Cincotta, who made forcing banks to subsidize “affordable housing” her life’s work.  She organized the National People’s Action on Housing and sponsored a conference in Chicago that galvanized much national attention.  In time she brought together and coordinated an alliance of some 60 community organizations, all dedicated to changing the lending policies of area banks.  Congressional liberals, including Senator William Proxmire (its primary promoter), were impressed by her endeavors and ultimately passed the Community Reinvestment Act (CRA) in 1975.  

Though little noticed at the time, it laid the foundation for America’s financial collapse three decades later.  Alan Greenspan, testifying before the House Committee on Oversight and Government Reform in October 2008, said:  “‘It’s instructive to go back to the early stages of the subprime market, which has essentially emerged out of the CRA” (p. 45).  Subprime lending “increased twentyfold between 1993 and 2000” (p. 71), becoming a major lever wielded by President Clinton to “embark on a massive social engineering program that would, in the hallowed name of civil rights, dramatically undermine the lending standards of banks all over the country.  He thereby set into motion a series of events that would shake the financial foundations of the country—and the world—sixteen years later” (p. 47).  

Add to the expansion of the CRA Clinton’s take-over of Freddie Mac and Fannie Mae—government agencies established during the Great Depression and heretofore “steady, even boring entities that simply served to lubricate the mortgage market so that middle-class Americans would find it easier to get a loan” (p. 77).  Such GSEs (government sponsored entities) are unique inasmuch that they “are private companies but are implicitly guaranteed by the federal government (that is, by us taxpayers)” (p. 79).  Once controlled by President Clinton they began to “redistribute wealth by taking on the affordable housing mission” (p. 80).  They—assisting lenders such as Countrywide Credit, the “largest lender to Hispanics and blacks in the country” (p. 94)—sucked up the subprime loans issued by banks and sold them to Wall Street institutions.  The big players in this highly politicized process naturally rewarded their allies.  Sweetly lucrative spots on the boards of directors of Countrywide and similar institutions were given prominent politicians—Henry Cisneros, following his stint at HUD; Nancy Pelosi’s son; California Governor Jerry Brown’s sister.  Fannie and Freddie, in their “private” role also gave generously to the political campaigns of friendly politicians—e.g. Kit Bond, Christopher Dodd, Barack Obama, Rahm Emmanuel.  And the fall-out of all this nearly defies comprehension.  “Today Fannie and Freddie are behemoths of debt and, as such, prime incubators of the economic crisis.  If you add together the mortgages they hold and the mortgages they have sold to investors around the world and on which they have offered a payment guarantee, these two companies hold potential liabilities of some $5 trillion.  . . . .  In effect, these two government-sponsored entities have liabilities equaling about half the current U.S. national debt” (p. 104).  

Suitably encouraged (or coerced) by the government, a litany of lending agencies—Wells Fargo, Washington, Countrywide, et al.—quickly entered the subprime mortgage business.  Some of them were led by “do good capitalists” such as Robert Rubin, a close friend of Bill Clinton, who envisioned themselves as part of a great societal transformation.  “Together, these two groups—the Washington and Wall Street branches of the emerging boomer overclass—forged a new form of liberal state capitalism” (p. 123).  Jon Corzine, accumulating a fabulous fortune working for Goldman Sachs before successfully running for senator and then governor of New Jersey, always identified himself as “a child of the ‘60s” and proved himself reliably liberal, championing “affirmative action, same-sex marriage, gun control, and universal health care” (p. 124).  Corzine and Rubin (and their financial institutions) prospered in the ‘90s, in part, because the federal government issued bailouts for countries such as Mexico and South Korea, Thailand and Indonesia, lest they default and endanger Goldman Sachs, Citibank, et al.  Such bailouts “made sure not only that the banks and investment houses were protected but that they made a nice return on their investments.  This is the essence of state capitalism:  the profits go to the financial firms, the losses are covered by taxpayers” (p. 138).  

Our Ruling Class assumed that their risky financial adventures would be covered by the federal government—as long as their businesses were “too big to fail.”  Less concern was evident for the ordinary folks who were borrowing money to buy houses.  People with no prospects of repaying them were granted loans on houses, and one cannot but wonder at such flagrant violations of common sense, but the Clinton social agenda mandated them and for a brief time “the number of minority homeowners soared” (p. 71).  In time, inevitably this “massive experiment in socially engineered housing equality created a whole new class of debtors in America.  And by far the greatest victims were the very people Clinton was trying to help” (p. 73).  Sadly enough:  “The mortgage crisis—and especially the meltdown of minority neighborhoods—is directly related to well-meaning efforts by liberals in government to tilt the housing market in their favor” (p. 159).  Joining the ranks of most revolutionaries they failed to heed the law of unintended consequences.  

221 Intellectuals and Society

“Experts,” according to an ancient and prescient adage, “should be on tap, not on top.”  Specialists of all sorts are invaluable—as brain surgeons or shortstops, actors or accountants, plumbers or professors—but they almost always err egregiously when they assume their mastery of a given subject qualifies them to pontificate on or control areas apart from their expertise.  This is especially true regarding “public intellectuals” determined to shape society.  The 20th century, troubled by totalitarian regimes, sadly witnessed, noted Mark Lilla in The Reckless Mind, how famed “professors, gifted poets, and influential journalists summoned their talents to convince all who would listen that modern tyrants were liberators and that their unconscionable crimes were noble, when seen in the proper perspective.”  

Still more, as Eric Hoffer said:  “One of the surprising privileges of intellectuals is that they are free to be scandalously asinine without harming their reputation.  The intellectuals who idolized Stalin while he was purging millions and stifling the least stirring of freedom have not been discredited.  They are still holding forth on every topic under the sun and are listened to with deference.  [Jean-Paul] Sartre returned in 1939 from Germany, where he studied philosophy, and told the world that there was little to choose between Hitler’s Germany and France.  Yet Sartre went on to become an intellectual pope revered by the educated in every land” (Before the Sabbath).  Amazingly, says Thomas Sowell in Intellectuals and Society (New York:  Basic Books, c. 2009):  “As late as 1932, famed novelist and Fabian socialist H.G. Wells urged students at Oxford to be ‘liberal fascists’ and ‘enlightened Nazis.’  Historian Charles Beard was among Mussolini’s apologists in the Western democracies, as was the New Republic magazine” (p. 99).  “W.E.B. Du Bois was so intrigued by the Nazi movement in the 1920s that he put swastikas on the covers of the magazine he edited, despite protests from Jews” and celebrated, in 1936, the fact that “Germany today is, next to Russia, the greatest exemplar of Marxian socialism in the world’” (p. 99).  

As we all know, wisdom easily eludes highly educated individuals.  There are no “wise fools,” but “smart fools” populate most every coffee shop and faculty lounge.  Similarly, one may be highly intelligent without being an “intellectual,” and it is important to carefully define the “intellectuals” Sowell studies, for he restricts them to “an occupational category, people whose occupations deal primarily with ideas—writers, academics, and the like” (p. 2).  As “intellectuals” they easily take their own “notions” to be certain knowledge, and they often ignore the “common sense” of the multitude (living and dead) whose accumulated knowledge far exceeds their own.  Consequently, as Robert L. Bartley, late editor of the Wall Street Journal, stated, the free market “‘is smarter than the smartest of its individual participants’” and provides “more knowledge for decision-making purposes, through the interactions and mutual accommodation of many individuals, than any one of those individuals possesses” (p. 16).  

Given this guiding principle, Sowell applies it, in successive chapters, to various realms of society:  economics, social visions, the media and academia, the law, and war.  In economics (Sowell’s specialty) one finds utterly unlearned intellectuals (novelists and preachers, journalists and professors) making “sweeping pronouncements about the economy in general, businesses in particular, and the many issues revolving around what is called ‘income distribution” (p. 34).  Demonstrable factual errors—e.g. the “widening gap between the rich and the poor”—gain credence simply through endless repetition.  Renowned thinkers, ranging from George Bernard Shaw to Bertrand Russell to John Dewey, have made inexcusable (and elementary) mistakes when they become self-anointed authorities on the economy.  Decrying “greed” and calling for “compassion,” such intellectuals generally reveal the poverty of their understanding of basic economic principles and realities.  Apparently innocent of any historical grounding, they cite as true the Progressive muckrakers’ calumnies regarding “robber barons” and the New Dealers claims to having “ended” the Great Depression.  What actually happened is quite irrelevant as a fabricated past is promulgated to advance a current cause.  Sowell patiently—and cogently—shows the baneful consequences of taking seriously most intellectuals’ economic pronouncements.

Intellectuals in the media serve mainly as filters, promoting their social visions.  Thus in the 1930s Walter Duranty, the Moscow correspondent for the New York Times, won a Pulitzer Prize for articles on Stalin’s Russia “‘marked by scholarship, profundity, impartiality, sound judgment and exceptional clarity’” (p. 122).  That he failed to mention the massive starvation that killed millions of Ukrainians or the brutalities of the gulag hardly dented his reputation since he aired politically correct views.  Politicians from the heartland, such as Harry Truman, a voracious reader and thoughtful leader, are often tarred as country bumpkins, while the likes of Adlai Stevenson, who rarely read anything but “had the rhetoric and the airs of an intellectual,” are touted for their sophistication.  In our day:   “No factual information that could reflect negatively on homosexuals is likely to find its way through either media or academic filters, but anything that shows gays as victims can get massive coverage” (p. 126).

  The law too suffers at the hands of Progressive intellectuals who assume themselves superior to legislators past and present and determine to impose “social justice” through judicial fiat.  With justices such as Roscoe Pound (dean of Harvard Law School a century ago) and Supreme Court Justice Louis Brandeis, they find the Constitution to be a “living document” ever malleable in the hands of a select few.  Pound “advocated that an anointed elite change the nature of law to conform to what they defined as the ‘vital needs of present-day life,’ despite being at variance with (‘in advance of’) the public, with whose ‘moral sense’ the law was supposedly being made to conform.  Law, according to Pound, should also reflect what he repeatedly called—without definition—‘social justice’” (p. 164).  This is not the “legal justice” embedded in documents but a “social justice” emanating from the warm hearts of robed reformers!  

Sowell devotes two chapters to the intellectuals’ stance on war.  Anti-war sentiments have been cascading through our elite institutions since WWI, the “great war” that precipitated the laments of the “lost generation.”  Woodrow Wilson’s dream of a “war to end all wars” that would establish a “world made safe for democracy” lay shattered in the trenches of France.  Pacifism quickly became the mantra of the ‘20s and ‘30s, and anti-war novels such as All Quiet on the Western Front and A Farewell to Arms molded a new consciousness in the West.  School teachers (“one of the elements of the intelligentsia in the penumbra surrounding the inner core of intellectuals” {p. 294}) especially focused on the victims of the conflict who suffered and died, not the heroes who exemplified patriotic virtues.  While Hitler rearmed Germany, intellectuals in the West championed disarmament and “peace at any price.”  The Second World War would briefly revive support for the military, but within a decade the pleas for peace (especially with the USSR) resounded again.  

Evaluating this, Aleksandr Solzhenitsyn declared, in his 1970 Nobel Lecture:  “The timid civilized world has found nothing with which to oppose the onslaught of a sudden revival of barefaced barbarity, other than concessions and smile.”  Anti-war activists, chanting pacifist slogans, subverted America’s effort to defend South Vietnam by alleging (with Walter Cronkite following the ’68 Tet offensive) that we were mired in stalemate” in an unwinnable “civil war.”  We now know Cronkite was wrong, for the guerrillas were crushed in Tet, but the “Communists political success consisted precisely in the fact that media outlets like the New York Times declared their military offensive successful” (p. 250).  As in France in the ’30s, so too in America in the ‘70s—anti-war forces demonized the military and championed appeasement, the Neville Chamberlain strategy which “became the leitmotif of media and academic discourse during the decades of the Cold War” (p. 259).  And it revived, with rancorous fury, as America invaded Iraq in 2003.  “Disdain for patriotism and national honor,” Sowell concludes, “was just one of the attitudes among the intellectuals of the 1920s and 1930s to reappear with renewed force in Western democracies in the 1960s and afterwards.   How far history will repeat itself, on this and other issues, is a question for the future to answer.  Indeed it is the question for the future of the Western world” (p. 280).  

Should “intellectuals” continue to shape Western culture, there’s little hope for the future, Sowell suggests.  If the past century is a prelude for the next, there’s every reason to believe that intellectuals such as MIT’s Noam Chomsky and Stanford’s Paul Erlich, lambasting capitalism and predicting catastrophic ecological events, will continue to shape public policy.  “Many among the intelligentsia see themselves as agents of ‘change,’ a term often used loosely, almost generically, as if things are so bad that ‘change’ can be presupposed to be a change for the better” (p. 306).  But such is not often the case!  Indeed, should we entrust the future to the kinds of thinkers who have brought us to this point, we may very well witness the loss of those most precious goods secured by our traditions.  

* * * * * * * * * * * * * * * * 

In logic one learns of the ad hominum fallacy—discrediting the argument by demeaning the advocate, shooting the messenger rather than considering the message.  Though logically fallacious, there remains a certain legitimacy in evaluating what’s claimed in accord with the character of the speaker.  Two decades ago the distinguished British historian Paul Johnson did precisely this when he penned Intellectuals (New York:  Harper & Row, Publishers, c. 1988), stressing that “the rise of the secular intellectual has been a key factor in shaping the modern world” (p. 1).    As many of these intellectuals were personally reprehensible, Johnson demands we consider this before taking as true their theories.  His work is perennially worth pondering, for by drafting biographical vignettes of eleven influential thinkers, he builds a case against trusting them for guidance in most all areas of life.  

To compress his case Johnson cites an illuminating 1946 essay wherein Evelyn Waugh enumerated these ten objectives of mid-century intellectuals:  “(1) the abolition of the death penalty; (2) penal reform, model prisons and rehabilitation of prisoners; (3) slum clearance and ‘new towns’; (4) light and heating subsidized and ‘supplied free like air’; (5) free medicine, food and clothes subsidies; (6) abolition of censorship, so that everyone can write, say and perform what the wish . . .; (7) reform of the laws against homosexuals and abortion, and the divorce laws; (8) limitations on property ownership rights for children; (9) the preservation of architectural and natural beauty and subsidies for the arts; (10) laws against racial and religious discrimination” (p. 316).  Rejecting any Christian foundations, they lead the public in “eroding social disciplines and rules” (p. 317).  That their objectives have been realized cannot be denied. 

Johnson begins with Jean-Jacques Rousseau—“An Interesting Madman.”  That Rousseau largely shaped the modern world cannot be doubted, for his ideas guided the French Revolution and suffuse those subsequent revolutions designed to perfect both man and society.  Overcoming a variety of obstacles, sampling a dozen vocations, he failed in most everything for 30 years and was censored, by an employer, , the Comte de Montaigu, for exuding a “‘vile disposition’ and ‘unspeakable insolence’, the product of his ‘insanity’ and ‘high opinion of himself’” (p. 6).  The famed philosophe Diderot, who knew him well, “summed him us as ‘deceitful, vain as Satan, ungrateful, cruel, hypocritical and full of malice’” (p. 26).  He did, however, have a facility with words and wrote (at the age of 39) his Discours on the arts and sciences which established his reputation in Parisian circles.  To this he then added political prescriptions such as the Social Contract, novels such as Emile (containing his educational philosophy) and La Nouvelle Heloise, all igniting the fires of Romanticism that would impact the 19th century.  He ingratiated himself in polite circles by acting impolite, highlighting “his ostentatious rejections of social norms by a studied simplicity and looseness of dress, which in time became the hallmark of all the young Romantics” (p. 12).  

Carefully examined, however, Rousseau appears eminently untrustworthy!  Touting himself for his commitment to truth and virtue, he filled his Confessions with half-truths, prevarications and distortions.  He wrote eloquently of love but showed remarkably little of it in his personal life, maligning his father and slandering his benefactors.  Quite authoritative in giving advice on educating children, he took an illiterate laundress, Therese, as his mistress for 33 years and heartlessly discarded their five children in an orphanage wherein two-thirds of all babies died in the first year.  Justifying himself, he asserted that children would have hindered his career as a writer and also declared:  “‘I know full well no father is more tender than I would have been’ but he did not want his children to have any contact with Therese’s mother:  ‘I trembled at the thought of entrusting mine to that ill-bred family’” (p. 22).  What he decided was that all children should be entrusted to the State—the perfect patrie he envisioned.  Therein they would be properly reared and become virtuous citizens.   But detached from his evil being Rousseau’s enthralling words took flight.  Few remembered him as the “monster” David Hume described.  Rather, he became, in the words of George Sand, “Saint Rousseau”—the guiding light for philosophers ranging from Kant to Schiller and Mill and novelists such as George Eliot and Victor Hugo.  Amazingly, “Tolstoy said that Rousseau and the Gospel had been ‘the two great and healthy influences of my life’” (p. 27).  

After devoting a chapter to “Shelley, or the Heartlessness of Ideas,” Johnson turns to Karl Marx, who “has had more impact on actual events, as well as on the minds of men and women, than any other intellectual in modern times” (p. 52).  He was reared in a “quintessentially middle-class” family headed by a father who “was a liberal and described as ‘a real eighteenth-century Frenchman, who knew his Voltaire and Rousseau inside out’” (p. 53).  Though inclined to scholarship, being “totally and incorrigibly deskbound,” Marx was actually much more a poet and moralist with an eschatological message; he was “not interested in finding the truth but in proclaiming it” (p. 54).  “In fact his greatest gift was as a polemical journalist.  He made brilliant use of epigrams and aphorisms” (p. 56).  

Marx marshaled copious quantities of data to support his arguments regarding the plight of the working man, but he seemed strangely indifferent to real workers “and so far as we know Marx never set foot in a mill, factory, mine or other industrial workplace in the whole of his life” (p. 60).  Citing articles by his colleague Friedrich Engels, he routinely misrepresented economic statistics, portraying England as it was decades earlier and “omitting to tell the reader of the enormous improvements brought about by enforcement of the Factory Acts and other remedial legislation since the book was published and which affected precisely the type of conditions he had highlighted” (p. 66).  He treated both primary and secondary sources with “gross carelessness, tendentious distortion and downright dishonesty” (p. 66) and he deliberately falsified statements by statesmen such as Gladstone and economists such as Adam Smith.  Whatever the subject:  “He can never be trusted” (p. 68).  

Still more, what Marx wrote bears witness to what he was, showing “his taste for violence, his appetite for power, his inability to handle money, and, above all, his tendency to exploit those around him” (p. 69).  These traits are treated in depth by Johnson as he demonstrates the man’s utterly loathsome personality.  In the words of Bakunin:  “‘Marx does not believe in God but believes much in himself and makes everyone serve himself.  His heart is not full of love but of bitterness and he has very little sympathy for the human race’” (pp. 72-73).  Nor did he extend much sympathy to his immediate family, consigning wife and children to poverty through his Bohemian sloth.  His mother “is credited with the bitter wish that ‘Karl would accumulate capital instead of just writing about it’” (p. 74).  

“Of all the intellectuals we are examining,” Johnson says, “Leo Tolstoy was the most ambitious.   His audacity is awe-inspiring, at times terrifying,” for he imagined he could, through his own genius, “effect a moral transformation of society.  His aim, as he put it, was ‘to make the spiritual realm of Christ a kingdom of this earth’” (p. 107).  He declared he had never “‘met a single man who was morally as good as I’” and rejoiced to find within himself an “‘immeasurable grandeur’” (p. 107).  Without doubt he discovered and developed remarkable literary skills and penned some of the greatest novels ever written.  Indeed:  “There are times when he writes better than anyone who has ever lived, and surely no one has depicted nature with such consistent truth and thoroughness” (p. 112).  

But Tolstoy wanted more.  Controlling characters in his novels was not enough—he wanted to control actual persons, to mold them and their society into his vision of perfection.  Though haunted with guilt for his financial prodigalities and sexual indulgences, he deemed himself a Messiah called to rectify this fallen world.  As early as 1855 he aspired “to create a faith based on ‘the religion of Christ but purged of dogmas and mysticism, promising not a future bliss but giving bliss on earth’” (p. 124).  His religion seemed vaguely pantheistic with a sentimental sympathy for the poor (including the serfs on his own estate who occasionally elicited his attention).  He exemplified, as his wife noted, the intellectual who loves mankind but fails to love living persons, beginning with his own family.  “Tolstoy’s case is another example of what happens when an intellectual pursues abstract ideas at the expense of people” (p. 137).  Sadly for Russia, a cadre of intellectuals would take control of the country in 1917 and provoke an “infinitely greater national catastrophe” (p. 137).  Beyond harming himself and his family, his words helped ignite “a millenarian transformation of Russia herself . . . in one volcanic convulsion” that “made nonsense of all he wrote about the regeneration of society” (p. 137).    

“No intellectual history offered advice to humanity over so long a period as Bertrand Russell” (p. 197).  Making his reputation early on as a philosopher whose A History of Western Philosophy “is the ablest thing of its kind ever written” (p. 200), he  devoted most of his energies to advocating various social causes, especially pacifism, opposing armed conflicts from WWI to Vietnam and promoting nuclear disarmament.  Ironically, as Johnson notes, “Kingsley Martin, who knew Russell well, often used to say that all the most pugnacious people he had come across were pacifists, and instanced Russell.  Russell’s pupil T.S. Eliot said the same:  ‘[Russell] considered any excuse good enough for homicide’” (p. 204).  On a personal level, he not only married and divorced a handful of women but took and tossed aside a variety of mistresses throughout his life.  Calling others to be truthful, he lived dishonestly, even taking credit (and money) for articles written by others.  Lascivious, arrogant, dishonest, unkind—hardly a model worth emulating or lauding!

What’s true for Russell applies equally to the other intellectuals (including Ibsen, Brecht, Sartre and Hemingway, Edmund Wilson and Lillian Helman) Johnson describes.  What we should learn, quite simply, is this:  “beware intellectuals.  Not merely should they be kept well away from the levers of power, they should also be objects of particular suspicion when they week to offer collective advice” (p. 342).  “Above all, we must at all times remember what intellectuals habitually forget:  that people matter more than concepts and must come first.  The worst of all despotisms is the heartless tyranny of ideas” (p. 342).  

220 Defending Life

For 20 years I showed my ethics students Silent Scream, a moving anti-abortion film narrated by the late Dr. Bernard Nathanson, who died earlier this year.  He not only clearly described the abortion procedure (using an ultrasound) but shared his sorrow for playing a major role in legalizing it in 1972.  When he made the film he professed no religious faith, merely a humanistic concern for taking innocent life.  But in time, as he joined various anti-abortion endeavors, he came to faith and joined the Catholic Church.  To explain his spiritual journey, including his early commitment to abortion-on-demand, and to evaluate the evil of killing unborn babies, he wrote The Hand of God:  A Journey from Death to Life by the Abortion Doctor Who Changed His Mind Washington (Regnery Publishing, Inc., c. 1996).  “This book,” he said, “will be semi-autobiographical, using myself as a paradigm for the study of the systematic fission and demise of one system of morality, no matter how fragmented, fatuous, and odious, and the painful acquisition of another more coherent, more reliable, and less atomistic one” (p. 3).  

His father, a brilliant obstetrician, “was a formidable, dominant force in my life and in many ways forged the ruthless, nihilistic pagan attitudes and beliefs that finally drove me to unleash—with a handful of co-conspirators—the abortion monster” (p. 5).  Though nominally Jewish, the Nathansons (father and son) were thoroughly secularized, much attuned to the relativism of modernity.  Highly intelligent, Nathanson moved easily through Cornell University and McGill Medical School.  Importantly, at McGill he “forged a strong, even compelling teacher-student relationship” (p. 45) with Professor Karl Stern, an alluring lecturer who had left Judaism to enter the Catholic Church in 1943—a journey beautifully portrayed in The Pillar of Fire.  While unaware of this at the time, 20 years later, “floundering in the wake of my hegemony of the abortion clinic and the doubts that were beginning to crack my own pillars of certainty,” Nathanson learned “that even as I had spoken to him on so many occasion about so many other things, he [Stern] possessed a secret I had been searching for all my life—the secret of the peace of Christ” (p. 46)   

While at McGill Nathanson impregnated a young woman.  To eliminate the problem his father sent him money to kill the baby, and he slipped easily “into the satanic world of abortion” (p. 58).  Years later he would impregnate another woman (who begged to give birth to the child) and performed, without remorse, the abortion himself, killing his own child.  “I have aborted the unborn children of my friends, colleagues, casual acquaintances, even teachers.  There was never a shred of self-doubt, never a wavering of the supreme confidence that I was doing a major service to those who sought me out” (p. 61).  Practicing medicine at Women’s Hospital in New York, he came to see abortion as a valuable service, particularly for the poor, making life better for the disadvantaged.  

His commitment to abortion rights led to a relationship with Larry Lader, an “ardent feminist and a great admirer of Margaret Sanger” who “was obsessed with abortion” (p. 87).  Nathanson and Lader teamed up to legalize abortion, forming the National Association for Repeal of Abortion Laws (NARAL) and other action committees.  Manipulating the media, recruiting ideological feminists and liberal clergy, fabricating statistics, making emotional appeals to pity and equity, they effectively orchestrated a repeal of New York’s abortion laws in 1970.  A year later Nathanson became director of the Center for Reproductive and Sexual Health, an abortion clinic launched with the assistance of the Rev. Howard Moody and his Clergy Consultation Referral Service.  He continued practicing obstetrics and gynecology and toured the country urging politicians to legalize abortion (unexpectedly accomplished through judicial fiat in Roe v. Wade in 1972) and “was known as the abortion king” (p. 124).  

While promoting the cause, however, new technologies (preeminently the ultrasound) confronted and troubled Nathanson with the stark truth of abortion.  To actually see the fetus in the womb revealed its fully human form, and he began to see the vapidity of all the assorted pro-abortion arguments he’d earlier espoused.  He publically expressed his doubts and performed his last abortion in 1979, persuaded “that there was no reason for an abortion at any time; this person in the womb is a living human being, and we could not continue to wage war against the most defenseless of human beings” (p. 128).  He clearly describes pre-natal developments, insisting “we have a virtually unbroken series of quantifiable, noncontingent, scientifically verifiable and infinitely reproducible events that signifies the beginning of a new human life” (p. 138).  

His growing pro-life convictions led to an alignment with pro-life people—virtually all deeply religious and unusually at ease with themselves.  He was amazed at the “sheer intensity of the love and prayer” evident in those who gathered to protest outside abortion clinics.  His convictions regarding the sanctity of life, coupled with his amazement at Christians witnessing to their faith, led to an openness to the Lord and Giver of life.  So, “for the first time in my entire adult life, I began to entertain seriously the notion of God—a god who problematically had led me through the proverbial circles of hell, only to show me the way to redemption and mercy through His grace.  The thought violated every eighteenth-century certainty I had cherished; it instantly converted my past into a vile bog of sin and evil; it indicted me and convicted me of high crimes against those who had love me , and against those whom I did not even know; and simultaneously—miraculously—it held out a shimmering sliver of Hope to me, in the growing belief that Someone had died for my sins and my evil two millennia ago” (pp. 193-194).  

Now he is “no longer alone” (p. 196).  The lost is found.  The blind now sees.  The sinner’s saved.  With his mentor Karl Stern, Nathanson has discovered, as Stern wrote in a letter, that that “‘toward Him we had been running, or from Him we had been running away, but all the time He had been in the center of things’” (p. 196).  Along with Nathan’s earlier treatises—Aborting America and The Abortion Papers—this book provides invaluable insight into the monumental battle between the cultures of life and death.  

* * * * * * * * * * * * * * * * * 

In her junior year at Texas A&M Amy Johnson decided work as a volunteer in a nearby Planned Parenthood facility, persuaded she could help thereby establish equal rights for women.  Her compassion was ignited by a Planned Parenthood recruiter, who told her about anti-abortion zealots determined to deprive women of their rights and criminalize the procedure, “forcing women to choose between greater poverty and unwanted babies they couldn’t care for or dangerous back-alley butchers” (p. 17).  Johnson’s volunteer work morphed into a full-time job and she became director of the clinic, effectively working for eight years, engaged in what she believed to be a righteous endeavor, helping women prevent unwanted pregnancies.

One day, however, she was unexpectedly drafted to handle the ultrasound probe while an abortionist killed a baby.  Despite her role as director of the clinic, she had always avoided any involvement in the procedure, thinking of herself as a counselor providing birth control information and materials, helping women in need.  Holding the probe, however, she saw a twelve week-old baby and thought about her own little girl, who had looked the same at 12 weeks.  “What am I about to see?” she asked herself.  “My stomach tightened.  I don’t want to watch what is about to happen” (p. 4).  But she couldn’t escape.  Indeed she was a part of the team aborting the baby.  In those crucial moments she faced the enormity of the act:  “What was in this woman’s womb just a moment ago was alive.  It wasn’t just tissue, just cells.  That was a human baby—fighting for life!  A battle that was lost in the blink of an eye.  What I have told people for years, what I’ve believed and taught and defended, is a lie” (pp. 6-7).  

Johnson (with Cindy Lambert) tells the story of her transformation in unPLANNED:  The dramatic true story of a former Planned Parenthood leader’s eye-opening journey across the life line (Wheaton:  Tyndale House Publishers, c. 2010).  Her account is powerful not only because of her change of heart but also because it gives us a glimpse into the abortion rights world—populated by tender-hearted, church-going, professing Christians such as Johnson.  She grew up in a pro-life family and regularly attended a church that opposed abortion before going off to college.  There she turned into a party girl and became sexually involved with an older man, Mark, when she was 20.  This led (at his suggestion) to an unintended pregnancy’s termination.  The “problem” solved, she “had no regrets.  No sadness.  No struggle over whether I’d done was right or wrong.  Just a definite sense of relief.  Whew.  That’s behind me.  I can get on with my life now” (p. 25).  

She buried her abortion deep in her heart, determined to forever keep it a secret.  She and Mark married, and she returned to A&M where she pursued a degree in psychology.  The marriage soon collapsed, but just as the divorce was granted she found herself pregnant again.  So she procured yet another abortion, reciting all the rationalizations she was learning at the Planned Parenthood facility, where her volunteer work had flourished and she was on staff.  She also began interacting with pro-life protesters who stood and prayed outside the clinic, trying to persuade women to protect their unborn children.  At first she saw them as enemies to be attacked, but in time their love and sincerity prodded her to consider their position.  She was especially moved by Sister Marie Bernadette who silently prayed, serving as a “conscience” for them all.  

She also met and married and had a baby with Doug, a pro-life believer who challenged (without ever personally attacking) her to evaluate her position with Planned Parenthood.  They attended an evangelical church but were no allowed to join the congregation because of her work.  So they found an Episcopal church which officially approved abortions and affirmed her conviction that she was doing the Lord’s work by helping needy women.   Her conviction was reinforced by other “Christians” in the clinic, including some Roman Catholics, who claimed to be untroubled by promoting what their church decried.  (Indeed, one of the messages of this book is the relative impotence of church teaching—folks like Johnson pretty much make up their own morality independent of religious teachings.)    

Johnson loved her work, but as she received promotions and awards she moved into the upper echelon of Planned Parenthood management she grew distressed by what she encountered.  Money seemed to preoccupy the organization’s leaders, and since “abortion services” generated significant revenue they figured significantly in Planned Parenthood’s agenda.  Her own clinic in Bryan, which only scheduled abortions every other Saturday, was ordered to find ways to increase the numbers.  The bottom line, not needy women, dictated decisions.  Then came the day when she held the ultrasound wand and helped abort a baby.  “Now that the scales had begun to fall from my eyes, the guilt of countless abortions, including my own two, came crashing down on my shoulders” (p. 123).  

The next day she and Doug went to church, where she recited the liturgy’s confession of sin and did so with total honesty.  In that moment, as the ancient words “spilled out, I sensed God’s love and forgiveness pouring in” (p. 129).  She realized that this was a turning point for her, though she needed to find ways to rightly terminate her career with Planned Parenthood.  She found comfort and assistance in the folks she’d for years considered foes—the Coalition for Life pro-life protesters who gathered each Saturday to decry abortions!  They listened to her, loved her, and tried to help her.  And her husband, above all, totally supported her.  Her trusted colleagues at Planned Parenthood, however, betrayed her.  She confided to two of them she considered friends, and they assured her that they likewise wanted to sever their relationship with the clinic, even enlisting her help in filling out resumes for job applications.  But when Planned Parenthood initiated legal action (a restraining order) against her, it became clear that they “had not just turned against me but had apparently given false statements to the court” (p. 206).  When the case came to trial, however, she was ably defended by an attorney affiliated with the Coalition for Life and the judge dismissed the case.  

Amidst all the publicity generated by Johnson’s departure from Planned Parenthood—and the legal actions that followed—her decision gained national attention.  She appeared on such programs as The O’Reilly Factor and was enabled to bear witness to the wonderful workings of God in her life as she came to the Light.  Doors opened for her to speak and write to advance the pro-life cause, and when she tells her story she joins “in the legacy of prayer begun in Bryan, Texas,” and prays “for the women and men whom God is going to touch next, the lives He will save, the people he will use” (p. 256).  This treatise, unPLANNED, eloquently advances that endeavor.

* * * * * * * * * * * * * * * *

In Defending Life:  A Moral and Legal Case against Abortion Choice (Cambridge:  University Press, c. 2007), Francis J. Beckwith writes primarily “to provide a thorough defense of the pro-life position on abortion and its grounding in a particular view of the human person, a view I will argue is the most rational and coherent one that is at the same time consistent with our deeply held intuitions about human equality” (p. xi).  While focused on abortion, the book really argues that anyone defining man as the imago dei—the “intellectual scaffolding for the Declaration of Independence, the abolitionist movement, Abraham Lincoln’s second inaugural address, and Martin Luther King Jr.’s speech on the steps of the Lincoln Memorial” (p. xi)—automatically presumes the sanctity of life.  

Beckwith, currently a philosophy professor at Baylor University, succinctly summarizes his argument:    

1.  The unborn entity, from the moment of conception, is a full-fledged member of the human community.

2.  It is prima facie morally wrong to kill any member of that community.

3.  Every successful abortion kills an unborn entity, a full-fledged member of the  human community.

4.  Therefore, every successful abortion is prima facie morally wrong.  

Declarative statements (premises and conclusions to a syllogism) such as these obviously defy the pervasive moral relativism and tolerance so dominant today.  (That most moral relativists turn absolutistic is, of course, a given—the same Peter Singer who allows infanticide becomes utterly dogmatic when asserting animal rights!)  But when pro-abortion spokesmen address the issue they generally insist that moral decisions are simply personal or cultural preferences without objective standing.  Beckwith examines and rejects such relativism, proving it self-refuting.  If you’re truly a relativist you simply have nothing to say:  since nothing is true or false, right or wrong, why would one bother making any moral pronouncements?  

Yet moral relativists routinely impose their personal preferences through the judiciary!  Reversing a century of legal tradition holding that the “unborn is constitutionally a person protectable under the Fourteenth Amendment” (p. 22), today’s abortion laws, shaped by Roe v. Wade (1973) and Doe v. Bolton (1973), allow abortion on demand.  But during the past four decades, Beckman says, legal scholars have overwhelmingly rejected the spurious historical assertions (based upon two shoddy articles by a NARAL lawyer!) Associate Justice Harry Blackmun employed to make his case in Roe.  Inexplicably, Blackmun appeared utterly uninterested in scientific or philosophical evidence; he declared that the Court need not determine when life begins, thereby decreeing that the “unborn is not a human person” (p. 30)!  In his judicial fiat, Blackmun “seems to be confusing physical independence with ontological independence; he mistakenly argues from the fact of a pre-viable unborn’s lack of independence from its mother that it is not an independent being, a ‘meaningful life.’  ‘Once again,’ writes Hadley Arkes, ‘the Court fell into the fallacy of drawing a moral conclusion (the right to take a life) from a fact utterly without moral significance (the weakness or dependence of the child).  The Court discovered, in other words, that novel doctrines could be wrought by reinventing old fallacies’” (p. 36).  

Beyond the Court’s decisions, Beckwith addresses and declares vacuous the various popular and philosophical arguments set forth by abortion rights’ advocates.  He shows, scientifically,  that there is a being who is human living in the womb from the moment of conception:  it is “indisputable that at syngamy a new human being, an individual human being, exists and is in the process of development and is not identical to either the sperm or the ovum from whose uniting it arose” (p. 66).  Unlike a house or a bench, which we assemble over time, adding part after part, a fully human being appears in an instant!   “From this point until death no new genetic information is needed to make the unborn entity an individual human being” (p. 67), and “the unborn at any stage of her development looks perfectly human because that is what humans look like a that time” (p. 152).  

Pro-abortion advocates such as Judith Thomson, appealing to a multitude of fallacious arguments—pity, tolerance, feelings, etc.—almost always avoid this manifest reality:  what is killed in abortion in a fully human being.  And nothing justifies such killing!  With meticulous care Beckwith explains and rejects such “pro-choice” appeals and extends his discussion to the related issues of cloning, bioethics and reproductive freedom, for “the answer to the philosophical question lurking behind abortion—Who and what are we?—turns out to be the key that unlocks the ethical quandaries posed by these other issues” (p. 203).  It really matters that what is killed in abortion is truly an innocent human—a being different in kind from ants or antelopes.  

“This moral truth,” says Beckwith in the book’s final paragraph, “is the one strand in the tapestry of republican government that, if removed, will put in place premises that will facilitate the unraveling of the understanding of ourselves and our rights that gave rise to the cluster of beliefs on which the rule of law, constitutional democracy, and human equality depend.  As my dear friend Hadley Arkes has elegantly argued, if we are, as even the supporters of abortion must assume, bearers of moral rights by nature (including the ‘right to choose’), then there can be no right to abortion, for the one who has the ‘right to choose’ is identical to her prenatal self.  Consequently, the right to abortion can only be purchased at the price of abandoning natural rights and replacing them with the will to power.  It is a price not worth paying” (p. 229). 

“‘Statecraft,’ Aristotle wisely instructed his pupils, ‘is soulcraft,’ by which he meant that the moral premises embedded in the social and legal fabric of a political regime provide direction and sustenance for the character and beliefs of its citizens” (p. 42).  Our laws, as well as our schools, shape our minds.  Our abortion laws espouse a culture of death that cannot but destroy our body politic in the long run.  Deeply thoughtful treatises such as Beckwith’s, however intellectually challenging, are needed to restore the full equality of all persons to standing in our nation.  

219 Reason & Christianity

Rodney Stark, a professor at Baylor University, clearly declares the thesis of The Victory of Reason (New York:  Random House, c. 2005) in his subtitle:  Christianity Led to Freedom, Capitalism, and Western Success.”  More precisely:  “Faith in reason is the most significant feature of Western Civilization” (p. 105).  He explores the historical processes within Christianity wherein “reason won the day, giving unique shape to Western culture and institutions” (p. x) beginning with Early Church thinkers such as Irenaeus, Origen, and Augustine who insisted “that reason was the supreme gift from God and the means to progressively increase their understanding of scripture and revelation.  Consequently, Christianity was oriented to the future, while the other major religions asserted the superiority of the past” (p. x).  

To Stark:  “The success of the West, including the rise of science, rested entirely on religious foundations” (p. xi), and, furthermore:  “the rise of the West was based on four primary victories of reason.  The first was the development of faith in progress within Christian theology.  The second victory was the way that faith in progress translated into technical and organizational innovations, many of them fostered by monastic estates.  The third was that, thanks to Christian theology, reason informed both political philosophy and practice to the extent that responsive states, sustaining a substantial degree of personal freedom, appeared in medieval Europe.  The final victory involved the application of reason to commerce, resulting in the development of capitalism within the safe havens provided by responsive states.  These were the victories by which the West won” (p. xiii).  

Uniquely among all religions, Christians urged individuals to reason.  “In the beginning was the Word, and the Word was with God, and the Word was God.  All things were made by Him, and without Him was nothing made that was made.”  With these words the Apostle John paved the way for a rational religion with mystical implications.  Accordingly, Augustine demanded:  “‘Heaven forbid that God should hate in us that by which he made us superior to the animals’” (p. 6).   Reason enabled believers to delve ever deeper into the storehouse of Scripture, better discerning God’s revelation, and successive Church councils refined doctrines as well as refuted heresies.  Creation and Scripture both reveal God, so Christians such as St. Albert the Great (Aquinas’ mentor) encouraged careful, scientific study of the world.  Consequently, as Alfred North Whitehead concluded, in his definitive Science and the Modern World, scientific development took place in the West as Medieval thinkers insisted “on the rationality of God, conceived as with the personal energy of Jehovah with the rationality of a Greek philosopher’” (p. 15).  

Still more, as theologians pondered the mystery of the Trinity they developed a unique understanding of persons who freely think and will.  “Saint Augustine wrote again and again that we ‘possess a will,’ and that ‘from this it follows that whoever desires to live righteously and honorably, can accomplish this’” (p. 25).  As persons free to think and make decisions, we find freedom our natural milieu and our common human nature provides the foundation for natural rights and justice.  So while slavery was tolerated within Christian circles for several centuries, there was a strong bias against it.  As Lactantius noted, in his Divine Institutes, Christians considered others “brothers,” equal in worth before God.  “‘Since human worth is measured in spiritual not in physical terms, we ignore our various physical situations:   slaves are not slaves to us, but we treat them and address them as brother in the spirit, fellow slaves in devotion to God’” (p. 77).  Slavery simply disappeared in the Medieval world.   

Contrary to egregious stereotypes still circulating in many schools and  universities—“a hoax originated by antireligious, and bitterly anti-Catholic, eighteenth-century intellectuals” such as Voltaire (p. 35)—science flourished (often within  monasteries) throughout the Medieval period.  The “Dark Ages” were in fact hardly dark at all!  As the Roman Empire collapsed, millions of individuals were increasingly free to innovate and prosper.  New technologies—water mills, wind mills, horse collars and shoes increasing horse power, wheeled plows, fish ponds, cloth making, chimneys, eyeglasses, clocks, compasses—gradually improved the living standard of ordinary folks.  Simultaneous advances in high culture—pipe organs, harpsichords, violins, polyphonies, Romanesque and Gothic architecture, Dante and Chaucer, scores of universities such as Oxford and Salamanca and Prague—demonstrated the sophistication and originality of Christians throughout the Middle Ages. 

Capitalism developed in the ninth century as Catholic monks managing profitable farms sought to “reformulate fundamental doctrines to make their faith compatible with their economic progress” (p. 55).    As Stark defines it:  “Capitalism is an economic system wherein privately owned, relatively well organized, and stable firms pursue complex commercial activities within a relative free (unregulated) market, taking a systematic, long term approach to investing and reinvesting wealth (directly or indirectly) in productive activities involving a hired workforce, and guided by anticipated and actual returns” (p. 56).  As monasteries acquired more and more land, devout monks sought to manage them well, appropriating new technologies and envisioning a cash economy with just prices far better than antiquated barter systems, realizing the importance of property rights, profits, mortgages and credit.  Private property, Thomas Aquinas argued, must be defended “‘because human affairs are more efficiently organized when each person has his own distinct responsibility to discharge’” (p. 79).   Eminent theologians such as Aquinas “declared that profits were morally legitimate, and while giving lip service to the long tradition of opposition to usury, these same theologians justified interest charges” (p. 63).  They intuited the “miracle” of capitalism—“as time goes by, everyone has more” (p. 106).  

During the late Middle Ages a vigorous capitalistic system flourished.  Abacus schools (often called “Italian schools”) proliferated and trained clerks (adept at double-entry bookkeeping) for slots in burgeoning businesses.  International banks, bills of exchange, and venture capital loans all fueled a dynamic economy.  Additionally, as Christians “medieval capitalists often were concerned about the personal morality of those whom they employed” (p. 111) and stressed frugality and charity.  Within capitalist circles was an association of folks known as the Humiliati, devout Catholics who eschewed luxury and committed themselves to “‘austerity, prayer, fellowship and manual labour, while living with their families’” (p. 121).  They also “pledged to give all of their ‘excess income’ to the poor” (p. 121).  Moving north, capitalism subsidized the woolen mills in Flanders and Holland.  A “precursor to the modern stock exchange” was evident in Bruges, a booming city with a population of 90,000 as early as 1453.  In Antwerp and Amsterdam—and indeed wherever free enterprise capitalism thrived—prosperity ensued.  

When nation states developed in the 15th century, however, these capitalist centers collapsed as the increasingly absolute monarchs of Spain and France determined to control (and expropriate for themselves) their nation’s wealth.  They extended their tentacles into Italy and Holland, crushing (through taxation and regulation) the industries that enabled Florence and Bruges to proper in earlier centuries.  What Adam Smith would label mercantilism led to economic stagnation and repressive policies subverting the common weal throughout Europe.  England also turned in a despotic direction under the Tudors as Henry VII and his descendents sought to centralize power and control the nation’s wealth.  But important historical events (e.g. as the Magna Charta) and traditions (e.g. a genuinely decentralized economy) countered the centralizing tendencies of absolutism and allowed certain kinds of representative government and free enterprise to flourish.  Consequently, Englishmen at home and abroad nourished a capitalist commitment and Alexis de Tocqueville described the United States “early in the nineteenth century as ‘one of the freest and most enlightened nations in the world’” (p. 212).  

So “Christianity created Western Civilization” (p. 233).  Concluding his work, Stark cites “one of China’s leading scholars” who wondered why the West now dominates the world and “studied everything we could from the historical, political, economic, and cultural perspective.  At first, we thought it was because you had more powerful guns than we had.  Then we thought it was because you had the best political system.  Next we focused on your economic system.  But in the past twenty years, we have realized that the heart of your culture is your religion:  Christianity.  That is why the West is so powerful.  The Christian moral foundation of social and cultural life was what made possible the emergence of capitalism and then the successful transition to democratic politics.  We don’t have any doubt about this’” (p. 234).  And, says Stark,  “Neither do I” (p. 235).  

* * * * * * * * * * * * * * * * * * 

On September 12, 2006, Pope Benedict XVI delivered an oft-misrepresented lecture at the University of Regensburg addressing “faith, reason and the university.”  Reminding his hearers of the historic importance of such scholarly conclaves, he pointed to an earlier gathering near Ankara, Turkey, in 1321, between Manuel II Paleologus, the Byzantine emperor, and a noted Persian intellectual.  Challenging the historic Islamic commitment to Jihad, the Christian ruler highlighted the blatant contradiction between one of Mohammed’s early declarations—“There is no compulsion in religion”—and his later endorsement and implementation of Jihad, holy war.  Manuel II issued a challenge:  “Show me just what Mohammed brought that was new, and there you will find things only evil and inhuman, such as his command to spread by the sword the faith he preached.”  

Spreading the faith with the sword is wrong, Manuel argued, because such violence violates the very nature of God.  “God,” the emperor said, “is not pleased by blood—and not acting reasonably is contrary to God’s nature.  Faith is born of the soul, not the body.  Whoever would lead someone to faith needs the ability to speak and reason properly, without violence and threats.”  To a Christian ruler, “shaped by Greek philosophy,” Pope Benedict says, “this statement is self-evident.  But for Muslim teaching, God is absolutely transcendent.  His will is not bound up with any of our categories, even that of rationality” (#14).  Unlike Muslims, Christians believe that in the Logos and an intricate harmony between faith and reason.  Unfortunately, this balance was lost as Nominalists (such as William of Occam) in the late Medieval world joined Muslims in elevating God’s Voluntas above His Logos.  

Protestants such as Luther further tended to dehellenize theology—even despising, in Luther’s case, reason itself.  Immanuel Kant’s 18th century effort to anchor “faith exclusively in practical reason” (#35) arbitrarily reduced religious faith to a purely subjective response, making it a “personal experience,” and liberal theologians in the 19th century (following Schleiermacher) effectively discarded the “God of the philosophers” in order to seek personal encounters with the “God of Abraham, Isaac and Jacob.”  This complex historical development, cogently summarized by Benedict XVI, has led to the pervasive skepticism and relativism so baneful in the modern academy.  Removing reason from religion, voluntarists—whether Mohammed or Duns Scotus, Luther or Kant—paved the way for the pervasive irrationalism so evident everywhere.   With Kant, they mistakenly tried to protect religion by discarding metaphysics and setting “thinking aside in order to make room for faith” (#35).  So we now face a world clearly described by Socrates, in Phaedo, when he appraised the many conflicting philosophies pervading Athens and said:  “It would be easily understandable if someone became so annoyed at all these false notions that for the rest of his life he despised and mocked all talk about being—but in this way he would be deprived of the truth of existence and would suffer great loss.”  However difficult the challenge, Benedict urges us to side with Socrates—and Emperor Manuel II—and discern the truth of being, the Logos whereby all exists, and find our real raison de etre.  

Benedict’s lecture, says James Schall in The Regensburg Lecture (South Bend:  St. Augustine’s Press, c. 2007), “is one of the fundamental tractates of our time” (p. 9).  It deserves extended commentary as well as reading and re-reading.  The Pope “has an amazing capacity to get to the heart of things.  He is a wise man in the proper sense of that term.  That is, he knows how to find the order in things.  He knows the foundational issues” (p. 5).  Open to Reality, he wants to “straighten out our minds about where we are and what we are about.  Acting correctly presupposes thinking correctly, presupposed understanding what is” (p. 10).  Rightly understanding what is about God rightly concerns us.  “Is He logos or not, is He sola voluntas or not.  We need to grasp the import of such inquiry” (p. 44).  Christians who worship a reasonable God and Muslims who worship an arbitrary, voluntarist Allah do, in fact, worship different gods!  Therein lies the radical, irreconcilable differences between these two religions.  Christian martyrs die for their Lord; Muslim suicide bombers randomly kill, taking others’ lives in the name of Allah.  

Remarkably akin to Islam, the modernity crafted by Western intellectuals such as Descartes and Rousseau and Marx assumes “that the first principles of reason are themselves subject to will.  Contrary to Aristotle, they do not ‘bind’ reason to what is.  Modernity, in its philosophic sense, means that we are bound by nothing.  There is no order in things or in the mind, for that matter, that would ground any order.  There is only the order we ourselves make and impose on things.  This view of modernity has developed, in large part, to protect us from the notion that truth obligates us.  The real question thus becomes, in the classical sense, what ‘limits’ reason?  The answer is what is, reality” (p. 106).  

Eminent Christian theologians, from Origen onwards, have relied on Greek philosophers such as Plato and Aristotle as well as biblical revelation.  Western Civilization stands as witness to their invigorating intellectual work.  We need (personally and collectively) both good philosophy and theology.  As the 21st century begins, the West has, sadly, fallen on hard times.  (I saw hints of this when, mid-way through my career as a history professor, World Civilizations replaced Western Civilization as a staple in the general core.  Jesse Jackson’s orchestrated marches at Stanford University toward the end of the ‘80s—chanting “Hey, Hey, Ho, Ho, Western Civ Has Gotta Go”—memorably marks that transition.)  

Professor Schall’s exegesis of Benedict’s lecture amplifies and illuminates its message.  We must reestablish the truth at all things were made by the Word—God, essentially, is the Mind making all that is coherent.  As rational creatures, we are called to behold this Mind and conform our minds to His.  “Mind is universal, as Cicero often said” (p. 128).  Either God capriciously calls for Jihad or rationally pleads for brotherly love.  Between the two gods there is simply no common ground.  Reasonable discussion, the Pope hopes, might lead to a common commitment to what our minds, open to what is, simply must tell us what is true regarding the “One in whom we live, and move, and have our being.”  

* * * * * * * * * * * * * * * * * * * *

Roger Scruton—an academically trained English philosopher who now writes full time, the author of eminent treatises such as Modern Philosophy and The Aesthetics of Music—explores, in The West and the Rest:  Globalization and the Terrorist Threat (Wilmington:  Intercollegiate Studies Institute, c. 2002) “the vision of society and political order that lies at the heart of ‘Western civilization’” (p. x).  From its inception Christianity held aloft Jesus words:  “Render unto Caesar the things that are Caesar’s, and unto God the things that are God’s.”  Two realms, St. Augustine’s two cities—the religious and the political—should co-exist and retain their proper boundaries.  “The idea persists in the medieval distinction between regnum and sacerdotium, and was enshrined in the uneasy coexistence of Emperor and Pope on the two ‘universal’ thrones of Medieval Europe” (p. 4).  Consequently, “throughout the course of Christian civilization we find a recognition that conflicts must be resolved and social order maintained by political rather than religious jurisdiction.  The separation of church and state was from the beginning an accepted doctrine of the church” (p. 5).  While many religions are tribal or national, Christianity was ever a “creed community,” open to Jews and Gentiles, men and women, slaves and freemen.  Flourishing within the Roman Empire, it “adopted and immortalized the greatest of all Roman achievements, which was the universal system of law as a means for the resolution of conflicts and the administration of distant provinces” (p. 21).  In the secular realm, Christians were loyal, law-abiding citizens of the state; in the spiritual realm, however, they obeyed God only.  

Islam, on the other hand, insists there can be no separation of church and state—all is one under the sovereign authority of Allah and the ulama who claim divinely imparted knowledge and imams who interpret Mohammed’s edicts in the Koran.  Islam is not a political system, but it insists on controlling the political order.  “Like the Communist Party in its Leninist construction, Islam aims to control the state without being a subject of the state” (p. 6).  Islamic law, the sharia, minutely regulating all kinds of behavior, is envisioned as the perfect resolution to all social as well as religious issues. 

In Scruton’s judgment, the separation of church and state began to unravel during in the West during the Enlightenment.  This was because it is “impossible to understand the French Revolution of one does not see it as primarily a religious phenomenon” (p. 44).  Both the monarchy and the Church were to be destroyed by the Revolution’s “fanaticism and exterminatory zeal” (p. 45).  As revolutionary movements and ideologies grew empowered during the next two centuries, a “godless theology” gained momentum and a rather unanticipated “culture of repudiation” emerged, extending even to such hallowed entities as the family.  This is evident in a pervasive “demand for rights” wherein politics degenerates into “a scramble to claim as much from the common resources as they will yield” (p. 68).  Added to this is the postmodern repudiation of objective truth and, indeed, reason itself!  Enamored of Nietzsche, eminent intellectuals such as Foucault and Derrida recite his prescription:  “There are no truths, only interpretations.”  An inconsistent and self-contradictory relativism necessarily devolves from this, placidly holding that all cultures, as well as all viewpoints, are equally valid.  “All distinctions are ‘cultural,’ therefore ‘constructed,’ therefore ‘ideological,’ in the sense defined by Marx—manufactured by the ruling classes in order to serve their interests and bolster their power.  Western civilization is simply the record of that oppressive process, and the principal purpose of studying it is to deconstruct its claim to our membership” (p. 79).  Nothing can be judged, nothing condemned—except, of course, universal truths and objective values and anyone who dares challenge the regnant relativism.  

Thus today we have a newly-apologetic West facing a suddenly-militant and aggressive Islam.  In an economically globalizing world conflicts inevitably erupt and “the Islamists have identified the core component of the system that they wish to destroy” (p. 134).  Globalization’s tentacles, spreading into Muslim lands, revealed a secular society without foundation in divine law, challenging most all the traditions sacred to Islam.  “It is the very success of America in founding a common loyalty without a shared religious faith that so incenses the Islamist extremists” (p. 65).  Scruton’s analysis is fresh and insightful.  While not the whole story, it tells important truths regarding what distinguish “the West from the rest” (p. 159).