258 Things That Matter

One of the nation’s most eminent columnists and TV commentators, Charles Krauthammer, has collected a variety of his Washington Post columns in Things That Matter:  Three Decades of Passions, Pastimes and Politics (New York:  Crown Forum, c. 2013).  When he began the project, he planned to print exclusively non-political essays (tentatively titling the collection There’s More to Life than Politics, thus suggesting that personal and cultural things matter most), but he quickly realized that just as he had moved from psychiatry to journalism years ago because he believing in the “sovereignty” of politics, so too his book should reflect his conviction that:  “Politics, the crooked timber of our communal lives, dominates everything because, in the end, everything—high and low and, most especially, high—lives or dies by politics.  You can have the most advanced and efflorescent of cultures.  Get your politics wrong, however, and everything stands to be swept away.  This is not ancient history.  This is Germany 1933” (#2).  

Krauthammer assembles his essays thematically rather than chronologically.  He treats things “personal” and gives us some insight into his life story as well as his perspective on events ranging from chess matches to baseball, from dogs, from Columbus to Nixon, from psychiatry to art.  He considers things “historical,” looking especially at Israel and recent American history, including a number of columns spawned by the terrorist attacks on September 11, 2001.  Obviously well-read and erudite, these columns serve as tiny windows on the world the author finds fascinating and thereby reveal facets of his personality.  But it’s when he turns to politics that he clearly deals with what’s most important to him.  Here he often provides important historical materials, such as the role of the French Revolution in shaping the modern world.  The bloodbath beginning in France in 1789 with the attack on the Bastille has been routinely  replicated by totalitarian 20th century movements.  Whereas the American Revolution in 1775 focused singularly on liberty, the French pursued both liberty and state power, thereby birthing “the model, the monster, of the mobilized militarized state” fully realized in Lenin’s and Stalin’s USSR.  “In Saint-Just’s famous formulation:  ‘The Republic consists in the extermination of everything that opposes it.’  This brutal circularity of logic is properly called not revolution but nihilism” (#1810).  

In less virulent forms this infatuation with state power has played an important role in American history and is surely evident in the convictions and aspirations of our current President, Barack Obama, who routinely embraces what Krauthammer calls the “fallacy’ of “equating society with government, the collectivity with the state” (p. 136).  Thus he promotes redistributionist taxation and central planning to “spread the wealth around.”  The “Julia” advertisement in the 2012 Obama campaign celebrated a woman “swaddled and subsidized throughout her life by an all-giving government of bottomless pockets and ‘Queen for a Day’ magnanimity.   At every stage, the state is there to provide—preschool classes and cut-rate college loans, birth control and maternity care, business loans and retirement.  The only time she’s on her own is at her grave site” (p. 137).  

Invoking state power to sanction traditionally perverse and ultimately destructive sexual behaviors further typifies today’s left.  Gay activists, promoting same-sex marriage, have opened the door for polygamy as well.  It is, indeed, “utterly logical for polygamy rights to follow gay rights.  After all, if traditional marriage is defined as the union of (1) two people of (2) opposite gender, and if, as advocates of gay marriage insist, the gender requirement is nothing but prejudice, exclusion and an arbitrary denial of one’s autonomous choices in love, then the first requirement—the number restriction (two and only two)—is a similarly arbitrary, discriminatory and indefensible denial of individual choice” (p. 163).  

Under the New Deal, state power was wielded to care for the elderly through the Social Security Administration.  It has, for nearly 80 years, proved a boon for the millions whose retirement years are blessed by a monthly check.  Rightly defined, however, Krauthammer insists, it’s clearly a Ponzi scheme!  It’s a “pay-as-you-go” program whereby recipients get paid by newcomers to the system, which is by definition a Ponzi scheme.  “The critical distinction between a Ponzi scheme and Social Security is that Social Security is mandatory.  That’s why Ponzi schemes always collapse and Social Security has not” (p. 175).  Still more:   if Social Security can “rustle up enough new entrants” and change some its standards to fit the new demographics, it can evade failure for at least the foreseeable future.  It is indeed a Ponzi scheme but a “vital, humane and fixable” one!  

Whereas Social Security may be something of a sustainable fraud, “‘The largest threat to freedom, democracy, the market economy and prosperity,’ warns Czech president Vaclav Klaus, ‘is no longer socialism.  It is, instead, the ambitious, arrogant, unscrupulous ideology of environmentalism’” (p. 178).   Throughout the 20th century, “social planners, scientists, intellectuals, experts and their left wing political allies—arrogated to themselves the right to rule either in the name of the oppressed working class (communism) or, in its more benign form, by virtue of their superior expertise in achieving the highest social progress by means of state planning (socialism)” (p. 178).  At the dawn of the 21st century these experts and social planners have nestled into the environmental movement and shrewdly used the power of the state to implement their left wing goals and regulate everyone’s activities in order to save the planet.  

Though not a full-fledged Libertarian, Krauthammer shares many libertarian convictions.  His “conservatism” is mainly of a fiscal and constitutional variety.  And ironically, in a book titled “things that matter” he has little to say about things that matter most!  A rather thoroughly secularized Jew, he occasionally lifts his pen to defend Israel and his kinsmen.  But as an agnostic he has little or nothing to say about things religious.  Thus his columns offer information and insight into the passing scene but provide little sure guidance as to our status as pilgrims passing through it.  Things that Matter, consequently, is best appreciated as “light reading”—light in both its journalistic style and ultimately transient subject matter.  

* * * * * * * * * * * * * * * * * * * * 

Looking back over a distinguished career as a professor of philosophy at Georgetown University, James V. Schall, SJ, recently published something of an encomium to Christian philosophy in Reasonable Pleasures:  The Strange Coherence of Catholicism (San Francisco:  Ignatius Press, c. 2013).  He launches each of the book’s chapters with a quotation worth pondering.  Thus, his Introduction—“On What Proves Profitable to Examine”—begins with a statement in Aristotle’s Ethics:  “However, we should examine the origin not only from the conclusion and premise [of a deductive argument], but also from what is said about it; for all the facts harmonize with a true account, whereas the truth soon clashes with a false one.”  To discern the truth is, Schall says, a “reasonable pleasure,” for it is “the delight we take in knowing the truth of things, especially the truth about ourselves and our place in the existence of things” (#65).   As rational creatures we’re designed to enjoy thinking, just as sensual creatures we’re designed to enjoy eating.  So when we “get the point” of a demonstration we are pleased; when we weary of a speaker’s interminable meanderings, we wish he’d “get to the point;” when we “see the point” of a joke we enjoy laughing.  

Throughout his life Schall has found G.K. Chesterton superbly stimulating.  Thus his chapter on dogma—defined as “an accurate statement of what is true”—cites a passage from GKC’s Orthodoxy which declares:  “If I am asked, as a purely intellectual question, why I believe in Christianity, I can only answer, ‘For the same reason that an intelligent agnostic disbelieves in Christianity.’  I believe in it quite rationally upon the evidence.  But the evidence in my case, as in that of the intelligent agnostic is not really in this or that alleged demonstration; it is in an enormous accumulation of small but unanimous facts.”  Carefully weighted, the facts point to faith in Christ.  Certainly we hunger for a personal relationship, to experience of God, to see Him, to feel Him.  “But we also long to understand, to make sense of what is that we behold or hold in faith and in observation” (#293).  Thus dogma!  Rather than constricting the mind, dogma enables it to relish what has been grasped, to feast upon the truth that nurtures the soul.  

Along with a small but distinguished cadre of philosophers such as Henri Bergson and Michael Novak, Schall finds “wit and humor” essential human traits.  Still more:  “laughter in itself . . . is one of the signs of eternal life” (#552).  It’s also a vital way to pry open men’s minds to truth.  Heeding Boswell’s Life of Johnson, we learn that the learned doctor’s “trade is wit.  It would be as wild in him to come into company without merriment, as if a highwayman were to take the road with his pistols.”  Thus Schall supports “this proposition:  That particular philosophical or theological theory is most correct, most likely to be true, which can best account for laughter and joy” (#552).  When we laugh we demonstrate our ability to think, to see the incongruities and paradoxes and novelties of existence.  As Aristotle wisely prescribed:   “we cannot laugh all the time at the risk of becoming frivolous buffoons, nor can we refused to laugh at all at the risk of becoming bores and dullards.  Save us from the witless man who laughs at nothing” (#602).  

What’s true of wit and humor easily extends to “play and sports,” distinctively human activities  manifesting the creativity of the human mind.  Contrary to legions of rigorists who disdain such recreational activities, Schall thinks they reflect “the real world in ways that constantly surprise us” (#781).  Here again he finds support in Aristotle, who noted that games, like songs, are played for their own sake and “suggested that watching games was rather like contemplation” (#806).  Obviously games are not “the most important things in the world, but, in their own way, I think that they prefigure what is.”  We certainly ponder life’s meaning in various ways, but an important way is by understanding “what we do when we see a good game” (#891).  Absorbed in a game, easily losing track of ordinary time, we cease to think of ourselves; we realize there’s a reality other than our mind.  If we love the game, we know how to love something other than ourselves.  

Turning to a dramatically different topic, Schall makes a “‘Reasonable’ Case for Hell,” something he’s thought much about.  There is, in fact, good reason for such a place.  Centuries ago, in Phaedo, Plato (upholding a transcendental standard of justice unattainable through human institutions) said:  “Those who are deemed incurable because of the enormity of their crimes, having committed many great sacrileges or wicked and unlawful murders and other such wrongs—their fitting fate is to be hurled in Tartarus never to emerge from it.”  Many folks think “the teaching on hell was revealed to us to make us fear punishment.  I tend to think it was first given to us to make us think, and to be careful where we tread” (#1227).  Truthful thinking, rejecting rationalizations, leads us to confession and repentance, which “is nothing less than restoring to order what we chose to reject.  We live with the consequences of our sins, but, on repentance, no longer with their self-justifying principle, which is that of preferring our will to God’s reason in giving us being” (#1388).    

Even in Christian circles today the doctrine of hell has been shoved to the periphery of church teachings.  A really good God, many think, simply could not eternally punish any of His creatures.  Thus when celebrities such as Princess Diana are tragically killed their fans instantly assume (rather than hope) they’re happily in heaven.  Thinking thusly, they fail to understand, with Plato, that if earthly “crimes are not punished properly, the world is incoherent.  In other words, it would be even crueler if our anguished minds know that the greatest crimes were not punished or the greatest deeds not rewarded” (#1314).  It would also be unjust for God, who granted us free will, to override that freedom.  “Free creatures can abuse themselves and others by their free acts” (#1405), and if we bear no responsibility for them we’re not truly free.  Having designed us to make choices, our good God respects those choices—even if we choose to eternally sever our ties to Him.  “No one is punished unjustly.  No one escapes the justice due to his free acts.  All can be forgiven if they will.  God would not have it otherwise.  He could not make a free being not to be free.  He could not permit a free being to escape the logic of his choosing himself over others” (#1411), including The Holy Other One, God.  

In a chapter titled “the earthly city,” Schall finds Aristotle our best guide to politics.  As social beings we clearly need to discover ways of getting along with each other.  “The great illusion of the twentieth century was that we could ‘force’ men to be virtuous by careful planning of their politics and economics” (#1514).  “Most modern ideologies are little more than efforts to bring the Kingdom of God to this earth by some sort of human means” (#2115).  Charmed by that illusion, millions of men suffered as utopians of various hues (Lenin; Hitler; Mao; Castro; Pol Pot) sought to build societies wherein everyone is equally cared for and content.  They failed to grasp the necessary distinction drawn by both Plato and Aristotle between politics and economics.  They’re not the same thing, and providing for material well-being is not the main raison d’etre of politics.  

Still more, these utopians failed to understand what Saint Robert Cardinal Bellarmine declared centuries ago:  “‘If you are wise, then, know that you have been created for the glory of God and your own eternal salvation.  This is your goal; this is the center of your life’” (#1526).  More recently, Alexander Solzhenitsyn said much the same, explaining what had so sadly gone wrong in the Soviet Union:  “Men have forgotten God.”  By nature, created in the “image and likeness of God,” we rightly understand ourselves only in the light of Christ.  Man is “created not to be merely human but to be more than human.  Homo non proprie humanus sed superhumansus est.  This inner core of his being explains why he is never satisfied with the goods, power, or glory that he initially things will satisfy him” (#1611).  

What most deeply satisfies us, Schall shows, is worship—an activity much akin to wit and humor, sports and play, something in and of itself meaningful and satisfying, something deeply contemplative.  Worshiping false gods, worshiping creatures (especially those of a political sort) rather than the Creator ,has marred the human story.  But worshiping the true God, conforming our minds to what’s really Real (He Who Is), is the most reasonable of all pleasures.  Turning to the rich history of the Church, we find:  “The motto of the Order of Saint Benedict is Ora et labora, ‘Pray and work’: that of the Order of Saint Dominic, along with Veritas, ‘Truth’, is Contemplata aliis tradere, ‘To give to others the fruits of contemplation’; and that of the Society of Jesus is In Actione contemplativus, ‘Contemplation in action’.  By calling these phrases to mind, I want next to consider how it is possible to look on the world and ourselves as engaged in the worship of God in all we do” (#1879).  Praying, thinking, teaching, parenting, evangelizing, sharing the Good News in a multiplicity of ways, reveals a people rightly worshiping God.

And such worship leads us as pilgrims along the way of eternal life, union with God Himself.  Everything good in life posits the need for and availability of salvation.  To be saved from sin, to be freed from sin, to be healed within and prepared for life everlasting, is the point of the human story.  In the wonderful words of T.S. Eliot, in “Dry Salvages”:  

But you are the music

While the music lasts.  These are only hints and guesses,

Hints followed by guesses; and the rest

Is prayer, observance, discipline, thought and action.

The hint half guessed, the gift half understood, is Incarnation.

* * * * * * * * * * * * * * * * * * * 

For many years Reb Bradley has spoken and written about family life and parenting in such publications as Child Training Tips.  In Born Liberal, Raised Right:  How to Rescue America from Moral Decline—One Family at a Time (Los Angeles:  WND Books, c. 2008), he seeks to “answer questions asked by hundreds of thousands of people across the country:  What has happened to America in the last fifty years?  Where did we go wrong?  Why do so many children raised in good homes grow up with values so different from their parents?  Who is responsible for America’s moral downturn—the schools, the entertainment industry, the media?  Is our problem rooted in poverty, racism, and a lack of good education; or are insensitivity, intolerance, and oppressive religion the real culprits? (#83).    

We’re daily reminded by the violence in rap music, video games and schools, that something dangerous has infiltrated youth culture.  We’re routinely alarmed by our kids’ academic stagnation, as evident in SAT scores that have significantly slipped since 1963.  But the real problem cannot be blamed on “society,” since it comes, Bradley thinks, from parents (however personally conservative, however unconsciously) raising little liberals who lack “the key ingredient of maturity—self-control.  Any society that is out of control is comprised of individuals who lack self-control” (p. 4).  To identify this  deficit as the core issue is not a simplistic, though it is certainly a simple, step, for:  “In modern America, most individuals are ruled by their passions—they lack self-restraint—they cannot say ‘no’ to themselves.  If they had the virtue of self-control, they and the society they comprise would not be so ‘out of control’” (p. 4).   Too many of us lack what America’s Founders found essential for a free republic:  self-governing citizens.  In a letter to his brother John, Samuel Adams said Americans “must learn ‘. . . the art of self government without which they never can act a wise part in the government of societies.’  James Madison, known as the ‘Father of our Constitution,’ emphasized the importance of personal moral restraint when he declared, ‘We have staked the whole of all our political institutions on the capacity of mankind for self-government, upon the capacity of each and all of us to govern ourselves, to control ourselves’” (p. 14).  Ironically, in a nation where people routinely celebrate their “freedoms,” increasing numbers of us are enslaved by the cruelest of all taskmasters:  intemperance.  

Just as it’s easy to see the baleful impact of such intemperance it’s easy to discover its source.  “Self-control is a deep-rooted character trait trained into children by their parents in the first few years of life.  It was the observation of Thomas Jefferson that, ‘the qualifications for self-government in society are not innate.  They are the result of habit and long training.”  Thus:  “The true cause of America’s decline is that most modern parents were not raised to value self-control as a virtue, so few have trained their children to have it” (p. 7).   Misled by educators and therapists promoting “self-expression,” “self-actualization,” and “self-esteem,” parents have equated love with permissiveness and predictably reared self-indulgent kids.  They are, in fact, aiding and abetting expressions of their sinful nature, doing what’s enjoyable rather than what’s right!  We are all born little hedonists, following the mantra of the ‘60s:  “If it feels good, do it.”  “It has been said that evil triumphs when good men do nothing.  So also, for liberalism to succeed, parents need not do anything” (p. 10).  

To encourage and teach parents how to lead their little liberals into mature adults, Bradley offers, in this insightful and readable book, suggestions regarding what must be done.  

# # # 

257 The 21st Century Church

  Discerning the face and future of the Church at the beginning of the 21st century cannot but perplex us, since we struggle to envision the invisible.  But some thoughtful analysts at least give us food for thought by reflecting on certain phenomena or trends that may reveal clues as to what’s happening and what’s likely to come.  In An Anxious Age:  The Post-Protestant Ethic and Spirit of America (New York:  Image, c. 214), Joseph Bottum finds our “moment more tinged by its spiritual worries than any time in America since perhaps the 1730s” (#39 in Kindle).  We live in a nation alive with a “spiritual” ferment that eludes traditional denominational (and even Christian) categories.  In fact:  “We live in a spiritual age when the political has been transformed into the soteriological.  When how we vote is how our souls are saved” (#59).  This results, in a peculiar way, from the dramatic decline of the Mainline Protestant churches—“the central political fact of the last 150 ears of American history” (p. 79)—that once dominated the religious life of America, leaving as their inheritance an “elect class” of folks forging “a post-Protestant ethic for the spirit of America” (p. #140).  This generally successful, upper-middle-class cadre—what Bottum calls the “Poster Children”—has replaced their grandparents’ Mainline Protestant faith with their own creeds and convictions.  

Bottum’s Poster Children are products of the ‘60s, snidely sneering at their elders’ Ozzie and Harriet world.  “They are, for the most part, politically liberal, preferring that government rather than private associations (such as intact families or the churches they left behind) address social concerns.  They remain puritanical and highly judgmental, at least about health, and like all Puritans they are willing to use law to compel behavior they think right.  Nevertheless they do not see themselves as akin to their Puritan ancestors, for they understand Puritanism as concerned essentially with sexual repression, and the post-Protestants have almost entirely removed sexuality from the realm of human action that might  be judged morally” (p. 14).  And though they do not identify as Protestant Christians they actually demonstrate the inevitable unraveling of the “Social Gospel” proclaimed from Mainline Protestant pulpits throughout the 20th century.  

The “Social Gospel,” as formulated by Walter Rauschenbusch a century ago, redefined sin as a social condition needing correction, not a personal evil requiring redemption.  Jesus was an inspired teacher proposing social justice rather than a suffering Savior dying to save our souls.  Egregious social evils—Rauschenbusch listed “bigotry, the arrogance of power, the corruption of justice for personal ends, the madness of the mob, militarism, and class contempt” p. 37)—require social action culminating in social justice.  Rauschenbusch, of course, called the churches to spearhead assaults on such evils, but today those churches are largely empty and invalided.  Yet his agenda has been internalized (without any of his theological trappings) by the Poster Children.  They effectively embrace “a social gospel, without the gospel.  For all of them, the sole proof of redemption is the holding of a proper sense of social ills.  The only available confidence about their salvation, as something superadded to experience, is the self-esteem that comes with feeling that they oppose the social evils of bigotry and power and the groupthink of the mob” (p. 39).  Feeling righteous (e.g. endorsing government programs to care for the poor), not being righteous (e.g. personally giving to charities actually caring for the poor), is the ultimate good.

As the Mainline Protestant churches declined, both in numbers (from 50% to 10% of the population in fifty years) and public influence, some observers thought Roman Catholics would fill in the vacuum.  But the “Catholic Moment” envisioned by Richard John Neuhaus seems to have passed and Bottum sees little likelihood of Catholicism replacing Mainline Protestantism as the nation’s core religion.  Since Vatican II, of course, the Catholic Church in America has experienced something of its own implosion, acerbated by appalling sex-abuse scandals, and seems significantly weakened in many ways.   “Twenty-five years of the prestige built up by John Paul II and Mother Teresa drained away in an instant.  And at every moment since, whenever the bishops have tried to influence public affairs, there has been someone ready to remind us of their sins”” (p. 218).  Intellectually, there has certainly been some effective work, such as that done by the late Cardinal Avery Dulles, Richard John Neuhaus, George Weigel and Robert George.  And there are certainly numbers of devout and intellectually vigorous young Catholics sustaining the Church, but just as the swallows no longer return to San Juan Capistrano so too the prospects of Catholicism becoming the foundational faith of America seems slight.  

Though Bottum’s analysis regarding the future for Christianity in America may be unduly pessimistic, it merits thoughtful reading, especially for its penetrating exploration of the Social Gospel’s deleterious impact on orthodox faith and practice.  A published poet, Bottum writes fluently and provides those fresh and masterful figures of speech characteristic of poets.  That we Christians feel considerable anxiety when pondering the state of the church goes without saying—and to a degree this has been true for 2000 years.  But to explain our anxiety, as Bottum has done, may well help us pray and work more wisely.

* * * * * * * * * * * * * * * * 

While best known for his magisterial biography of Pope John Paul II, George Weigel has written widely on Catholic concerns.  In Evangelical Catholicism:  Deep Reform in the 21st Century Church (New York:  Basic Books, c. 2013), he endeavors to show that:  “The deep reform of the Catholic Church has in fact been underway for more than one and a quarter centuries.  It began with Pope Leo XIII” and was sustained by “the revitalization of Catholic biblical, liturgical, historical, philosophical, and theological studies in the mid-twentieth century.  It continued in another, and at least as important, way in the martyrdom of millions of Catholics at the hands of the mid-twentieth century totalitarian systems” and was enriched by the work of the Second Vatican Council and popes Pius XII, Paul VI, John Paul II and Benedict XVI (p. 2).  In process since Pope Leo XIII’s election in 1878, this reform movement embraces much that is good in the secular world, leaving behind many Counter-Reformation distinctives while invigorating the traditions and truths that hallmark it as perennially compelling.  Neither the antiquarians longing for a pre-Vatican authoritarianism nor the progressives following the path of Protestant Liberalism provide the reform pattern.

But it has been made possible by the remarkable (and now officially Saint) Pope John Paul II orchestrated what seemed impossible when he was elected in 1978—effectively implementing the decisions of Vatican II without dividing or destroying the Church.  He “led the Church into the kind of new Pentecost that John XIII had envisioned, through the experience of the Great Jubilee of 2000; and pointed the Church firmly and confidently into the future, declaring that a ‘New Evangelization’ would be the Catholic Church’s grand strategy in the twenty-first century and the third millennium.  That grand strategy has been followed by Benedict XVI, whose pontificate has been one of dynamic continuity with that of his predecessor, whose accomplish may lead history to remember him as Pope St. John Paul the Great” (p. 10).  Shedding the fortress mentality that characterized many Catholics since the French Revolution, Evangelical Catholicism sees itself as a “counterculture that seeks to convert the ambient public culture by proclaiming certain truths, by worshipping in spirit and in truth, and by modeling a more humane way of life.  Evangelical Catholicism does not seek to ‘get along’; it seeks to convert” (p. 19).  

Such Catholicism is countercultural inasmuch as it dissents from the various highly subjective “spiritualities” which stress the individual’s search for God—and of the myriad churches designing programs to entice “seekers”—frequently boiling down to little more than the erroneous notion that “it’s all about me.”  Sweetly catering to what Weigel calls “the imperial autonomous Self” is not the Catholic way!  In fact, Christianity is all about Christ and what He’s done for us!  Christ’s Gospel calls us to repent and believe, to enter into His Kingdom, and to enjoy communion—a “mature friendship”—with Him.   Christians tell the story, from the beginning, of God’s search for us.  “The conviction that Christianity is a revealed religion is thus the conviction on which Evangelical Catholicism rests:  the supernatural gift of divine revelation (i.e., God coming in search of us) is given to men and women so that, by an act of faith that is itself made possible by supernatural grace, they may be set on the path of salvation, which is the glorification of the human person within the light and life of god the Holy Trinity (i.e., our responding to God’s search for us by learning to take the same path through history that God is taking” (p. 28).  

In important ways Weigel’s “Evangelical Catholics” have more in common with conservative Protestant Evangelicals than liberal Catholics.  They share, for example a commitment to the inspiration and integrity of Scripture.  Pope Pius’ Providentissimus Deus and Vatican II’s Dei Verbum reaffirm “St. Jerome’s axiom that ignorance of the Scriptures is ignorance of Christ, who is the living center of the Word of God read and the word of God preached” (p. 75).  Evangelicals and Evangelical Catholics both call for an oft-radical conversion—a turning away from sin and the “spirit of the age” that encourages it—and sustained commitment to Jesus and His Way, engaging in charitable works.  Unlike Evangelicals, however, Evangelical Catholics deeply revere and adhere to the Seven Sacraments of the Church and her liturgical services as the primary means of grace available to us.  

“The first criterion of authentic Catholic reform reflects the promise of the Lord to his first disciples:  that through him, they would ‘know the truth, and the truth will make you free’ [John 8:32].  This is the criterion of truth:  all true Catholic reform is built out from the truth that is Christ and reflects the truths that have been entrusted to the Church by Christ” (p. 92).  The second criterion is mission, proclaiming the Gospel of salvation to the whole world.  Contrary to the World Council of Churches, declaring that “the world sets the agenda for the Church,” reforming Catholics have an agenda for the world:  come to Jesus!   To rightly blend truth and mission requires nothing less than holiness, the standard by which all the Church’s endeavors must ever be measured.  “Holiness is the ultimate antidote to infidelity, fear, and the evangelical paralysis that follows from infidelity and fear.  Holiness is what binds together a Church in which centrifugal forces are always at work” (p. 105).  “In its Dogmatic Constitution on the Church, the Second Vatican Council spoke eloquently about the universality of the call to holiness.  The Lord himself, the Council Fathers recalled, ‘preached holiness of life . . . to each and every one of his disciples without distinction . . . :  ‘You, therefore must be perfect, as your heavenly Father is perfect’” [Matthew 5:48]” (p. 257).  Consequently:  “the first task of Evangelical Catholicism is to foster the holiness of all the people of the Church” (p. 257).  

The reforms Weigel envisions involve the clergy—popes, bishops, priests—resulting in a corps of men modeling John Paul II, to whom the author devotes many pages.  Priests “must be thoroughly converted to Christ, must be living a life of friendship with the Lord Jesus, and must have shown at least some capacity to invite others to meet the Lord before he can be seriously considered as a candidate for the diocesan priesthood” (p. 139).  Seminary training must focus on the “Great Tradition of the Christian faith” summed up in the Catechism of the Catholic Church and persuasive apologetics rather than the critical categories of secular graduate schools.  Priests must learn to effectively preach expository sermons reflecting an orthodox understanding of Scripture.  “A man who cannot preach well will not be a priest capable of being an icon of the priesthood of Jesus Christ, the Word incarnate, in the Evangelical Catholicism of the future” (p. 146).  Leading the people in worship, the priest will understand himself to be the “servant of the liturgy, not its master, and that he must never think of the sacred liturgy as an occasion for the expression of his charming (or winsome, or glowing) personality” (p. 147).  Indeed, the Catholic priest embodies in a special way the Gospel truth that human flourishing comes through self-gift, not self-assertion.  The postmodern world celebrates the imperial, autonomous Self; the evangelical Catholic priest lives in the life of an obedient, deeply ecclesial man-for-others” p. 150).

In addition to the clergy, Evangelical Catholics need to reform the liturgy, including music, restoring it to some of its ancient majesty.  Religious orders (monastic communities) need serious attention, restoring them to their authentic purposes.  The laity (politicians included) needs to participate, living out the Gospel in marriage, child-rearing, vocation.  Universities, to deem themselves Catholic, most heed the admonitions of popes John Paul II and Benedict XVI and recover their distinctively Christian character.

* * * * * * * * * * * * * * * * * * *

After serving several years as a Presbyterian pastor, T. David Gordon began teaching “media ecology” courses at Grove City College, only to be diagnosed with stage III colorectal cancer and given a 25% chance of survival.  Having pondered the sorry state of the pulpit for many years, he was prodded by his illness to share his burden in Why Johnny Can’t Preach:  The Media Have Shaped the Messengers (Phillipsburg, NJ:  P&R Publishing, c. 2009).  He laments that “less than 30 percent of those who are ordained to the Christian ministry can preach an even mediocre sermon” (#44 in Kindle).  They generally fail to make a basic point, rooted in a biblical text, and then propose legitimate applications.  This results not so much from the intrinsic ability of the pastors as from “societal changes that led to the concerns expressed in the 1960s to 1980s in educational circles—societal changes reflected in a decline in the ability to read (texts) and write—[which] have led to the natural cultural consequence that people cannot preach expositorily” (#86).  American culture has shifted dramatically in the past half-century:  “A culture formerly dominated by language (reading and writing) has become a culture dominated by images, even moving images” (#306).  Even magazines and textbooks are now packed with images designed to arrest the “reader’s” attention.  

Enlisting the aid of Robert Lewis Dabney’s classic Lectures on Sacred Rhetoric, Gordon identifies these essential elements of a good sermon:  1) textual fidelity; 2) unity; 3) evangelical tone; 4) instructiveness; 5) movement; 6) point; 7) order.  Few of these fundamentals can be found in today’s sermons, Gordon says.  This is because to compose a good sermon requires, firstly, the ability to carefully read the biblical text.  Unfortunately, few Americans (including those with college degrees) actually read great texts, whether Shakespeare or T.S. Eliot, Dostoevsky or Mark Twain.  What reading is done is focused on information rather than literary skill and understanding.  As C.S. Lewis observed, modern readers tend to “use” rather than “receive” texts.  And, to complicate things, post-modern readers simply read into the texts their own predispositions!   

Secondly, composing a good sermon requires, needless to say, composition!  And in a world of cells phones (with continuous “texting”) there is hardly a second invested in thoughtfully composing one’s thoughts and skillfully putting words on paper!  Talking on the phone easily destroys our ability to write!  “The consequences for preaching should be very obvious.  Telephone conversations rarely have unity, order, or movement; it isn’t surprising that those who spend more time on the phone than in private written correspondence preach sermons that rarely have unity, order, or movement” (#610).  Nor does significant content distinguish telephone conversations.  Thus contemporary sermons say little about Sin and Salvation!  They generally seek to amuse and gently suggest ways to cope with life’s challenges.  Or they may offer moral guidance, staking out positions in the nation’s “culture wars.”  What’s really missing is any consistent  Christological focus—proclaiming the person and work of Christ, who alone can enable us to cope with life or embody the Gospel (including its ethics).  

* * * * * * * * * * * * * * * * * *

Having earlier explained Why Johnny Can’t Preach, T. David Gordon examines another current  church quandary in Why Johnny Can’t Sing Hymns:  How Pop Culture Rewrote the Hymnal (Phillipsburg, NJ:  P&R Publishing, c. 2012).  In brief, he says we no longer sing hymns because we cannot do so—we’ve lost the musical capacity to do so.  Immersed in a world of “pop music,” we no longer know how to either attend to or make good music.  Thus the music of the Church (hymnology) developed over the centuries has become as incomprehensible as Latin and Greek; they are “strange, unfamiliar, and inaccessible” (#34).   “For nineteen centuries, all previous generations of the church (Greek Orthodox, Catholic, Protestant, or Revivalist), in every culture, employed prayers and hymns that preceded them, and encouraged their best artists to consider addition to the canon of good liturgical forms” (#280).  Sadly, he says, this tradition was buried by the cultural avalanche triggered in the ‘60s.  Consequently:  “My own generation and my children’s generation are—to use the Greek term—musical idiots” (#70)

Illustrating this reality is the centrality of the guitar in today’s worship services.  The organ, so basic to church music for centuries, has been displaced (and often removed), giving way to guitar-driven bands singing choruses that express certain sentiments but generally lack content or coherence.  Primarily designed to engage the emotions and express personal experiences, “contemporary” church music clearly reflects 21st century popular culture.  Notably absent is the biblical conviction that:  “Song is the divinely instituted, divinely commanded, and divinely regulated means of responding to God’s great works of creation, preservation, and deliverance” (#202).  “Contemporary worship music deliberately attempts to sound like the music we hear every day in the culture around us.  It goes out of its way not to sound foreign or different.  But if meeting our Maker and Redeemer is different from all other meetings, why shouldn’t the various aspects of that meeting be different from the aspects of other meetings?” (#518).  

Gordon sees clearly that this is a deeply theological issue, not simply a matter of musical taste.  Indeed, the “dictatorship of relativism” denounced a decade ago by Pope Benedict XVI applies to aesthetics as well as ethics.  To sing what we “like” rather than reflect on what is God’s Truth sucks us into the cultural nihilism that defines our world, celebrating a narcissistic self-assertiveness rather than costly discipleship.  “Neil Postman rightly said:  ‘I believe I am not mistaken in saying that Christianity is a demanding and serious religion.  When it is delivered as easy and amusing, it is another kind of religion altogether.’  So what is at stake in the kind of religion presented in music that is easy, trivial, light, inconsequential, mundane, or everyday” (#450).  

Though there is clearly a polemical edge to Gordon’s treatise, and though he is clearly more musicologically proficient that the general reader (including this one), he does his best to avoid pouring fuel on the fire of the “worship wars.”  What he does—and what makes this book worth pondering—is to show how easily contemporary Christians have thoughtlessly embraced a music form that militates against the very foundations of the faith.  

# # # 

256 Communist Infiltration

  Given the currently aggressive behavior of Vladimir Putin—positioning Russia to recapture some her lost empire—we’d be wise to reflect on whatever lessons we can learn from the Cold War.  As archives have opened to historians during the past two decades it has become obvious that Communist agents effectively infiltrated and impacted America throughout the 20th century.  In Stalin’s Secret Agents:  The Subversion of Roosevelt’s Government (New York:  Simon & Schuster, Inc., c. 2012), M. Stanton Evans (a veteran journalist) and Herbert Romerstein (formerly the head of the Office to Counter Soviet Disinformation at the U.S. Information Agency) document the degree to which USSR agents both secured information regarding this nation and also helped shaped her policies.  As Whittaker Chambers personally witnessed:  “‘In a situation with few parallels in history, the agents of an enemy power were in a position to do much more than purloin documents.  They were in a position to influence the nation’s foreign policy in the interest of the nation’s chief enemy, and not only on exceptional occasions, like Yalta (where Hiss’s role, while presumably important, is still ill-defined) or through the Morgenthau plan for the destruction of Germany (which is generally credited to [Soviet agent Harry Dexter] White but in what must have been the staggering sum of day to day decisions’” (p. 7).  

When Franklin D. Roosevelt and Winston Churchill secretly met and signed the Atlantic Charter inn August 1941, they committed their countries to seeking “no territorial changes that do not accord with the freely expressed wishes of the people.”  During the course of the war they were joined by Joseph Stalin in strategic conferences—especially Teheran and Yalta—to discuss and plan for the future; they also reiterated the ideals of the Atlantic Charter.  Yet in retrospect it is obvious that Stalin intended to capitalize on military victory to promote the Soviet agenda in Asia and Eastern Europe.  The guns of war had barely cooled before Russian troops occupied Poland, East Germany, and Hungary; then Mao Tse-tung took control of China.  Grasping the import of this process, Churchill lamented “that ‘after all the exertions and sacrifices of millions of people, and of victories of the Righteous caused, we will not have found peace and security and that we lie in the grip of even worse perils than we have surmounted’” (p. 19).  

Unfortunately for millions of people around the globe, FDR and his associates ignored experts on Soviet affairs such as George F. Kennan and saw this process not as “something to be combated, deplored, or counterbalanced, but rather an outcome to be accommodated and assisted” (p. 19).  He generally sided with Stalin rather than Churchill in making Big Three decisions.  Importantly, the authors say:  “Not to be omitted in this context was the presence in the White House of Mrs. Roosevelt, who had around her a coterie of youthful leftists and was a point of contact for outside forces who took a favorable view of Moscow, the American Communist Party, and all manner of pro-Soviet causes” (p. 22).  One of Roosevelt’s closest aides, Harry Hopkins, gladly acknowledged that Russia would fill the vacuum left by a defeated Germany and dominate Eastern Europe; to him:  “‘Since Russia is the decisive factor in the war, she must be given every assistance and effort must be made to obtain her friendship’” (p. 19).  This certainly seemed to be FDR’s approach in Tehran and Yalta, where he said “‘of one thing I am certain; Stalin is not an imperialist.’  And at a post-Yalta meeting, the President observed to his presumably nonplussed cabinet that as Stalin early on had studied for the priesthood,’ something entered into his nature of the way in which a Christian gentleman should behave’” (p. 21).  

That President Roosevelt could so naively misjudge Stalin cannot be attributed to his patently failing health early in 1945 when the Big Three met at Yalta (a resort on the Crimean coast).  Rather, he relied on a handful of trusted advisers to draft the documents and craft the accords to which he assented.  Above all, he singled out Alger Hiss to be at his right hand and it is obvious that Hiss played an important role in the conference.  The papers of Edward Stettinius Jr. (the then just-appointed Secretary of State) reveal Hiss’s activity.  Working on his own autobiography soon after the war, Stettinius would frequently tell the historian helping him to “‘See Alger Hiss about this’” (p. 45), since he had put together the background papers used by the State Department at Yalta.  A veteran bureaucrat, Hiss understood how to direct the flow of information (or disinformation, as the case may be) in ways amenable to himself.  

In his subversive activities Hiss clandestinely carried on the more public work of pro-Communists (if not staunch Communists) such as John Reed (a journalist who championed the Bolsheviks), Raymond Robins (a confidant of Lenin and Trotsky in the early days of the revolution), Armand Hammer (a businessmen with close connections to the Soviets), and Walter Duranty (a New York Times correspondent now proven to have been a Soviet agent who deliberately misinformed readers about the Stalin-orchestrated famine in the Ukraine).  In the 1930s Hiss linked arms with Soviet agents such as Whittaker Chambers and Elizabeth Bentley and worked to abet the Communist cause.  As FBI special agent Guy Hottel asserted, in his 1946 memo to FBI Director Hoover:  “‘It has become increasingly clear in the investigation of this [Bentley] case that there are a tremendous number of persons employed in the United States government who are Communists and who strive daily to advance the cause of Communism and destroy the foundations of this government. . . .  Today nearly every department or agency of this government is infiltrated with them in varying degree.  To aggravate the situation, they appear to have concentrated most heavily in those departments which make policy, particularly in the international field, or carry it into effect . . . [including] such organizations as the State and Treasury departments . . .’” (p. 100).  These charges, Evans and Romerstein show, can be amply documented with materials now available in the archives.  

Aiding the Communist cause during the Roosevelt years were “friends in high places” such as Vice President Henry Wallace, who in 1948 would run for President as the candidate of “the Communist-dominated Progressive Party” (p. 113).  More importantly, FDR’s “longtime aide and crony, Harry Hopkins” adroitly promoted the Soviet cause; indeed, during “the war years, Moscow had no better official U.S. friend than Hopkins” (p. 113).  A social worker by vocation, he used his positions to funnel money not only to needy Americans but to assist the Soviet Union as well.  Under the rubric of the Lend-Lease program, for example, he sent the Russians thousands of pages of documents containing information on such things as uranium and heavy water basic to the atomic weapons being developed at Oak Ridge, TN.  When the Polish government in exile demanded an inquiry into the notorious slaughter of Polish officers in Russia’s Katyn Forest, Hopkins joined Moscow in vehemently opposing any investigation.  This brutal massacre is  now historically demonstrable.  During the war Hopkins often opposed Churchill and assailed the British for their imperialism and colonialism and often spoke highly of Stalin.  Conversely, he often spoke highly of Stalin.  At the Yalta Conference, Hopkins loyally supported Stalin and his agenda.  In the judgment of Ian Yeaton, one of the best-informed and experienced American officials dealing with the USSR and China, “‘From our first meeting I considered him disloyal to the trust that had been imposed on him.  After learning of the manner in which he high handedly handled security at our end of the Alaska Siberian [Lend-Lease] pipeline, I changed it to perfidious or traitorous, if you like’” (p. 133).  

Though less openly pro-Soviet than Hopkins, Secretary of Treasury Henry Morgenthau pursued many policies Stalin supported, in part due to the presence of Harry Dexter White and other Russian agents (named in the Venona decrypts) on his staff.   Morgenthau thus argued, in what is known as the “Moregenthau Plan,” for punitive measures (including “unconditional surrender” and the destruction of Germany’s industrial base) as the war wound down.  White and his associates even discussed shooting large numbers of German soldiers as they surrendered—at Teheran Stalin said 50,000 Germans should be promptly shot when the Allies prevailed.  They also discussed conscripting Germans to work in Russia as “reparations,” and such was certainly done as the Russians prevailed in Eastern Europe.  Apparently “slave labor for Russia had to be sanctioned by the United States to keep from offending Moscow” (p. 190).  Sadly enough:  “there isn’t any doubt that forced-labor-as-reparations was approved at Yalta” (p. 191).  

“Stalin’s coup in Asia” was also facilitated by his agents working within the American government.  In his meetings with the Soviet dictator, FDR (Churchill, importantly, was not involved in these discussions) “agreed to vast array of benefits for Moscow:  sanctioning Soviet control of Outer Mongolia, ceding to Russia the southern part of Sakhalin Island north of Japan and the Kurile chain that stretches between Japan and Russia, plus de facto control of seaports and railways in Manchuria, the main industrial zone and richest part of China” (p. 200).  FDR made such concessions despite the fact that Russia had done nothing of consequence to assist in the war against Japan in the Pacific!  The Asia protocol secretly providing for all this was “written verbatim by the Russians.  The go-between in this was Averell Harriman” (p. 207), a wealthy businessmen who had promoted American investments in Russia and was quite close to Harry Hopkins.  In the authors’ opinion, FDR embraced “postwar plans crafted for him by Harry White and other Soviet secret agents” (p. 208) that ultimately placed  China under Stalin’s control.   

As the authors conclude their treatise they lament the fact that many documents dealing with the Cold War are still inaccessible.  They’ve just dropped into what George Orwell labeled the “memory hole” so characteristic of modern bureaucratic states.  FBI files dealing with communist activities within the U.S. government “are still heavily ‘redacted,’ with page after page of information blacked out by official censors” (p. 249).  Clearly “there has been a deliberate cover-up of crucial information reflecting the extent of the pro-Red penetration and the policy effects that followed” (p. 250).  Thus we have barely begun to understand the full thrust of “Stalin’s Secret Agents” and their role in shaping American history.  But now we know at least three things.  First, it’s demonstrable “that Communist and pro-Soviet penetration at the American government was extensive, involving many hundreds of suspects, and that by the era of World War II and early stages of the Cold War reached up to significant levels” (p. 254).  Second, these infiltrators powerfully shaped American foreign policy in pro-Soviet ways.  And third, this “occurred because Soviet agents preyed on the credulity of officials who were ignorant of Communist methods and apparently had no interest in learning” (p. 255).  

“The net effect of these converging factors was a series of free-world retreats, as pro-Communist forces triumphed in a host of European countries during the earliest stages of the Cold War, followed by the fall of China to Communism a few years later.  These events would be a prelude to Marxist conquest elsewhere, in places as disparate as Indochina; the Latin American states of Cuba and Nicaragua; African nations, including Zimbabwe and Angola; and numbers other cases of like nature” (p. 255).  In short, much of the massive human suffering during the past half-century can be traced back to the effective work in America of Stalin’s Secret Agents!  

* * * * * * * * * * * * * * * * 

For four decades Allen Weinstein, The Archivist of the United States from 2005-2009 and earlier a distinguished professor at several prestigious universities, has researched and written detailed studies of Alger Hiss.  In his most recent edition (the third) of Perjury:  This Hiss-Chambers Case (Stanford, CA:  Hoover Institution Press, c. 2013) he adds yet more damning evidence proving Alger Hiss’s Communist ties and activities.  When Weinstein began his research, 40 years ago, it was fashionable to dismiss Joseph McCarthy’s accusations as “Red baiting” and defend the integrity and patriotism of Alger Hiss, ever the golden boy for upper-crust socialites.  It was also fashionable to denigrate and defame Whittaker Chambers, the man who presented evidence that Hiss was, indeed, a Soviet spy.  Trendy liberals in the ‘50s “were well disposed to believe Hiss’s version of events.  His innocence was a matter of faith, if only because Chambers, [Richard] Nixon, [J. Edgar] Hoover and others on the anti-Communist right were his political enemies.  Hiss’s fate symbolized for young liberals the quintessence of McCarthyism, its paranoid fear of any public figure to the left of Dwight Eisenhower” (p. 3).  Senator McCarthy was indeed too heavy-handed and irresponsible in some of his charges, but he was clearly correct regarding Communist infiltration of the American government.  

As the decades have passed, it has become clear to any fair-minded analyst that Whittaker Chambers was right!  Evidence accruing from Russian archives and Soviet memoirs demonstrate “Hiss’s complicity as an agent” (p. 1).  “Decades after Alger Hiss had left government, new evidence would emerge from both Soviet and U.S. intelligence sources that reinforced the likelihood that he had maintained a link with Soviet Military Intelligence operatives beyond the 1930s and throughout World War II” (p. p. 386).  We now know, for example, that Noel Field (with his wife Herta close friends of Alger and Priscilla Hiss), who sought asylum in Hungary after WWII, named Alger Hiss as a fellow Communist underground agent in the State Department during the mid-thirties:  “Field said that he had been involved [while at the State Department] and that Hiss was the other one involved’ after he joined the department” (p. 218).  “Field freely conceded his prior involvement in espionage for the Soviet Union in 1954 statements made to Hungarian State Security officers,” admitting that:  “‘From 1927 gradually I started to live an illegal life completely separate from my official life . . . [committing] espionage for the Soviet intelligence service’” (p. 219).  Elaborating, Field told the Hungarian interrogators that Hiss “had also been active in working for the Soviets during this period:   ‘[I]n Fall 1935 Hiss at one point called me to undertake espionage for the Soviet Union. . . .  I informed him that I was already doing such work’” (p. 219).  

Weinstein begins his presentation with events in 1948, when Elizabeth Bentley  testified before the House Un-American Activities Committee.  Bentley, labeled the “Red Spy Queen” by the press, said she had frequently delivered documents from American officials to Russian agents.  She named names—Lauchlin Currie, one of FDR’s chief aides, and Harry Dexter White, former assistant secretary of the Treasury, “chief architect of the World Bank and, after 1946, director of the International Monetary fund” (p. 14).  In addition to Bentley, Whittaker Chambers appeared before the Committee, delivering still more distressing details, especially regarding Alger Hiss, a graduate of Harvard Law School and law clerk for Supreme Court Justice Oliver Wendell Holmes, well-know by FDR insiders such as Dean Acheson (who later became President Truman’s Secretary State).  Hiss could hardly have occupied any higher standing in society, and his demeanor demonstrated an aristocratic character.  As one of the young idealists joining to advance the New Deal, Hiss had joined the State Department in 1936, attended the Yalta Conference, and helped establish the United Nations. He was clearly the protégé of Secretariesy of State Stettinius and Acheson.    When brought before the HUAC he absolutely denied any truth to Chambers’ accusations—a denial that would persist for the rest of his life and be largely believed by the progressive establishment.  

In time, however, it became clear that Chambers spoke truthfully and that Alger Hiss was significantly involved in the Communist movement.  Indeed much of what Chambers subsequently wrote, in his classic and eminently worth reading memoir Witness, is confirmed by Weinstein.  In the Committee’s final report, the “HUAC left little doubt as to credibility:  ‘The verifiable portions of Chambers’s testimony have stood up strongly; the verifiable portions of Hiss’s testimony have been badly shaken’” (p. 68).  To fill in the historical picture, Weinstein provides extensive and insightful biographical portraits of both Hiss and Chambers, including considerable evidence that the men and their wives were indeed “friends” for several years.  The contrast could hardly be more vivid.  What the two shared was a commitment to Communism—a “Soviet America” according to Chambers.  

With painstaking attention to detail, Weinstein shows how Hiss was deeply involved in spy ring activities during the ‘30s and, when interrogated by the FBI in 1948 and testifying before grand juries and congressional committees, consistently lied.  He perjured himself!  In time he would be indicted, tried, convicted, and sentenced to five years in prison for perjury.  Despite the evidence, Hiss continually declared himself innocent of any wrongdoing, and he was supported by legions of friends and political allies.  Dean Acheson, for example, said “‘that whatever the outcome of any appeal which Mr. Hiss or his lawyers may take in this case I do not intend to turn my back on Alger Hiss’” (p. 529).  Thenceforth significant sectors of the Democrat Party, the media, and academia refused to turn their backs on Alger Hiss.  Indeed many of them made defending him something of a sacred cause.  As a Columbia University philosophy professor declared:  “‘Even if Hiss himself were to confess his guilt, I wouldn’t believe it’” (p. 540).  Once released from prison, he cultivated “a new clientele in the 1960s and 1970s among college audiences, faculty and students, both in this country and in England.  He became a steady, if unspectacular, fixture on the university lecture circuit , and with each brief burst of renewed interest in the case he reiterated his polite but firm claim of innocence” (p. 556).  Despite his criminal record, he gained re-admission to the bar in Massachusetts.  “Revisionist” scholars, riled by the Vietnam War, found in Hiss an icon for their anti-anti Communist fervor and (as an innocent victim of McCarthy-style persecution) he enjoyed a significant rehabilitation in such quarters.  He died still protesting his innocence. 

Whittaker Chambers, on the other hand, became something of a celebrity in conservative political circles.  William F. Buckley and his National Review, welcomed Chambers’s essays.  Unlike “the relaxed and amiable Hiss, Whittaker Chambers ‘bore witness’ to his version of events after the trials with a mixture of public discomfort and private despair” (p. p. 558).  His 800 page Witness became an instant best-seller and a Book-of-the-Month Club selection for May 1952, quickly meeting his pressing financial needs.  It also elicited praise from “leading European ex-Communists like Arthur Koestler and Andre Malraux, the latter writing:  ‘You  have not come beck from hell with empty hands’” (p. 561).  He died in 1961.  

As a diligent historian, Weinstein consults all available documents, knows intimately the secondary literature regarding the case, and makes considered judgments regarding Hiss’s activities.  Admittedly, only readers truly interested in the case will have the patience to peruse the entire text, with its almost hour-by-hour accounting of the facts, but they cannot but be impressed by the diligence with which the author has accumulated and presented his material.  

255 “The Experience of God”

One of my best discoveries in graduate school—learned quite apart from my course work—was the intellectual depth of the Christian tradition.  Judged purely by their acuity, Christian thinkers such as Augustine and Aquinas, Jacques Maritain and C.S. Lewis, were frequently superior to their secular counterparts.  Thus, while I myself lack the mental capacity necessary to defend or nourish the Christian worldview in elite circles, I could rely on those who have done so most effectively.  Thus I commend David Bentley Hart, for he is one of the more powerful contemporary theologians whose works reward careful reading.  Subscribers to First Things (for years my most favored periodical) recognize him as a regular contributor as well as the monthly essayist on “Back Page.”  An easy introduction to his thought is available in a collection of 21 of his magazine essays—In the Aftermath:  Provocations and Laments (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2009), wherein he addresses topics as varied as “Evelyn Waugh’s Travel Writings,” “The Pornography Culture,” and “Tsunami and Theology.” In such essays, written for the general reader, he seeks to entertain as well as instruct.  

An Eastern Orthodox theologian, Hart has taught at places such as the University of Virginia and Providence College.  His most recent publication, The Experience of God:  Being, Consciousness, Bliss (New Haven:  Yale University Press, c. 2013) seeks to set forth persuasive reasons for adhering to the classic Christian convictions regarding the reality of God.  He writes with full awareness of what seems to be a resurgence of naturalistic atheism in “enlightened” circles aligned with Richard Dawkins and Christopher Hutchins, noting disdainfully that their “texts are manifestoes, buoyantly coarse and intentionally simplistic, meant to fortify true unbelievers in their unbelief; their appeal is broad but certainly not deep; they are supposed to induce a mood, not encourage deep reflection; and at the end of the day they are probably only a passing fad in trade publishing, directed a new niche market” (p. 5).  

Hart simply does “not regard true philosophical atheism as an intellectually valid or even cogent position; in fact, I see it as a fundamentally irrational view of reality, which can be sustained only by a tragic absence of curiosity or a fervently resolute will to believe the absurd.  More simply, I am convinced that the case for belief in God is inductively so much stronger than the case for unbelief that true philosophical atheism must be regarded as a superstition, often nurtured by an infantile wish to live in a world proportionate to one’s own hopes or conceptual limitations” (p. 16).  In general, today’s celebrated atheists “appear to know almost nothing about the religious beliefs they abominate, apart from a few vague and gauzily impressionistic daubs or aquarelle washes, and who seem to have no real sense of what the experience of faith is like or of what its rationales might be.  For the most part, they seem not even to know that they do not know” (p. 20).  Still more:  quite often “those who make the most theatrical display of demanding ‘proof’ of God are also those least willing to undertake the specific kinds of mental and spiritual discipline that all the great religious traditions say are required to find God” (p. 327).  

Hart’s disdain for atheism’s intellectual vacuity is set forth in his 2009 treatise, Atheist Delusions:  The Christian Revolution and Its Fashionable Enemies (New Haven:  Yale University Press), wherein he contends that history’s zenith was reached when Christian theology, shaped by classical Church Fathers and Medieval Schoolmen, best sounded the depths of Reality and crafted Western Civilization.  An unapologetic apologist (much like C.S. Lewis) for pre-modern thought, he believes there was a “peculiar and radical nature” in the Christian faith that transformed the world, liberating it “from fatalism, cosmic despair, and the terror of occult agencies” and creating “a new conception of the world, of history, of human nature, of time, and of the moral good.”  Compared with the grandeur of the Christian worldview, modernity—most clearly defined by its atheism—is rather like a parasite destroying the goodness, truth, and beauty most needed by our species.  

Still, the “deep reflection” absent in atheism is what’s needed when thinking about God, and Hart pitches his text at the highest level of philosophical thought.  Indeed, he believes that “There are, in fact, truths of reason that are far surer than even the most amply supported findings of empirical science because such truths are not, as those findings must always be, susceptible of later theoretical revision” (p. 71).  And such “truths of reason” have been plumbed profitably by great theologians.  Deeply immersed in Church Fathers such as Athanasius and Basil, Gregory of Nyssa and Dionysius the Aeropogite, Augustine and Aquinas, Hart makes no new claims for his presentation.  Rather, he purports to set forth “a faithful digest of the primary claims made about the nature of God” by the true masters of theology.  “Far from being some weak, etiolated remnant of the more robust flora of the age of faith, it is the strongest and most comprehensive set of claims about God that it is possible to make.  There is no note of desperation or diffidence in this language; it forthrightly and unhesitatingly describes a God who is the infinite fullness of being, omnipotent, omnipresent, and omniscient, from who all things come and upon whom all things depend for every moment of their existence, without whom nothing at all could exist” (p.  7).  

“Deep reflection” should nurture wisdom, which Hart defines as “the recovery of innocence at the far end of experience; it is the ability to see again what most of us have forgotten how to see, but now fortified by the ability to translate some of that vision into words, however inadequate” (p. 9).  “God is not only the ultimate reality that the intellect and the will seek but is also the primordial reality with which all of us are always engaged in every moment of existence and consciousness, apart from which we have no experience of anything whatsoever.  Or, to borrow the language of Augustine, God is not only superior summo meo—beyond my utmost heights—but is also interior intimo meo—more inward to me than my inmost depths.  Only when one understands what such a claim means does one know what the word ‘God’ really means, and whether it is reasonable to think that there is a reality to which that word refers and in which we should believe” (p. 10).  

This leads Hart to propose “being, consciousness, and bliss” as keys to knowing ultimate reality.  To simply wonder at their sheer reality prompts the experience of God.  To move from the indubitable certainty of my own being, immersed in a welter of other beings, cannot but provide “a genuine if tantalizing brief glimpse into an inexhaustibly profound truth about reality.  It is the recognition, simply said, of the world’s absolute contingency.  The world need not be thus.  It need not be at all.  If, moreover, one takes the time to reflect upon this contingency carefully enough , one will come to realize that it is an ontological, not merely an aetiological, mystery; the question  of existence is not one concerning the physical origins of things, or of how one physical state may have been produced by a prior physical state, or of physical persistence across time, or of the physical constituents of the universe, but one of simply logical or conceptual possibility  How is it that any reality so obviously fortuitous—so lacking in any mark of inherent necessity or explanatory self-sufficiency—can exist at all?” (p. 90).  

To know what a being is—its essence—involves inductive and deductive thinking; a specific tree or a species of trees may be analyzed and categorized with precision.  But to know that a being is—its existence—demands an entirely different kind of thinking, metaphysical thinking, and “an old and particularly sound metaphysical maxim says that between existence and nonexistence there is an infinite qualitative difference.  It is a difference that no merely quantitative calculation of processes or forces or laws can ever overcome” (p. 95).  Before we even begin to ask what a thing is we have already acknowledged that it isWhy it is no physicist can say!  Only the metaphysician or theologian, peering through the phenomenal world as an icon, can catch a glimpse of “some truly unconditioned reality (which, by definition, cannot be temporal or spatial or in any sense finite) upon which all else depends; otherwise nothing could exist at all.  And it is this unconditioned and eternally sustaining source of being that classical metaphysics, East and West, identifies as God” (p. 106).  

“And God, therefore, is the creator of all things not as the first temporal agent in cosmic history (which would make him not the prime cause of creation but only the initial secondary cause within it), but as the eternal reality in which ‘all things live, and move, and have their being,’ present in all things as the actuality of all actualities, transcendent of all things as the changeless source from which all actuality flows.  It is only when one properly understands this distinction that one can also understand what the contingency of created things might tell us about who and what God is” (p. 107).  Given this understanding, we can develop reasonable positions regarding God’s simplicity, infinity, omnipotence, omniscience, omnipresence, freedom, impassibility, etc.  With vast erudition and verbal dexterity Hart moves throughout the history of philosophy and theology, interacts with the most trenchant of postmodern thinkers, and illustrates the perennial power of ontology, the study of being.

We not only know that we are—and that other things are—but we know that we know!  Regarding the reality of human consciousness, Hart says:  “No less wonderful than the being of things is our consciousness of the:  our ability to know the world, to possess a continuous subjective awareness of reality, to mirror the unity of being in the unity of private cognizance, to contemplate the world and ourselves, to assume each moment of experience into a fuller comprehension of the whole, and to relate ourselves to the world through acts of judgment and will” (p. 152).  “Being is transparent to mind; mind is transparent to being; each is ‘fitted’ to the other, open to the other, at once containing and contained by the other.  Each is the mysterious glass in which the other shines, revealed not in itself but only in reflecting and being reflected by the other” (p. 152).  

Neuroscientists running brain-scans, determined to reduce such self-awareness to matter-in-motion, are as blinded by their descriptive calculations as naturalistic physicists, shackled to their laboratory devices, who deny metaphysical realities.  When I’m told that my thoughts are merely neurons firing in my brain and compare such diagrams with my rich inner world (filled with memories of childhood and thoughts about building projects and concerns about loved ones and debates regarding decisions) I cannot but question the wisdom of “brain science.”  For I’m intensely aware of qualities (such as the colors of Colorado aspen trees in the fall), or of abstract ideas (such as geometric circles or distributive justice), or of my reasoning capacities (logic) and mental intentions (freely turning my thoughts assorted items), and I cannot believe they are merely physical perturbations.  Indeed, “considered solely within the conceptual paradigms we have inherited from the mechanical philosophy, it is something of a conundrum that such a thing as consciousness should be possible for material beings at all” (p. 154).  

This modern, mechanistic vision, solidified by the theories of Newton and Darwin, had been presciently rejected by most of the great philosophers Hart defends, for “neither Platonists, nor Aristotelians, no Stoics, nor any of the Christian metaphysicians of late antiquity or the Middle Ages could have conceived of matter as something independent of ‘spirit,’ or of spirit as something simply superadded to matter in living beings.  Certainly none of them thought of either the body or the cosmos as a machine merely organized by a rational force from beyond itself.  Rather, they saw matter as being always already informed by indwelling rational causes, and thus open to—and in fact directed toward—mind.  Nor did Platonists or Aristotelians or Christians conceive of sprit as being immaterial in a purely privative sense, in the way that a vacuum is not aerial or a vapor is not a solid.  If anything, they understood spirit as being more substantial, more actual, more ‘supereminently’ real than matter, and as in fact being the pervasive reality in which matter had to participate in order to be anything at all.  The quandary produced by early modern dualism—the notorious ‘interaction problem’ of how an immaterial reality could have an effect upon a purely material thing—was no quandary at all, because no school conceived of the interaction between soul and body as a purely extrinsic physical alliance between two disparate kinds of substance.  The material order is only, it was assumed, an ontologically diminished or constricted effect of the fuller actuality of the spiritual order” (p. 168).  Thus my soul, as Thomas Aquinas said, is the form of my body, the true substance of my being.   

As is true of the mystery of being, Hart argues, the mysterious reality of consciousness leads directly to the mysterious reality of God, who “is in himself the absolute unity of consciousness and being, and so in the realm of contingent things is the source of the fittedness of consciousness and being each to the other, the one ontological reality of reason as it exists both in thought and in the structure of the universe” (p. 235).  Still more, He is the “bliss” present in our moments of ecstasy and our longing for endless joy.  We desire many things, but above all we desire, as Aristotle wisely said, lasting happiness.  We surely desire to know “truth,” and we desire it because there is a delight to discovering it.  Knowing it, we “transcend” the flux of finite things and experience a bit of a higher reality.  We also desire moral goodness and make decisions in accord with our understanding of it, for “the good is an eternal reality, a transcendental truth that is ultimately identical with the very essence of God” (p. 253).  “Simply said, if there were not God, neither would there be such a thing as moral truth, nor such a thing as good or evil, nor such a thing as a moral imperative of any kind.  This is so obviously true that the need to argue the point is itself evidence of how inextirpable our hunger for a transcendent moral truth is” (p. 256).  

Finally, punctuating our experience with openings to the reality of God, there is the blissful apprehension of beauty, the finest of all avenues to the Author and Finisher of our faith.  To Hart, beauty more than anything else grants us access to the immediate presence of God in His world, for through the beautiful “we are granted our most acute, most lucid, and most splendid encounter with the difference of transcendent being from the realm of finite beings.  The beautiful affords us our most perfect experience of that existential wonder that is the beginning of all speculative wisdom.” (p. 283).  This thesis was articulated by Hart a decade ago in his first (and in many ways most intellectually daunting) major publication, The Beauty of the Infinite:  The Aesthetics of Christian Truth (Grand Rapids:  William B. Eerdmans Publishing Company, c. 2003).  Ultimately, he argues, the Christian message appeals to us more as a writer’s story than a lawyer’s brief; God reveals himself in history and episodes and biographies more clearly than in propositions and deductions.  He seeks “to defend a theological reappropriation of what I have called the ‘covenant of light’—a trust in the evidence of the given, an understanding of knowledge as an effect of the eros stirred by the gift of the world’s truth” (Loc #2294 in Kindle edition).  

“What Christian thought offers the world is not a set of ‘rational’ arguments that (suppressing certain of their premises) force assent from others by leaving them, like the interlocutors of Socrates, at a loss for words; rather, it stands before the world principally with the story it tells concerning God and creation, the form of Christ, the loveliness of the practice of Christian charity—and the rhetorical richness of its idiom.  Making its appeal first to the eye and heart, as the only way it may ‘command’ assent, the church cannot separate truth from rhetoric, or from beauty” (#107).  “Phrased otherwise, the truth of being is ‘poetic’ before it is ‘rational’—indeed is rational precisely as a result of its supreme poetic coherence and richness of detail—and cannot be truly known if this order is reversed.  Beauty is the beginning and end of all true knowledge: really to know anything, one must first love, and having known one must finally delight; only this ‘corresponds’ to the Trinitarian love and delight that creates” (# 2069).

To lay a foundation for his presentation, Hart devotes a lengthy (one-third of the book) section of his book to “postmodern” thinkers such as Foucault and Derrida as well as formative philosophers such as such as Kant, Nietzsche, and Heidegger.  Only the most diligent of academicians can profit from this discussion—but it clearly reveals Hart’s erudition and dazzling rhetorical prowess.  That done, he moves to his central task:  celebrating “the beauty of the infinite” disclosed in the Trinitarian theology of classic Christian thought, for “the Christian understanding of beauty emerges not only naturally, but necessarily, from the Christian understanding of God as a perichoresis of love, a dynamic coinherence of the three divine persons, whose life is eternally one of shared regard, delight, fellowship, feasting, and joy” (# 2427).  

This truth stands profoundly revealed at the inception of Jesus’ ministry, when He was baptized by John in the Jordan River.  In this theophany the Father, Son, and Spirit joined in celebrating their communal mission on earth, for “the descent of the dove at Christ’s baptism reveals that every act of God, as Basil says, ‘is inaugurated by the Father, effected by the Son, and perfected by the Holy Spirit’ (De Spiritu Sancto 16.38), it reveals also that God’s love is always entirely sufficient in itself” (#2740).  Thus America’s greatest theologian, Jonathan Edwards, “calls the Spirit the beautifier, the one in whom the happiness of God overflows and is perfected precisely as overflowing, and so the one who bestows radiance, shape, clarity, and enticing splendor upon what God creates and embraces in the superabundance of his love.  And this beauty is the form of all creaturely truth; . . . .  Delight in beauty ‘corresponds’; joy in beauty, when it is truly joy, reflects the way in which God utters himself, and utters creation, in the Spirit’s light; joy repeats, in some sense, the gesture that gives being to beings, and alone grants knowledge of being as original peace” (# 2783).

Thus we experience God immediately as we open ourselves to beauty in all creation, whether it be the soaring Alps or the intricate, counterpoint compositions of Bach (whom Hart asserts is the greatest of all theologians!).  Thus Gregory of Nyssa “likens the soul partaking of divine blessings to a vessel endlessly expanding as it receives what flows into it inexhaustibly; participation in the good, he says, makes the participant ever more capacious and receptive of beauty, for it is a growth into the goods of which God is the fount; so no limit can be set, either to what the soul pursues or to the soul’s ascent” (#3053).  And this involves what patristic theologians routinely called “deification,” for “‘when the bridegroom calls to the soul,’ writes Gregory, ‘she is refashioned into the yet more divine and, by a beneficent change, changed from her glory to one still more exalted’” (#3131).  Inasmuch as we see all that is as a gift, graciously given us by a loving LORD, and insofar as we rejoice at the sheer beauty of these gifts, we discover the divine design orienting us to eternal life through Christ the Lord. 

254 Disinformation

For two centuries the world has been shaped by currents unleashed in the French Revolution, the fountain of a “heinous iniquity” (according to Erick von Kuehnelt-Leddihn), “historically the mother of most of the ideological evils besetting civilization, not only of the West but of the entire world.” A utopian ideology (generally identified as Leftist) aspiring to societal transformation has variously informed movements ranging from Jacobins in France to Bolsheviks in Russia to Maoists in China to Sandinistas in Nicaragua.   This phenomenon was brilliantly analyzed years ago in Kuehnelt-Leddihn’s Leftism Revisited:  From de Sade to Marx and Hitler and Pol Pot, wherein he showed how the past 200 years vividly illustrate the folly of the French Revolution and its slogan, “liberty, equality, fraternity.”    As an eyewitness to those events, the poet Goethe declared, “‘Legislators and revolutionaries who promise equality and liberty at the same time are either psychopaths or mountebank’” (p. 9).  

You just can’t enact or impose both liberty and equality, so by and large revolutionaries promise liberty and then strangle it in order to implement the economic equality demanded by the masses.  As  Kuehnelt-Leddihn’s says, we humans by nature need liberty.  Created in the image of God, who alone is truly free, we need the freedom to be the spiritual beings we’re created to be.  Every man wants to be . . . and needs to be . . . free!   Leftists, however, consider human beings basically physical creatures with purely material needs.  “Leftism is basically materialistic.” Thus attaining equality, dividing up the economic pie fairly, becomes the goal.  If it’s necessary to sacrifice individual liberty to attain economic equality, the die is cast in favor of equality.  

     Equalitarian regimes, i.e. communist dictatorships, flourished and fell in the 20th century, repeating what seems to be an inevitable pattern.  The connections drawn between the French Revolution’s Marquis de Sade and Hitler’s Nazi nihilism and the Bolshevik brutality of Joseph Stalin may be tenuous, but common principles and tactics make them at least distant bedfellows!  Along with imposing their will on populations through violent assaults on established powers, Leftists have resolutely sought to manipulate public opinion and political policy through propaganda.  This was spectacularly evident in the aftermath of WWII when the USSR successfully extended its rule over Eastern Europe, as David Martin meticulously documents in The Web of Disinformation:  Churchill’s Yugoslav Blunder (New York:  Harcourt Brace Jovanovich, Publishers, c. 1990).  

As WWII began many European nations quickly collapsed and submitted to NAZI control, but others (especially in Southeast Europe) nurtured substantial guerilla resistance to occupation.  Leading the battle in Yugoslavia was General Draza Mihailovic, a valiant patriot supported by large portions of his countrymen and early celebrated throughout the Ango-American world as “an international symbol of resistance to Nazi tyranny.  The readers of Time magazine [in 1941] voted him ‘Man of the year’” (p. 29).  He “and his followers wanted an independent and democratic peasant Yugoslavia” (p. 57), and “the thirty-odd British officers who served with Mihailovic at various times from October 1941 to June 1944, as well as the eight American officers who served with him . . . from August 1943 to 1944, were convinced from their own experience that the Mihailovic forces represented a genuine and vitally important resistance movement” (p. xxii).    He fully enjoyed the support of Winston Churchill and the Allies who saw him as a key to weakening Hitler’s power in that area.  

In the early days of the war the Soviets also supported Mihailovic, but they turned against him when a committed Communist, Tito, emerged as his rival.  Aligned with the USSR rather than Britain, Tito was clearly more interested in establishing Communism than defeating Nazism and fought Mihailovic more than Hitler.  For Tito and Stalin to succeed in extending Soviet hegemony it was imperative for them to discredit and undermine Mihailovic—primarily by persuading Prime Minister Winston Churchill to abandon him.  This was effectively accomplished by the end of December, 1943, with the assistance of a Soviet agent, James Klugman, who worked in Britain’s Cairo office during the war and skillfully falsified the information reaching Churchill.  Klugman himself was probably the “most brilliant” of the Cambridge University circle in the 1930s that birthed “such notorious Soviet agents as Kim Philby, Guy Burgess, Donald Maclean, and Anthony Blunt” (xvii).    Extensive research, including over 100 interviews, persuaded Martin that Klugman—responsible for summarizing reports of British officers in Yugoslavia—cleverly crafted disinformation designed to establish Communism in Yugoslavia.  

Consequently, as Soviet troops moved into Yugoslavia in the final days of WWII, they joined Tito in attacking Mihailovic’s supporters as well as the Nazis.  Churchill and Roosevelt, as was evident at the Yalta Conference, failed to oppose Stalin’s designs on Eastern Europe.  A vital section of that region fell quickly behind Stalin’s Iron Curtain—though in time Tito would break away from the USSR in order to orchestrate his own version of the Communist creed.  But had Mihailovic been rightly supported by Britain and America, Martin argues, that region might have been spared the trauma that ensued.  And that failed to happen, as The Web of Disinformation makes clear, because a well-placed secret agent, James Klugman, effectively shaped British opinion.  

* * * * * * * * * * * * * * * * * * * * 

One of the greatest 20th century leaders, Czechoslovakia’s Vaclav Havel, lamented the “communist culture of the lie” that had ravished his nation under Soviet rule.  Such should have been expected, of course, for Vladimir Lenin had openly declared that the only “morality” he and his followers embraced was whatever advanced the cause of socialism; thus lying as well as killing and stealing were easily embraced inasmuch as they helped promote the dictatorship of the proletariat.  In its Chinese version, Mao Zedong routinely declared that a lie repeated a hundred times becomes the truth, and even the Russian word Glasnost, adroitly employed by Mikhail Gorbachev in the ‘80s and warmly celebrated in the West, “really means lying, and lying is the first step toward stealing and killing” p. 17).  

In Disinformation:  Former Spy Chief Reveals Secret Strategies for Undermining Freedom, Attacking Religion, and Promoting Terror  (Washingon, D.C.:  WND Books, c. 2013), Lt. Gen. Ion Mihai Pacepa and Prof. Ronald J. Rychalak make clear the devious designs and strategies implemented by Communists around the world during the past century.  For many years, before defecting to the United States in the late ‘70s, Pacepa headed Romania’s intelligence agency, the DIE.  Rychalak is a distinguished scholar,  well-versed in history and law, particularly well-known for his work on Pope Pius XII.  In his helpful Introduction to the book R. James Woolsey, former Director of the CIA, says:  “This remarkable book will change the way you look at intelligence, foreign affairs, the press, and much else besides.  Lt. Gen. Ion Mihai Pacepa is the highest-ranking defector we have ever had from a hostile intelligence service.  As chief of the Romanian intelligence he was for many years in the key meetings with heads of state and a participant in some of the most sensitive discussions by our enemies during the Cold War” (#96 in Kindle).  

Intelligence operatives primarily sought to “frame” information by infiltrating organizations (such as the World Council of Churches), altering records, rewriting history, and manipulating media to advance their cause.  They especially sought to discredit political and religious leaders as well as regimes who opposed their endeavors.  Thus, the authors explain, their book’s title, Disinformation, focuses on what “has been the Kremlin’s most effective weapon in its war on the West, especially on Western religion.  Iosif Stalin invented this secret ‘science,’ giving it a French-sounding name and pretending it was a dirty Western practice.  As this book will show, the Kremlin has secretly, and successfully, calumniated leading Roman Catholic prelates, culminating in Pope Pius XII; it almost succeeded in assassinating Pope John Paul II; it invented liberation theology, a Marxist doctrine that turned many European and Latin American Catholics against the Vatican and the United States; it has promoted anti-Semitism and international terrorism; and it has inspired anti-American uprisings in the Islamic world” (#290).  

As a youngster growing up in Romania, Pacepa dreamed of going to America—some of his relatives lived in what was then a most prosperous Detroit!  But when the Soviets occupied his country at the close of WWII that dream faded and he was drafted into the Securitate, the intelligence department, and ultimately became its head officer, working for the tyrannical Nicholae Ceausescu, who “was more or less a Romanian version of the current Russian president, Vladimir Putin—an empty suit who morphed into his country’s president without having held any productive job, who knew nothing about how the real world worked, and who believed that lying to the world and killing ofof his critics were the magic wands that would keep him in power” (p. 9).  In that position, Pacepa says, and “because Romania was a relatively small country, I believe that I, as its top intelligence officer, very possibly had a clear picture of how the Kremlin and its dezinformatsiya really functioned than perhaps all but the very innermost Soviet inner circle” (p. 6).  “Change the public image of the leader, and you change history, I heard over and over from Khrushchev’s lips” (p. 178).  

Importantly, disinformation is not misinformation!  Whereas misinformation comes directly from government spokesmen and is easily detected, disinformation “is a secret intelligence tool, intended to bestow a Western, nongovernment cachet on government lies” (p. 30).  Prominent Western intellectuals, such as Jean-Paul Sartre, were recruited by the KGB to promote the Soviet agenda; thus Sartre, for example, “vilified the United States as a racist country suffering from political rabies” (p. 34).  Others—such as the Italian writer Carlo Falconi (who relied on communist forgeries to write The Silence of Pius XII) and the German playwright Rolf Hochhuth (whose 1963 play The Deputy popularized a malign image of the Pope), and John Cornwell (whose 1999 Hitler’s Pope:  The Secret History of Pius XII revived the slurs)—were used to attack the Catholic Church, particularly by promoting the slander of Pius XII as “Hitler’s Pope.” These and other publications disregarded all the evidence, alleging that the Pope had nefariously supported the Nazis when in fact he had done much to resist and ultimately defeat them!  To rectify the record, Pacepa devotes several well-documented chapters to demonstrate the truth about Pius XII’s heroic efforts to assist the Jews and defeat the Nazis.

And Pope Pius XII was only one of many Western clergymen subject to Soviet disinformation, for a “global war on religion” was part and parcel of the Communists’ strategy throughout the Cold War.  They worked especially hard to get leftists in the West to organize “peace movements” and promote anti-Semitism.  And, when Latin and South America seemed impervious to Castro-style violent revolutions, “the KGB was able to maneuver a group of leftist South American bishops into holding a conference in Medellin, Columbia.  At the KG B’s request, my DIE provided logistical assistance to the organizers.  The official task of the conference was to help eliminate poverty in Latin America.  Its undeclared goal was to legitimize a KGB-created religious movement dubbed ‘liberation theology,’ the secret task of which was to incite Latin America’s poor to rebel against the institutionalized violence of poverty’ generated by the United States” (p. 101).  

Though Pope John Paul II, who knew a great deal about Communism, repudiated it, “Liberation Theology” enjoyed much support in elite seminaries, denominational headquarters, and ecumenical organizations such as the National Council of Churches and World Council of Churches.  So easily manipulated were these groups that the KGB in 1983 sent 47 “agents to attend the WCC General Assembly in Vancouver, and the following year the KGB took credit for using its agents on the WCC selection committee to arrange for the right man to be elected WCC general secretary” (p. 102).  Simultaneously, “a black version of liberation theology began growing in a few radical-leftist black churches in the United States.  Black liberation theologians James Cone, Cornel West, and Dwight Hopkins have explicitly stated their preference for Marxism because Marxist thought is predicated on a system of oppressor class (whites) versus victim class (blacks), and it sees just one solution:  the destruction of the enemy” (p. 103).  Importantly:  James Cone mightily influenced Jeremiah Wright, the Chicago pastor and (for 20 years) religious mentor of Barack Obama.  

Disinformation in America was long promoted by I. F. Stone, a prominent, influential journalist who was (as recently published KGB documents reveal) a paid Soviet Spy who wrote “articles on subjects recommended to him by Moscow” to distribute dezinformatsiya.  Thus he praised and promoted The Deputy (the maliciously false portrait of Pius XII) when it was brought to the American stage.  “M.S. (Max) Arnoni . . . onetime editor of the Encyclopedia Britannica, and publisher of A Minority of One, a highbrow magazine for the liberal American elite, also jumped in to promote The Deputy.  According to former KGB general Oleg Kalugin, now an American citizen, Arnoni received money from the KGB for promoting the Soviet line in the American media” (p. 139).  Much the same can be said about leftist magazines such as Mother Jones and think-tanks such as the Institute for Policy Studies.  

Such disinforming writers and organizations have effectively subverted American policies and incubated Anti-Americanism around the globe.  Postmodern intellectuals, notably Jacques Derrida, allegedly abandoned Marxism but still justified “the Islamic war against the United States” and called for “a ‘new Internationale’ to unite all the environmentalist, feminists, gays, aboriginals, and other ‘dispossessed and marginalized’ people who were combating American-led globalization” (p. 288).  Pacepa provides insights into such things as the alleged “missile gap” John F. Kennedy claimed while running for President in 1960 and Lee Harvey Oswald’s Soviet ties (documented in Pacepa’s Programmed to Kill:  Lee Harvey Oswald, the Soviet KGB, and the Kennedy Assassination), and John Kerry’s outrageous accusations during the Vietnam War.  And though the old USSR has collapsed, a KGB empire survives under the direction of Vladimir Putin; Communist ideology remains and primarily works these days through anti-Israeli and pro-Islamic operatives.  For instance:  Pacepa personally observed how the KGB shrewdly transformed “an Egyptian-born Marxist . . .  into a Palestinian-born Yassar Arafat” (p. 284).  “Documents in the Mitrokhin Archive describe Arafat’s close collaboration with my Romanian DIE and with the KGB in the early 1970s.   Other documents disclose the KGB’s secret training provided Arafat’s guerillas, and reveal the super secret channels used by the KGB to provide arms shipments to the PLO” (p. 323).  

Sadly, Pacepa believes, as a result of disinformation “the ghost of Marx lives on” in the United States.  Give what he witnessed in his own native land, after WWII, he finds developments in has adopted homeland, America, alarming.  “After forty-five years of Cold War, and still more years of war in Iraq and Afghanistan, millions of young Americans, unaware of history or unable to learn from it, have come to believe that capitalism is their real enemy, and that it should be replaced with socialism.  They found a home in the Democratic Party, whose primary 2008 election theme was the promise to redistribute America’s wealth” (p. 306).  Barack Obama’s central theme in 2008, as all of us should recall, was “change.”  Voters intoxicated with the message rarely realized, as Pacepa says, that “the quintessence of Marxism is change, which is built on the dialectical materialist tenet that quantitative changes generate qualitative transformations.  Thus ‘change,’ through the redistribution of the country’s wealth, became the electoral slogan in all Soviet boloc countries” (p. 310).  

To Pacepa, Barack Obama’s 2008 campaign was “a major cause of déjà vu.  It felt as though I were watching a replay of one of those election campaigns of Ceausescu’s in which I was involved during my years in Romania.  Ceausescu’s media painted the Romania of his predecessor, Ghjeorghiu-Dej, as a decaying, corrupt, economically devastated country and demanded it be changed by redistributing the country’s wealth.  It was a disinformation campaign” (p. 309).  Still more:  “Just as Ceausescu loved to remind everyone that someone as great as he ‘is born once every five hundred years,’ so did Sen. Barack Obama portentously proclaim, ‘We are the ones we have been waiting for, ‘ artfully substituting the regal ‘we’ to convey his actual meaning:  I am the One you have been waiting for.  Meanwhile, Obama’s Houston campaign headquarters had a large poster of communist idol Che Guevara hanging on the wall” (p. 309).  The crowds thronging around Obama reminded Pacepa of “Ceausescu’s revival meetings—more than eighty thouseand people were gathered in front of the now-famous Greek temple resembling the White House that had been erected in Denver, to demand that America’s wealth be redistributed.  It was a superb show of disinformation” (p. 311).   

Following Obama’s election, he and his appointees “began changing into a Ceausescu-style nomenklatura (in the Soviet bloc, the special elite class of people from which appointees for top-level government positions were drawn) with unchecked power.  This new nomenklatura started running the country secretly, just as Ceausescu’s nomenklatura did.  ‘We have to pass the bill so that you can find out what is in it,’ then-leader of the US House of Representatives nomenklatura, Nancy Pelosi once told the media.  That was a first in American history.  It didn’t take long before this nomenklatura—this arrogant, new elite class—began to take control of banks, home mortgages, school loans, automakers and most of the healthcare industry” (p. 311).  Pacepa realizes that many may find his comparisons between Obama and Ceausescu quite dubious, but points to some alarming similarities between them that should concern us.  

“This book,” Pacepa says as he concludes his treatise, “is another open letter, this time witten jointly with Professor Rychlak (whose ancestors had immigrated from Poland) and addressed especially to oiur fellow Americans.  Let us reject the Marxist redistribution of wealth which has transformed so many once-noble countries into lands looking like giant trailer camps hit by the hurricane, with their leaders roasting in Dante’s Inferno.  Indeed, all Marxist redistributionists who have ever risen to lead a country have ended up in hell—all, from Trotsky to Stalin, Tito to Ahivkov, Enver Hoxha to Matyas Rakosi, Serkou Toure to Nyeree, Khrusheev to Ceausescu.  All had their days of temporary glory but all ended in eternal disgrace.  A few remnants, like the Castro brothers, are still hanging on, but they certainly have a place in hell reserved and waiting for them.

“Let us, once and for all, all reject Marxism’s ‘science’ of disinformation, its glasnost, and its political necrophagy that has been used so destructively over the years to squash freedom and bankrupt countries.  Let us recognize them for what they are—and expose them with all our might—when such deceitful campaigns rear their ugly heads.  Let us return to our own American exceptionalism and its traditions of patriotism, honesty and fairness.  The United States of America is the greatest country on earth.  Let us keep it that way for future generations” (p. 350).  

253 What Darwin Got Wrong

 Two distinguished professors of cognitive science—Jerry Foder (Rutgers) and Massimo Piattelli-Palmarini (University of Arizona)—argue, in What Darwin Got Wrong (New York:  Farrar, Straus and Giroux, c. 2010), “that there is something wrong—quite possibly fatally wrong—with the theory of natural selection” (p. xiii).  The theory makes two claims:  1) natural selection is an observable process wherein “creatures with adaptive traits are selected”—i.e. survivors procreate; 2) natural selection is a guiding mechanism whereby “creatures are selected for their adaptive traits” (p. xv).  The first premise is historical—certain things happened; the second is philosophical—why these things happened.  To infer the second premise from the first is clearly illogical—what philosophers call an “intensional fallacy.”  But this is precisely what Neo-Darwinians do and thereby render the theory suspect.  As self-identified atheists, the authors firmly pledge allegiance to the philosophical naturalism their guild demands, but they do insist that clear thinking demands doubt regarding the “just so” Darwinian story.  

To build their case, Foder and Piattelli-Palmarini make a rigorous evaluation of the evidence (especially the information in-forming genetic activity) now available and believe natural selection fails to explain it.  Rather than sheer randomness, there seem to be “laws of form” giving direction to (i.e. causing) biological formation.  But neither Darwin nor his modern epigones set forth a credible theory of causation, though they claim to do so under the rubric of “natural selection.”   Darwinians, as natural historians, trace what happened, often in the dim and distant past.  They tell us what apparently happened—but not what “had to happen,” which is “the domain of theory, not of history; and there isn’t any theory of evolution” (p. 152).  “Natural history isn’t a theory of evolution; it’s a bundle of evolutionary scenarios.  That’s why the explanations it offers are so often post hoc and unsystematic” (p. 159).  Along with Marx, Darwin imagined he could extract scientific theory from history; both men were grievously mistaken.  

“‘OK; so if Darwin got it wrong,’” the authors write by way of summary, “‘what do you guys think is the mechanism of evolution?’  Short answer:  we don’t know what the mechanism of evolution is.  As far as we can make out, nobody knows exactly how phenotypes evolve.  We think that, quite possibly, they evolve in lots of different ways; perhaps there are as many distinct kinds of causal routes to the fixation of phenotypes as there are different kinds of natural histories of the creatures whose phenotypes they are” (p. 153).  Dogmatically insisting there is no God or Intelligent Designer or Mother Nature or any kind of supervising Mind, the authors simply leave unanswered the really important question:  why did all this occur?    But they do, at least, have the intellectual fortitude to point out why Natural Selection cannot be the answer.

* * * * * * * * * * * * * * * * * * 

Not long ago I heard the leader of an atheist movement in England elucidate why he is establishing fellowship centers to substitute for churches for folks disbelieving in God.  Explaining his views, he said “we come from nothing and go to nothing,” so living as painlessly as possible here is life’s only goal.  To say we come from nothing, of course, violates one of the clearest logical principles, for obviously nothing could come from nothing.  That a reasonably articulate man could  so cheerfully espouse nonsense illustrates one of the distressing marks of modernity:  the lack of philosophical perspicacity evident wherever scientism reigns.  As a healthy antidote, one of the 20th century’s greatest philosophers, Etienne Gilson, provides a valuable perspective on modern science in From Aristotle to Darwin and Back Again:  A Journey in Final Causality, Species, and Evolution (San Francisco:  Ignatius Press, 2009, a new translation of Gilson’s 1971 treatise).  

He begins where anyone concerned with meticulous science, meaningful distinctions, and coherent logic must:  with Aristotle.  Importantly, Aristotle understood that different subjects demand  different ways of thinking—doing math differs significantly from composing music, though the two disciplines certainly share some commonalities.  When they considered questions concerning the origins of things, Pre-Socratic thinkers had, by-and-large, invoked chance and necessity, churning along in a mechanical fashion.  To Aristotle, however, it made more sense to see things as designed, with discernable purpose, much like an artistic work reflecting the mind of its maker.  Such reasoning led him to posit, when explaining things, four essential causes:  material; efficient; formal; final.  In the common sense tradition following Aristotle’s paradigm, it makes sense to understand a house as composed of material things, put together by workmen, following a blueprint, in order to provide suitable shelter.  To eliminate formal and final causes from the equation—as has been done for three centuries by scientists fixated solely on material and efficient causes—renders reality ultimately unintelligible.

For Aristotle there is an undeniable telos—an end-oriented ingredient—to all that is.  His voracious investigations of the natural world filled him with wonder:  “‘Every realm of nature is marvelous’” (p. 25).  Still more:  “‘Absence of haphazard and conduciveness of everything to an end are to be found in Nature’s works in the highest degree, and the resultant end of her generations and combinations is a form of the beautiful’” (p. 25).   Thus he “found teleology so evident in nature that he asked himself how his predecessors had been able to avoid seeing it there, or, still worse, had denied its presence” (p. 21).  Two millennia later Gilson asks the same question!  How can highly intelligent people not see the obvious design in things?  “In brief, if there is in nature at least an apparently colossal proportion of finality, by what right do we not take it into account in an objective description of reality?” (p. 31).  

Gilson takes us on a 400 year historical journey, showing how philosophical naturalism, with its mechanistic explanations, has become dominant in the West.  At the apex of the account stands Charles Darwin, who scrupulously expunged any hint of design (with its powerful suggestion of a Creator) from his version of evolution through Natural Selection.  Unfortunately, in his and his followers’ writings the word “Evolution has served the purpose of hiding the absence of an idea” (p. 103).  Claiming to explain everything it explains very little if anything.  Thus a distinguished French naturalist, Paul Lemoine lamented:  “The theories of evolution with which our studious youth are lulled to sleep actually compose a dogma which everyone continues to teach; but, each in his specialty, zoologist or botanist, takes cognizance of the fact that any of the explications furnished cannot stand’” (p. 104).  “‘The result of this expose,’” Lemoine said in closing his article in the Encyclopedie francaise, “‘is that the theory of evolution is impossible’” (p. 105).  And it is impossible because it cannot withstand the kind of rigorous analysis given it by philosophers such as Gilson.  

Living things cannot be explained mechanistically.  Aristotle saw this clearly centuries ago, and the “facts that Aristotle’s biology wished to explain are still there.  He is reproached, sometimes bitterly, with having explained them poorly, but up to the present no one has explained them any better’ (p. 141).  To truly understand our world we must, as did Lemaine, allow that “‘vital phenomena tend toward a precise end, from which tendency the name of “final causes” is derived’” (p. 142).  And indeed, most biologists, when seeking to explain anything, silently rely on (and develop euphemisms for) such final causes.  

* * * * * * * * * * * * * * * * 

In Darwin’ Doubt:  The Explosive Origin of Animal Life and the Case for Intelligent Design (New York:  HarperOne, c. 2013), Stephen C. Meyer takes seriously Charles Darwin’s personal doubt regarding his celebrated theory, popularly portrayed as the evolutionary tree of life.  Darwin was deeply troubled by the lack of fossil evidence for the universal common ancestry of all living things, gradually flowering into various species through natural selection, basic to his theory of evolution.  Indeed, his work was vigorously disputed by the most celebrated fossil expert of his day, the Swiss paleontologist teaching then at Harvard, Louis Agassiz.  During a markedly brief period of time—the Cambrian Era—a great variety of animal species just suddenly appeared, with no hint of common ancestry.  To Agassiz, this “posed an insuperable difficulty for Darwinian theory” (p. 8).  

Darwin recognized this difficulty, noting:  “‘The abrupt manner in which whole groups of species suddenly appear in certain formations has been urged by several paleontologists . . . as a fatal objection to the belief in the transmutation of species.  If numerous species, belonging to the same genera of families, have really started into life all at once, the fact would be fatal to the theory of descent with slow modification through natural selection’” (p. 17).  What Darwin did, to preserve his fossil-less theory, was to insist that in time further paleontological expeditions would uncover a fuller geological record that would confirm his belief in common ancestry and natural selection.  

But a series of meticulous 20th century paleontological expeditions have failed to find the evidence Darwin envisioned.  Instead there stands exposed in the fossils of the Cambrian Era a “geologically abrupt appearance of a menagerie of animals as various as any found in the gaudiest science fiction.  During this explosion of fauna, representatives of about twenty of the roughly twenty-six total phyla present in the known fossil record made their first appearance on earth” (p. 31).  Though much of the field work was done in the Burgess Shale in the Canadian Rockies, there is an even richer fossil depository in China—the Maotianshan Shale near Chengjiang —which affords us “an even greater variety of Cambrian body plans from an even older layer of Cambrian rock” (p. 50).  This site, inspected by Chinese scientists, shows “that the Cambrian animals appeared even more explosively than previously realized” (p. 51).  Thus the renowned Chinese paleontologist J. Y. Chen declares that the evidence turns “upside down” Darwin’s tree of life imagery.  Ironically, Chen noted:  “‘In China we can criticize Darwin, but not the government.  In America you can criticize the government but not Darwin” (p. 52).  The Chinese scientists have also, during the past 20 years, refined the methodology for dating the geological record, leading them to believe that the “Cambrian Explosion” took place within five to ten million years—a mere moment in earth’s five billion year history.  

Evidence regarding biological development in the Cambrian Explosion, Meyer argues, calls into question the hallowed “tree of life” depicted in standard textbooks.  Rather than a single tree, it looks like a score or more separate bushes, all beginning at the same time.  For over 3 billion years, only single-celled organisms (notably bacteria and algae) existed; then, some 560 million years ago some complex multicellular organisms, such as sponges, appeared; shortly thereafter came the Cambrian Explosion and “the oceans swarmed with animals” as a “carnival of novel biological forms arose”—all given structure by “an explosion of genetic information unparalleled in the previous history of life” (p. 163).  And the more we understand about genetics the more difficult it is to even imagine, much less demonstrate, how such living creatures emerged and evolved in accord with the Darwinian theory.  

Consequently, various naturalistic alternatives to the standard evolutionary model have been proposed.  Some biologists envision “self-organizing” patterns following certain natural laws, rather as crystals seem to spontaneously assemble.  Decades ago Stephen Jay Gould theorized that the slow process of gradual evolution advanced through inexplicable jumps, suddenly developing new life-forms.  More recently Jeffrey Schwartz, in Sudden Origins, admitted:  “We are still in the dark about the origin of most major groups of organisms.  They appear in the fossil record as Athena did from the head of Zeus—full blown and raring to go, in contradiction to Darwin’s depiction of evolution as resulting from the gradual accumulation of countless infinitesimally minute variations’” (p. 318).  Instead, he postulated some as yet indiscernible “Hox genes” that better explain it.  

“Clearly,” Meyer says, “standard evolutionary theory has reached an impasse” (p. 337).  So he proposes a better approach:  Intelligent Design.  If one is not committed to a purely materialistic metaphysic, if one is open to the possibility of a mental dimension to reality, then looking for an information-giving intelligent milieu or agent might make sense.  Meyer’s approach “affirms that there are certain features of living systems that are best explained by the design of an actual intelligence—a conscious and rational agent, a mind—as opposed to a mindless, materialistic process.  The theory of intelligent design does not reject ‘evolution’ defined as ‘change over time’ or even universal common ancestry, but it does dispute Darwin’s idea that the cause of major biological change and the appearance of design are wholly blind and undirected” (p. 339).  

Meyer’s fascination with intelligent design began when, as a young scholar, he encountered the work of a chemist, Charles Thaxton, whose book The Mystery of Life’s Origin demonstrated the improbability of nonliving chemicals evolving into living biological organisms.  Thaxton and his co-authors “suggested that the information-bearing properties of DNA might point to the activity of a designing intelligence—to the work of a mind, or an ‘intelligent cause’ as they put it.  Drawing on the analysis of the British-Hungarian physical chemist Michael Polanyi, they argued that chemistry and physics alone could not produce the information in DNA any more than ink and paper alone could produce the information in a book.  Instead, they argued that our uniform experience suggests a cause-and-effect relationship between intelligent activity and the product of information” (p. 341).  

This possibility drew Meyer to go to the University of Cambridge in England, where he pursued his interests in the history and philosophy of science.  There he discovered the important role historical scientists assign to “abductive inference,” a method that infers “past conditions or causes from present clues.”  Unlike deductive logic, abductive reasoning leads to plausibility rather than certainty.  It is an “inference to the best explanation.”  Historians who understand their discipline know it’s as much an art as a science since they deal with particular events rather than universal laws.  Thus there have ever been rival hypotheses regarding past events such as the fall of the Roman Empire.  So too scholars studying evolution necessarily engage in historical work, and “whether they always realize it or not . . .typically use the method of inference to the best explanation” (p. 351).  

Applying this method to the Cambrian Era, we encounter creatures possessing layers of highly sophisticated digital information akin to “systems known from experience to have arisen as a result of intelligent activity.  In other words, standard materialistic evolutionary theories have failed to identify an adequate mechanism or cause for precisely those attributes of living forms that we know from experience only intelligence—conscious rational activity—is capable of producing” (p. 358).  Reading Macbeth we reasonably infer it was written by a literary genius, a mind telling the story—Shakespeare.  Listening to The Messiah we reasonably infer it was composed by a musical genius, a mind orchestrating text and score—Handel.  Encountering pictographs on the rocks near Boise, Idaho, we reasonably infer they were drawn centuries ago by rational human beings—Indians residing in that area.  Inevitably we think materials containing and conveying information necessarily come from conscious minds.  

Thus, as we now know, living organisms—whether molecules or cells, plants or animals, as preserved in the fossils of the Cambrian Era—“require specified and highly improbable (information-rich) arrangements of lower-level constituents in order to maintain their form and function” (p. 365).  Still more:  “Conscious and rational agents have, as part of their powers of purposive intelligence, the capacity to design information-rich parts and to organize those parts into functional information-rich systems and hierarchies.  We know of no other causal entity or process that has this capacity” (p. 366).  And “both the Cambrian animal forms themselves and their pattern of appearance in the fossil record exhibit precisely those features that we should expect to see if an intelligent cause had acted to produce them” (p. 379).  These ancient animals suddenly appeared, “without any clear material antecedent; they came on the scene complete with digital code, dynamically expressed integrated circuitry, and multi-layered, hierarchically organized information storage and processing systems” (p. 381).  

In the light of all this, might it make sense to infer, as the best explanation, an intelligence of some sort as the cause of it all?  Yes, Meyer insists, it does.  As the popular novelist Dean Koontz says, “Meyer writes beautifully.  He marshals complex information as well as any writer I’ve read. . . .  This book—and his body of work—challenges scientism with real science and excites in me the hope that the origins-of-life debate will soon be largely free of the ideology that has long colored it . . . a wonderful, most compelling read.”  

* * * * * * * * * * * * * * 

Though Charles Darwin is generally presented to the public as a virtuous scientist, motivated by a dispassionate desire to understand the world, Benjamin Wiker argues, in The Darwin Myth:  The Life and Lies of Charles Darwin (Washington:  Regnery Publishing, Inc., c. 2009),  that he often made misleading statements regarding his life and his claims to intellectual originality, and that these assertions were naively embraced by most of his biographers, who have portrayed him as exemplary in every way—a “secular saint who single handedly brought enlightenment to a world shrouded in the darkness of superstition and ignorance” (p. ix).  To provide the context that shows he was notably less than honest about himself and fair to his rivals, Wiker sketches a succinct overview of his life and intellectual development, giving considerable care to his theological and philosophical orientation. 

By 1859, Darwin was ready to provide the public with a “long argument” in “two long books,” setting forth his notions of “evolution through natural selection” and the purely naturalistic “descent of man.”  Especially absent in his presentation—and the clandestine case he actually advocated, Wiker contends —was absence of God in creation.  While he at times made elusive references to some higher power, Darwin clearly envisioned an essentially godless world.  “Darwin’s principle of natural selection was chosen by him precisely because it excluded any creative action by God” (p. 139).  

Darwin also envisioned an essentially amoral world, for “morality does not govern evolution.  If it did, then we might expect a divine overseer” (p. 92).  Darwin knew that “if there were a moral standard outside the process of natural selection, if the evolution of morality progressed toward that standard, if the actions of men and societies were judged by that standard, then we would be admitting a theistic account of evolution” (p.145).  Unwilling to grant this, he considered morality a survival technique—constantly changing, relative to various environments, without real substance.  Social Darwinism, celebrated by eugenicists and might-makes-right thinkers such as Nietzsche and dictators such as Stalin, naturally and necessarily followed Darwin’s philosophical views.  Consequently, Wiker suggests, he, along with Marx and Freud, may rightly be acknowledged for his importance while decried for his influence.  

252 Admirable Autobiographies

For forty years Michael Novak has been a very visible and influential public intellectual, writing generally at the crossroads of politics and religion.  In his recent autobiography, Writing from Left to Right:  My Journey from Liberal to Conservative (New York:  Image, c. 203), he details his remarkable career, filled with insights into many of the most powerful men of our time as well as his own intellectual development.  Gracefully written, irenic and generous in presentation, his memoirs open for us vistas of understanding.  Though autobiographies are perhaps the most particularistic of all historical materials, Novak sees in his life broader patterns; thus:  “This book,” he tells us, “is about political and economic upheavals between the years 1960 and 2005, and the navigation through heavy waves that many of us chose.  This is not just my story, but the story of thousands, even millions.   Many more are likely to join us over the next decade.  Reality does not flinch from teaching human beings hard lessons” (Kindle, #100).  

All four of Novak’s grandparents came from the same part of rural Slovakia and settled in western Pennsylvania.  He was born and reared in Johnstown, a place in the Allegheny Mountains best known as the site of a disastrous 1889 flood.  Reared in a solidly Catholic, working class family, he early sensed a call to the priesthood and for 12 years immersed himself in studies for the religious life.  But in 1960 he “judged that my true vocation was in lay, not priestly, life” (p. 13) and resolved to become a writer.  Settling in New York, he finished and saw published his first book (a novel) and began reading widely in politics and economics.  Offered a position as a writer for an aspiring Democratic politician, he wrote a speech about “The New Frontier” that was never given, but the phrase made its way into a memorable address by John F. Kennedy, whose ultimate election proved deeply satisfying.  Politically, in accord with JFK, Novak was happily situated as a middle-of-the-road Democrat.  

Sensing the need for further academic preparation, he entered the philosophy department at Harvard in pursuit of a Ph.D.  But his professors were immersed in such things as symbolic logic whereas Novak was more interested in metaphysics and ethics.   While there he managed to publish several papers and two books, but he was hardly attuned to the main concerns of his professors.  He did, however, meet a visiting scholar, Gabriel Marcel, whose lectures “fanned the sparks in me of a lifelong interest in the ‘person’” (p. 27).  He also encountered, while at Harvard, the works of Reinhold Niebuhr and determined to write his doctoral thesis on Niebuhr’s Christian Realism.  His graduate studies were interrupted, however, when he was given the opportunity to spend four months in Rome, observing and writing about the second session of the Second Vatican Council.  In time he published The Open Church, momentarily aligning himself with the “progressives” who saw the Council as a vehicle whereby the Church could be significantly updated and improved.  “Within a few years after the Council, I found myself reacting more and more negatively to the large faction of the ‘progressives’ who failed to grasp the truly conservative force of Vatican II—its revival of ancient traditions, its sharper disciplines, its challenges to mere worldliness and mere politics” (p. 52).  His religious convictions an intellectual shift, as noted in the books’ title, from left to right.  

His position on the left, shared by many younger folks in the ‘60s, began when he began teaching philosophy at Stanford University, where he spent some of his happiest days.  Identifying with his students, he gradually embraced their anti-war radicalism and even left Stanford to help start an experimental college of the State University of New York at Old Westbury.  Designed to align with the ethos of the counter-culture—making students colleagues rather than apprentices—the experiment degenerated into “crazy, rebellious, and anarchic” chaos.  As Novak “faced the full implications of the deep leftist principles” (p. 93) he was, to put it mildly, appalled.  To imagine them applied to the nation—as envisioned by Tom Hayden and The Students for a Democratic Society—left him aghast.  

Soon departing Old Westbury, in 1970 Novak was hired by Sarge Shriver (whom both LBJ and George McGovern almost picked as vice presidential running mates) to work (as a writer) for the election of Democrats to Congress.  Shriver, of course, was married to Eunice Kennedy, so Novak entered and intellectually helped shape a significant faction of the Democrat Party.  “Shriver loved the vein of Catholic thought that wanted to ‘reconstruct the social order,’ ‘put the yeast of the gospel in the world,’ ‘feed the hungry, comfort the afflicted’—generally, that is to make a difference in the world” (p. 112).  In addition to working for Shriver, the prolific Novak wrote articles and finished a book, The Rise of the Unmeltable Ethnics, anticipating themes enunciated within a decade by Ronald Reagan.  

This publication demonstrated his “declaration of independence from the cultural left” (p. 124), a rather painful move inasmuch as it presided imperiously over much of the nation’s intellectual life.   Novak also had to acknowledge that Left-wing Democrats were taking control of the party and locking it into a pro-abortion, anti-Jewish/Christian, multicultural agenda.  And he also began re-thinking his loosely socialistic economic positions, prodded by Richard John Neuhaus and Peter Berger to consider the empirical evidence of the actual good wrought by capitalism around the world.  “Socialism, I was beginning to infer, is not creative, not wealth producing.  It is wealth consuming (it consumes the wealth of others); it is parasitical” (p. 149).  LBJ’s “Great Society,” along with most all welfare states, have clearly failed to fulfill their promises.  

Fortuitously invited to join (as a theologian) the American Enterprise Institute in 1977, Novak secured “a front row seat at the great economic debates of the next three years” (p. 174).  He found the “supply side” economics, as espoused by Jack Kemp and Ronald Reagan, surprisingly persuasive.  He was appointed by President Reagan as the U.S. as ambassador for the UN Commission on Human Rights and published (in 1982)  what his probably his most influential book, The Spirit of Democratic Capitalism, to provide an ethical rationale for free enterprise.  When she met him, Margaret Thatcher exclaimed:  “‘I have so wanted to meet you.  I have been reading your book.  You are doing the most important work in the world.’”  “‘Exposing the moral foundations of capitalism,’ she went on, ‘is so important.  Then fate of the poor all around the world depends on it’” (p. 218).  That truth was confirmed by Vaclev Havel, who with  a handful of friends met secretly to discuss the book, chapter by chapter, and determined to implement its principles in Czechoslovakia.  

“In my lifetime,” Novak says, “I have been favored to meet and often work with many great political leaders, presidents, artists, and builders of brand-new industries in new technologies of the Electronic Age.  I especially prized working with Reagan, Thatcher, and Vaclav Havel.  But of all the great human beings I have met—and even been invited into friendship with—none is closer to my heart than John Paul II” (p. p. 298).  This leads to a warmly-written chapter on “the Pope who called me friend.”  John Paul II had apparently read Novak’s The Spirit of Democratic Capitalism as he prepared his social encyclical, Centesimus Annus, in 1991.  Invited, on several occasions, to dine and discuss ideas, as well as celebrate mass with the Pope, culminated, in a profound way, Novak’s life as a Christian writer.  

* * * * * * * * * * * * * * 

For many years Melanie Phillips was an acclaimed reporter and columnist—“a darling of the left”—for the Guardian, probably the leading leftist newspaper in England, “the paper of choice for intellectuals, the voice of progressive conscience, and the dream destination for many, if not most, aspiring journalists” (#347).  Thus she titles her judicious autobiography Guardian Angel:  My Story, My Britain (New York:  emBooks, c. 2013).  As a serious writer she has ever sought to tell the truth and follow the evidence wherever it led.  This, however, challenged her ideological assumptions, and in time she found a journalistic home in more conservative quarters.  Consequently, looking back on a career of “charting social and political trends for more than thirty years,” she declares “there were now two Britains:  the first adhering to decency, rationality and duty to others, and the second characterized by hatred, rampant selfishness, and a terrifying repudiation of reason” (Kindle, #23).    There were obviously battles waging, so her memoir “is the story of my culture war:  the account of my battles with the hate-mongering left” (#28).  

Reared in London as an only child in a working-class, somewhat dysfunctional Jewish family, she found solace in “the magical world of books,” excelled in school and entered Oxford University, where she studied English Literature and “dabbled in student politics and mildly left-wing circles” (#309).  Inwardly insecure, she often compensated by thrusting herself to “the center of the stage in order to validate my existence by the approval of an audience” (#314).  Needing employment, she fell into a training post with a suburban newspaper and began learning to do journalism and was named Young Journalist of the Year.  This led, in 1977, to a staff position on the Guardian.  She was 26 years old and shared, for many years, the staff’s conviction that they, speaking for the liberal left, “were the embodiment of virtue itself” (#406).  They espoused “a set of dogmatic mantras.  Poverty was bad, cuts in public spending were bad, prison was bad, the Tory government was bad, the state was good, poor people were good, minorities were good, sexual freedom was good” (#449).  Still more:  by the ‘90s  post-modernism had infiltrated journalism under the label of “attachment” journalism.  “Truth was now said to be an illusion; objectivity was a sham; journalists who tried to be dispassionate were therefore perpetrating a fraud upon the public.  The only honest approach was for journalists to wear their hearts on their sleeves; this was not called bias, but honesty” (#999).  

And they were, Phillips laments, very much the vanguard of the cultural revolution that has totally transformed Britain!  Her disillusionment with the Left began when she honestly followed the evidence while researching and writing articles on a wide variety of subjects—immigration, education, environmentalism, marriage and family, feminism, multiculturalism, health care, Israel and foreign affairs.  Though only nominally Jewish, she found to her surprise that her colleagues on the Guardian branded her as a Jew who could not deal dispassionately with Israel.  Indeed, anything but a pro-Palestinian stance was anathematized by Britain’s leftists, who routinely equate Israelis with Nazis!  “The more I read, the more horrified I became by the scale of the intellectual and moral corruption that was becoming embedded in public discourse about the Middle East—the systematic rewriting of history, denial of law and justice and the corresponding demonization and delegitimisation of Israel” (#1669).  

As a mother of two, she became increasingly distressed with the nation’s schools, which had demonstrably been “hijacked by left-wing ideology.  Instead of being taught to read and write, children were being left to play in various states of anarchy on the grounds that any exercise of adult authority was oppressive and would destroy the innate creativity of the child” (#738).  When she dared suggest that teachers should actually teach something, the left’s reaction was vitriolic—she was instantly branded “right-wing” and ostracized by many colleagues.  By 1990 she “realized something very bad indeed was happening to Britain.  What was being described was more akin to life in a totalitarian state.  Dissent was being silenced, and those who ran against the orthodoxy were being forced to operate in secret; still more, the very meaning of concepts such as education, teaching, and of knowledge was being unilaterally altered, and thousands of children, particularly those at the bottom of the social heap, were being abandoned to ignorance and institutionalized disadvantage” (#812).  Her alienation from the Left “was not so much political as moral.  The left was rejecting all external authority and embracing instead moral and cultural relativism—the idea that ‘what is right’ is ‘what is right for me’, and declaring any hierarchy of values illegitimate” (#878).  

Aligned with her critique of education, Phillips’ sharpest point of disagreement with the Left devolved from her defense of the traditional family.  All the evidence proved that children flourished best in intact homes, where mothers and fathers shared responsibility for rearing them.  Divorce, single-parenting and step-parenting all harm children.  But despite the data, the cultural left triumphantly validated such behaviors.   The facts were never cogently debated, only denied.  She was by no means refuted, only reviled.  Ultimately she concluded “that the destruction of the traditional family had as its real target the destruction of Biblical morality.  I thought I was merely standing up for evidence, duty and the protection of the vulnerable.”  But her foes saw clearly “that the banner behind which I was actually marching was the Biblical moral law which put chains on people’s appetites” (#980).  

When she began investigating the claims of environmentalists, including the furor over global warming, she immediately “smelled charlatanry.”  Determined to follow the evidence, she found none!  Instead what posed as environmentalism simply “brought together deeply obnoxious strands of thinking” hitherto evident in the anti-Western, anti-capitalist frothing of leftist ideologues.  “To me,” she says, “the clear message of environmentalism was that the planet would be fine if it wasn’t for the human race.  So it was a deeply regressive, reactionary, proto-fascist movement for putting modernity into reverse, destroying the integrity of science, and threatening humanity itself” (#892).  Her position was reinforced by discussions with dozens of first-rate scientists who demonstrated that “the science” was not at all “settled” and the alarmists’ views were often akin to a “scam.”  

* * * * * * * * * * * * * 

During the past decade Joseph Pearce has emerged as one of the better writers dealing with literary figures—Literary Converts; Wisdom and Innocence:  A Life of G.K. Chesterton; C.S. Lewis and the Catholic Church; Tolkien, Man and Myth; Solzhenitsyn:  A Soul in Exile.  Bradley Birzer, the author of J.R.R. Tolkien’s Sanctifying Myth, says:  “Pearce writes with historical insight on one hand and poetic imagination on the other.  Perhaps our greatest living biographer, Pearce has the uncanny ability to get into the minds, hopes, fears, and motivations of his subjects.”  Having written at length and with discernment about others, in Race with the Devil:  My Journey from Racial Hatred to Rational Love (Charlotte, NC:  Saint Benedict Press, c. 2013) he now tells his own life story.  

Reared in a rather conventional middle-class home near London, Pearce had little exposure to or interest in religion as a child, though there was a residual Christianity here and there.  His father, amazingly well self-educated (and to whom the book is dedicated), was the strongest influence in his life, and he has “nothing but gratitude to him for all the good things he taught me and all the love he bestowed upon me” (Kindle Loc #276).  He did, however, have a strong anti-Catholic bias rooted in a general misunderstanding of the English Reformation and its aftermath” (# 295).  And he was also was quite hostile to the immigration policies of Great Britain which were transforming the land he loved into a multicultural jungle.  Yet unlike “the philanthropist who proclaims his love for Man but despises men,” his father “loved men but despised those who spoke in the abstract about the brotherhood of man” (#310).  Consequently, “the extent to which I love my fellow man is attributable, under grace, to the example my father gave me.  My love of poetry and history has its roots in his love for these things.  My omnivorous hunger for knowledge is a gift that he gave me.  The path of the autodidact, which he took through life, is the path that I have followed also.  I am happy to have followed in my father’s footsteps, though equally happy that I ceased to do so when I came to realize he we was not always walking in the right direction” (#357).  

Following the elder Pearce’s father’s example of learning on his own, young Joseph read widely—far beyond his classroom assignments—and developed a consuming passion for politics.  Indeed, at the age of 17 he published an article entitled “‘Red Indoctrination in the Classroom,’ which critiqued the Marxist orientation of my high school education” (#458).  Still more, he became active in the National Front, a political party noted for its highly racial agenda.  This involved him in numerous street demonstrations and occasional fights with militant Marxists.  Determined to advance the cause he founded Bulldog, a magazine designed to incite racial hatred and targeting young people with the National Front message.  This made him a highly visible and controversial figure and he began to work fulltime for the Party in 1978.  In time he was twice arrested and imprisoned for “hateful” articles he had written.  

Looking back on this period of his life, Pearce says:  “The animus of my political creed to which I subscribed was not animosity towards aliens but a love of my own people, albeit a love that became an idol, a false god that I worshipped at the expense of my own spiritual wellbeing” (#799).  Providentially, he began to discover, though a variety of experiences, a better way that began with an awakening to beauty.  For instance, visiting the family of a friend in rural England sparked within him “a fuller and better vision of the beauty of life, particularly country life, to which I had been largely unaware until then” (#774).  Away from city lights he clearly saw, for the first time, the stars in all their majesty.  “Having my eyes awakened to such beauty was a baptism of the imagination—a baptism of desire—which I now see as foundational to my path to religious conversion” (#789).  

He was also nourished by the works of Alexander Solzhenitsyn, who sowed “seeds of faith and hope in my understanding of reality and exorcise the demons of nihilism and pessimism that lurked in the darkest recesses of my soul” (#1049).  (Much later, while writing a biography of Solzhenitsyn, he was privileged to meet him—a rare encounter granted when Solzhenitsyn learned Pearce sought to emphasize his Christian convictions.)  His love of pop music exposed him to Elvis Presley, whose gospel recordings played a vital role in opening Pearce to the Gospel!  And then he was (much like C.S. Lewis, decades earlier) “surprised by Chesterton”!  First fascinated by some of his “distributist” economic ideas, he the stumbled into the works of G.K.C. and would never be the same again!  In Chesterton he “found a new friend who would become the most powerful influence (under grace) on my personal and intellectual development over the following years” (#1729).  

Such influences—in concert with an inner “baptism of desire” that sharpened his hunger for truth—prepared him for some unexpected changes while serving his second prison term.  It was, for him, the dark night of the soul described by St John of the Cross.  Longing for freedom, disillusioned with his political work, and strangely drawn to theological inquiries and halting efforts to pray, he emerged from prison determined to change his life.  He began visiting a small Catholic chapel and sensed therein the answer to his heart’s disquiet.  In time he converted to the Church began to live in accord with her doctrines and ethics, finding self-sacrifice the key to the good life.  Rather than writing racist screeds he began to consider Christian materials and wrote a biography of his beloved G.K. Chesterton.  “Whereas my previous writing had led people astray, I hoped that my gifts as a writer could now help lead people to the truth” 

251 Vietnam Revisited

When I began my college teaching career in January, 1966, America’s involvement in the Vietnam War was beginning to evoke controversy.  But since I didn’t teach recent American history and had other compelling concerns I naively accepted the news as reported in Time Magazine and by Walter Cronkite on CBS.  Thus I first supported what I understood to be a just war defense of the people of South Vietnam; subsequently, accepting what was said about the Pentagon Papers and promoted by Cronkite et al., I came to believe it impossible to win the war and that we had been misled regarding the reasons we had entered it.  (Especially important in shaping my own views in those days was Sojourners Magazine, whose editor, Jim Wallis, claimed to provide an Evangelical appraisal on world affairs—something I now realize was deeply, if not deviously, flawed).  And when Saigon fell and South Vietnam was melded into the Communist orbit I effectively closed my mind to what seemed then to have been a sad and misguided American endeavor.  Older and wiser now, I have recently revisited the conflict with the assistance of sources suitably critical of the popular opinions established by the media 45 years ago.  

Uwe Siemon-Netto is a German journalist who spent five years covering the war in Vietnam.  Unlike most American correspondents, who stayed in the safety of Saigon, he spent much time in the countryside, where the war was waged, and developed meaningful relationships with a variety of individuals.  He talked with and knew not only the political and military elites but ordinary people (such as the boy Duc, for whom the book is named).  In his deeply moving memoir—Duc:  A Reporter’s Love for the Wounded People of Vietnam (c. 2013)—he laments the many casualties of a conflict sadly won by the Communists in 1975.  He personally observed the “heinous atrocities the Communist committed as a matter of policy,” serving as “a witness to mass murder and carnage beside which transgressions against the rules of war perpetrated on the American and South Vietnamese side—clearly not as a matter of policy or strategy—appear pale in comparison” (p. xii).  As “a collection of personal sketches of what I saw, observed, lived through and reported in my Vietnam years,” Duc gives us an enlightening slice of a story we need to better understand a critical phase in our nation’s history.

One message Siemon-Netto makes clear:  ordinary Americans were misinformed by intellectually dishonest “apologists for the Hanoi regime, such as philosopher Noam Chomsky” and New Left leaders, personified by Jane Fonda, who denied demonstrable truths regarding such events as the Hue Massacre (where Communists ruthlessly slaughtered thousands of  innocent civilians) and the highly successful Tet Offensive, a decisive victory for both the American military and South Vietnamese government.  “More than half of the 80,000 Communist soldiers who participated in the Tet Offensive were killed; the Vietcong infrastructure was smashed.  This was a big military victory.  It was a hard-won victory for the allies, but a victory it was.  All things being equal, this should have been the Allied triumph bringing this war to a successful end.  We combat correspondents could testify to this, irrespective of the pacifist and defeatist spin opinion makers, ideologues and self-styled progressives in the United States and Europe put on this pivotal event” (p. 209).   

Indeed, as Peter Braestrup said:  “‘Rarely has contemporary crisis-journalism turned out, in retrospect, to have veered so widely from reality.  Essentially, the dominant themes of the words and film from Vietnam . . . added up to a portrait of defeat for the allies.  Historians, on the contrary, have concluded that the Tet Offensive resulted in a severe military-political setback for Hanoi in the South’” (p. 140).  But on February 27, 1968, less than two months after the military triumph, “CBS Evening News anchorman Walter Cronkite, home from a flying visit to Vietnam after the Tet Offensive, pronounced sonorously before an audience of some 20 million viewers the Vietnam War unwinnable.  This flew in the face of everything many combat correspondents . . . had lived through at Tet” (p. 221).  “But Walter Cronkite’s opinion trumped reality, turning a military victory into a political defeat” (p. 221) and the antiwar movement would surge significantly with Senator Eugene McCarthy surfing the swell.  Soon President Lyndon Baines Johnson would acknowledge he had lost middle America along with Cronkite and abandon his re-election campaign.  

Communist leaders (such as Gen. Vo Nguyen Giap, who recently died in his 105th year), shrewdly manipulated mouthpieces such as Fonda, knowing quite well that the war would be won in America’s living rooms as well as the battlefields of Vietnam.  “‘During the latter half of the 15-year American involvement,’ wrote Robert Elegant, ‘the media became the primary battlefield.  Illusory events reported by the press as well as real events within the press corps were more decisive than the clash of arms or the contention of ideologies.  For the first time in modern history, the outcome of the war was determined not on the battlefield but on the printed page and, above all, on the television screen.  Looking back coolly, I believe it can be said . . . that American and south Vietnamese forces actually won the limited military struggle’” (p.113).  Wielding their pens to oppose the war, the “new journalists” employed their skills, less concerned with accurate writing than improving the world.  Rather than researching and reporting, they “followed a drift in journalism that became fashionable when the profession changed its character from a down-to-earth craft to another pseudo-academic ivory tower” (p. 74).   American journalists in Vietnam tended to “preach, pontificate and browbeat like the scribes of Joseph Goebbels propaganda ministry in Nazi Germany or of the Soviet agitprop service” (p. 74).  “The traditional journalists, the craftsmen, were still on the job in my time in Vietnam.  They had their stories published, albeit in many cases further and further in the back pages.  The limelight was reserved for the new journalists, the pundits, stars who opined rather than reported.  They flew in and out of Saigon on ‘special assignments’ and, clad in freshly pressed fatigues, pontificated before millions of television viewers, not on the basis of what they had experienced in the jungles and villages . . . but on the basis of the stereotypical antiwar ideology, they themselves were imposing on the American public square” (p. 75).   

Reading Duc, however, can help rectify the record regarding Vietnam.  Siemon-Netto’s  compelling concern for the people of Vietnam—as well as the American servicemen who sacrificed so much in that conflict—finds its voice in this memorable account.  

* * * * * * * * * * * * * * * * * * * * *

Phillip E. Jennings served as a Marine Corps helicopter pilot in Vietnam and has written The Politically Incorrect Guide to the Vietnam War (Washington:  Regnery Publishing, Inc., c. 2010) to challenge some of the widely-embraced assumptions regarding the history of that conflict.  The author of comic novels as well as the CEO of Molecular Resonance Corporation, he writes not as an academic (though he’s consulted 300 books) or journalist but as a war veteran determined to expose falsehoods and defend the propriety of American’s involvement in Southeast Asia.  

After a quick overview of the labyrinthine developments leading to the division of Vietnam by the Geneva accord in 1954, he notes that the United States came to the aid of South Vietnam (as it had done in Korea in 1950) following North Vietnam’s 1959 decision to dispatch fighters to enlarge Ho Chi Minh’s communist dictatorship.  President John F. Kennedy, who strongly supported America’s commitment to the Ngo Dinh Diem regime and the South’s military (the ARVN), approved increased military aid and involvement in the war.  Unfortunately, JFK’s ambassador (Henry Cabot Lodge) and two young journalists (Neil Sheehan and David Halberstam, who falsely claimed that 30 Buddhist monks had been killed by the government) orchestrated a process culminating in a military coup and the execution of Diem, the only man who conceivably could have effectively ruled and defended his country.  

Following Kennedy’s assassination in 1963, President Johnson decided to massively escalate America’s role in Vietnam.  His ill-focused and inept strategies, generally attuned to domestic politics rather than military developments, resulted in what often appeared an insoluble stalemate.  Nevertheless, “the situation in South Vietnam in 1967 was far from dire for the Americans and their allies” (p. 88).  Much had gone wrong, but not all was lost!  That was demonstrated in the 1968 Tet Offensive, wherein the Communists were decisively defeated.  Indeed, “Had the United States followed up on the destruction of the Viet Cong in the Tet Offensive by mining Haiphong Harbor and bombing Hanoi (as Nixon did in 1972), the war might have ended in 1968” (p. 103).  Even then, though the American media refused to report it, during the four most important years of the war (’68-’73) there was an “unheralded victory.”  

With the ’68 election of Richard Nixon as President, America’s strategies shifted—and so did the course of the war, during which South Vietnamese soldiers took “over the war on the ground, and pacified 90 percent of the countryside” (p. 105).  “Nixon was decisive where Kennedy waffled; and he was a tough-minded statesman while Johnson was an over-promoted congressional enforcer.  Nixon succeeded where his Democratic predecessors (and political opponents) failed” (p. 115).  He launched the Christmas bombing of military targets in Hanoi in 1972—next to the Tet Offensive the most successful American campaign.  Reeling under the assault, Ho Chi Minh began cooperating with the peace talks in Paris that promised to secure a future for South Vietnam.  

Tragically, Nixon’s successes unraveled amidst the Watergate scandal.  Emboldened Democrats, enjoying majorities in both houses of Congress, moved to curb the president’s powers and (following his resignation in 1974) curtail aid to South Vietnam.  In short order Laos and Cambodia, as well as South Vietnam, “were sacrificed to Communism” (p. 145).  

* * * * * * * * * * * * * * * * *

One of the best analyses of the Vietnam War was published 30 years ago by Norman Podhoretz, entitled Why We Were in Viet Nam (New York:  Simon and Schuster, c. 1982, 1983).  It is simply structured to answer four questions:  Why we went in?  Why we stayed in?  Why we withdrew?  and Whose immorality?   

We entered the war under the guidance of President John F. Kennedy, who had declared, in a speech given in 1956 while still a senator from Massachusetts, “‘the cornerstone of the Free World in Southeast Asia,’” an outpost of democracy which “‘would be threatened if the red tide of Communism overflowed into Vietnam’” (p. 19).  Still more:  it was in America’s national interest to protect her representatives and investments in that region.  JFK was clearly committed to the Truman Doctrine of “containment” and thought South Vietnam worth defending.  Following this policy, Truman had involved the U.S. in the Korean War, and “as Guenter Lewy puts it in his authoritative history of the Vietnam War, ‘no serious discussion or questioning appears to have taken place of the importance of Southeast Asia to American security interests, of the correctness of the dire predictions regarding the consequences of the loss of the area’” (p. 34).  

Having gone into Vietnam under JFK, we stayed in because President Lyndon B. Johnson supported the Truman Doctrine and “‘made a solemn private vow’” to “devote himself to ‘seeing things through in Vietnam’” (p. 64).  He orchestrated the passage of the Golf of Tonkin resolution in 1964 and deftly enlisted the support of Arkansas’s  J. William Fulbright and Idaho’s Frank Church—senators who later became some of his harshest critics.  Though promising during the ’64 electoral campaign to limit our involvement in Vietnam, LBJ dramatically expanded America’s involvement in the war, costing the nation considerable blood and treasure.  Professors, protesters, and politicians soon surfaced to denounce the effort, siding “with the enemy with complete impunity” (p. 85).  Skillfully infiltrating the “Movement,” hard core “communist groups worked on increasingly close terms with the non-Communist radicals who made up the ever-selling constituency of what had only recently become known as the New Left” (p. 89).  

Pro-Communist intellectuals such as Susan Sontag, Mary McCarthy and Noam Chomsky praised and supported North Vietnam, many of them making pilgrimages to Hanoi to bask in the limelight Ho Chi Minh provided.  To McCarthy Hanoi was a wonderful place, full of well-fed cheerful children and free of prostitutes and refuse.  So too in the countryside, she “‘saw no children with sores and scalp diseases  . . . .  no rotten teeth or wasted consumptive-looking frames’” (p. 93).  All was well in the workers’ paradise, whereas south of the border she found nothing but chaos and corruption.  Sadly enough, General Edward Landsdale lamented, Ho and his minions malevolently devastated Vietnam, yet our public intellectuals never sought to portray them as earlier writers had done with the Kaiser in WWI or Hitler in WWII.  “‘For some baffling reason, we accepted the self-portrait of Ho Chi Minh as a benevolent old “uncle” who was fond of children—and of other Politburo leaders as speakers for a people they did not permit to have opinions.  So we let their claims to leadership go unchallenged while their people suffered and died’” (p. 108).  

Determined to defend American’s effort in Vietnam, Podhoretz condemned the media for enlisting in the anti-war movement and deliberately misleading the public.  Given his illustrations and citations, no sympathetic reader could deny the fact that influential journalists and academics effectively supported North Vietnam and shaped public opinion to that end.  “Thus did the North Vietnamese go on fighting in the reasonably secure belief that even if they lost on the battlefield, American public opinion—like French public opinion before it—would force the United States to withdraw on terms that would eventually ensure the Communist conquest of the south” (p. 130).    

Consequently we withdrew from the war.  LBJ decided not to run for another presidential term in 1968 and Richard Nixon was elected promising to end the war, though he certainly wanted to save South Vietnam from Communism and following the Christmas bombing in 1972 stood poised to actually prevail in the struggle.  In the opinion of Sir Robert Thompson, one of the most knowledgeable authorities on the war:  “‘In my view, on December 30, 1972, after eleven days of those B-52 attacks on the Hanoi area, you had won the war.  It was over!  . . . They would have taken any terms’” (p. 156).  And, indeed, Ho Chi Minh’s representatives to the Paris Peace accords quickly signed on to the Nixon-Kissinger proposals.  But the antiwar movement in America despised any hint of victory!  Prominent Democrats, such as George McGovern and Howard Hughes, railed against the “immorality” of this nation’s support of South Vietnam and the collapse of the Nixon presidency brought to power anti-war ideologues who rapidly orchestrated the process of exiting Southeast Asia.    

Clearly distressed by this nation’s failure in Vietnam, Podhoretz addresses an important ethical question:  who was clearly wrong?  In retrospect, as we consider the millions who died in Vietnam and Cambodia as a result of Ho Chi Minh’s aggressions, as we calculate the atrocities wrought by the Communists, as we reflect on the radical shifts in American foreign policy under Jimmy Carter, it becomes clear to Podhoretz that we should have persevered in the war and spared the world from a series of catastrophes.  Despite the many failures of the U.S. military, including isolated atrocities such as My Lai, American soldiers they could hardly be accused of “war crimes,” whereas nothing short of “genocide” took place under Ho Chi Minh’s and Pol Pot’s direction.  

* * * * * * * * * * * * * * * * 

For many years Bruch Herschensohn was an influential member of California’s Republican establishment, working for both Richard Nixon and Ronald Reagan.  In 1992 he would have probably been elected the United States Senate had not his opponent, Barbara Boxer, issued a last-minute and utterly false smear asserting he habitually visited strip clubs.  Narrowly defeated, he turned to writing and lecturing in universities such as Claremont, Pepperdine, and Harvard.  Determined to rectify a slice of this nation’s historical record, so badly distorted by left-wing ideologues, he recently published An American Amnesia:  How the U.S. Congress Forced the Surrender of South Vietnam and Cambodia (New York:  Beaufort Books, c. 2010).  “Voluntary amnesia,” he asserts, “is a crime against history” (Kindle #2127), and we must at all costs avoid it.  The story he tells he fully understands as a participant, consulting the notes and clippings he made while serving as a speech writer to Richard Nixon.

The 1973 Paris Peace Accords, precipitated by the Christmas bombing a month earlier and negotiated by Secretary of State Henry Kissinger, established two independent Vietnams.  But when President Nixon was forced from office, says North Vietnam’s Colonel Bui Tin, “‘we knew we would win’” (#828), and, defying the accords, within three years Ho Chi Minh’s Viet Minh forces had successfully invaded and conquered the South while Pol Pot’s Khmer Rouge took control of Cambodia.  Sustained by enormous assistance from China and the Soviet Union, Communist forces prevailed while the United States Congress did everything possible (overriding President Gerald Ford’s repeated objections) to abandon and ignore Indochina.  Evaluating all this, Senator J. William Fulbright “announced that he was no more depressed than I would be about Arkansas losing a football game to Texas’” (#778).  

In this endeavor the Congress was aided and abetted by the media, typified by Sidney Schanberg (egregiously celebrated in the 1984 film The Killing Fields) who was awarded the Pulitzer Prize in 1976 for his reporting on Cambodia.  According to him, “‘I have seen the Khmer Rouge and they are not killing anyone’” (#543).   “NBC’s Jack Perkins watched Saigon’s War Memorial being toppled into the street by North Vietnamese soldiers, and he said to his American television audience that the statue had been ‘an excess of what money and bad taste accomplish.  I don’t know if you call it the fall of Saigon or the liberation of Saigon’” (#759).  That “liberation” led quite quickly to renaming Saigon Ho Chi Minh City and the expulsion of many residents to the countryside, where “reeducation camps” imposed Ho’s ideology.  Ultimately millions of innocents were slain.  Amazingly, due to the agitation of anti-war protesters and the stratagems of the 94th Congress, more died in the year following Saigon’s fall “than during the preceding decade of war” (#857).  But the American media studiously ignored the genocide!  

America’s reaction to the war in Vietnam, says Herschensohn, had unintended consequences, revealed in incidents such as 9/11.    The 94th Congress not only refused to grant President Ford funds to defend Cambodia and South Vietnam but enacted policies to hamstring the CIA, leading to a series of intelligence failures around the world.  Under Jimmy Carter, America retreated everywhere—turning away from El Salvador, Nicaragua and Iran.  When the Ayatollah Khomeini seized control of Iran, Carter’s “Ambassador to the United Nations, Andrew Young, stated, ‘Khomeini will be somewhat of a Saint when we get over the panic’” (#2067).  A saint indeed!  And a murderous saint to boot!  

Thus Herschensohn warns:  “Because of congressional actions taken in the mid-1970s, the nation today faces risks to our survival, and risks to the very survival of civilization as we know it” (#2127).  

250 The Roots of Radical Islam

 To understand radical Islam’s emergence during the last half of the 20th century, Lawrence Wright’s The Looming Tower:  Al-Qaeda and the Road to 9/11 (NY:  Alfred A. Knopf, 2006) remains one of best researched, most readable surveys.  He tells how a small cadre of religious zealots—most notably Sayyid Qutb, Ayman  al-Zawahiri, and Osama bin Laden—deliberately upended our world.  And they did so, in part, because the United States routinely failed to understand, withstand, and respond to their assaults.  Amazingly, despite recurrent warnings and violent episodes, almost no one in America took them seriously.  “It was too bizarre, too primitive and exotic.  Up against the confidence that Americans placed in modernity and technology and their own ideals to protect them from the savage pageant of history, the defiant gestures of bin Laden and his followers seemed absurd and even pathetic” (p. 6).  

Wright begins his account portraying Sayyid Qutb, an Egyptian school teacher who came to the United States in 1949.  Filled with hatred for the new nation of Israel and shocked by its triumph over Arab armies, he found in America added fuel for the Islamic zeal consuming his soul.  The shame and shock at the establishment of Israel “would shape the Arab intellectual universe more profoundly than any other event in modern history.  ‘I hate those Westerners and despise them!’ Qutb wrote after President Harry Truman endorsed the transfer of a hundred thousand Jewish refugees into Palestine.  ‘All of them, without any exception:  the English, the French, the Dutch, and finally the Americans, who have been trusted by many’” (p. 9).  His hatred, interestingly enough, didn’t deter him from coming to study in America! 

Though generally well-treated by the ordinary folks in Washington D.C. and Greeley, Colorado, where Qutb briefly studied and continued his writing projects, he looked for and found proof of America’s degeneracy in such events such as a church dance, the freedom enjoyed by women, and publications such as the spurious Kinsey Report.  Hostility to America meshed easily with hostility to Israel to form the core of his world view, and when he returned to Egypt he believed :  “Modern values—secularism, rationality, democracy, subjectivity, individualism, mixing of the sexes, tolerance, materialism—had infected Islam through the agency of Western colonialism.  America now stood for all that” and he was persuaded “that Islam and modernity were completely incompatible” (p. 24).  Ultimately  imprisoned by General Abdul Nasser—the first truly native-born Egyptian to rule Egypt in 2500 years—he wrote Milestones, an enormously influential treatise, to recall Muslims to the pristine purity of their 7th century origins.  For radical Muslims, Qutb’s Milestones resembles Hitler’s Mein Kampf or Lenin’s What Is to Be Done.  

The second significant Islamist was Ayman al-Zawahiri, a medical doctor from a prominent family who was reared in an upscale Cairo suburb.  During his student years he absorbed and quickly promoted Qutb’s version of Islam.  He too was distressed by the mere existence of Israel and felt humiliated by Egypt’s collapse in the 1967 war—a decisive “psychological turning point in the history of the modern Middle East.  The speed and decisiveness of the Israeli victory in the Six Day War humiliated many Muslims who had believed until then that God favored their cause.  . . . The profound appeal of Islamic fundamentalism in Egypt and elsewhere was born in this shocking debacle” (p. 38).  Fiercely nationalistic, Zawahiri envisioned reestablishing the Muslim Caliphate centered in Egypt and enabling Islam to authentically flourish, dominating planet earth.  He launched an underground movement (al-Jihad) designed to overthrow the secular regime in his country.  Accused of involvement in the assassination of President Anwar Sadat, Zawahiri was imprisoned for three years and early emerged as the spokesman for the defendants.  

The third and most infamous protagonist in Wright’s story is “The Founder,” Osama bin Laden, one of the many sons of Mohammed bin Laden, one of Saudi Arabia’s most prosperous businessmen.  He was especially close to King Abdul Aziz and did much of the construction work on the renovation of the Grand Mosque in Mecca, which can hold a million worshippers.  Though expected to take his place in his father’s extensive business empire, young Osama bin Laden joined the Muslim Brothers while in high school and began to show less interest in making money than establishing Islamic states.  While studying in King Abdul Aziz University in Jeddah he turned increasingly religious, taking the Salafist position that declares versions of Islam other than that espoused by Saudi Arabian Wahhabis heretical.  Like Zawahiri he was deeply moved by the writing of Sayyid Qutb and embraced his anti-American agenda.  “Bin Laden would later say that the United States had always been his enemy.  He dated his hatred for America to 1982, ‘when America permitted the Israelis to invade Lebanon and the American Sixth Fleet helped them’” (p. 151).  

   When Soviet troops invaded Afghanistan in 1979, radical Muslims rallied to defend Islam, so both Zawahiri and bin Laden made their way to the fields of conflict.  Much much of their activity, however,  took place in nearby Pakistan, where bin Laden proved especially useful in fundraising.  Here they and their followers engaged in endless discussions regarding jihadist strategies and sought to train young warriors to give their lives to the cause.  The Afghans fought and won the war against Russia, whereas the Arabs recruited by Zawahiri and bin Laden mainly looked for opportunities to die as martyrs for Islam.   Thus forged amidst the Afghan War, the ideology and methodology of Al Qaeda were basically in place by 1988.  In particular, Islamic rationalizations for suicide missions and terrorist attacks on innocent civilians coalesced within the principle of takfi—a license for true believers “to kill practically anyone and everyone who stood in their way; indeed, they saw it as a divine duty” (p. 125).    

As the war in Afghanistan wound down, bin Laden returned to Jeddah, Saudi Arabia, where he enjoyed a celebrity status for his “divine mission” in Afghanistan.  In his native land the Wahhabi version of Islam had gained strength:  theaters were closed, music (“the flute of the devil” bin Laden said) virtually disappeared, and women’s activities were seriously circumscribed.  But for radicals like bin Laden even this was not sufficient and his activities increasingly irritated King Fahd and the princes ruling the Kingdom.  When, for example, Iraq conquered Kuwait and threatened Saudi Arabia, bin Laden objected to allowing American troops to defend the kingdom.  He and his jihadists, he declared, could (with Allah’s aid) repel any invasion of Arabian peninsula’s sacred soil.  But King Fahd,  trusting in tanks rather than jihadists, invited the Americans to establish bases and successfully overturn Saddam Hussein’s conquests.  

At odds with Saudi rulers (who ultimately revoked his citizenship), bin Laden then moved to Sudan, where he bought land near Khartoum and tried to both farm and launch various business enterprises.  His very presence added considerably to Sudan’s financial status and he seemed momentarily content.  But he soon fell in with a radical Imam (Abu Hajer) who encouraged him to attack the United States, “the last remaining superpower” threatening Islam.  He and al-Qaeda would henceforth target American troops and murder innocents—concentrating “not on fighting armies but on killing civilians” (p. 175).  By this time he had come to despise the United States as “weak and cowardly,” urging his followers to remember Vietnam and Lebanon.  When a few of their soldiers die, he said,  Americans retreat!  “For all its wealth and resources, America lacks convictions.  It cannot stand against warriors of faith who do not fear death” (p. 187).   President Bill Clinton’s cowardly withdrawal from Somalia in 1993 had further confirmed bin Laden’s growing contempt for the USA.    

Amidst deteriorating conditions, bin Laden left Sudan in 1996 financially ruined, his family scattered, and his organization broken.  “He held America responsible for the crushing reversal that had led him to this state” (p. 223).  On August 23, 1996, in his “Declaration of War Against the Americans Occupying the land of the Two Holy Places,” he said:  “You are not unaware of the injustice, repression, and aggression that have befallen Muslims through the alliance of Jews, Christians, and their agents, so much so that Muslims’ blood has become the cheapest blood and their money and wealth are plundered by the enemies” (p. 234).  Barred from returning to Saudi Arabia, he settled in Afghanistan, now controlled by Mullah Mohammed Omar and the Taliban.  Joined by a group of Egyptians following Zawahiri, he began training terrorists such as Mohammed Atta to take down America.  In  1998, Zawahiri drafted a document calling on “all of the different mujahideen groups that had gathered in Afghanistan” to launch  “a global Islamic jihad against America” (p. 259).  

This fatwa, signed by bin Laden as well as Zawahiri, declared that the killing of “Americans and their allies—civilian and military—is an individual duty for every Muslim who can do it in any country in which it is possible to do it’” (p. 260).  Soon thereafter the jihadists orchestrated the nearly simultaneous bombings of American embassies in Kenya (killing 213 and injuring thousands of people) and Tanzania (killing 11 and wounding 85).  Two years later the USS Cole was nearly sunk by a suicide attack in Aden, Yemen’s deep water port, killing 17 sailors.  To bin Laden:  “The destroyer represented the capital of the West, and the small boat represented Mohammed.”  

But other than haphazardly launching a few missiles and issuing threats, Bill Clinton and his administration did nothing.  In the waning days of his presidency he tried “to burnish his legacy by securing a peace agreement between Israel and Palestine. (p. 331).   Within a year, however, culminating the jihadist offensive, came September 11, 2001 and with the collapsing New York towers the world woke up to al-Qaeda, bin Laden, and the threat posed by radical Islam!  

* * * * * * * * * * * * * * * * * * * 

In Nazi Propaganda for the Arab World (New Haven:  Yale University Press, c. 2009), Professor Jeffrey Herf “documents and interprets Nazi Germany’s propaganda efforts aimed at Arabs and Muslims in the middle East and North Africa” (p. 1).  From 1939 to 1945 a steady stream of anti-Semitic, anti-Allies propaganda reached millions of Muslims via shortwave radio.  These broadcasts both “attributed enormous power and enormous evil to the Jews” (p. 2) and promised that an Axis victory would free “the countries of the Middle East from the English yoke and thus realize their right to self-determination” (p. 3).  In time, of course, the Allies won WWII and little came of the Nazi endeavor to establish a foothold in the Islamic world.  But the broadcasts’ rhetoric, it can be argued, helped shape the mindset of today’s radical Muslims, for the same anti-Semitic, anti-Western message routinely circulates throughout their world.

Central to the story is Haj Amin el-Husseini, the Grand Mufti of Jerusalem, who resided in Berlin during WWII and was “the most important public face and voice of Nazi Germany’s Arabic-language propaganda” (p. 8).  He and his family were influential and he “led opposition to the Balfour Declaration and to Jewish immigration to Palestine” (p. 8).  In Berlin, he met and associated with Adolf Hitler, Heinrich Himmler, and other important Nazis.  He assured Hitler that “the Fuhrer was ‘admired by the entire Arab world.’  He thanked him for the sympathy he had shown to the Arab and especially the Palestinian cause” (p. 76).  Hitler responded by assuring Husseini that Arabs would be liberated from English domination, Jews in North Africa and the Middle East would be destroyed, and “the Mufti would be the most authoritative spokesman of the Arab world” (p. 78).  “Husseini was a key figure in finding common ideological ground between National Socialism, on the one hand, and the doctrines of Arab nationalism and militant Islam, on the other” (p. 8).  Following the war he mysteriously “escaped” and found shelter in Cairo, where he was protected and lauded for the remainder of his life, ever promoting an anti-Jewish, anti-American agenda.  

From one perspective, this book is a chronological record of what was said by the Nazi propaganda machine.  Chapter by chapter, Herf describes the shifting nature of the broadcasts, reflecting the course of WWII.  As the war began, the Nazis sought to enlist Arab support in the Middle East, where England especially controlled considerable territory.  Thus similarities between Islam and National Socialism were stressed.  As General Rommel seemed on the verge of victory in North Africa, the broadcasts promised both the extermination of the Jews (primarily in Palestine) and freedom from British rule.  When the Allies began to turn back the German advance, the broadcasts shifted to emphasize the potential harm Muslims would suffer should the British and American and Soviet armies succeed.   As the Third Reich collapsed, the broadcasts shifted to emphasize conspiracies afoot in Islamic lands, blaming Jews and their supporters (especially America) for various evils.

From another perspective, however, there was a constancy to the broadcasts:  hostility to Jews and their allies.  “Radical anti-Semitism was a central component throughout the broadcasts” (p. 11).  No labels were too vicious, no rumors too unfounded, no accusations too malicious for assertion on the radio!  Arabs in North Africa and the Middle East were urged to kill Jews, following the Nazi example, aiming at the “final solution.”  They were reminded that the prophet Mohammed expelled the Jews from Arab lands and then urged to follow his example.  A broadcast in 1942 was titled “Kill the Jews before They Kill You.”  Egyptians were urged to do their duty “to annihilate the Jews and to destroy their property’” (p. 125).  Husseini always made it clear that “his hatred of the Jews was ineradicably bound to his Muslim faith and to his reading of the Koran” (p. 154).  He charged that “‘they lived like a sponge among peoples, sucked their blood, seized their property, undermined their morals yet still demand the rights of local inhabitants’” (p. 185).  Still more, he cried out:  “‘Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them.  This pleases God, history and religion.  This serves your honor.  God is with you’” (p. 213).  

As was evident in the Grand Mufti’s messages, the Koran was the great authority invoked to appeal to Arab listeners.  Neither Hitler’s Mein Kampf nor The Protocols of the Elders of Zion were much discussed, nor were the speeches of Hitler or Himmler invoked.  Rather, texts from the Koran were continually cited to justify Nazi propaganda.  “Nazism thus stood with the ‘faithful’ and ‘noble’ Muslims against traitors who deviated from the path laid down in the Koran” (p. 197).  Himmler even urged his German scholarss to link the Shi’ite hope for the coming of the Twelfth Imam to Hitler, suggesting that “‘the Koran predicts and assigns to the Fuhrer the mission of completing the Prophet’s work’” (p. 199).  Hitler could be portrayed, Himmler said, “‘as Jesus (Isa) who the Koran predicts will return and, as a knight . . . defeats giants and the king of the Jews who appear at the end of the world’” (p. 199).  

Hostility to the Jews was conjoined with hostility to the Allies (preeminently England and America) in the broadcasts.  Despite the fact that the British restricted Jewish Immigration to Palestine and the Americans equivocated regarding the establishment of a Jewish state, both nations were accused of actively promoting such activities.  Egyptians particularly were portrayed as victims of British oppression and urged to drive out the foreigners.  As American troops increasingly played a role in the war, the broadcasts besmirched the USA and President Franklin D. Roosevelt in particular.  He was declared to be not only a tool in the hands of conniving Jews (such as Hans Morgenthau and Bernard Baruch) but to be a Jew himself!  In one of his broadcasts Husseini “stated that the ‘wicked American intentions toward the Arabs are now clearer, and there remain no doubts that they are endeavoring to establish a Jewish empire in the Arab world.  More than 400,000,000 Arabs oppose this criminal American movement’” (p. 213).  

Though Herf focuses almost exclusively on the historical details, it takes little imagination to apply his insights to current affairs.  Virtually the same rhetoric employed by the Nazis is evident throughout Islamic lands.  The link is quite clear in the Muslim Brotherhood, which was founded by Hassan al-Banna (a graduate of the most prestigious Islamic university, Al-Azhar in Cairo), who had “‘made a careful study of the Nazi and Fascist organizations’” (p. 225).  “The Brotherhood wanted to establish a government based on pure Koranic principles and sought to counter reliance on Western culture, which it regarded as having brought about an ‘abasement of morals, conduct and character, for having increased the complexity of society and for having exposed the people to poverty and misery’” (p. 225).  

Picking up on Muslim Brotherhood themes following WWII, Sayyid Qutb furthered the fanatical message of radical Islam, writing Our Struggle with the Jews.  “The title itself,” notes Professor Herf, “evokes disconcerting comparisons to Hitler’s Mein Kampf (My Struggle).  Most important, in its views of the Jews and in its conspiratorial mode of analysis the book displayed a striking continuity with the themes of Nazism’s wartime broadcasts, with the important difference that it was far more embedded in the Koran and Islamic commentaries” (p. 255).  Qutb asserted that the Koran “‘spoke much about the Jews and elucidated their evil psychology’” (p. 257).  This alone authorized “war against the Jews in Israel” (p. 258).  Qutb probably “listened to Nazi broadcasts and traveled in the pro-Axis intellectual milieu of the radical Islamists in and around Al Azhar University.” Thus, Herf reasons:  “Just as the Nazis had threatened the Jews with ‘punishment’ for alleged past misdeeds, so Qutb offered a religious justification for yet another attempt to ‘mete out the worst kind of punishment’ to the Jews then in Israel.  In terms that his audience understood, Our Struggle with the Jews was a call to massacre the Jews living in Israel” (p. 259).  Executed in Egypt in 1966, Qutb “became both a martyr and an ideological inspiration for such radical Islamist groups as Al Qaeda, Hezbollah, and Hamas” (p. 255).  His influence clearly permeates the thought and action of radical Muslim terrorists, including Osama bin Laden.  The vitriol regarding Jews, the anger at America, the dishonest renditions of history, the constant complaints of victimization—nothing much has changed in more than half-a-century!  

Professor Herf draws upon previously untapped documentary sources, especially a cache of materials—transcriptions of the broadcasts made in Cairo by an American ambassador and sent to Washington—that provide extensive evidence for his case.  A professor of history at the University of Maryland, he has written extensively about the Third Reich’s animosity towards the Jews.  His books include:  Reactionary Modernism:  Technology, Culture, and Politics in Weimar and the Third Reich; The Jewish Enemy:  Nazi Propaganda During World War II and the Holocaust; and Divided Memory:  The Nazi Past in the Two Germanys.  He writes as a scholar for scholars, an historian for historians, meticulously footnoting every assertion.  Above all he wants to fully, conclusively document his argument.  Consequently, as he demonstrates the recurrent message, year after year, there is an unavoidable redundancy to the presentation that taxes the reader’s patience.  But his treatise provides important evidence that enables us to understand important aspects of Islam, then and now—from Mohammed onwards, Muslims have distrusted and detested Jews and anyone else disinclined to submit to Allah.  This, rightly understood, is part and parcel of Islam, which means “surrender to God’s will” as manifest in the Prophet’s followers.  

249 “Mind and Cosmos”

   Academic philosophers rarely grace the covers of newsmagazines, but the March 25 issue of The Weekly Standard portrayed Professor Thomas Nagel, bound with ropes, surrounded by demonic monks, roasting in a fire, featured in an article titled “The Heretic—professor, philosopher apostate.”  The reason for such attention was the recent publication of Nagel’s Mind and Cosmos:  Why the Materialist Neo-Darwinian Conception of Nature Is Almost Certainly False (New York:  Oxford University Press, c. 2012).  A professor at New York University, he enjoys an eminent position within the elite galaxy of revered intellectuals.  Before publishing this treatise he had refrained from openly questioning the entrenched naturalistic Weltanschauung of his peers so starkly set forth by Francis Crick:  “You, your joys and your sorrows, you memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules.  Who you are is nothing but a pack of neurons.”  Taking issue with such reductive materialism and publishing a slender treatise questioning its guiding assumptions has elicited outrage and abuse from his erstwhile colleagues, but in doing so Nagel did the real work of a philosopher—following the evidence and seeking the truth rather than tacking to the winds of opinion.  

He admits that “for a long time I have found the materialist account [given “canonical exposition” in Richard Dawkins’ The Blind Watchmaker] of how we and our fellow organisms came to exist hard to believe, including the standard version of how the evolutionary process works.  The more details we learn about the chemical basis of life and the intricacy of the genetic code, the more unbelievable the standard historical account becomes” (p. 5).  The more we understand about life and the cosmos the less adequately Neo-Darwinianism explains things.  Though personally a humanist atheist, he finds common ground with the advocates of Intelligent Design such as Michael Behe and Stephen Meyer, persuasive critics of the dominant paradigm.  He’s come to seriously consider the possibility that mind, rather than matter, shapes Reality.  Consequently:  “My guiding conviction is that mind is not just an afterthought or an accident or an add-on, but a basic aspect of nature” (p. 16).  This is particularly evident when we turn our attention to what we know best—ourselves!  “Something more is needed to explain how there can be conscious, thinking creatures whose bodies and brains are composed of those elements.  If we want to try to understand the world as a whole, we must start with an adequate range of data, and those data must include the evident facts about ourselves” (p. 20).  Unfortunately:  “Evolutionary naturalism implies that we shouldn’t take any of our convictions seriously, including the scientific world picture on which evolutionary naturalism itself depends” (p. 18).  

The mysterious and absolutely indubitable reality of human consciousness highlights the inadequacy of evolutionary naturalism.  “Organisms such as ourselves do not just happen to be conscious; therefore no explanation even of the physical character of those organisms can be adequate which is not also an explanation of their mental character.  In other words, materialism is incomplete even is a theory of the physical world, since the physical world included conscious organisms among its most striking occupants” (p. 45).  Scholars like Dawkins and Crick, who reduce consciousness to material entities, fail to properly distinguish between description and explanation; observing neurons firing in the brain does not begin to adequately explain the phenomenon of consciousness.  Far better, Nagel says, is the ancient Aristotelian conception of “teleological laws” guiding natural processes.  In addition to matter-in-motion, there may well be “something else, namely a cosmic predisposition to the formation of life, consciousness, and the value that is inseparable from them” (p. 123).  Old Aristotle may well have erred, but he now appears wiser than his modern antagonists!  As for theists, a creative God certainly provides a satisfactory explanation.  No final explanation for consciousness fully persuades Nagel, but he knows that the Neo-Darwinian answer lacks cogency.  What we must seek, he argues, is “a form of understanding that enables us to see ourselves and other conscious organisms as specific expressions simultaneously of the physical and mental character of the universe” (p. 69).  

What’s true for consciousness is even truer for cognition—our incredible ability to reason.  We are not only aware of ourselves as thinking beings but we can transcend our personal perspectives and objectively discover momentous realities such as the law of gravity.  Evolutionary naturalism fails, abysmally, to explain the existence and unique mental powers of our species, so properly labeled homo sapiens.  “Rationality, even more than consciousness, seems necessarily a feature of the functioning of the whole conscious subject, and cannot be conceived of, even speculatively, as composed of countless atoms of miniature rationality” (p. 87).  

Then add to cognition conscience!  Add to speculative reason practical reason.  We do, countless times a day, evaluate things, judging them good and evil, right and wrong.  And such judgments range far beyond our individual feelings or interests.  I may very well be more outraged by the former San Diego Mayor Bob Filner’s abusive behavior than by an undeserved personal insult.  I may very well be more concerned with the national debt’s impact on future generations than by the sharp increase of my electric bill, though both result from irresponsible politicians’ decisions.  To Nagel, only the “moral realism” expounded by traditional thinkers such as Aristotle and C.S. Lewis enables us to craft ethical principles and render moral judgments; and “since moral realism is true, a Darwinian account of the motives underlying moral judgment must be false, in spite of the scientific consensus in its favor” (p. 105).  

Inasmuch as consciousness, rationality and morality define us as human beings—and inasmuch as evolutionary naturalism cannot explain these fundamental realities—we must, Nagel says, open our minds to better ways of thinking and understanding the universe, taking “the appearance and evolution of life as something more than a history of the development of self-reproducing organisms, as it is in the Darwinian version” (p. 122).  A better version is wanted!  For, Nagel concludes:  “I would be willing to bet that the present right-thinking consensus will come to seem laughable in a generation or two” (p. 128).  No wonder “the present right-thinking” guardians of secular orthodoxy turned venomous when confronted with Nagel’s intellectual rigor and incisive logic!  

* * * * * * * * * * * * * * * * * *

In many of his writings C.S. Lewis trenchantly critiqued the philosophical naturalism masquerading as “science” in the modern world.  This he labeled “scientism,” carefully differentiating it from authentic “science,” with its rigorous methodology and tentative hypotheses.  The intrinsic nihilism and potential brutality of “scientism” was philosophically exposed in Lewis’s The Abolition of Man and memorably portrayed in his That Hideous Strength, one of the great dystopias of the 20th century.  The same message is manifest (though without Lewis’s theistic foundation) in Raymond Tallis’ recent Aping Mankind:  Neuromania, Darwinitis and the Misrepresentation of Humanity (Durham, U.K.:  Acumen Publishing Limited, c. 2011).   As a medical doctor (and “atheistic humanist”) who taught for many years at the University of Manchester, devoting himself to brain science, he is thoroughly aware of neuroscience and its implications for understanding human nature.  But he has become increasingly distressed by the unwarranted supposition (what he dubs “neuromania”) that we are no more than our brains, ignoring the importance of common sense, consciousness and culture, art and religion.  As widely propounded in both scholarly and popular circles:  “The neurophysiological self is at best the locus of ‘one damn thing after another’, which hardly comes near to the self of a human being who leads her life, who is a person reaching into a structured future with anticipations, aims and ambitions, that are themselves rooted in an almost infinitely complex accessible past that makes sense of them” (p. 135).   

Even on a purely material level man’s brain eludes easy analysis.  Though specific neurological sections clearly do specific things (e.g. seeing; hearing), they are capable of alternative and adaptive roles.  Rather than being “hard-wired” like a computer, the brain has a beguiling “plasticity” enabling it to reorganize under certain conditions.  The brain is clearly necessary for us to think—but it is not necessarily a sufficient explanation of our thinking.  Neurologists may chart correlations between neurons firing and mental activity, but as elementary logic reminds us a correlation must never be equated with causation.  “The errors of muddling correlation with causation, necessary condition with sufficient causation, and sufficient causation with identity lie at the heart of the neuromaniac’s basic assumption that consciousness and nerve impulses are one and the same, and that . . . ‘the mind is a creation of the brain’” (p. 95).  Quite the contrary, Tallis argues:  “mental events are not physical events in the brain” (p. 133).  

Undergirding the notion that the mind is the creation of the brain is the evolutionary assumption that we are nothing but the clever animals Daniel Dennett declares as part and parcel of  “Darwin’s Dangerous Idea,” the “universal acid” that cuts away all confidence in what philosophers call qualia—intentionality and meaning,  morality and justice, freedom and responsibility, beauty and love.  To Tallis, any theory that discounts such qualia (intensely felt personal realities basic to human experience) demands disbelief!  Obviously “nerve impulses are not at all like qualia” (p. 95) and any attempt to explain away the latter by describing the former cannot but miscarry.  Indeed, “we shall find, again and again, that we cannot make sense of what the brain is supposed to do—in particular postulating an intelligible world in which it is located—without appealing to talk about people who are not identical with their brains or with material processes in those brains” (p. 111).  

The “Darwinitis” infecting “neuromaniacs” is similarly suspect to Tallis.  “If they only looked at what was in front of their noses they would not have to be told that there are differences between organisms and people:  that a great gulf separates us from even our nearest animal kin” (p. 147).  In almost every significant way we differ from other animals!  “Many of our strongest appetites—for example, for abstract knowledge and understanding—are unique to us” (p. 151).  Our finest endeavors—writing and reading books, composing and listening to symphonies—have no parallel in the animal kingdom.  Importantly, to Tallis, embracing Darwinism as an explanation for human origins does not necessarily entail accepting “Darwinitis (which purports to explain everything about people in terms of biological evolution)” (p. 153).  Especially problematic is any Darwinian explanation of human consciousness, the fundamental reality known to us.  “In short, if it is difficult (although not in principle impossible) to see how living creatures emerged out of the operation of the laws of physics on lifeless matter, it is even less clear how consciousness emerged or why it should be of benefit to those creatures that have it.  Or, more precisely, why evolution should have thrown up species with a disabling requirement to do things deliberately and make judgments” (p. 179).  

Whether humanizing animals or animalizing man, Darwinitis demands its disciples deny non-material realities of any sort.  Consequently they remain “bewitched” by figures of speech comparing us with computers or machines, dolphins or chimps.  Unlike computers, however, we think and skillfully program computers, which are not conscious and cannot reason.  Even the most sophisticated supercomputers “are as zombie-like as pocket calculators” (p. 195).  We, conversely, uniquely use languages that are not at all computational!  In our languaging we reveal our freedom and dignity (realities necessarily denied by neuromaniacs) as human beings, and in our literature we revel in our uniquely human creativity.  

Sadly enough, Tollis says, even our current humanities (history, philosophy, art and music) have fallen captivity to Darwinitis and neuromania, reducing literally all our activities to “animalities,” i.e. matter-in-motion.  Thus we find Shakespearean scholars studying Macbeth’s grasping for an imaginary dagger and declaring:  “‘when moving his right hand, an actor playing Macbeth would activate the right cerebellar hemisphere and the left primary cortex” (p. 294)!   Such “scholarship,” relentlessly marching through academia, should give us pause, Tollis says, because it ruthlessly destroys all that grants grandeur to our literary treasures.  Similarly, we must be alerted to the flourishing academic discipline of “neuro-evolutionary ethics” espoused by thinkers such as Patricia Churchland, who insists that “‘it is increasingly evident that moral standards, practices and policies reside in our neurobiology’” (p. 317). Thus, as Albert Einstein asserted in 1932, in our “thinking, feeling, and acting” we do nothing freely “‘but are just as causally bound as the stars in their motion’” (p. 312).  

Such views, coming to the foreground in our world, lead Tollis to warn:  “Be afraid, be very afraid.”  Indeed, Tollis is sufficiently afraid to look favorably on “at least the idea” of God (p. 325).  The consequences of the atheism he embraces embarrass him!  Though irreligious himself, he finds the traditional notion of God preferable to the simplistic “biologism” espoused by prominent atheists such as Richard Dawkins, whose “devastating reductionism . . . disgusts even an atheist like me.  In defending the humanities, the arts, the law, ethics, economics, politics and even religious belief against neuro-evolutionary reductionism, atheist humanists and theists have a common cause and, in reductive naturalism, a common adversary:  scientism” (p. 336).  

* * * * * * * * * * * * * * * * * 

For many years Alvin Plantinga has effectively represented the Christian perspective among academic philosophers.  Illustrating his prestige among his peers, he was invited to deliver the Gifford Lectures in 2005.  In print the lectures are titled:   Where the Conflict Really Lies:  Science, Religion, and Naturalism (New York:  Oxford University Press, c. 2011).  Unlike some Gifford lecturers (e.g. William James, in The Variety of Religious Experience), Plantinga writes almost exclusively for his peers, and this treatise will be accessible only to folks with ample background in the denser realms of science, philosophy and theology.  His thesis, in short, claims:  “there is superficial conflict but deep concord between science and theistic religion, but superficial concord and deep conflict between science and naturalism” (#89 in Kindle ed.).  Theists such as himself need not deny evolutionary evidence, but they cannot abide a naturalistic “add-on to the scientific doctrine of evolution:  the claim that evolution is undirected, unguided, unorchestrated by God (or anyone else)” (#142).  Though vociferously denied by its secular proponents, scientific “naturalism” assumes a religious role in their thinking and may be understood as a “quasi-religion” whose presumptions clearly conflict with the data of consciousness and cognition.  Unfortunately, as Richard Feyerabend wisely noted years ago:  “Scientists are not content with running their own playpens in accordance with what they regard as the rules of the scientific method; they want to universalize those rules, they want them to become part of society at large” (Against Method, p. 220).  

We Christians especially should take seriously the scientific discoveries and insights of our time.  Created in God’s image, we are uniquely equipped “to know and understand something of ourselves, our world, and God himself” (p. 4).  All truth is God’s truth and we can (in part, looking through a dark glass) know it.  Unfortunately, all too many modern “scientists” abandon their circumscribed vocation and become amateur philosophers when promoting their naturalistic (and generally atheistic) convictions.  By carefully reading Richard Dawkins’ The Blind Watchmaker and The God Delusion—and demanding such necessities as demonstrable evidence and cogent explanation, logical rigor and unambiguous definitions— Plantinga effectively illustrates Dawkins’ sophomoric shortcomings.  Dispatching Dawkins, he then dissects Daniel Dennett’s Darwin’s Dangerous Idea—“a paradigm example of naturalism” (p. 36).  Though by profession a philosopher (whereas Dawkins is a biologist dispensing philosophy), Dennett apparently has failed to do the honest intellectual toil necessary to actually engage the great theists of the past!  Consequently, many of his arguments (like Dawkins’) prove jejune to a first-rate thinker such as Plantinga—“about as bad as philosophy (well, apart from the blogosphere) gets” (p. 45).  

Much the same can be said of “evolutionary psychologists” who “explain distinctive human traits—our art, humor, play, love, sexual behavior, poetry, sense of adventure, love of stories, our music, our morality, and our religion itself—in terms of adaptive advantages accruing to our hunter-gather ancestors” (p. 131).  Thus Harvard’s Steven Pinker devoted “only eleven of his 660-page How the Mind Works” to music; he explained that music “‘was useless’ in terms of human evolution and development’” and should be regarded “as ‘auditory cheesecake,’ a trivial amusement that ‘just happens to tickle several important parts of the brain in a highly pleasurable way, as cheesecake tickles the palate’” (p. 132).  Such Pinkerian statements simply illustrate the intellectual vacuity of celebrated academics!  

On a more constructive level, after meticulously answering objections to the possibility of God’s intervention in the world, he suggests that God could easily work through both the “macroscopic” and “microscopic” realms, exercising “providential guidance over both “cosmic” and “evolutionary” history and doing so “without in any way ‘violating’ the created natures of the things he has created” (p. 116).  Still more, Plantinga confesses:  “perhaps God is more like a romantic artist; perhaps he revels in glorious variety, riotous creativity, overflowing fecundity, uproarious activity.  . . . .  Perhaps he is also very much a hands-on God, constantly active in history, leading, guiding, persuading and redeeming his people, blessing them with “the Internal Witness of the Holy Spirit” (Calvin) or “the Internal Instigation of the Holy Spirit” (Aquinas) and conferring upon them the gift of faith.  No doubt he is active in still other ways.  None of this so much as begins to compromise his greatness and majesty, his august and unsurpassable character” (p. 107).  Equally possible, God may very well have created “a theater for setting for free actions on the part of human beings and other persons”—“a world of regularity and predictability” wherein we function in accord with our imago dei status (p. 119).  

Thus good science poses no “defeaters” for Christian faith.  There is in fact deep concord between them.  As Sir Isaac Newton said, in Principia Mathematica:  “This most beautiful system of the sun, planets and comets, could only proceed from the counsel and dominion of an intelligent and powerful being. . . .  This Being governs all things, not as the soul of the world, but as Lord over all” (p.).  Today’s cosmologists often reflect on the apparent “fine tuning” of the universe that suggests a profound teleological process culminating in a world “just right” for us human beings.   Current advocates of “Intelligent Design” such as Michael Behe have persuasively detailed evidence and argued that “irreducibly complex” structures cannot be adequately explained by Neo-Darwinians who insist the evolutionary process is absolutely unguided.  Plantinga effectively demonstrates the thoroughly rational and philosophically defensible position that there is every reason to believe in a Mind behind the Cosmos.