138 Christians and War

CHRISTIANS AND WAR

 In Mere Christianity, C.S. Lewis explains the “just war” position I’ve come to embrace as my own:  “Does loving your enemy mean not punishing him?  No, for loving myself does not mean that I ought not to subject myself to punishment–even to death.  If one had committed a murder, the right Christian thing to do would be to give yourself up to the police and be hanged.  It is, therefore, in my opinion, perfectly right for a Christian judge to sentence a man to death or a Christian soldier to kill an enemy.  I have always have thought so, ever since I became a Christian, long before the war, and I still think so now that we are at peace.  It is no good quoting “thou shalt not kill.”  There are two Greek words:  the ordinary word to kill and the word to murder.  And when Christ quotes that commandment He uses the murder one in all three accounts, Matthew, Mark, and Luke.  And I am told there is the same distinction in Hebrew.  All killing is not murder any more [p. 92] than all sexual intercourse is adultery.  When soldiers came to St. John the Baptist asking what to do, he never remotely suggested that they ought to leave the army:  nor did Christ when He met a Roman sergeant-major–what they called a centurian.  The idea of the knight–the Christian in arms for the defense of a good cause–is one of the great Christian ideas” (pp. 91-92). 

Since I deal with the issue of war in some of the classes I teach, during the recent war in Iraq I read or re-read several treatises devoted to the broader issue of Christians engaging in combat.  One of the most widely-cited and most respected is Roland H. Bainton’s Christian Attitudes Toward War and Peace:  A Historical Survey and Critical Re-Evaluation (Nashville:  Abingdon Press, c. 1960).  Since Bainton is an eminent historian, the author of Here I Stand, one of the finest biographies of Martin Luther, one might expect a dispassionate, objective survey of the evidence. I’d read the book 30 years ago and accepted it as something of a definitive survey.  Returning to the treatise, however, I realize that one should begin reading it from at the end!  In the next to the last chapter, Bainton declares that “pacifism” is the only legitimate Christian position (p. 248).  This leads him, in his final chapter, to as invalidate natural law arguments that support “just wars.”  Then he reveals the bias that underlies his work, for he urged the United States to “disarm unilaterally,” hoping the USSR would honor such a move, and subordinate the nation’s sovereignty to a “world government” of some sort (p. 256).  Still more:  this world government should institute a “planned economy” (p. 258) of a clearly socialist sort.  Aligning himself with leftists such as Norman Cousins and Bertrand Russell, Bainton seems to reduce the “kingdom of God” to the social gospel utopia so popular in 20th century liberal academic circles..

With Bainton’s bias in mind, his book certainly provides much valuable historical information.  His meticulous footnotes are especially useful in locating the sources he discusses.  He touches on concerns for war and peace in the Greco-Roman world, noting the development of “just war” thought in Cicero.  He acknowledges that the Old Testament, recording the conquest of Canaan–and more especially the deuterocanonical books detailing the Maccabbees’ revolt–provided a certain basis for Medieval  “crusading.”  The New Testament, he insists–especially Jesus’ teachings–provides a basis for “pacifism,”  the position he argues that was embraced by the pre-Constantinian Church. 

Since I’ve read most of the primary sources in this era, and since Bainton says that “the early Church” is “the best qualified to interpret the mind of the New Testament” (p. 66), I carefully scrutinized this section, reading the original sources cited in his footnotes.  What one finds, when reading the alleged “pacifists” of the EarlyChurch, is passing references to war within passages devoted to idolatry or personal purity.  Some of the citations, put in context, are frankly irrelevant to the discussion.  Some of them merely stress the importance of loving everyone, including one’s enemies.  Clement of Alexandria is cited for opposing war–but the passages where he says otherwise are not noted!  Both Tertullian and Origin, two of the four most generally cited “pacifists” in this era, warmly supported the Empire and stressed that by praying for Roman soldiers they did more to protect the state than they would by joining the army.  He acknowledges that writers, such as Tertullian, opposed military service, while Christian soldiers, at the same time, “were not excluded from communion” (p. 66).  He notes that pacifism “flourished” in safe enclaves like Alexandria and Carthage, where there was no war, though Christian soldiers evidently served on the frontier (in Armenia, for instance) where barbarians threatened.  He makes absolutist statements, such as “no Christian author approved of participation in battle,” followed in the very next sentence by the acknowledgment that “the position of the Church was not absolutist, however” (p. 66).  In sum:  there’s no solid evidence, but Bainton chooses to believe that pacifism better reflected the mind of the EarlyChurch! 

The “just war” doctrine decisively developed following Constantine’s edict of toleration, when Christians increasingly assumed responsibilities for their society–magistrates, police, courts, soldiers, etc.  Understandably, apart from the vigorous monastic movement, Christians found that they could not withdraw from the world and hope non-Christians would do the “dirty” work necessary to enact and enforce laws and protect people from evil-doers.  Ambrose and Augustine, especially, justified war as an , when fought according to Christian principles.  Augustine’s position proved to be “of extreme importance because it continues to this day in all essential to be the ethic of the Roman Catholic Church and of the major Protestant bodies” (p. 99). 

Subsequent centuries generally supported Augustine’s position, though scattered  pacifists registered their protests.  During the Middle Ages, Christians added crusading to the just war position, and Bainton, predictably, has little good to say about these efforts to retake lands lost to the Muslims.  Some sectarian movements, such as the Waldensians and Cathari, espoused pacifism, as did some humanists during the Renaissance.  The magisterial Protestant Reformers, of course, supported “just war,” while Anabaptists made pacifism something of a rule of faith. 

During the Enlightenment, secular thinkers like Emanual Kant, in Perpetual Peace espoused a prudential pacifism, and during the 19th century opposition to war increased.  And to the degree Protestant Liberals promoted the “social gospel” a commitment to peace, as well as social justice, marked their agenda.  William Ellery Channing, the famous Unitarian, for instance, inveighed against the inhumanity of all war, and the Quakers served at the front lines of he pacifist movement.  With Leo Tolstoy, increased numbers of Christians exalted the kenotic Christ, who renounced all power to lead the exemplary life we’re called to embrace.


Another pacifist manifesto, The Early Christian Attitude to War:  A Contribution to the History of Chrisitan Ethics (New York:  The Seabury Press, 1982), by C. John Cadoux, though written in 1919 as a volume in the “Christian Revolution Series,” has remained a staple in the anti-war library.   A deeply learned examination of the sources, Cadoux’s treatise deals honestly with the sources, documenting that the EarlyChurch allowed diverse opinions concerning war. 

He argues that Jesus’ teaching underlies the pacifist position, though he acknowledges that certain passages in the New Testament authorize military action.  He insists that the Early Church disapproved of war, he also admits, that handful of anti-war dissenters, such as Tertullian, Hippolytos and Lactantius (primary sources for pacifism), were never accepted as teaching authorities by the established Church.  On the other hand, just war advocates, such as Augustine, were elevated to the position of Doctors whose positions were generally taken to be normative.  He also notes the Old Testament’s approval of righteous warfare and examines the various documents that indicate the presence of Christian soldiers in the period before Constantine as well as thereafter. 

Apart from my appreciation for the many sources examined and documented, I found Cadoux’s position seriously flawed on at least three counts.  First, the book seems to be written with little concern for the broader Roman world within which the EarlyChurch flourished.  The Pax Romana, as the words indicate, insured empire-wide peace for nearly two centuries (28 B.C.-180 A.D.).  Certainly there was warfare on the frontiers.  Obviously there were internal conflicts, such as the Jewish insurrection that led to the destruction of Jerusalem in 70 A.D.  But one must always remember that one reason Christians said very little about war was there were few wars.  Still more, Cadoux says little about what military service involved–and that the religious commitments required of soldiers, more than fighting in wars, best explains the anti-military pronouncements of rigorists such as Tertullian.  Secondly, Cadoux admits that a score of highly regarded historians (i.e. Harnack, Troeltsch, Ramsey) do in fact differ with his assertions, but he fails to effectively explain why his interpretation of the evidence should be accepted–other than providing ammunition for the pacifist movement. 

Like Bainton, Cadoux helps guide us to the sources–and their footnotes and bibliographies are quite helpful.  Both, however, must be read with an awareness of the argument being advanced. 


Radically differing from Bainton and Cadoux, a positive perspective on the “just war” tradition has recently been published by Darrell Cole, a professor at DrewUniversity, entitled When God Says War Is Right:  The Christian’s Perspective on When and How to Fight (Colorado Springs:  WaterBrook Press, c. 2002).  The book enjoys Chuck Colson’s commendation:  “For many years I have read about, thought about, written about, and spoken about just war.  Nothing I’ve studied, however, has taught me as much as Darrell Cole’s book.  Cole’s in-depth research and clear writing style yield what I believe will become a new classic work in the field.  The fact that our nation is attempting to prosecute a just war on terrorism makes Cole’s book both timely and an indispensable resource for policymakers and the citizens who hold them accountable.”

Cole’s goal is “to present the traditional Christian just war doctrine in a clear, accessible manner” (p. 2), accurately explaining the position finely honed by Thomas Aquinas and John Calvin in particular.  That these two theologians–arguably the greatest Catholic and the greatest Protestant thinkers–agreed in teaching the responsibility for waging a “just war” lends credence to Cole’s view that war is rightly considered a “good” endeavor when carefully implemented.  This is because Christian love, rooted in the very character of God, prompts one to use force when appropriate to protect innocent people and to establish the peace that is good for everyone.

To defend his position, he evaluates the pacifist option.  He shows where those (like Bainton and Cadoux) who argue that the EarlyChurch was pacifist are wrong.  The best recent historical studies simply present a mixed picture.  Before Constantine the few references available to us show that some Christians opposed and some supported taking up arms and serving the state as a soldier.  Interestingly enough, they almost all admired soldierly virtues such as courage and employed military imagery in their descriptions of spiritual valor.  They also, without exception, supported the Empire’s police and military personnel–urging, as did Origen, that Christians pray for the triumph of Roman armies.  With the triumph of Constantine, of course, Christians increasingly assumed various responsibilities for secular rule, and the greatest theologians of the 4th and 5th centuries–Basil of Caesarea, Ambrose, Augustine–worked out the “just war” criteria that would subsequently shape Christian thinking on the subject.  “In Ambrose’s eyes, the Christian who stands idly by while his neighbor is attacked is no virtuous person, and perhaps not even a Christian” (p. 21). 

Defining the just war, Cole says that five criteria have generally been invoked on behalf of jus ad bellum (just reasons for going to war).  They are:  “(1) proper authority, (2) just cause, (3) right intention, (4) war as the only way to right the wrong, and (5) reasonable hope of success” (p. 78).  Added to that are the criteria for jus in bello (justly waging war), that prescribe “discrimination” (fighting without deliberately taking civilians’ lives) and “proportion” (appropriately limiting the means employed).  Cole carefully explains that one can foresee bad things happening, when one pursues a certain course, without specifically intending for them to occur.  So “collateral” casualties inevitably accompany armed conflict, but that does not negate the righteous intent with which one pursues his goal. 

Having explained what constitutes a “just war,” Cole then looks at WWII, the Vietnam and Gulf wars, the possibility of nuclear war, and the current conflict with Muslim terrorists.  It’s clear that many wars–at points at least–fail to meet just war criteria.  Even WWII, when one looks at things like saturation bombing, had it’s unjust aspects.  But, Cole insists, wars will erupt, and Christians must assume responsibilities for their world, including an effort to wage truly just wars. 


A handy compendium, War and Christian Ethics (Grand Rapids: Baker Book House, c. 1975), ed. by the distinguished WheatonCollege philosopher, Arthur F. Holmes, is still one of the best volumes available.  After a helpful introduction, there are selections from Plato and Cicero, illustrating the “Pagan Conscience.”  Then documents from the EarlyChurch illustrate the “conflict of loyalties,” pitting the non-violent views of Athenagoras, Tertullian, Origen, and Lactantius against the just war positions of Ambrose and Augustine.  The Medieval and Reformation eras reveal a virtual consensus in support of just wars. 

Martin Luther’s statement is both strong and typical:  “‘For example, a good doctor sometimes finds so serious and terrible a sickness that he must amputate or destroy a hand, foot, ear, eye, to save the body.  Looking at it from the point of view of the organ he amputates, he appears to be a cruel and merciless man; but looking at it from the point of view of the body, which the doctor wants to save, he is a fine and true man and does a good and Christian work, as far as the work itself is concerned.  In the same way, when I think of a soldier fulfilling his office by punishing the wicked, killing the wicked, and creating so much misery, it seems an un-Christian work completely contrary to Christian love.  But when I think of how it protects the good and keeps and preserves wife and child, house and farm, property, and honor and peace, then I see how precious and godly this work is; and I observe that it amputates a leg or a hand, so that the whole body may not perish'” (p. 143).   

Moving to more recent times, pacifists such as Robert Drinan have argued that the Gospel mandates pacifism whereas Reinhold Niebuhr and Paul Ramsey insisted that it does not.   To Niebuhr, pacifism is not simply an alternative Christian position.  It is, he insisted profoundly wrong, for “there is not the slightest support in Scripture for this doctrine of non-violence” (p. 306).  Pacifists have, Niebuhr says, “reinterpreted the Christian gospel in terms of the Renaissance faith in man” (p. 307), a faith pervasive in modern Christian circles that emphasize the earthly establishment of the “Kingdom of God.”

Finally, addressing a related but somewhat different issue, is Stanley N. Gundry, ed., Show Them No Mercy:  4 Views on God and Canaanite Genocide (

Grand Rapids:  Zondervan, c. 2003).   Four distinguished scholars advance their views in brief essays and respond to those of the other three, providing an open and challenging debate that nicely explores God’s role in the conquest of Canaan.  In “The Case for Radical Discontinuity,” C.S. Cowles, of PointLomaNazareneUniversity, argues that the God who authorized the killing and conquest described in the Old Testament cannot be harmonized with the God of love revealed in the New.  He finds inadequate all efforts to reconcile a loving God with a Warrior Lord.  Loving, not conquering, one’s enemies is the way of Jesus–and since Jesus reveals God the Old Testament wars simply do not reveal Him truthfully.  Unwilling to affirm “the inerrancy and infallibility of all Scripture” (p. 15), Cowles takes the Old Testament as only a partial (and in parts seriously flawed) revelation of the God revealed in Christ.

Eugene H. Merrill, a professor of Old Testament at Dallas Theological Seminary, makes “The Case for Moderate Discontinuity,” arguing that the “Jahweh war” called for in passages such as Ex. 23 and Dt. 20 must be understood as a “war against spiritual darkness and wickedness in realms that transcend the human and earthly” (p. 76).  Thus the conquest of Canaan is part of God’s plan for man’s salvation, and the wars authorized therein must be restricted to that time and place and purpose.        Daniel L. Gard, a theologian at Concordia Theological Seminary in Forth Wayne, IN, argues “The Case of Eschatological Continuity” by aligning the Old Testament’s wars with the Second Advent of Christ revealed in Matt. 25 and the book of Revelation.  Then Tremper Longman III, a professor of Old Testament at Westmont, builds “The Case of Spiritual Continuity,” refusing to grant any difference between the God of Israel and the God of Jesus.  God fought, in the past, as a Warrior, and he will come as a Warrior at the end of time, sitting in final Judgment. 

The issue discussed in this book is one of the most difficult one encounters reading the Bible.  To listen to the four positions, to weigh the evidence, to come to a conclusion, is facilitated by these essays.

137 When Character Was King


I never voted for Ronald Reagan and often criticized his policies. Towards the end of his presidency, however, it dawned on me that he was in fact an unusually gifted leader. By the mid-1980s I’d also discovered how politicians like Jimmy Carter–and propagandists for the “evangelical left” such as Jim Wallis and Sojourners Magazine–had misled me. I naively embraced Carter’s praise for Nicaragua ‘s Sandinistas as well as his doomsday scenarios regarding ecological destruction. By 1990, however, I began to admit that Jimmy Carter had nearly ruined the country whereas Ronald Reagan had revived it. And I began to suspect I knew very little about the real Reagan. So, better late than never, I’ve begun to rectify my knowledge by reading some studies of the man. One of the best is Peggy Noonan’s When Character Was King: A Story of Ronald Reagan ( New York : Viking, c. 2001).

Noonan was a young speech-writer in the Reagan White House, and her “insider” contacts enable her to add illuminating anecdotes to her story. Others provide more scholarly studies, but she brings a journalist’s skill to portraying him in a compelling manner. Her chapter on his ranch, for example, explains much about Ronald Reagan. He loved the land, the hard work involved in clearing trails, the opportunity to ride horses, the simple lodging, the beauty of the natural world. “The people who came to this house always described it the same way: humble, basic, simple, plain, unpretentious. And then they’d always say: Like him” (p. 109).

Her favorite story confirming this occurred in the hospital following the assassination attempt just two months into his presidency. Vice President Bush visited him in the hospital and found him on his knees mopping up water around the sink. Amazed to see the President in such a posture, Bush asked Reagan what he was doing. He replied that they wouldn’t let him take a bath so he’d given himself a sponge bath and slopped water on the floor, so he was cleaning up his mess. Bush reminded him that nurses did that sort thing. But Reagan insisted that he wouldn’t think of having a nurse do the dirty work for him! “When I try to tell people what Reagan was like,” says Noonan, “I tell the bathroom story” (p. 187).

Noonan’s book takes a chronological approach, explaining the importance of Reagan’s his family and Illinois youth. His devout, evangelical mother, deeply influenced his moral development–and it was her Bible he used when sworn in as President. His resolve and willingness to work led to a college education at Eureka College , success in in radio broadcasting during the Great Depression, and ultimately Hollywood stardom. Following WWII he became deeply involved in a labor dispute, due to his position as president of the Screen Actors Guild. He further discovered the power of Communism in certain Hollywood sectors. Months of intense negotiations, multiplied physical threats, the collapse of his first marriage–all thrust Reagan into a different, demanding arena. But he learned. And much that served him well in politics was learned from these Hollywood conflicts. His movie career declined, but fortunately he managed to make a living giving speeches and hosting TV series. He married again, finding in Nancy a woman who singularly devoted herself to him.

Doors opened for him to enter politics, especially following his 1964 speech for Barry Goldwater that was telecast and brought him to the attention of the nation. Reagan made the case for Goldwater “that Goldwater had never managed to make for himself. And in making the case for Goldwater, he made the case, in effect, for modern political conservatism” (p. 87). Two years later he was elected Governor of California and served two successful terms. Then he unsuccessfully challenged Jerry Ford for the Republican nomination in 1976. Four years later he was elected President.

As President, Reagan demonstrated his character by carrying through on his promises. He promised to cut inflation, and it fell from 14 to 3 percent. He promised to cut taxes, and the top tax rate fell from 78 to 35 percent. He promised to get the economy going, and the Dow Jones soared from 800 to 2400. He promised to reduce unemployment, and he did. He promised to lower interest rates, and by 1989 they were less than half what they had been in 1980. He promised to constrict federal regulations, and the Federal Register shrunk from 87,000 pages of rules and regulations was reduced to 47,000. And, except for the first two years, he had to work with a Democratic majority in the House of Representatives!

Dinesh D’Souza, born in India and educated in the United States , served as Senior Domestic Policy Analyst under Reagan from 1987 to 1988. His Ronald Reagan: How an Ordinary Man Became an Extraordinary Leader (New York: The Free Press, c. 1997), provides an admiring analysis of the President’s political accomplishments. Like many conservatives, he early admired Reagan the man but underestimated him as a statesman. He now ranks him along with Washington and Lincoln as one of the nation’s greatest presidents, preeminently in two areas: foreign policy and domestic economy.

He begins, with some arresting vignettes, contrasting the alleged wisdom of the Harvard elite with that of Reagan. Arthur Schlesinger, Jr., for example, asserted in 1982 that folks imaging the Soviet Union would soon collapse were only “kidding themselves.” Economist John Kenneth Galbraith, highly revered within the Democratic Party, praised the USSR’s “great material progress” and asserted ordinary Russians were prosperous and happy. Furthermore, he asserted, in 1984: “the Russian system succeeds because, in contrast with the Western industrial economies, it makes full use of its manpower'” (p. 2). Lester Thurow, another eminent economist, praised, in 1989, the “remarkable performance” of the USSR, equaling that of the US.

The intellectual elite, of course, loved to ridicule President Reagan. Clark Clifford dismissed him as an “amiable dunce” (p. 14). But from the beginning of his presidency he insisted that “The Soviets can’t compete with us” (p. 4). He deeply believed that America was, in all ways, superior to the USSR. Marxist ideology, he believed, was profoundly wrong and could not stand the light of truth. “In 1987, Reagan spoke at the Brandenburg Gate in West Berlin. ‘In the communist world,’ he said, ‘we see failure, technological backwardness, declining standards. . . . Even today, the Soviet Union cannot feed itself.’ Thus the ‘inescapable conclusion’ in his view was that ‘freedom is the victor.’ Then Reagan said, ‘General Secretary Gorbachev . . . . Come here to this gate. Mr. Gorbachev, open this gate. Mr. Gorbachev, tear down this wall'” (p. 4).

The wall came down two years later, largely because of Reagan’s steely resolve and policies. In Margaret Thatcher’s opinion, “‘Ronald Reagan won the cold war without firing a shot'” (p. 23). To Henry Kissinger, his success was “‘the greatest diplomatic feat of the modern era'” (p. 134). He did so, in part, through his words. His “evil empire” speech in 1983 angered the dovish Washington establishment (especially State Department functionaries), but it brought hope to millions suffering Soviet oppression. That speech demonstrated “what Vaclav Havel terms ‘the power of words to change history'” (p. 135).

He was also committed to “peace through strength,” the main plank of the “Reagan Doctrine.” Whereas Jimmy Carter vitiated the armed services, Reagan initiated a massive rebuilding of America ‘s military. Whereas Carter stood meekly aside while dictatorships and Communism dramatically advanced around the globe, Reagan steadfastly resisted it at every turn. In Iran , in Nicaragua , Reagan refused to appease regimes he considered “evil.” Supporting (often covertly) Solidarity in Poland and the Contras in Nicaragua, Reagan’s efforts ultimately enabled lovers of freedom to triumph in various places. Though highly controversial, his invasion of Grenada, where thousands of Cubans were helping to establish a Marxist regime, proved a highly significant move, restoring democracy to a troubled island. Responding instantly to Muslim terrorism, he bombed Libya into reticence.

In addition to his accomplishments in foreign affairs, Ronald Reagan helped ignite economic developments in the 1980s. His hostility to big government moved him, in the early ’60s, from FDR’s Liberalism to Goldwater Conservatism. He realized that “we are all eager to play the role of the selfish looter; we all like to get money for nothing. Thus we are always tempted to support government measures that impoverish other citizens to enrich ourselves. The principle was stated by that Fabian socialist George Bernard Shaw: ‘A government which robs Peter to pay Paul can always depend on Paul’s support'” (p. 100). Disillusioned with the galloping centralization of power–manifestly evident under LBJ’s “Great Society”–he pilloried its strategy: “‘If it moves, tax it. If it keeps moving, regulate it. And if it stops moving, subsidize it'” (p. 53). “He once likened the government to a baby: ‘It is an alimentary canal with an appetite at one end and no sense of responsibility at the other'” (p. 67).

Arriving in Washington , though unable to implement his deepest convictions, he did shift the nation’s course in several significant areas. What’s now called “Reaganomics” took shape–a combination of Milton Friedman’s monetarism and Arthur Laffler’s supply side thinking. Few folks thought it would work, and spirited discussions ensued even within conservative circles. Robert Reich, President Clinton’s labor secretary, dismissed it as “hopelessly contradictory” (p. 92). Kevin Phillips laughed at those who imagined “that Reagan could solve the nation’s serious problems with policies based on ‘maxims out of McGuffey’s Reader and Calvin Coolidge'” (p. 106). But the newly-elected Reagan was undeterred. Demonstrating its truth took time, and he had to “stay the course” through a recession at the beginning of his first term.

But as a candidate he’d “predicted that if his program was implemented, the economic woes of the Carter era would end and the United States would enjoy lasting economic growth and prosperity” (p. 85). And he was right! Inflation, raging at 21% in 1980 fell to three per cent within eight years. Tax cuts actually led to increased federal revenues! Unemployment went down and productivity soared. The Reagan years proved to be some of the “biggest peacetime economic boom in U.S. history” (p. 109). Despite Reagan’s critics, such as the Clintons , who criticized it as a “decade of selfishness,” in fact the 1980s witnessed “the greatest outpouring of private generosity in history” (p. 116). ************************************ Ronald Reagan and his first wife, Jane Wyman, adopted a boy and named him Michael. Though divorce disrupted normal family ties, Michael admires his father had compiled (with the assistance of Jim Denney) a worthy book of quotations entitled The Common Sense of An Uncommon Man: The Wit, Wisdom, and Eternal Optimism of Ronald Reagan (Nashville: Thomas Nelson Publishers, c. 1998). One can easily locate the “Great Communicator’s” views on topics ranging from Acting to the Welfare State, from Communism to Prayer, from Horses to Taxes. Each topic is introduced, in illuminating ways, by Michael Reagan.

Concerning acting, Michael says “Both of my parents saw acting as a process of revealing truth, not creating illusions” (p. 1). His father, responding to critics who dismissed him as merely an actor who knew how to speak, defended his craft: “Because an actor knows two important things–to be honest in what he’s doing and to be in touch with his audience. That’s not bad advice for a politician either. My actor’s instinct simply told me to speak the truth as I saw it and felt it” (p. 3).

A great patriot, Ronald Reagan frequently cited Alexis de Tocqueville’s assessment of America ‘s strength: “Not until I went into the churches of America and heard her pulpits flame with righteousness did I understand the secret and genius of her power. America is great because he is good, and if America ever ceases to be good, America will cease to be great.” In his own words, Reagan said: “I believe this blessed land wa set apart in a very special way, a country created by men and women who came here not in search of gold but in search of God. The you ld be a free people, living under the law, with faith in their Maker and their future” (p. 12).

Resolutely opposing Communism, he believed it would collapse because it was rooted in deceit. “The Marxist vision of man without God,” he said, “must eventually be seen as an empty and a false faith–the second oldest in the world–first proclaimed in the Garden of Eden with whispered words of temptation: ‘Ye shall be as gods'”

(p. 37). He looked at this false faith as “evil” and dared to say so. “We know that living in this world means dealing with what philosophers would call the phenomenology of evil or, as theologians would put it, the doctrine of sin. There is sin and evil in the world, and we’re enjoined by Scripture and the Lord Jesus to oppose it with all our might” (p. 101). More pungently, he said: “Evil is powerless if the good are unafraid” (p. 102).

The quotations are short, as is the book. But they reveal Ronald Reagan as witty, wise, and optimistic, as the subtitle suggests. **********************************

Reagan’s quips and phrases were often attributed to gifted speech writers, and certainly they provided much of the material he used while he was President. Recently, however, the Reagan archives provided researchers an unexpected treasure: handwritten speeches revealing how truly gifted he was as a writer, how much he read, how deeply he thought about public policy before he became President. Reagan In His Own Hand, ed. Kiron K. Skinner et al. ( New York : The Free Press, c. 2001) reveals a man largely unknown to the public. The editors give us a thoroughly scholarly work, including the original text with its emendations, suitable footnotes, and selected photocopies, of the 670 handwritten drafts Reagan wrote for radio talks he gave from 1975-1979.

Even George Schultz, Reagan’s secretary of state and long-time friend, was surprised by the cache of papers discovered in his archives. He was not surprised, however, that the paper reveal an intelligent man who read and thought and wrote throughout his life. “I was always struck by his ability to work on an issue in his mind,” says Schultz, “and to find its essence, and by his depth of conviction” (p. ix). He also appreciated the President’s “intense interest and fondness for the spoken word, for caring very deeply about how to convey his thoughts and ideas to people–not only to the American people, but to people living all over the world” (p. ix.)

The person who knows Reagan best, his wife Nancy, also remembers his life-long commitment to writing out his ideas. While many of his critics scoffed at the movie actor reading others’ scripts, Nancy says that the President “continued to write in the White House. He wrote speeches in the Oval Office, and he had his own desk in the living quarters of the White House. He was always sitting at his desk in the White House, writing” (p. xiii). Time constraints, of course, demanded that he use speech writers for many of his addresses, but he invariably played a role in their composition, at times stubbornly insisting on his own language even when more cautious diplomats sought to smooth over his rhetoric.

Reagan also read a lot! Referring to the years when the radio speeches were delivered, Nancy says: “Nobody thought that he ever read anything either–but he was a voracious reader. I don’t ever remember Ronnie sitting and watching television. I really don’t. I just don’t. When I picture those days, it’s him sitting behind that desk in the bedroom, working” (p. xv). Reading his radio talks–and noting the footnotes that refer one to the books and policy papers he referred to–persuades the reader that during the late ’70’s Reagan studied diligently as he prepared for his 1980 presidential campaign. The editors note: “We have checked dozens of references in his writings and, in virtually all cases, Reagan correctly cited or quoted his sources” (p. xxii).

The editors arrange the materials thusly: Reagan’s Philosophy; Foreign Policy; Domestic and Economic Policy; and Other Writings, which include items from 1925-1994. The policies he pursued during his eight years as president are clearly set forth in the things he wrote. His opposition to Communism, which he likened to a “disease,” his resolve to oppose every expansive move of the Soviet Union , was oft-expressed. He believed: “The ideological struggle dividing the world is between communism and our own belief in freedom to the greatest extent possible consistent with an orderly society” (p. 13). Winning the Cold War–something all the “experts” decried during the ’70s–was doable, Reagan believed. And he was, of course right. His analyses of communism in Vietnam, Cambodia, Nicaragua, Cuba, Africa, 25 years later, prove remarkably prescient.

In domestic and economic affairs, Reagan continually decried needless government interference in the free enterprise system and called for a reduction in taxation. He believed in sound monetary practices as an antidote to the rampant inflation that marked the Carter administration. As President, he brought about a dramatic reduction in inflation–and interest rates have never soared as they did in the ’70s. He believed that reducing taxation would stimulate economic development actually harked back to what he’d learned as an economics major in college. So the “Reagan Revolution” of the ’80s was hardly a new notion for the president. Rather he believed that ordinary people, free to pursue their own goals without undue interference, would creatively shape a booming economy.

What one finds, reading this book, is that Ronald Reagan was not merely “The Great Communicator.” He had something to communicate!

136 The Sword of the Prophet


Educated in England, receiving a PhD at the University of Southampton, and doing postdoctoral research at the Hoover Institute, Serge Trifkovic has worked as a broadcaster for BBC and a reporter in southeast Europe for U.S. News & World Report and The Washington Times.  In The Sword of the Prophet (Boston:  Regina Orthodox Pr3ss, Inc., 2002), he sets forth a “politically incorrect” perspective on Islam, its “history, theology, and impact on the world.”  He sees today’s conflicts as simply a recent manifestation of an ancient religious struggle. 

In his Foreword to the book, a former Canadian Ambassador to Yugoslavia, Bulgaria and Albania, endorses Trifkovic’s position, noting that “something is wrong in the Muslim world” (p. 3).  It’s a poverty-stricken, dictator-dominated realm, and the recent resurgence of militant Islam poses “the greatest danger to ‘Western’ values since the end of the Cold War” (p. 4).  “This is a book,” Ambassador Bissett says, “that deals with what many consider to be the major issue of our time–the question of whether the Western and Muslim civilizations can live together in peace” (p. 5).  Sadly enough, he admits, it seems “that, just as the Western democracies refused to acknowledge the danger inherent in the rise of Nazi and Communist ideologies, our refusal to confront militant Islam may cost us dearly” (p. 5). 

Trifkovic begins his book with the assertion that the Muslim attack on the United States on September 11, 2001, demonstrated an antipathy against the Christian world deeply rooted in Islam.  That so many refer to Islam as a “religion of peace” shows that “the problem of collective historical ignorance–or even deliberately induced amnesia–is the main difficulty in addressing the history of Islam in today’s English speaking world, where claims about far-away lands and cultures are made on the basis of domestic multiculturalism assumptions rather than on evidence” (p. 8).  Just as pro-communist publicists long avoided condemning the evils of Stalinist Russia, pro-Muslim “experts” have skillfully spread skillful propaganda to gloss over the truth concerning Islam.  To set forth the facts–to counteract the propaganda–this book was written.

First, we learn much about Muhammad.  Born in Mecca in 570 A.D., early orphaned, he spent his early years working at utterly menial jobs, including shepherding sheep.  Then, fortuitously, he met a wealthy widow, Khadija, 15 years his senior, for whom he worked and ultimately married.  Freed from survival concerns, he began to spend time in the solitude, especially in some caves near Mecca, and, in A.D. 610, received a message from an angel designating him as “the Messenger of God.”  When he shared his message, he won as converts only his wife and a few kinsmen. Most of the folks in Mecca merely scoffed at the new zealot.  But his visions continued, and his wife and a politically powerful uncle protected him from persecution for a few years. 

In A.D. 622, however opposition in Mecca escalated to the point that Muhammed and 70 followers fled north to the more hospitable city of Medina.  This event–the hijrah–marks Islam’s true beginning point.  Here, importantly, Muhammad shifted his emphasis from religion to politics, from persuasion to coercion.  His followers became bands of brigands, and as they raided camel caravans they brought money to the “prophet” and his movement.  Small-scale scirmish victories brought admiration and acclaim from the warrior-minded Arabs, and a battle at Badr in 624 proved particularly momentous, for the principles of jihad came to the fore.  No mercy was extended to unbelievers or captives.  “The Kuran contains the accompanying revelation from on high:  ‘It is not for any Prophet to have captives until he hath made slaughter in the land.’  Fresh revelations described the unbelievers as ‘the worst animals.’  The Prophet was now the ‘enemy of infidels.’  Killing, or in the case of Jews and Christians, enslaving and robbing them, was not only divinely sanctioned but mandated” (p. 38).  Within a decade, at the head of a victorious army, swollen by warriors fattened by plunder and power, Muhammed conquered Mecca, dying there in A.D. 632.

Evaluating the prophet’s career, Trifkovic says:  “Muhammad’s practice and constant encouragement of bloodshed are unique in the history of religions.  Murder pillage, rape, and more murder are in the Kuran and in the Traditions ‘seem to have impressed his followers with a profound belief in the value of bloodshed as opening the gates of Paradise’ and prompted countless Muslim governors, caliphs, and viziers to refer to Muhammad’s example to justify their mass killings, looting, and destruction.  ‘Kill, kill the unbelievers wherever you find ‘them’ is an injunction both unambiguous and powerful” (p. 51). 

Here Alexis de Tocqueville’s appraisal seems remarkable:  “‘ I studied the Kuran a great deal. . . .  I came away from that study with the conviction that by and large there have been few religions in the world as deadly to men as that of Muhammad.  As far as I can see, it is the principal cause of the decadence so visible today in the Muslim world, and, though less absurd than the polytheism of old, its social and political tendencies are in my opinion infinitely more to be feared, and I therefore regard it as a form of decadence rather than a form of progress in relation to paganism itself'” (p. 208). 

Muhammad’s example and teachings led quickly, following his death, to “jihad without end.”  Trifkovic insists:  “The view of modern Islamic activists, that ‘Islam must rule the world and until Islam does rule the world we will continue to sacrifice our lives,’ is neither extreme nor even remarkable from the standpoint of traditional Islam” (p. 87).  The century following Muhammad’s death (A.D. 632-732) witnessed the success of Muslim armies, conquering much of the known world, creating “an Arab empire ruled by a small elite of Muslim warriors who lived entirely on the spoils of war, the poll and land taxes paid by the subjugated peoples” (p. 89).  Lush agricultural lands, under Muslim rule, slowly turned to deserts.  Thriving economies, subordinated to Muslim dictates, slowly sank into impoverishment.  “The periods of civilization under Islam, however brief, were predicated on the readiness of the conquerors to borrow from earlier cultures, to compile, translate, learn, and absorb.  Islam per se never encouraged science, meaning “disinterested inquiry,” because the only knowledge it accepts is religious knowledge” (p. 196). 

The primary victims of Muslim oppression were Christians, even in Spain, the alleged “jewel of supposed Islamic tolerance” (p. 108).   The oft-denigrated, and ultimately unsuccessful, Crusades were but “a belated military response of Christian Europe to over three centuries of Muslim aggression against Christian lands, the systematic mistreatment of the indigenous Christian population of those lands, and harassment of Christian pilgrims” (p. 97).  As a modern parallel, Trifkovic notes that the Crusades were designed as “a recon quest of something taken by force from its rightful owners, ‘no more offensive than was the American invasion of Normandy'” (p. 102).  Though less well-known in the West, the Muslim conquest of India led to what Will Durant called “probably the bloodiest story in history” (p. 111).  It was far worse than the Holocaust, worse than the killings of American Indians by the Spanish and Portuguese.  Muslims slaughtered Indians indiscriminately–killing 50,000 Hindus in a temple in Somnath, for example.  The Ottomans did the same in the Balkans, as did the Turks in Armenia.  In sum:  “Islam is and always has been a religion of intolerance, a jihad without an end” (p. 132).  Indeed, it resembles, in many ways, Bolshevism and National Socialism in the 20th century. 

Turning to the “fruits” of Islam, Trifkovic discusses such things as the absolute lack of religious liberty, the subjugation of women, the widespread practice of enslaving non-Muslims.  He also shows how deeply embedded is the hatred for Jews in the Muslim world.  For example, during WWII the Mufti of Jerusalem and former President of the Supreme Muslim Council of Palestine, Haj Mohammed Amin al-Husseini, urged Muslims to support Hitler.  In a radio broadcast from Berlin, he said:  “‘Arabs!  Rise as one and fight for your sacred rights.  Kill the Jews wherever you find them'” (p. 186).  He supported the extermination of European Jewry.

Trifkovic concludes his treatise with an examination of “Western Appeasement,” showing how in Bosnia and Kosovo, Indonesia and Africa, leaders in the West have been so subservient to the economic power of Mid-Eastern oil and so conflicted concerning their own cultural traditions that they failed to resist militant Muslims.  “The West,” he insists, “cannot wage ‘war on terror’ while maintaining its dependence on Arab oil, appeasing Islamist aggression around the world, turning a blind eye to the Islamic destruction of peoples who are animists, Hindus and Christians, and allowing mass immigration of Muslims into its own lands” (p. 260).  Added to his concern is “Jihad’s Fifth Column” alive and well in the U.S. and other Western nations. 


For those who find fiction a more palatable vehicle for historical and cultural information, Craig Winn and Ken Power’s Tea with Terrorists:  Who They Are; Why They Kill; What Will Stop Them  (CricketSong Books, c. 2003) provides an engrossing study of Islam’s threat to the world.  There’s adventure, romance, suspense–plus Christian apologetics, countless quotations from Islam’s primary texts, and an unrelenting warning that we Americans are just beginning a life-and-death struggle against an implacable foe.  The story centers upon a heroic Navy Seal, Thor Adams, who leads a disastrous expedition into Afghanistan, following which he launches an intellectual search to understand what motivated the Muslims he’d encountered.  The publicity he garnered granted him a podium, and he told the nation the truth he discovered about Islam.  That led to political acclaim and success.  (Since I don’t want to spoil the book for anyone who wants to read it, I’ll say no more about the adventure and romance, but it is engrossing enough to sustain interest for 600 pages!)

The more historical and philosophical sections of the book, however, deserve attention, for they stress many of the same points earlier discussed in Trifkovic’s The Sword of the Prophet.  World War III began, the Winn and Power make clear, on September 11, 2001.  The Muslims who steered the planes into American landmarks represented millions of Muhammad’s modern disciples who are deeply committed to jihad–the holy war that will end only when Islam rules the globe.  Terrorist acts, at the moment, are the strategies of choice, as they have been from the beginning, for “Muhammad was a terrorist” (p. 116).  Indeed a careful reading of Islam’s sacred texts reveals a disturbing celebration of death, a continual call to kill all “infidels,” a justification of any crime or enormity if it furthers the sacred cause.   “Islamic scriptures promote war, lying, thievery, rape, bigamy, genocide” (p. 434).  Just as Muhammad’s success followed his decision to shift from preaching a “religious” message to leading an armed band of brigands and killers that ultimately conquered Mecca, today’s Muslims resort to treachery and intimidation in their quest for pleasure and power.

Today’s Muslim terrorists are not, Winn and Power insist, abnormal.  Islam is not, and never has been, a “religion of peace.”  Peaceful Muslims, in the past, have often been coerced converts, not true believers, generally living in lands far from Arabia.  Lots of less-than-fervent faithful clearly crave normal routines, free from violence.  But devout Muslims, seeking to live out Muhammad’s precepts, have always embraced a violent agenda.  They reveal the fact that “Allah is as different from God as Muhammad is from Jesus” (p. 354).  Indeed, they simply carry on the most ancient Islamic agenda, and the cancers cells of el-Qaeda, Islamic Jihad, and Hamas are wildly metastasizing offshoots of “the cancer [that] is Islam itself” (p. 467). 

The authors argue that Islam is a “perverted religion” most nearly akin to dictatorial ideologies such as Communism and Nazism.    They even devote 10 pages to showing some amazing resemblances between the “Messenger” Muhammad and the “Leader” Adolph Hitler.  “Violence was the key to victory for both men” (p. 436).  Both gained and maintained political power through intimidation and manipulative rhetoric.  Both “became anti-Jew and anti-Christian” (p. 443).    Both led movements that led to the deaths of multiplied millions of innocent people. 

135 No Good Men Left?

Barbara Dafoe Whitehead (the author of The Divorce Culture) is one of the premier scholars writing about marriage and family.  Her most recent treatise, Why There Are No Good Men Left:  The Romantic Plight of the New Single Woman (New York:  Broadway Books, c. 2003) explores why so many highly successful career women–particularly in their 30s–fail so frequently to find a good man who will settle down and make a lasting commitment to marriage.  As she puts it:  “This book is about a contemporary crisis in dating and mating.  It explains why some of the best educated and most accomplished young single women in society today are discontented with their love lives, why romantic disappointment has emerged as a generational theme, and why many of these women have come to believe that ‘there are no good men left'” (p. 2). 

The women she interviewed, researching the book, reveal their cultural milieu in the language they use to describe the loss of romance.  The poetry and song of traditional courtship has disappeared.  The traditional system, rooted in the concept of covenant, maintaining vows for a lifetime, has been replaced by a libertarian system, characterized by momentary interests.  One no longer “falls in love” or finds the “love of my life.”  Instead, there is much talk–using the more cerebral and “scientific” jargon of psychology–about “relationships,” about “being in a relationship.”  The “M” word, marriage, is rarely mentioned–”perhaps because they’ve been warned that talk of marriage can seem needy or desperate” (p. 4).  And that’s precisely what the young career women resist being!  To admit one actually needs a man, that one cannot live a fully satisfactory life one one’s own, rubs against all the feminist ideology most of them have absorbed. 

They illustrate the enormous success of the “Girl Project” launched in the ’70’s and symbolized by the application of Title IX to athletics.  “Rather than prepare girls for future adult lives as wives and mothers, the Girl Project’s aim has been to prepare them for adult lives without dependence on marriage” (p. 77).  So girls began studying harder and now constitute a majority of students in colleges and universities.  Rather than look for husbands, increasing numbers of them focus singularly on preparation for work.  They have successfully moved into medical schools, law schools, business schools, and in some of these graduate programs now constitute a majority of students.  They engage in athletics and serve in the military.  Success in the workplace has come, with bewildering speed, to America’s young women. 

Yet, when the truth is told, most of these young women really want to marry and have children.  Indeed, a 2001 Gallup Poll indicated that 89 percent of them thought it “extremely important” to do so (p. 6).  The novels they voraciously buy and read reveal the depth of these young women’s hunger for a spouse.  Whitehead seriously studied the “Chick Lit” which has proven so popular in recent years.  Great literature it is not.  But it does demonstrate the indestructible desire in the heart of most women.  Thirty years ago, when nearly 90 percent of the nation’s women married before they reached the age of 30, such aspirations were obviously satisfied.  Today, nearly one-fourth of all women are unmarried at that age.  There are today 2.3 million college-educated single women in the 25-34 age group–compared with 185,000 in 1960 (p. 25). 

Not finding a husband, however, does not mean these women are sexually chaste!  The average age of their first sexual intercourse is 17, and “the majority of young women today will live with a boyfriend before they live with a husband” (p. 11).  Cohabitation has become a widely practiced–and socially acceptable–pattern for folks in their 20s.   It’s the “signature union of the emerging relationships system” (p. 116), and more young women first live together with a man than marry one.   “Women often have sex with their boyfriend before they get to know him well as a human being” (p. 29).  Though initially exciting and satisfying, women (unlike men) ultimately find cohabitation a dead end road.  All too often, what they thought was a commitment that would merge into marriage was, from the man’s perspective, simply an attractive arrangement providing free sex and homey comforts.  Understandably, “the benefits of cohabitation for men help to explain why there is no courtship crisis for high achieving young men” (p. 124).   Indeed, Whitehead laments:  “If a corps of mischievous social engineers had deliberately set out to create confusion and uncertainty in the new single woman’s search for love, they couldn’t have come up with a more effective device than cohabitation-as-courtship” (p. 127).

Single women frequently find themselves dating–or living with–”Mr. Not Ready.”  After investing much energy and attention to a series of “Mr. Not Readys,” they remain unwed in their 30s.   Conversely, the family-oriented single men, rather than courting career women, more often select “younger women who are not as committed to serious careers or not as far along in their careers as she is” (p. 36).  They discover, as a greeting card says:  “‘Why are men like parking spaces?  All the good ones are taken'” (p. 40).   Putting marriage on hold while you pursue a career until you’re 30 may very well mean losing the opportunity to marry and have children.

By consenting to cohabit women discourage their “lovers” from marrying them.  “Because men see marital commitment as a status, they take seriously the formal, legal, and public events, ceremonies, and rituals that mark the change in their status from ‘not married’ to ‘married.’  They assign far less weight to the informal, intimate, and private gestures and understandings that serve, for a woman, as benchmarks along the way to marriage” (p. 143). 

Having described and explained the plight of the women she’s studied, Whitehead has little to say concerning a solution.  She–like her two unmarried daughters, now in their 30s–resolutely defends the career track that seems to create the very problem she laments.  Though she observes the problems associated with cohabitation, she makes no moral judgment concerning it.  If what Christians traditionally called “living in sin” is wrong to Whitehead it’s only because it fails to lead to a more permanent “relationship” wherein children can be born and reared.  Her morally indifferent social science simply fails to provide any reason to condemn the very social patterns so manifestly harmful to both men and women.  Yet, for painting an honest portrait of women without “good men” she must be praised.

* * * * * * * * * * * * * * * *

Providing a very personal perspective, bringing to the discussion the wisdom of a lifetime, incorporating  insights derived from rearing four children and taking delight in 10 grandchildren, Midge Decter has written a “memoir of my life as a woman,”  An Old Wife’s Tale:  My Seven Decades in Love and War (New York:  ReganBooks, c. 2001).  Born to a Jewish family in St. Paul, MN, during the Depression, she stands rooted in an America largely vanished but still worth remembering and emulating.  As a teenager, of course, she would hardly have agreed, for she left home as quickly as possible (dropping out of the University of Minnesota) and moved to New York to “make her way in the world.”  There she met and married her first husband, with whom she had two children.  Subsequently, when that marriage dissolved, she married Norman Podhoretz, the lasting “love of her life” and bore him two children.  In the midst of all this, she worked at various jobs, lived in both suburbs and the inner city, and thought much about the society surrounding her. 

She especially pondered “the true Woman Problem.  Not the oppression of women, to say the least a laughable proposition in the United States of America, nor the glass ceiling that so many have been relentlessly calling attention to, but rather a seemingly never-to-be-mediated internal clash of ambitions:  the ambition to make oneself a noticeable place in the world and the ambition to be a good mother” (p. 51).  She began to discern the problem when highly successful women, in private conversation, overflowed, like broken dams, with assorted grievances regarding their husbands (or their lack thereof).  She then began to study women’s literature, such as Betty Friedan’s The Feminine Mystique, a best-seller that she found “both intellectually and stylistically very crude.  It was also unbelievably insulting to ordinary housewives, written on the level and in exactly the kind of lingo previously used by a number of pop sociologists to denigrate the postwar lives of the ordinary people” of America (p. 69).  At the time, Decter failed to see that Frieden’s book was more than simply “another in the series of generally left-inspired attacks on the nature of American society” (p. 71).  It was, in fact, along with the anti-war protests and other manifestations of the rebellious ’60s, a thoroughly pernicious attack on the culture only strong families can sustain. 

Decter sensed that as women following Frieden became more vitriolic and aggressive, men retreated into silence, lest they be judged anti-female.  And yet, ironically, “the movement that began with the claim that it was out to make a real revolution in women’s lives began to define the various forms of male withdrawal from combat as victories, whereas the truth was they were for the most part expressions of the deepest (and in most cases to this day unrecognized) contempt” (p. 90).  As she read and thought and observed, staying at home with her youngsters, she decided to write a book, titled The New Chastity, “in which I faithfully stuck to the movement’s own sources and then compared it with the truths I knew on my own pulse about what women want and how they feel” (p. 93).  Published in 1972, calling into question the most passionately held articles of faith in the feminist movement, this book instantly catapulted Decter to something of a celebrity status–a woman willing to dispute the claims of the women’s movement!  For her efforts NOW gave her its “Aunt Tom” award, a badge of honor for her in the culture war just begun!

Her militancy solidified during the following decades as she watched the children of her “liberal” friends suffer under their parents’ ideological fantasies.  “It is,” she laments, “harrowing to remind oneself of the wreckage visited upon the children of the famous baby boom who grew up among the so-called enlightened classes” (p. 106).  Drug addictions, psychiatric treatments, lesbian experimentats, diet disorders–all symptoms of something seriously awry in the nation’s homes.  Summing up her views, she wrote Liberal Parents, Radical Children, an indictment of those who give children everything the need, of a material sort, and neglect the most important things, such as teaching them manners, how to treat members of the opposite sex, how to live right 

At the same time she began to critique feminism, she and her husband slipped away from the liberal political ideology they had long espoused.  The McGovern presidential campaign in 1972 signified the “capture of the Democratic party by the hard Left” (p. 122).  A personal conversation with President Jimmy Carter revealed an intransigent opposition to the moderate views of Decter and her husband.  So in the ’70s they left the Left!  They “had to rethink most of what we had once thought, not only about politics but about a whole slew of things that fall under the category of what you might call the Natue of Man and God” (p. 125).  Consequently, Norman Podhoretz’s “neo-conservatism,” articulated in Commentary, helped guide the ascendant conservative movement that triumphed with Ronald Reagan. 

Because she’s known both the satisfactions of professional success and family bonds, she weeps to watch young women choosing careers rather than marriage:  “How sad it is,” she says, in a passage that rather sums up her treatise, “that the movement claiming to liberate women and given them control over their own lives should have adopted a program in which they deprive females of one of the most significant means of tasting power and control.  All the law and medical degrees in the world will not make up for what women have been losing in their relations with men, for to become tough and demanding as feminism has defined the process of their taking control is as nothing compared with being hungered for and, later on in life, indispensable” (p. 196).  

Astute, engaging, filled with the wisdom of a maturity, An Old Wife’s Tale could help young women hoping to discover how to become one!


If you need to be alarmed about the future of the nation’s youths, read Meg Meeker’s Epidemic:  How Teen Sex Is Killing Our Kids (Washington:  LifeLine Pressc. 2002).  A medical doctor who practiced pediatric and adolescent medicine for 20 years, a fellow of the American Academy of Pediatrics and a fellow of the National Advisory Board of the Medical Institute, she brings to this treatise both the data and the passion needed to alert us to a momentous problem.  In Elayne Bennett’s judgment, “I truly believe Epidemic is the most important book that anyone who lives or works with teenagers should read, and read now.  Not only does Meg Meeker vividly explain the problem, she explains the solution.”

In Part One Meeker declares “the epidemic is here.”  She blends both personal anecdotes and statistics to point out the pervasiveness of Sexually Transmitted Diseases.  In 1960, syphilis and gonorrhea were the two STDs that concerned physicians, and both of them could be treated if detected early.  Forty years later, there are dozens of them–perhaps 100!–and some have no known cure.  “Every day, 8000 teens will become infected with a new STD” (p. 3).  Of the sexually active teens, fully one-fourth carry a STD.  A British study indicates “that almost half of all girls are likely to become infected with an STD during their very first sexual experience” (p. 12). 

More than 45 million Americans carry an incurable herpes virus!  And kids engaging in oral sex, taking it to be “safe” since no pregnancy results, easily spread herpes throughout the population.  Sadly, President Bill Clinton’s affair with Monica Lewinsky “gave new meaning to the word ‘sex,’ and taught an entire nation of teenagers that as long as you didn’t have ‘vaginal penetration,’ you really weren’t having ‘sex'” (p. 145).  “HPV, one of the most prevalent sexually transmitted diseases,” directly causes “99.7% of cervical cancer cases and the deaths of nearly 5000 women each year” (p. 16).  Some 75% of sexually active people now carry HPV!  AIDS continues to haunt us, and increasing numbers of women how carry the HIV infection. 

Accompanying the physical problems, STDs also inflict grave emotional harm on youngsters.  Amazingly, Meeker has “asked hundreds of teenage girls whether or not they like having sex, and I can count on one hand those who said they did” (p. 78).  Severing the act from the lasting context of love and marriage renders it heartbreaking, for sex is ultimately “a spiritual experience” (p. 81).  Consequently, one of the main reasons for teen “depression is sex” severed from its proper context (p. 63).  Suicide now ranks as the third leading cause of young people’s deaths.  Fully one-third of our teens have contemplated suicide!  Rather than a joyous experience, sex has become a source of incredible pain!

“One classic example of how kids turn this rage inward is the preponderance of body piercing.  Punching holes in intimate parts of their bodies, such as their lips, tongue, belly button, or even vagina, sends a message to the world:  ‘I am hurting this intimate part of myself because I don’t like who I am.’  When girls pierce the sexual parts of their bodies, their labia and nipples (some so severely that they’ll never be able to nurse a baby), they’re saying:  ‘I am cutting on my womanhood.  This is anger turned upon the self'” (p. 72). 

All of this results, Meeker declares, from the sexual revolution birthed in the ’60s.  “With the coming of that revolution, my own generation demanded previously unheard-of-sexual freedom and promiscuity.  We may have gotten what we thought we wanted, but the ride wasn’t free.  Countless children are now paying the price” (p. 33).  Yet it’s reinforced by the dominant powers of our culture.  Young women are “encouraged to expose every inch of skin they can get away with,” but “in doing so, girls are taught that their bodies are not worth protecting” (p. 73).  This, Meeker says, violates one of the most basic feminine instincts, for like self-preservation the preservation of one’s virginity is “hard-wired into our psyches” (p. 73). 

Television, arguably our most influential medium, broadcasts highly sexualized programs, with men and women sexually active, but only one percent of folks having sex on TV are married!  (p. 126).  “On television today, teens are exposed to homosexual sex, oral sex, and multiple partner sex” (p. 126).  All this is done under the artifice of “artistic expression” or “free speech.”  How, ironic, Meeker notes, for “Selling sex to teens is just as bad as selling them cigarettes and alcohol.  Can you imagine the public outrage of parents if movies, magazines, and music incorporated glamorous smoking imagery to the same degree they do sexual content?” (p. 140). 

The sexual revolution, of course, has been fueled by birth control devices.  For years Dr. Meeker cheerfully prescribed contraceptives for teenagers, thinking they would insure the vaunted “safe sex” encouraged by the culture.  She failed to envision how contraceptives would contribute to the proliferation of STDs.  “While we physicians handed out oral contraceptives, chlamydia rates rose.  While we gave injections of Depo-Provera, the numbers of HPV rose.  And while we handed out condoms to teens, we say syphilis outbreaks and genital herpes climb” (p. 95).  The proverbial law of unintended consequences seems demonstrated herein.  For though contraceptives prevent births they routinely fail to provide even minimal protection against STDs.  Condoms, especially, though they may protect against some diseases, have little value in preventing the spread of many of them.  Importantly, in giving teenagers condoms, adults have informed them that they aren’t expected to control their desires.  Since they’re going to “do it,” just make sure their activities cause the least harm! 

Epidemic paints a bleak portrait!  What little hope there is for our kids, as one might expect, comes from better parenting.  Kids need trustworthy parents who know what they believe and live in accordance with their beliefs, who care for them, who insist on good behavior.  Importantly, kids want strong family structures.  “Kids like having someone they love set high standards because it demonstrates faith that they can meet these standards” (p. 220).  Other adults–in family, church, school, or neighborhood–also help.  Sexually active teens are seeking something lacking in their lives.  If that something is satisfied by loving adults, they are less likely to go astray.  And they will be spared the anguish of the epidemic sweeping the nation. 

134 The Battle for the Trinity


  We are, by nature, word-shaped and word-shaping creatures.  Consequently, we define and debate the meaning of words.  “What is the good of words,” asked G.K. Chesterton, “if they aren’t important enough to quarrel over?  Why do we choose one word or than another if there isn’t any difference between them?  If you called a woman a chimpanzee instead of an angel, wouldn’t there be a quarrel about a word?  If you’re not going to argue about words, what are you going to argue about?  Are you going to convey your meaning to me by moving your ears?  The Church and the heresies always used to fight about words, because they are the only things worth fighting about” (The Ball and the Cross {NY:  John Lane Co., 1910}, p. 96). 

            Above all, words regarding the Trinity are worth fighting about!  As one of the giants of 20th century theologians, Emile Brunner, said:  “We are not concerned with the God of thought, but with the God who makes His Name known.  But He makes His Name know as the Name of the Father; He makes this Name of the Father known through the Son; and He makes the Son known as the Son of the Father, and the Father as Father of the Son through the Holy Spirit.  These three names constitute the actual content of the New Testament message” (The Christian Doctrine of God, trans. Olive Wyon {Philadelphia:  Westminister Press, 1974}, p. 206).  No doctrine has been more essential–or more disputed–for 2000 years, for on it the Christian faith stands or falls. 

          As we enter the 21st century, one of the strongest challenges to traditional trintarianism is feminist theology, and probably the best assessment of that threat is found in Donald Bloesch’s The Battle for the Trinity:  The Debate over Inclusive God-Language  (Ann Arbor:  Servant Publications, c. 1986).  In her foreword to the book, the distinguished biblical scholar Elizabeth Achtemeier noted that feminist theology–much of which she contends is a return to “Baalism”–had impacted “the liturgy and worship of the church, its governing bodies, its witness, its doctrine, and its sacred literature” (p. xi).  She believes that “several feminist theologians are in the process of laying the foundations for a new faith and a new church that are, at best, only loosely related to apostolic Christianity” (p. xi).  Anticipating Achtemeier’s concern for Baalism, the great Jewish biblical scholar and philosopher Martin Buber noted that ancient Israel’s prophets forever struggled against the pagan religions of Canaan that featured mother goddesses in “which the inherent dynamism of nature is worshipped as the force which procreates life, and always more life.”  Such worship contradicted the way of Jahweh, and such female deities, Buber said, threatened both “the purity of the faith” and “the humanity of women”  (p. 40).  

While Elizabeth Achtemeier sympathized with some of the pain responsible for the feminist movement, she refuses to justify its theology and warns that some of its most significant proponents seek to replace the Christian God with “a god or goddess” of their own making.  Consequently, she wonders why Christian scholars have “neither admitted any responsibility for current feminist misinterpretations of the Bible nor mobilized any effort to correct those misinterpretations.  On the contrary, many educators seem simply to accept feminists’ positions without questioning the fundamental theological issues involved” (p. xiii). 

            Donald Bloesh–Achtemeir notes–had the courage and conviction to question such issues in The Battle for the Trinity.   Re-reading his treatise 20 years after it was written, reflecting upon developments during intervening years, one is struck by the prescience of his insights.  He listed some of the changes then proposed for mainline churches.  For example, a “United Church of Christ document says that we should ‘avoid use of masculine-based language applied to the Trinity as in ‘Father, Son, and Holy Ghost.'”  We are also instructed to avoid the use of masculine pronouns and adjectives in reference to God, such as he or his.  We are even asked to abandon masculine role names for God including “Lord,” “King,” “Father,” “Master,” and “Son” (p. 2).   

“At a United Methodist General Conference in Baltimore in May of 1984, Methodists were urged to begin finding new ways of referring to deity, such as alternating male and female pronouns or using genderless terms” (p. 2).  Inclusive language reformers especially targeted biblical translations.  Thus Princeton Theological Seminary’s Bruce Metzger, one of the world’s greatest scholars, the chairman of the RSV translation committee, disavowed such tinkering with the wording of the New Revised Standard Version, declaring:  “The changes introduced in language relating to the Deity are tantamount to rewriting the Bible.  As a Christian, and as a scholar, I find this altogether unacceptable” (p. 4).  The illustrative list need not be extended, for a visit to most any mainline church–or a reading of various contemporary scholars–will document the success of the terminological turn.  The recent controversy over quite modest moves to embrace “inclusive language” in the Revised New International Version, indicates that the issue is now moving from mainline to evangelical circles. 

Probing beneath the new language, Bloesh explained the theological developments responsible for it.   This is what makes his treatise one of the best available.   Despite its modern expressions, an ancient tendency infuses feminism–the effort to shift from a Trinitarian to Unitarian understanding of God, to envision Him as an immanent (virtually pantheistic) Power rather than a transcendent Person.  Such becomes clear when reading the American women (e.g. Mary Daly; Rosemary Reuther; Nancy Hardesty; Elisabeth Schussler-Fiorenza) and men (e.g. Paul Jewett; Robin Scroggs; John Cobb), who have supported and contributed to feminist thought.  Among these thinkers, the German theologian Jurgen Moltmann (a major architect of “liberation theology”) has been especially influential, for he envisioned God as “bisexual,” contending the Shekinah denotes a “feminine principle within the Godhead” (p. 6).  Even earlier, Paul Tillich, who deeply influenced great numbers of 20th century theologians, espoused what he called an “‘ecstatic naturalism,'” and portrayed God “‘as Spirit or Spiritual Presence, and Spirit, it seems, is conceived basically as feminine rather than masculine'” (p. 7).  

While some feminists seek simply to do theology in a “different voice,” many more use it as a weapon, taking up arms as partisans, waging political battles against the Church and her traditions, seeking to establish a new religious regime.  Somewhat representative of the latter is Harvard University Divinity School Professor Elisabeth Schussler-Fiorenza, who boldly inveighs “‘against the so-called objectivity and neutrality of academic theology.'” She espouses a theology that “‘always serves certain interests'” and pledges “‘allegiance'” to a “‘partisan'” theology that becomes “‘incarnational and Christian'” by championing the cause “‘of the outcast and oppressed'” (p. 84).  (Her baneful influence has been recently evaluated by Eamonn Keane in A Generation Betrayed:  Deconstructing Catholic Education In the English-Speaking World .) 

Consequently–and most importantly–says Bloesch, “The debate in the church today is not primarily over women’s rights but over the doctrine of God.   Do we affirm a God who coexists as a fellowship within himself, that is, who is Trinitarian, or a God who is the impersonal or suprapersonal ground and source of all existence?  Do we believe in a God who acts in history, or in a God who simply resides within nature?  . . . .  Do we believe in a God who created the world out of nothing or in a God whose infinite fecundity gave rise to a world that is inseparable from his own being?” (p. 11).  Bloesh believes that the new feminist gospel is a resurgent Gnosticism, a refurbished Neoplatonic mysticism, that allows one to portray God in accord with one’s own desires rather than taking Him as revealed in Scripture and Christ.   Our God-talk either reveals to us truths concerning Him, or it’s nothing but man-made symbols ever groping for more satisfactory images of Him. 

After a probing discussion of what “symbol” means, Bloesch concludes that “Symbols may be either metaphors or analogies, and these are not the same.  I agree with Thomas [Aquinas] and [Karl] Barth that analogical knowledge is real knowledge, whereas metaphorical knowledge is only intuitve awareness or tacit knowledge” (p. 21).  Many biblical words are obviously metaphors–thus God is a Rock, in the same sense that my wife is my anchor.  Other words are analogies–God is Father or Lord, in the same sense that Roberta is my soul-mate.  A true analogy is not figurative but real.  So, as Hendrikus Berkhof said:  “‘When certain concepts are ascribed to God, they are thus not used figuratively but in their first and most original sense.  God is not ‘as it were’ a Father; he is the Father from whom all fatherhood on earth is derived” (p. 25).  Importantly, we must “understand that is not we who name God, but it is God who names himself by showing us who he is'” (p. 25).  So when we refer to God as “Father” we are using a symbol appropriate to Him.  “Such words as Father, Son, and Lord, when applied to God, are analogies,” Bloesh says, “but they are analogies sui generis.  They are derived not from the experience of human fatherhood or sonship or lordship, but from God’s act of revealing himself as Father, Son, and Lord.  They are therefore more accurately described as catalogies than analogies insofar as they come from above” (p. 35).

            Upholding the Church’s traditional language preserves her confidence in God the Father, Creator of all that is.  Fathers bring into being beings apart from themselves.  In a sense, they are separate from and transcend the creative process.  Mothers, of course, bring into being beings conceived within their wombs.  Consequently, “Goddess spirituality is a perennial temptation in the life of the church, but it must be firmly rejected, for it challenges the basic intuitions of faith–that God is Lord and King of all creation, that the world was created by divine fiat rather than formed from the being of God as an emanation of God, that God utterly transcends sexuality.  Whenever biblical theism is threatened by philosophical monism, whether this takes the form of pantheism or panentheism, theologians must be vigilant in reaffirming the biblical principle of the infinite qualitative difference between God and the world . . . and the absolute sovereignty of God over his creation” (41).

* * * * * * * * * * * * * * * * * *

            Mary A. Kassian writes as an Evangelical woman initially attracted to the feminist position but ultimately disillusioned by its radical and ultimately non-Christian implications.  In her work, The Feminist Gospel:  The Movement to Unite Feminism with the Church (Wheaton:  Crossway Books, c. 1992), she provides an overview of feminist thought and a distinctly evangelical response.  She notes, for example, that “The major thesis [i.e. the equality of the sexes, as asserted by Margaret Mead] proposed by Christian feminists in the early 1960’s was identical to the thesis of secular feminism” (p. 31).  They sought liberation through “a castrating of language and images that reflect and perpetuate the structures of a sexist world.”  By “cutting away the phallocentric value system imposed by patriarchy,” Mary Daly said, they could design a better world in their own image (p. 70).   Whereas traditional Christians tried to glorify God and serve Him, feminists “shifted the emphasis:  God’s purpose was to assist humans to realize liberation, wholeness, and utopia for themselves” (p.  95).  Embracing feminism freed women from the “hisstory” that ignored them.  They could declare their own truth–write “herstory.”  Accordingly everything can be questioned, there are no absolutes, meaning is socially constructed, and allegedly “natural” realities or ethical standards need not be heeded

            The process began incrementally, slowly and subtly so as to elicit a minimum of opposition.  First it was suggested that inclusive language for human beings be changed.  Thus the generic “man” was anathematized.  “Mankind” was replaced by “humankind.”   A committee chairman must be called a “chair.”  Victorious in changing terms regarding fellow humans, the inclusivists then shifted to loftier terrain and proposed that pronouns referring to God could be changed without seriously changing one’s understanding of God.  You could, in fact, simply use the word God incessantly, abolishing the need for pronouns.  Gaining ground, they then “began to take greater liberties with interpretive hermeneutic methods, using women’s experience as the norm” (171).  So Letty “Russell concluded that experience equals authority.  She stated that ‘the Bible has authority in my life because it makes sense of my experience and speaks to me about the meaning and purpose of my humanity in Jesus Christ.'”  Accordingly, the biblical “text only has authority as I agree with it and interpret it to my experience'” (171). 

            Such women then felt free to re-vision and re-write reality in accord with their own experiences.  Re-casting reality in accord with themselves, they felt free to re-name God as well.  Hinting at things to come, a popular musician, Helen Reddy, accepting a Grammy Award for her 1972 song, “I Am Strong, I am Invincible, I Am Woman,” said:  “I’d like to thank God because She made everything possible” (p. 135).    Such re-naming efforts, Kassian found, “logically led to an erosion of God’s independent personality.  God became a ‘force.'”  This was manifestly clear when an erstwhile “evangelical feminist,” Virginia Mollenkott, moved from calling God ‘He/She’ to ‘He/She/It'” (p. 145).  Or, one might add, “Whatever”!

            Ultimately Kassian concluded that “Feminism and Christianity are like thick oil and water:  their very natures dictate that they cannot be mixed” (p. 217).  She fully agrees with Virginia Mollenkott that “The language issue is anything but trivial” (p. 237).  But because that’s true she rejects Mollenkott’s conclusions.  Indeed, Kassian says:  “It is important to understand that it is not we who name God, but it is God who names Himself by showing us who He is.  In the book of Exodus, God calls Himself “I am who I am” (Exod. 3:14).  He also reveals Himself as Lord and Master (Adonai), Self-existent One (Jehovah Yahweh), God Most High (El Elyon), and the Everlasting God (El Olam).  In the New Testament Jesus Christ is revealed as Lord (Kyrios) and Son, and the first person of the Trinity is called Father and Abba (dear Father).  The names of god are God’s self-designation of His person and being.  Such names do not tell us who God is exhaustively, but they are informative symbols having a basis in revelation itself” (p. 243). 

            Still more:  “God has a name, ‘I AM who I AM’ (Exod. 3:14).  The name of God is important.  The  symbols of faith that compose the Biblical witness–in the form of God’s own name–have been elected by God as means of revelation and salvation.  To challenge or change the name of God as God has revealed it is a denial of God.  It is a denial of who God is.  It is by God’s name that we know Him, it is by His name that we are saved, and it is by His name that we are identified.   Feminism’s attempt to rename God is a blasphemy that comes out of the very depths of Hell.  We have no right to name God.  The only right we have, as created beings, is to submit to addressing God in the manner He has revealed as appropriate.  It is not we who name God, it is God who names Himself” (p. 244). 


            Another recent critique of feminist theology, from a Roman Catholic and European perspective, is Manfred Hauke’s God or Goddess?  Feminist Theology:  What Is It?  Where Does It Lead?  (San Francisco: Ignatius Press, c. 1995).  To explain some of the latent assumptions of feminism, Hauke takes us back to the Utopians of the 19th century, Saint-Simonist socialists, who referred to God “as both Father and Mother, as ‘Mapah'” (p. 21).  They further imagined that in the beginning “there was a mixed male and female being” (p. 21), and reasoned that there is no rigid difference between the sexes, postulating an androgynous ideal still embraced by radical feminists.  Consequently, modern thinkers, such as Simon de Beauvoir, assert:  “‘One does not arrive in the world as a woman, but one becomes a woman.  No biological, psychological, or economic fate determines the form that the female human being takes on in the womb of society'” (p. 29).   This androgynous assumption concerning male and female leads “Christian feminists” to insist that God is likewise androgynous.  So “Father” must be instantly coupled with “Mother” to fully name God.

These utopian socialist roots of the feminist movement are clear in the work of Rosemary Reuther, one of the highly regarded theologians, who praised the work on the family by Friedrich Engels, calling it “the fundamental text for consistent feminists” (p. 50).  She also spoke (perhaps influenced by Betty Friedan’s Stalinist views) highly of Russia’s Communist revolution “and praised the China of Mao Tse-Tung” (p. 50).  The revolution Reuther envisions, of course, is theological and ecclesiological, but the same Marxist antipathy to all forms of hierarchy is clear.  Whereas Mao overturned traditional Chinese society, she seeks to destroy the “hegemony” of the patriarchal Church and replace it with a kinder, gentler version.  To do so, one must destroy the “one-sidedly masculine symbols like ‘Father,’ ‘Lord’, and ‘King’.  Only then will the ‘male Church’ disappear.  Alongside the ‘our Father’, some then place an ‘our Mother’; ‘Jesus Christ’ is supplemented by ‘Jesa Christa’; while the third Person appears as the ‘Holy Spiritess'” (p. 49).

            More radical than Reuther, more deeply rooted in Marxism, Mary Daly, long a professor at Boston College, wrote the influential Beyond God the Father and rejected all hierarchical structures.  Neither God, nor any man, would stand above her.  (After successfully refusing, for 20 years, to admit men to her classes, Daly was recently dismissed from BC as a result of a lawsuit brought against her for such discrimination!)  “For Daly, God’s Incarnation as a male human being is the decisive reason for rejecting Christianity.  ‘Christ-worship’, Daly says, ‘is idol-worship'” (p. 193).    Equally radical, if not more so, is Jurgen Moltmann’s wife, Elizabeth Moltmann-Wendel, who tries to blend Israel’s Yahweh with a Cananite fertility goddess, Astarte–encouraging the worship “Yahweh/Astarte.”  To Moltmann-Wendel, a loving God, acting like a mother, “would ‘unconditionally’ accept even the immoral person.”  Rather than worry about or confess our sins, we can simply affirm that:   “‘I am good.  I am whole.  I am beautiful'” (p.  169).

            Having rather exhaustively studied the works of the most prominent modern feminists, Hauke concludes that they have clearly departed from the Christian faith.  He shares the view of Elke Axmacher, a theological professor at Berlin University, who says:   “A feminist approach to language about God has as much chance of success as an atheist approach to belief in God” (p. 60).  To the extent that the Church tolerates it, he warns, a new religion will develop.

133 Real (Christian) Ethics

 

                Most folks readily give “opinions” on various moral topics ranging from capital punishment to terrorist attacks.  Like the ancient Sophist, Protagoras, they take man to be “the measure of all things,” and every man supplies his own measuring rod.  To make decisions, many adopt pragmatic or utilitarian positions–assenting to democratic decisions favoring the greatest good for the greatest number, or whatever enables one to live more “successfully.”  But few of them venture to defend their “opinions” as moral absolutes, timelessly true.  Under the guise of tolerance, only a few evils–such as racism–elicit condemnation.  There are no absolutes, no “objective” truths.  So Hermann Goering quipped:  “I thank my Maker that I do not know what objective is.  I am subjective” (quoted in J..C. Fest, The Face of the Third Reich {Penguin, 1983}, p. 121).  Goering became a NAZI, he said, not because he took seriously Hitler’s ideology.  Rather he found it a vehicle whereby to vent his revolutionary passion, his nihilistic feelings, his hunger for vandalism and destruction.  If it feels good, do it!

                Such popular positions closely mirror views advanced in academia, where relativism reigns.  Fashionable “postmodernists” glibly insist that there are no universal “truths,” only interesting perspectives.  To John Rist, “the surcease of ethics can be seen to be parallel to and inextricable from the replacement of truth by assertion” (p. 151).  Learned professors, refusing to discriminate between good and evil, propound a fashionable nihilism that denies “reality” exists–only one’s interpretation of it.  Such intellectuals, as Thomas Wolfe shrewdly observed two decades ago, personify a “radical chic,” hosting terrorists and murderers at Manhattan cocktail parties.  They easily become “downwardly mobile,” taking seriously the criminal underclass or “gangsta rap” music, pretending to identify with the world’s “poor and oppressed,” defending despots like Fidel Castro or Palestinian assassins.

                In the midst of such moral nihilism, an eminent classicist and philosopher, John M. Rist, now Professor Emeritus at the University of Toronto, argues that of all the ethical “theories” advanced across the centuries, the moral realism of Platonism provides the “only coherent account” yet designed, and he defends that stance in Real Ethics:  Rethinking the Foundations of Morality (Cambridge:  University Press, c. 2002).  Responding to the widely-held notion that there is no metaphysical foundation, no higher source, for moral beliefs, Rist shows how Plato –who perenially pondered “How should I live?”–dealt with the same issue.  “He came to believe that if morality, as more than ‘enlightened’ self-interest, is to be rationally justifiable, it must be established on metaphysical foundations, and in the Republic he attempted to put the nature of these foundations at the centre of ethical debate” (p. 2).   

                The struggle between Socrates and Thrasymachus, in Plato’s Republic, illustrates the enduring difference between foundationalism and perspectivism.  Widely espoused by modern academics, perspectivism holds that “truth” cannot be found, so we must construct theories to realize our desires.  Following Thrasymachus, perspectivists like Nietzsche declare that we’re free to devise our own morality and seek, whenever possible, to subject others to our machinations.  There is no higher law, no transcendent source, for ethics.  In Nietzsche’s phrase, we may go “beyond good and evil,” devising our own rules for life.  The debate between Socrates, who insisted that morality is given to us from a higher source, and Thrasymachus, writes Rist, “is a debate between a transcendental realist and an anti-realist who disagree about the possibility of morality” (p. 19).  To Socrates and Plato, morality has a metaphysical basis.  Working within this tradition, theistic Platonists (especially Augustine) discerned that “God can create trees and men, men can make tables, but goodness and justice are not created by God (nor, it follows, by man), but subsist in God’s being or nature” (p. 38). 

                Having established his benchmark in Plato and Platonic theism, Rist then compares a variety of ethicists in the history of philosophy who have sought to establish other bases for morality.  Epicurus and Macchiavelli, Hobbes and Kierkegaard and Kant are carefully scrutinized, and Rist shows that despite many differences they all share a potent anti-realism.  Interestingly enough, all their “alternatives to ‘Platonic‘ objectivity in ethics may be forms of the claim–becoming explicit only after Kierkegaard but much indebted to him–that what matters in the world is what we prefer, what we choose to be ‘natural’, what we choose as our own–and precisely because we autonomous beings choose it as our own” (pp. 59-60).  Everything reduces itself to what I desire, what I know, what I can be.  Whether in self-help seminars or self-esteem educational publications, it’s clear that a fixation upon the self reigns in modern culture.  In its Nietzschean version (given a “Christian” twist by Paul Tillich), we’re encouraged to accept “ourselves as we are now, not in any responsibility for our actions, but simply in the being what we are” (p. 220).  Consequently, there has emerged–as is evident in various lawsuits and political appeals, “a choice-based, rights-claiming, largely consequentiality individualism, usually dressed up in democratic clothes” (p. 241).                

                In contrast, “For Plato what matters most about human beings is less that they can reason (though to some degree they can and that is important) than that they are moral or ‘spiritual’ subjects, capable of loving the good . . . and hence determining for themselves what kind of life they should live:  that is, whether we should live in accordance with a transcendent (and in no way mind-dependent) Form of the Good . . . or whether we should opt for the alternative life of force, fraud and rationalization, with, as its theoretical counterpart, the denial of metaphysical truths and concentration on the maximization of our desires:  a life in which reason is and ought only be . . . the servant of the passions, tyrannical as those passions will be over both ourselves and others” (pp. 86-87).   So too Aristotle, though he differed with his teacher in important ways, is a Platonist insofar as he emphasized the metaphysical foundations for ethical principles.  However foreign it may seem to modern philosophers, Aristotle thought that “there is something godlike about man,” namely his contemplative potential.   So endowed, despite our imperfections we can behold a transcendently perfect realm, “perfectly existing outside of man and independent of man’s control.  Man is not for Aristotle ‘the measure of all things’, but a variable creature” who finds his greatness through his awareness of and submission to “a superior being” (p.  145).  And Thomas Aquinas, Rist argues, was doubtlessly “a Platonist in that he believes in an ‘eternal law’ which is roughly the Platonic Forms seen in an approximately Augustinian manner as God’s thoughts” (p. 151). 

Though Rist’s placing of Aristotle and Aquinas alongside Plato and Aristotle may initially jar those of us who stress their differences, he makes his point persuasively, and I think he rightly insists that they are all moral realists.   Similarly, it makes sense that he insists that the only answer to our need for ethical direction lies in a recovery of Platonic Realism.  What is good for us, as individuals, is what is good for mankind.  The common good, not the individual good, should concern us.  Ultimately, as Plato held, the “common good will itself depend on the fact of God as a transcendent Common Good, who has made man with his specific needs and limitations and thus gives intelligibility to a common good which is (or should be) the object of human striving in social and political life” (p. 241). 

                More importantly, following Plato leads us to God!  We cannot live rightly apart from God.  Knowing what we ought to do does not imply we can do it.  As the Roman poet Ovid lamented, we generally “know the better and follow the worse.”  Without the empowering presence of God, we routinely fail to attain goodness.  To put it “in traditional terms, for morality to function God must function both as final and (at least in great part) as efficient cause of our moral life” (p. 257).  Those philosophers who construct moral systems based upon prudence or self-interest merely dream utopian dreams.  Theists who hope to establish links with atheists, sharing ethical principles, fail to grasp the fact that purely natural norms ultimately collapse into those of Protagoras and Thrasymachus.

                What Plato shows us Christians is that real ethics must be rooted in Reality.  With him, we must realize “that to be moral is not only to be rational, but also, far more importantly, to be godlike insofar as we are able–as Plato also said, agreeing with the Old Testament’s ‘You shall be as gods'” (p. 276).

* * * * * * * * * * * * * * * * * * * *

                When Professor Rist was asked to deliver The Aquinas Lecture, at Marquette University, in 2000, he titled his presentation On Inoculating Moral Philosophy Against God  (Milwaukeee:  Marquette University Press, 2000), reducing to a few pages some of the more detailed views he set forth in Real Ethics.  As an Aquinas lecturer, he joins some of the most distinguished philosophers of the 20th century, including Mortimer J. Adler, Yves Simon, Jacques Maritain, and Werner Jaeger.  The event provides a pulpit for distinguished philosophers, an opportunity to amplify their convictions.  Rist’s desire, in this lecture, was “to expose and attempt to correct a rather mysterious phenomenon, that of a group of theistic, indeed Christian, philosophers who act as though it makes no great difference in ethics whether God exists at all, who seem inclined to assume that the question of whether there can be moral truths at all in his absence can be lightly put aside” (p. 96).  To use St. Paul’s terms, addressing Christian philosophers, “be not conformed to this world, but be transformed by the renewing of your minds.” 

                Eminent ethicists today, such as Harvard University’s highly influential John Rawls, openly sought to establish a public, political morality in purely secular, implicitly atheistic terms.  Thus J.L. Mackie titled his influential textbook:  Ethics:  Inventing Right and Wrong,  Such thinkers are working out the legacy of  Immanuel Kant’s “Copernican Revolution”–the declaration “that theoretical reason was essentially impotent, and certainly has nothing to contribute to ethics” (p. 84).  To Kant, and his multiplied heirs, “practical reason” constructs morality in accord with human autonomy.  Nothing metaphysical, nothing supernatural, is knowable.  So man designs his own rules.  Whoever persuades–or coerces–others to follow him writes the laws of the land.  The Kantian “revolution” in philosophy, Nietzsche recognized, forces one to acknowledge that “morality either depends on God or it depends on the will and rationality of man.  We either find it or invent it; it rests either on fact or on choice” (p. 94).  Without God, whom Nietzsche declared to be “dead,” all things are possible.

                Rist’s concern is the split that has severed philosophy from theology–a disastrous dichotomy underlying much that’s wrong with the modern world.  What he wants to recover is an Augustinian approach that envisions theology as “an advanced form of philosophy, a philosophy, that is, in which more data is available (even though by ‘belief’ and ‘in hope’ rather than by knowledge”) (p. 19).  Though he calls this position “Augustinian,” it is more broadly “the mind of the early Church at least from some time in the second century, in the days, let us say, of Justin Martyr” (p. 19).  Augustine incorporated Plato’s philosophy into his theology, but he pushed beyond Plato and relied upon divine Revelation for ultimate truths.  In Plato Augustine found reference to God and the Logos–but only in Christ did he behold the Logos revealed as the incarnate Christ.  “It was above all the Platonist picture of God,” says Rist, “as transcendent and as the source and nature of value, which appealed to the developing Christian thinkers, especially when coupled with a theory of the return of the soul through love to God” (p. 87). 

                The same needs to be done by Christian ethicists today, says Rist, for “believers in the Christian religion must propose an understanding of the virtues which is impossible for pagans:  which indeed is only possible for those who believe in a God” (p. 37) Who is the loving Creator of all that is.  To see God as revealed Love, Augustine thought, enables one to “claim that love itself is the basis of the other virtues, which thus become, in his language, ‘nothing other than forms of the love of God'” (p. 38). Augustine’s position, anchored in Plato’s metaphysics, provides modern Christians a way to respond to modern moralists.  Christians must clearly set forth and defend an alternative to the secular model.  Indeed:  “The theistic tradition of which some of us believe that Christianity is the developing fulfillment, started, as Augustine recognized, with Plato.  It is not just any metaphysics which can provide an adequate philosophical framework for the truths of Christianity, but a Platonizing framework” (p. 85). 

* * * * * * * * * * * * * * * * * * * *

                Rist’s roots in the thought of St. Augustine stand clear in his Augustine:  Ancient thought baptized (Cambridge:  Cambridge University Press, c. 1994), written to describe “the Christianization of ancient philosophy in the version which was to be the most powerful and the most comprehensive” (pp. 1-2).  In Augustine one finds a unique thinker, fully open to the truth of philosophy and devoutly committed to the authority of Scripture and Church.  Many “truths about the Truth had been discovered by the Platonists” (p. 62).  But the “Truth” can be nothing less than God!  The “forms” Plato discerned by reason are “illuminated” by God for Augustine, lifted to a level of clarity and certainty through Revelation.  “When we learn something, he observes . . . our sources are intelligence (ratio) and ‘authority’; he himself has determined never to depart from the authority of Christ” (p. 57).  Only He “is the way, the truth, and the life.”

                To Plato, first-hand knowledge (episteme), excels second-hand knowledge (doxa), beliefs which may or may not be true.  To move from beliefs about, to knowledge of, what is, is the calling of truth-seekers.  To Augustine, so long as “we long for truth or wisdom we remain ‘philosophers,'” but the happiness we more deeply desire results from a rightly ordered love (p. 51).  To see truths about God may satisfy our minds, but to love God, with heart, mind, and soul, satisfies the soul.  And the reality and nature of the “soul” certainly concerned St. Augustine.  To know God and the soul, he thought, were the two great goals of man.  As Rist devotes a chapter to “soul, body and personal identity,” he makes clear that Augustine refused to reduce the soul to the body, ever insisting that there is a non-material essence to a human being, denouncing the “mad materialism” of Tertullian.  By nature we are, he said, both body and soul, mysteriously, indeed miraculously, “blended” together.  The body is the temple of the soul, worthy of reverence, and the body will be resurrected in the end, fulfilling God’s design for us. 

                Failing to fulfill that design–the imago dei–results, Augustine held, from a weakness of our will, the result of Adam’s fall.  As he expressed it in his Confessions, “it is I who willed it, and I who did not–the same I” (8.10.22).  Impaired by sin, the lack of love, “man is unable to choose the good (non posse non peccare), either in the sense that his good actions are never ‘wholly good’, because not motivated by pure love . . . or in the sense that at some point the will certainly choose evil” (p. 132).  Thanks be to God, however, His grace rescues us, restoring the freedom of the will, granting us sufficient strength to rightly respond to His love. 

                And it’s for love we are designed, to love we are called.  “the whole life of a good Christian is a holy desire,” he said (On John’s Epistle, 4.6).  A good man is impelled by “blazing love” (The Happy Life, 4.35).  Love, of course, may be perverted–loving self or things rather than God.  But rightly ordered, informing the virtues, love for God and neighbor constitutes the good life.  “When Augustine wishes to express the goal of the good life, he often speaks of the need to be ‘glued’ to God or ‘to cleave to God in love'” (p. 162).  Rightly loving God enables one to love others.  “For we are justified in calling a man good,” Augustine wrote in The City of God (11:28), not because he knows what is good, but because he loves the good.”  And he is able to love because God’s grace enables him to.  Consequently, Augustine’s famous injunction, Dilige et quod vis fac (“Love and do what you will”), has little in common with ancient antinomianism or contemporary “situation ethics.”  Rather, real love impels  one “wish to want what God wishes, loves and commands, and God wishes, loves and commands only what is constitutive of his own nature.  God’s nature is by definition unchanging; hence God’s love will be ‘eternal’, and hence we have an ‘eternal law'” (p. 191). 

                After dealing, in detail, with other aspects of Augustine’s thought, Rist concludes his book with a chapter entitled “Augustinus redivus.”  Granting that Augustine has been misread and misinterpreted–note Calvin’s take on his view of predestination, for instance–we should seriously study and courageously proclaim “the power and persuasiveness of many of Augustine’s ideas, and the perspicacity of many of his observations” (p. 292).  Reading Rist enables one to understand how this should be done.   He writes for scholars, and his works require disciplined attention.  But the rewards are worth the effort.  Few philosophers offer meatier material for Christians seriously committed to the truth and its proclamation.

132 Dissolving Materialism

Materialism, both scientific and philosophical, undergirds modernity.  The physical world, ourselves included, must be reduced to simple material entities, and if we understand them we understand everything.  This was proclaimed by Julien Offray de la Mettrie in the 18th century, who asserted, in L’homme machine (1747) that the mind and the brain are simply two words for a single material entity.  Essentially the same is declared by “evolutionary psychologists” such as MIT’s Steven Pinker today.  Man himself can be fully explained in terms of cells and neurons, following mechanical biological and chemical laws.  There are but slight differences of degree separating man from other animals, and to understand him the empirical sciences alone provide the key.  Reducing man to a machine, portraying the mind as a purely material entity–akin to the clockwork universe derived from Newton’s Laws–provides the foundation for secularism.

Countering such a worldview with the best of recent scientific research stands Jeffrey M. Schwartz, a professor of psychiatry at the UCLA School of Medicine, who with Sharon Begley has written a fascinating and persuasive treatise, The Mind and the Brain:  Neuroplasticity and the Power of Mental Force (New York:  HarperCollins Publishers, c. 2002).  This book builds upon the research he’s engaged in for 20 years, blending it into far-reaching philosophical conclusions, for “If materialism can be challenged in the context of neuroscience, if stark physical reductionism can be replaced by an outlook in which the mind can exert causal control, then, for the first time since the scientific revolution, the scientific worldview will become compatible with such ideals as will–and, therefore, with morality and ethics” (pp. 52-53).  He argues, armed with recent research breakthroughs, a view earlier advocated by noted neurologists such as Wilder Penfield, Charles Sherrington, and John Eccles–impeccably qualified scholars who (generally after a lifetime of study) concluded that there’s simply something more to the mind than the brain.  As Penfield said, in 1975, “‘Although the content of consciousness depends in large measure on neuronal activity, awareness itself does not . . . .  To me, it seems more and more reasonable to suggest that the mind may be a distinct and different essence'” (p. 163).

Materialistic assumptions–not accurate scientific data, Schwartz says–explain the deeply rooted belief that the brain, as a biological entity, fully explains our thinking processes.  Fleshed out in the highly influential writing of behaviorists such as John Watson and B.F. Skinner, or of psychoanalysts such as Sigmund Freud, materialism scoffed at free will and any alleged ability of the person thinking to transcend the mechanical activities of his brain.  To materialists, reference to any immaterial “mind” denotes the superstitions of a pre-scientific era.  Taking their position, of course, eliminates the possibility of consciousness (“knowing that you know” {p. 26}), free will and moral responsibility.  Indeed:  “The rise of modern science in the seventeenth century–with the attendant attempt to analyze all observable phenomena in terms of mechanical chains of accusation–was a knife in the heart of moral philosophy, for it reduced human beings to automatons” (p. 52).

Early enchanted by the mysterious inner workings of the thought processes, Schwartz began to do research with people suffering obsessive-compulsive disorders (e.g. repetitively washing one’s hands).  Drawing upon the Buddhist notion of mindfulness, he taught them to learn how to stand apart from their compulsive thoughts, to evaluate and consciously correct them, allowing their “minds” to give directions to their “brains.”  Such therapy did more than help his patients, however, for with the assistance of PET data Schwartz began to document the amazing plasticity of the brain.  “This was the first study ever to show that cognitive-behavior therapy–or, indeed, any psychiatric treatment that did not rely on drugs–has the power to change faulty brain chemistry in a well-identified brain circuit.  What’s more, the therapy had been self-directed, something that was and to a great extent remains anathema to psychology and psychiatry” (p. 90).  The conscious mind, supervising brain activities, actually re-wires the brain!

Schwartz’s neurological research linked him up with Henry Stapp, an eminent physicist working at the Lawrence Berkeley National Laboratory near Berkeley, California, who has devoted his scholarly career to the study of quantum physics.  Working out the implications of quantum theory as enunciated by John von Neumann, the great Hungarian mathematician, who said “‘that the world is built not out of bits of matter but out of bits of knowledge–subjective, conscious knowings'” (p. 31), Stapp’s research, fortuitously, paralleled Schwartz’s, and he had concluded, as is evident in Mind, Matter and Quantum Mechanics, that nuclear physics also reveals how the immaterial “mind” shapes the material world.  “‘The replacement of the ideas of classical physics by the ideas of quantum physics,'” says Stapp, “‘completely changes the complexion of the mind-brain dichotomy, of the connection between the mind and the brain'” (p. 48).  The reigning assumption, entrenched since Descartes, that only material entities could causally affect other material entites, dissolves in the world of quantum mechanics.  This is illustrated by the phenomenon of nonlocality, perhaps one of the most important scientific breakthroughs in the history of science.  Quantum physics shows that a specific “action here can [instantly] affect conditions there” (p. 348), even though here and there are light years apart!  Physical causation requires no material medium!  So Stapp and Schwartz both now believe that the power of the will, freely exercised, independent of physical stimuli, “generates a physical force that has the power to change how the brain works and even its physical structure” (p. 18).

In the process of building his philosophical case, Schwartz provides an extensive and fascinating discussion of what we know about the brain, a truly marvelous and mysterious three-pound ball of neurons.  He details how the brain develops, how it responds to various stimuli, how experiments with monkeys have opened for us deeper understandings of how it functions.  Virtually all the studies he discusses–and the high-level scholarly conferences he’s attended–have taken place during the past decade, and one easily grasps how up-to-date and pertinent is his presentation.  Within the past five years, for example, important and encouraging work has been done with small groups of stroke victims, who were once thought permanently disabled.  A new kind of therapy, constraint-induced (CI), reveals, for “the first time,” a demonstrable “re-wiring of the brain” following a stroke (p. 195).  Children suffering specific language impairment (SLI) may hope, given recently developed therapies, to overcome their affliction.

What’s being proved in such experiments is what researchers a decade ago widely doubted:  the reality of neurogenesis, neuroplasticity–consciously directed brain developments.   This further means we are truly free to think and to act.  Locked into classical physics, even Einstein in 1931 declared that it is “man’s illusion that he [is] acting according to his own free will'” (p. 299).  Ever resisting quantum theory, with its indeterminism, Einstein represents a worldview in the process of dissolving, Schwartz believes.  And he cites recent, carefully crafted experiments, documented in a special 1999 issue of the Journal of Consciousness Studies devoted to “The Volitional Brain:  Towards a Neuroscience of Free Will,” that demonstrate the growing openness to human freedom in the brain research community.  Much of this is to say that William James was right, a century ago, when he insisted that “Volitional effort is effort of attention.”  What we freely attend to, in our consciousness, shapes us.  “The mind creates the brain” (p. 364).  Obviously the brain is the material with which the mind works.  But mind is more than the brain.  As Anthony Burgess wrote, in A Clockwork Orange, “‘Greatness comes from within . . . .  Goodness is something chosen.  When a man cannot choose he ceases to be a man'” (p. 290).

The Mind and the Brain is one of the most fascinating books I’ve read in some time.  Dealing with some of the most difficult theoretical issues imaginable, the authors succeed in making clear the implications of the most recent scientific research.  And, equally important, they understand the philosophical implications of their study and develop them persuasively.

* * * * * * * * * * * * * * * * * *

Coming at the same issue from a very different perspective is Benjamin Wiker’s Moral Darwinism:  How We Became Hedonists (Downers Grove:  InterVarsity Press, c. 2002).  The book’s plot, as William Dembski says, is this:  “Epicurus set in motion an intellectual movement that Charles Darwin brought to completion” (p. 9).  Still more:  “Understanding this movement is absolutely key to understanding the current culture war” (p. 9).  Underlying both the ancient and the modern versions of hedonism is an anti-supernatural cosmological materialism.  Consequently, theists who see God at the center of their worldview cannot but do battle with Epicureans of every century, and Wiker wants to help arm us for active combat.

Materialism pervades virtually all branches of science, ranging from astronomy to microbiology, as naturalistic thinkers insist that everything that exists can be reduced to simply material entities.  The basic reason for this, Wiker says, is that “modern science itself was designed to exclude a designer.  Even more surprising, modern science was designed by an ancient Greek, Epicurus,” who lived three centuries before Christ (p. 18).  “The argument of this book, then, is that the ancient materialist Epicurus provided an approach to the study of nature–a paradigm, as the historian of science Thomas Kuhn called it–which purposely and systematically excluded the divine from nature, not only in regard to the creation and design of nature, but also in regard to divine control of, and intervention in, nature.  This approach was not discovered in nature; it was not read from nature.  It was, instead, purposely imposed on nature as a filter to screen out the divine” (p. 20).  To support his hedonistic ethics, to feel at ease with his lifestyle, Epicurus set forth a materialistic cosmology.  Centuries later, “Modernity began by embracing his cosmology and ends by embracing his morality” (p. 23).

Wiker develops his argument by tracing historical developments of Epicurean thought.  Embracing Democritus’s scientific hypothesis–that nothing exists but atoms-in-motion–Epicurus developed a consistent materialism that reduces moral questions to preferences of pleasure rather than pain.  Good is what feels good.  Evil is what feels bad.  So do whatever feels good, however much it may change from time to time and place to place.  Epicurus’s ideas were picked up and given poetic expression by Lucretius, one of the great Latin stylists.  Though Hedonism certainly impacted the ancient world, it wilted under the philosophical weight of Platonic and Aristotelian philosophy and the dynamic growth of Christianity.  The world is as it is, Christians insisted, because God designed it.  The godless cosmos and normless ethos of Epicurus slipped into the cellar of discarded errors as Christians shaped Western Christian Culture during the Medieval Era.  But errors are often dragged back to light, dressed up in new clothes, and such happened to Epicureanism.  During the late Middle Ages the authority of Aristotle was questioned and nominalism made powerful inroads in key quarters.  As the Renaissance developed, Lucretius was rediscovered, along with other classical texts, paving the way for the “scientific revolution” of the 17th and 18th centuries.  “We are materialists in modernity,” Wiker says, “in no small part, because were lovers of Lucretius at the dawn of modernity” (p. 59).

Shaping modernity were gifted scientists such as Galileo and Newton, in whom Wiker sees “the vindication of atomism through the victory of mathematics” (p. 112).  Consequently, under the guidance of increasingly irreligious scientists, a triumphant worldview is established which demonstrates “the complete theoretical victory of Epicurean materialism, all the essential elements of Epicurus’s system–the eternal and indestructible atoms, the infinite universe with the unlimited number of worlds the banishment of the creator God, the rejection of miracles, the displacement of design in nature by chance and material necessity, and the elimination of the immaterial soul–fell into place during the eighteenth and nineteenth centuries” (p. 112).  Laplace’s answer to Napoleon’s question concerning the place of God in his scientific work, sums up the consummation of this process:  “Sire, I have no need of that hypothesis.”

Without God, objective morality disappears as well.  Such is starkly evident in the work of Thomas Hobbes, one of the architects of modern thought.  By nature, we war against each other; only the fittest survive–nothing is naturally right or wrong.  To secure a peaceful society, however, we assent to the rule of a sovereign, who prescribes the rules.  Hobbes also helped subvert the authority of any divinely inspired Scripture, devising an approach of interpretation consonant with his Epicurean materialism, denying the reality of the immaterial, immortal soul, questioning the possibility of miracles and of heaven and earth.  Benedict Spinoza picked up on such ideas, and the corrosive acid of biblical criticism gained momentum.  So it follows that Thomas Jefferson, who “considered himself an Epicurean and studied Epicurus in Greek” (p. 207) and put together his own sacred text, entitled The Life and Morals of Jesus of Nazareth. 

Importantly, Wiker concludes, Epicureanism shaped Darwinism.  A materialistic metaphysics, evident in both positions, cannot be shape the ethical views it dictates.  Neither Epicurus nor Darwin had demonstrable evidence for their theories, but they both had a solid faith in their explanatory powers.  Eminent scientists, such as Lord Kelvin (relying on statistical probability) and Louis Agassiz (the reigning expert on fossils), resolutely critiqued the theory of evolution through natural selection.  But philosophers (Spencer and Marx) and publicists (Huxley) found it perfectly designed for their moral and social agendas.  Importantly, Wiker says, “We must always keep this in mind:  for Darwin nature did not intend to create morality, any more than nature intended to create certain species; morality was just one more effect of natural selection working on the raw material of variations in the individual” (p. 244).

In an amoral cosmos, of course, anything goes.  Thus Darwinian science has incubated Epicurean Hedonism.  Here Wiker guides us through the development of eugenics, from Darwin through Haeckel (whose books sold hundreds of thousands of copies in Germany) to Hitler himself.  Eugenics easily justifies abortion and euthanasia, also proposed by Haeckel as ways whereby to purify the race and later employed by Hitler’s henchmen.  Nearer home, Margaret Sanger embraced Darwinism and promoted various eugenic measures.  She championed birth control, for example, in order “‘To Create a Race of Thoroughbreds'” (p. 266).  Sexual activity itself, Sanger believed, should involve anything that feels good, for nothing is moral in the world of evolution through natural selection.

Even more abandoned to amorality was Alfred Kinsey, long regarded as an eminent man of science, a “sexologist” who allegedly informed the nation how people actually behaved.  Recent studies reveal that Kinsey was an incredibly perverted man, engaging in various forms of deviant behavior, including pedophilia.  His allegedly “scientific” studies were, in fact, fraudulent screeds designed to encourage the breakdown of sexual restraint.  However untrue, his views entered the nation’s textbooks and journalistic assumptions, powerfully evident in an episode on the recent PBS Evolution series, where viewers were encouraged to see the similarities between the sex life of humans and some primates called “bonobos,” who engage in all sorts of sexual activity (heterosexual and homosexual, adults with juveniles) simply for pleasure.  Consequently:  “Just as Kinsely’s views on the naturalness of premarital sex and homosexuality became the scientific foundation for the transformation of sexual morality from a Christian natural law position to that of the Epicurean, so also Kinsey’s views on the naturalness of pedophilia have become the foundation of the slow but sure revolution going on right now pushing adult-child sex and natural” (p. 285).  And, according to Darwinian principles, anything that feels good is natural and thus allowed.

Wiker sets forth a fascinating historical thesis.  To see modernity in the light of Epicurus certainly clarifies the deeply philosophical premise that shapes our culture.  To do as well as our ancient Fathers in the Faith, responding to hedonism, is clearly our challenge.

131 Islam: Past & Present

A widespread scholarly consensus exists concerning the Middle East:  to historically understand it one must read the works of Bernard Lewis, Professor of Near Eastern Studies Emeritus at Princeton University.  Having written over two dozen scholarly studies, he is well qualified to explain, in his most recent publication, What Went Wrong?  Western Impact and Middle Eastern Response (New York:  Oxford University Press, c. 2002).  For three centuries, he says, Muslims have asked this question, and it underlies much of the anger and envy now evident in the terrorism that now haunts the West.  Indeed, “In the course of the twentieth century it became abundantly clear in the Middle East and indeed all over the lands of Islam that things had indeed gone badly wrong.  Compared with its millennial rival, Christendom, the world of Islam had become poor, weak, and ignorant” (p. 151).

This reversed the conditions of the world Muslims once ruled.  Following Mohammed’s death in 632 A.D., his followers rapidly conquered much of the formerly Christian world–Syria, Palestine, Egypt, North Africa, Spain, Sicily.  By 732 they were in central France, and in 846 “a naval expedition even sacked Ostia and Rome” (p. 4).  In 1453 Muslims conquered Constantinople, finally burying the last remnants of the once powerful Byzantine Empire, and added the Balkans to their hegemony.  By 1529, as Luther was orchestrating his Reformation in Wittenberg, Muslim armies threatened Vienna, only to be repelled by Charles V.  Indeed, “Islam represented the greatest military power on earth,” Lewis says, and sustained it with a sophisticated (albeit exploitative) economic system (p. 6).

Then, abruptly, things changed.  Europeans, after a millennia defending themselves against Islam, took the offensive and rapidly overwhelmed their oppressors.  Incubated by the Renaissance and Enlightenment, new technologies provided Europeans the means with which to outmaneuver and overwhelm their foes.  Portuguese and Spanish explorers bequeathed colonies to their monarchs, encircling the Muslims and disrupting their trade monopolies, funneling gold and silver and agricultural products into Europe.  Whereas a Muslim army had merely been repulsed at Vienna in 1529, the second siege of Vienna, in 1683, resulted in a disastrous defeat, followed by a rout.  In the words of an Ottoman chronicler:  “This was a calamitous defeat, so great that there has never been its like since the first appearance of the Ottoman state” (p. 16).  Further east, Russia’s tsars, recovering lands lost during the Mongol invasions and occupation, began pushing south and east, challenging Muslim dominance.  By 1696, Peter the Great had occupied Azov, providing Russia a port on the Black Sea.

For the next three centuries, Muslims struggled to cope with their new, largely inferior status vis a vis Europe, trying to understand “what went wrong.”  One thing they learned, Lewis says, was learned on the battlefield.  Once almost omnipotent in battle, Muslims found themselves shocked by Europe’s military superiority.  Technically, whether considering naval vessels or soldiers’ arms, the West had advanced in military equipment whereas Muslims still tended to rely upon their swords and personal valor.  By 1798, when Napoleon and a small corps of French soldiers invaded and occupied Egypt, the disparity was clear, and during the 20th century most Arab lands were reduced to the humiliating status of European colonies.

Muslim inferiority was similarly evident in trade and commerce.  During the Renaissance and Enlightenment, Europeans began to study other languages and understand other cultures, whereas Muslims (elitists who disdained lesser cultures) rarely bothered to learn about their Christian foes.  To travel outside Muslim realms, to study under infidels, to acknowledge the achievements of non-Muslim peoples, was discouraged.  Though certain Western technologies were coveted and appropriated, the widespread resistance to everything associated with the Christian world prevented Muslims from assimilating many of the “modern” developments that transformed the world.  Illustrating the outcome of this process, Lewis says that today:   “the total exports of the Arab world other than fossil fuels amount to less than those of Finland, a country of five million inhabitants.  Nor is much coming into the region by way of capital investment.  On the contrary, wealthy Middle Easterners prefer to invest their capital abroad, in the developed world” (p. 47).

Turning to “social and cultural barriers,” Lewis focuses on three oppressed groups within Islam:  unbelievers, slaves, and women.  Though unbelievers enjoyed a degree of “tolerance,” economic restrictions and social pressures severely reduced their standing.  While Europeans largely outlawed slavery in the 19th century, the institution still persists in Muslim circles.  And virtually every Westerner visiting Muslim lands immediately notices the subordinate status of women under Islam.  Resurgent Islam, directed by radicals like the Ayatollah Khomeini, insist that “the emancipation of women–specifically, allowing them to reveal their faces, their arms, and their legs, and to mingle socially in the school or the workplace with men–is an incitement to immorality and promiscuity, and a deadly blow to the very heart of Islamic society, the Muslim family and home” (p. 70).

However embedded in Muslim traditions, such social and cultural factors contributed to the isolation and progressive impoverishment of their nations.  So they fell victim to European superiority.  Yet while Europeans–and now Americans–flexed their muscles in Arab countries, an abiding resentment boiled within Arab hearts.  So too, as Israel attained statehood–and developed a flourishing society in an area long reduced to a desert under Arab rule–a virulent anti-Semitism boiled to the surface.  Prophetically, writing this book in 1999, Lewis noted:   “If the peoples of the middle East continue on their present path, the suicide bomber may become a metaphor for the whole region, and there will be no escape from a downward spiral of hate and spite, rage and self-pity, poverty and oppression, culminating sooner or later in yet another alien domination; perhaps from a new Europe reverting to old ways, perhaps from a resurgent Russia, perhaps from some new, expanding superpower in the East” (pp. 159-160).

*************************************

For anyone interested in a more detailed history, Bernard Lewis’s The Middle East:  A Brief History of the Last 2,000 Years (New York:  Simon & Schuster, c. 1995) is probably the best available.  Accurate, analytical, up-to-date, readable, it deserves the accolades such as “masterpiece” routinely given it.

After sketching the pre-Christian societies in the Middle East, explaining the various peoples living therein, Lewis charts Christianity’s the effective expansion and establishment–from Ethiopia to Persia, from Macedon to Arabia–during the first six centuries of the Christian Era.  Then came Mohammed!  His teachings inspired devotees to conquer much of the world in the seventh century.  More importantly, Lewis says:  “It is the Arabization and Islamization of the peoples of the conquered provinces, rather than the actual military conquest itself, that is the true wonder of the Arab empire” (p. 58).  Amidst the success of Arab armies, however, the empire developed internal tensions.  Mohammed’s immediate successors, the “caliphs,” quarreled among themselves.  Indeed, during the “golden age” of Islam three of the four caliphs were assassinated.  Mohammed’s blood relatives struggled against those who claimed to better represent the prophet.  So factions developed– Shi’ite battling and Sunni–that still divide the Muslim world.

Despite internal turmoil, however, the Arab Empire prevailed, dominating much of the globe for 1,000 years.  Providing accurate information, without getting buried in the details, Lewis gives a cogent overview of the ‘Abbasid Caliphate, then charts the “steppe people’s” invasions from the north and east, including the conquests of Jenghiz Khan’s Mongol warriors.  First absorbing the blows of the invaders, then slowly converting them to Islam, Muslims preserved the essential character of Islam, though the center of power constantly as the dominance of one group (i.e. Egypt or Persia) dictated its trajectory.

Following a chronological overview, Lewis discusses various aspects of Muslim culture, explaining such things as the politics, economics, the elites, religion and law.  To Muslims, he explains, there is no clear distinction between politics and religion.  In accord with Mohammed’s teaching and example, “the choice between God and Caesar, that snare in which not Christ but so many Christians were to be entangled, did not arise.  In Muslim teaching and experience, there was no Caesar.  God was the head of the state, and Muhammad his Prophet taught and ruled on his behalf” (p. 138).  Since Muhammad himself was a trader and warrior, and his Arab followers were nomadic herdsmen and warriors, they tended to have little interest in agriculture.  Consequently, as the great Muslim historian Ibn Khaldun noted in the 14th century, under Islam “‘ruin and devastation prevail’ in North Africa, where in the past there was ‘a flourishing civilization, as the remains of buildings and statues, and the ruins of towns and villages attest'” (p. 166).  Warriors from the Arabian desert generally made deserts wherever they settled!

Lewis clearly explains Islam’s core elements, such as its “five pillars.”  Given the current world scene, his discussion of “jihad” (holy war) clarifies the perennially militant stance Muslims assume, for they embrace a sacred obligation to conquer the world and bring all peoples into submission to Islamic law (and thence, encourage conversion to the Islamic faith).  Consequently, “the Christian crusade, often compared with the Muslim jihad, was itself a delayed and limited response to the jihad and in part also an imitation.  But unlike the jihad it was concerned primarily with the defense or re-conquest of threatened or lost Christian territory” (p. 233).  Muslims, Lewis shows, were preoccupied with internal controversies and paid little attention to the Christian crusades.  And they certainly did not condemn them as do modern Westerners who wield the Crusades as a bludgeon with which batter Christianity.

**********************************************************

Far more critical of Islam, Bat Ye’or, an Egyptian-born scholar living in France, recounts what Christians suffered under Muslim rule in The Decline of Eastern Christianity Under Islam:  From Jihad to Dhimmitude (Cranbury, NJ:  Associated University Presses, c. 1996).  In an enlightening foreword to the book, Jacques Ellul notes that there exists in the West a “current of favorable predispositions to Islam,” notably evident in the many euphemistic discussions of jihad.  By setting forth the historical facts, Bat Ye’or dares to contradict the prevailing assumptions regarding Islam.  “Historians,” Bat Ye’or says, “professionally or economically connected to the Arab-Muslim world, published historical interpretations relating to the dhimmis, which were either tendentious or combined with facts with apologetics and fantasy.  After World War II, the predominance of a left-wing intelligentsia and the emergence of Arab regimes which were “socialist’ or allied to Moscow consolidated an Arabophile revolutionary internationalism” that remains strong is much of the contemporary world (pp. 212-213).

Jihad, in fact, helps constitute Islam, Ellul says, for it is a sacred duty for the faithful.  Indeed “it is Islam’s normal path to expansion.”  Unlike the “spiritual” combat imagined by some pro-Islamic writers, jihad  advocates “a real military war of conquest” followed by an iron-handed “dhimmitude,” the reduction of conquered peoples to Islamic law (p. 19).  Muslims divide the world into two–and only two–realms:  the “domain of Islam” and “the domain of war” (p. 19).  At times, strategy dictates tactical concessions and “peaceful” interludes.  But ultimately, Muslims are committed to conquer and control as much of the globe as possible.  Ellul stresses this “because there is so much talk nowadays of the tolerance and fundamental pacifism of Islam that it is necessary to recall its nature, which is fundamentally warlike!” (p. 20).  Writing presciently, in 1991, Ellul declared:  “Hostage-taking, terrorism, the destruction of Lebanese Christianity, the weakening of the Eastern Churches (not to mention the wish to destroy Israel) . . . all this recalls precisely the resurgence of the traditional policy of Islam” (p. 21).

Turning from Ellul’s remarks to Bat Ye’or’s treatise, we enter into a carefully crafted description of what happened to non-Muslim peoples under the yoke of Islam in the Mediterranean basin, Turkey, Armenia, Mesopotamia, and Iran, a subject heretofore distinguished by a paucity of reliable studies.  She meticulously defines jihad, noting that it may be waged through both overt war and more covert means:  “proselytism, propaganda, and corruption” (p. 40).  Whatever means necessary for Muslims to conquer and control lands and non-Muslim peoples find justification as jihad.  Thus motivated, Muslims established an enormous empire by the time of Charlemagne (ca. 800 A.D.), though in truth Muslim warriors were often brutal and booty-hungry pillagers, driven more by greed than holy zeal.

So too, when Muslims ruled a region, reducing all non-Muslims to dhimmitude, they exploited and oppressed (especially through onerous, discriminatory taxation) their subjects.  Forcibly occupying highly-civilized realms such as Egypt, Muslim rulers slowly and surely reduced them to wastelands, economically and culturally depressed shadows of ancient glory.  Everywhere the Muslims went, there resulted “the agricultural decline, the abandonment of villages and fields, and the gradual desertification of provinces–densely populated and fertile during the pre-Islamic period” (p. 102).   All the land under Muslim rule was “administered by Islamic law for the benefit of Muslims and their descendents” (p. 70).  More systematically and thoroughly than Europeans appropriating American Indian lands, the Muslims impoverished conquered peoples.  Even the much-vaunted “Islamic civilization” was derived, sucked out of dying corpses, not created.  “Islamic literature, science, art, philosophy, and jurisprudence,” Bat Ye’or says, “were born and developed not in Arabia, within an exclusively Arab and Muslim population, but in the midst of conquered peoples, feeding off their vigor and on the dying, bloodless body of dhimmitude” (p. 128).

Theoretically, Jews and Christians had religious freedom, but in fact “at no period in history was it respected” (p. 88).  Theoretically, conversions to Islam were to be voluntary.  In fact, massacres, torture, slavery and intimidation punctuated the process.  In Spain, two centuries after occupation, “in 891 Seville and its surrounding areas were drenched in blood by the massacre of thousands of Spaniards–Christian and muwallads.  At Granada in 1066, the whole Jewish community, numbering about three thousand, was annihilated” (p. 89).   To understand the much-maligned Christian Crusades, one must see them as defensive, just wars designed to relieve the suffering of oppressed and enslaved believers.  Centuries later, the 1915 “the genocide of the Armenians was a combination of massacres, deportations, and enslavement.  In the central regions of Armenia, the male population over the age of twelve was wiped out en masse:  shot, drowned, thrown over precipices, or subjected to other forms of torture and execution” (p. 196).

In short, Bat Ye’or says, “irrefutable historical and archaeological sources confirm” that the “process of Islamization” in conquered lands, “was perhaps the greatest plundering enterprise in history” (p. 101).  Reading this book certainly sobers one!  She supports her presentation with extensive footnotes and 175 pages of illustrative documents and finds little admirable in Islamic rule.  The weight of the evidence, the factual refutation of Arabophile histories, persuades one that the terrorists operating in the world today are hardly an aberration of Islam!

******************************************

For a brief, handy overview of the subject, James L. Garlow’s A Christian’s Response to Islam (Tulsa:  RiverOak Publishing, c. 2002) sets forth a pastor’s response to 9/11, including a clear critique of the gushy universalism that “referred to every deceased person as ‘being in Heaven'” (p. 83).  Such sentimentality was further evident when a “United Church of Christ fellowship announced it would substitute readings from the Koran for Bible readings for eight consecutive Sundays.  The pastor of one of the nation’s largest Methodist churches declared in a magazine article that God is the same one worshipped in ‘mosques, synagogues, and churches'” (p. 72).  Against such Garlow protests, for his concern is not so much with fully understanding Islam as with rightly responding as committed Christians to the contemporary scene.  The book began as a series of ever-expanding e-mailings to friends following the terrorists’ attacks, and, without pretending to be the definitive study of Islam or to provide a scholarly appraisal of its history, “it has one agenda:  to increase love and boldness for Christ with the result that we more effectively share Him with all (including Muslims), rather than simply ‘blending in with our multireligious culture” (p. 6).

Garlow roots his presentation in the ancient biblical account of Ishmael and Isaac, then explains how Mohammed and the Muslims, following the Koran’s message, have impacted the world.  In response, Christians must avoid either “Muslim-bashing” or “the knee-jerk reaction of platforming Muslims in Christian churches, thus implying that ‘We all worship the same God’ or buying into the politically correct line that ‘Islam is a religion of peace'” (p. 85).  There is, for example, a distinctive difference between Jehovah, revealed in the Old Testament, and Allah, highlighted in the Koran.  Jesus, to the Muslim, is merely one of 25 prophets, with Mohammed the last the most important.  To Christians, of course, He is the Eternal Son of God.  Consequently, Christians should take the opportunity to proclaim ever more vigorously that Jesus is the name above all names, the sole Savior of all mankind!  Without compromising their faith, Christians must also extend the hand of friendship to Muslims, building good relationships with them, learning the truth about their faith and their culture.  Having established a position of trust, dealing with them in very personal ways, Christians can bear witness to the faith that is within them, especially emphasizing the centrality of Christ.

# # #

130 Laws of Leadership

            

For many years John Maxwell has both exemplified and written about “leadership.”  Though his concern has always been the local church, having long pastored San Diego ‘s Skyline Wesleyan Church, his influence now includes the corporate world as well.  His The 12 Irrefutable Laws of Leadership:  Follow Them and People Will Follow You (Nashville:  Thomas Nelson Publishers, c. 1998) contains, he says, a “short list” of all he has learned.  The book became quite a “best seller,” garnering plaudits from diverse corners. 

Such plaudits include these words from Tom Landry, former coach of the Dallas Cowboys:  “John Maxwell understands what it takes to be a leader, and he puts it within reach with The 21 Irrefutable Laws of Leadership.  I recommend this to anyone who desire success at the highest level, whether on the ball field, in the boardroom, or from the pulpit.”  The founder of Promise Keepers, Coach Bill McCartney, agrees:  “In typical Maxwell style, filled with wisdom, wit, and passion, John provides a wealth of practical insights on what it takes to be a successful leader.”

            Let me simply list Maxwell’s “laws.”  1.  THE LID.  “Leadership Ability Determines a Person’s Level of Effectiveness.”  2.  INFLUENCE.  The True Measure of Leadership is Influence–Nothing More, Nothing Less.”  3.  PROCESS.  Leadership Develops Daily, Not in a Day.  4.  NAVIGATION.  Anyone Can Steer the Ship, But It Takes a Leader to Chart the Course.  5.  E.F. HUTTON.  When the Real Leader Speaks, People Listen.  6. SOLID GROUND.  Trust Is the Foundation of Leadership.  7.  RESPECT.  People Naturally Follow Leaders Stronger than Themselves.  8.  INTUITION.  Leaders Evaluate Everything with a Leadership Bias.  9.  MAGNETISM.  Who You Are Is Who You Attract.  10.  CONNECTION.  Leaders Touch a Heart Before They Ask for a Hand.  11.  INNER CIRCLE.  A Leader’s Potential Is Determined by Those Closest to Him.  12.  EMPOWERMENT.  Only Secure Leaders Give Power to Others.  13.  REPRODUCTION.  It Takes a Leader to Raise Up A Leader.  14.  BUY-IN.  People Buy Into the Leader, Then the Vision.  15.  VICTORY.  Leaders Find a Way for the Team to Win.  16.  BIG MO.  Momentum Is a Leader’s Best Friend.  17.  PRIORITIES.  Leaders Understand That Activity Is Not Necessarily Accomplishment.  18.  SACRIFICE.  A Leader Must Give Up to Go Up.  19.  TIMING.  When to Lead is as Important As What to Do and Where to Go.  20.  EXPLOSIVE GROWTH.  To Add Growth, Lead Followers–To Multiply, Lead Leaders.  21.  LEGACY.  A Leader’s Lasting Value is Measured by Succession. 

            Given the appeal of Maxwell’s work, the current pastor of Skyline Wesleyan, Jim Garlow, decided to illustrate its principles through a survey of historical leaders, titling his spin-off The 21 Irrefutable Laws of Leadership Tested by Time:  Those Who Followed Them . . . And Those Who Didn’t (Nashville:  Thomas Nelson Publishers, c. 2002).  To help him with the research, Pastor Garlow asked me to join him in the project, and he graciously credits me, on the title page, for my assistance, so I confess a vested interest in the publication.

            Prior to his pastoral ministry, Garlow earned a M.Th. from PrincetonUniversity and a Ph.D. in church history from DrewUniversity.  He has an absorbing interest in history and believes that “history is a great teacher.  By looking at the successes and failures of those who have gone before us, we can hopefully avoid their errors and gain from their strengths” (p. 2).  During one’s lifetime critics and lapdogs easily err, but judicious historians more accurately appraise a man’s true worth.  To them it becomes clear that some folks sacrifice their lives for “things that do not retain value.”  Conversely, others loom large for wisely investing in those permanent things that matter most.  Looking to the past, we discern those “who understood the principles of leadership” as well as those who tragically failed. 

            Maxwell’s first law, “The Law of the Lid,” insists that “leadership ability determines a person’s level of effectiveness.”  This law stands revealed in men who had great talents, unusual potential, but failed for lack of leadership skills.  “Leadership skill,” notes Garlow, “is the difference between success and failure; it is the difference between creative vitality and mediocre maintenance” (p.2).   This is dramatically illustrated in one of the two father-son teams that served as presidents of the United States, John Adams and John Quincy Adams. 

            “The second and sixth presidents of the United States came to that position thoroughly gifted and prepared–or so it seemed” (p. 7).  When elected President in 1796, John Adams enjoyed great prestige.  He’d excelled in virtually every previous endeavor, serving as a leader in the Continental Congress and as George Washington’s Vice President.  Furthermore, he was widely respected for his integrity.  He was, however, somewhat egotistical and bullheaded, adept at alienating both friends and foes.  Benjamin Franklin, who knew him well, quipped that he was “always honest, often great, but sometimes mad.”

            Taking up the reins of the presidency in 1797, Adams quickly showed how a gifted man fails as a leader.  Like many who personally perform well, he “was unable to delegate” (p. 8).   Like the Lone Ranger, “he tried to do most everything himself” (p. 8).  Compounding the problem, he frequently absented himself from his office!  “He loved his home in Quincy, Massachusetts, and was unusually unhappy in Philadelphia,” so he  “spent a shocking one-fourth of his presidency away from the nation’s capital, in Boston, in an era without phones, faxes, computers, or any other means of communication faster than horse travel!  He was an absentee president” (p. 8).  As is typical of highly intelligent men, Adams often saw too many sides of various issues and failed to act when crises demanded it.  Consequently, Adams lost the election of 1800.  “Inability to delegate, absenteeism, communication deficiencies, indecisiveness, and lack of discernment have one thing in common: lack of leadership skills,” Garlow says.  “Was he honest? Yes. Was he bright? Yes. Was he good?  Yes” (p. 10).   

            In 1824, John Adams’s son, John Quincy Adams, was elected the nation’s sixth president.  No one could ask for better parents!  “He had a loving father who guided him. His mother, Abigail Adams, was one of the most outstanding colonial women. Son John inherited much of his parents’ intellectual brilliance and Puritan ethic” (p. 10).   He was an unusually gifted man, obviously one of the most intelligent and most experienced of America’s presidents.  But despite his “uninterrupted success” in earlier assignments, he almost immediately failed.  Like his father, he had poor “relationship skills,” proving himself “exceptionally able to offend and alienate people.”  When he met Andrew Jackson, who had received more votes than Adams in the election, he refused to shake hands with the general,  “who graciously greeted him and offered his hand. Petulantly, Adams stood immobile, disdaining Jackson’s gesture, and replied in a manner designed to offend” (p. 11). 

            When he addressed the nation as President, Adams spoke apologetically, inviting criticism through his own lack of confidence in his abilities.  He was, without question, a good man, dedicated to his work.  “But he failed as a leader” (p. 12).  In Samuel Eliot Morison’s appraisal, he “‘was a lonely, inarticulate person unable to express his burning love of country in any manner to kindle the popular imagination.'”  As “John Maxwell so often says, “He who thinks he is a leader, but has no followers, is only taking a walk'” (p. 13).  “Much like his father,” Garlow says, “John Quincy Adams illustrates the ceiling principle. Utterly competent on one level, he failed to grow with his opportunities and failed to effectively serve as president. And that effectiveness hung on one thing: leadership” (p. 13).

            In contrast to the two Adams, another President, Theodore Roosevelt, provides a pattern for great leaders.  He illustrates the second “irrefutable law,” The Law of Influence.   “Leadership ultimately is influence” (p. 22).  “In 1910, at the Sorbonne in Paris, Roosevelt gave a speech that has been quoted by leaders ever since. It depicts his vigorous view of life and contains a profound challenge to everyone who reads the words today:

         “It is not the critic who counts: not the man who points out how the strong man stumbles or where the doer of deeds could have done better. The credit belongs to the man who is actually in the arena [italics the author’s], whose face is marred by dust and sweat and blood, who strives valiantly, who errs and comes up short again and again, because there is no effort without error or shortcoming, but who knows the great enthusiasms, the great devotions, who spends himself for a worthy cause; who, at the best, now, in the end, the triumph of high achievement, and who, at the worst, if he fails, at least he fails while daring greatly [italics the author’s], so that his place will never be with those cold and timid souls who knew neither victory nor defeat” (p. 23).  

Garlow challenges readers to note TR’s “words: ‘the man who is actually in the arena,’ ‘at least he fails while daring greatly.’  Those words ignite human hearts. That is the language of a leader. Those are the concepts of an influencer” (p. 23).  

            Roosevelt’s exploits, from the Spanish-American War through his years as President, reveal his ability to influence others.  The men he recruited for his famous “Rough Riders” followed them because he inspired them.  He truly cared for them and they loved him for it.  “Leaders draw others to themselves and their causes, even when the cause is difficult,” Garlow notes.  “Roosevelt’s cause was one that demanded a tough love, which calls men to risk their very lives in serving a higher good. Only leaders can inspire others to that level. There’s a name for it: influence” (p. 27).  His influence streamed, in part from his infectious courage.  In his Autobiography, he confessed, “There were all kinds of things I was afraid of at first, ranging from grizzly bears to ‘mean’ horses and gun-fighters; but by acting as if I was not afraid I gradually ceased to be afraid” (27).  Whether leading soldiers or declaiming from the “bully pulpit” in the White House, TR inspired men by his courageous confidence. 

            Moving to the third “irrefutable law of leadership,” The Law of Process, we discover that “Leadership Develops Daily, Not in a Day.”  Here Pastor Garlow provides some personal background, saying:  “I am uniquely qualified to write this book. Of the six billion persons on earth, I am the only one who had to follow John Maxwell in a leadership position since he has become so knowledgeable on leadership.”  Maxwell pastored San Diego’s SkylineWesleyanChurch for 14 years, and when he resigned Jim Garlow was asked “to consider coming to Skyline as the new senior pastor.  I immediately declined, saying, ‘Anyone who tries to follow John Maxwell is a fool.’ (Several years have passed since I made that comment. I think the statement might still be true!) Four months later, I found myself accepting the senior pastoral role at SkylineChurch.  I did follow–or attempted to follow–Maxwell. And it has been a challenge” (p. 36).

            The challenge came from trying to succeed (and then succeed) a highly gifted pastor.  Garlow had much to learn!  And learn he did, as the church’s continued growth and ministry testifies.  Learning “process” skills, however, stretched him.  He “underestimated” its importance.  In part this stemmed from the fact that he tends to be “event driven.”  As he confesses,  “I was an ‘event king.’  In fact, I can ‘out event’ anybody.  At ‘eventing,’ I’m good!  But leaders are not produced in events. They are made in process.  So I have been on a huge learning curve for the past few years.  I wish I could say that I have changed, and that I have conquered the process concept.  I haven’t. But I’m growing. I’m not where I want to be. But I’m not where I used to be. And while I see how far I have to go, I am thankful for the progress” (p. 37).  

            The importance of process appears in a careful study of the difference between the followers of two 18th century “exceptionally gifted” evangelists, George Whitefield and John Wesley.  “Both commanded enormous respect. Tens of thousands followed them” (p. 37).  They had “much in common, but they had one noticeable difference. As the years went by, Whitefield’s followers dissipated.  His organization faltered.  Wesley’s did not. What was the difference? Both men were brilliant. Both were winsome and compelling communicators. Both experienced phenomenal success in their lives. But Wesley understood process. Whitefield, it would appear, did not” (p. 38).  

            Whitefield was a powerful orator, probably the greatest of his generation.  He preached some 18,000 times, both in England and the American Colonies.  He helped ignite the Great Awakening in America.  “Thousands responded to his booming voice, which could be heard by a crowd of 20,000 (some have dared to say 40,000) without present-day public address systems” (p. 38).  He received generous financial support and established charitable foundations, especially orphanages.  Many gave of their finances to help support the orphanage that his wife operated in the Georgia Colony. 

            Wesley, like Whitefield, attended OxfordUniversity and became a priest in the Church of England.  Transformed by his Aldersgate experience in 1738, where his “heart was strangely warmed,” he joined his friend Whitefield in an innovative technique, preaching in open fields.   His preaching (some 40,000 times!) helped launch the “Evangelical Revival” which renewed religious life in England.  He continually traveled and preached.  “His energy level was amazing. He arose every morning at four o’clock, working eighteen-hour days. He rode on horseback a quarter of a million miles. He stopped riding a horse when he reached about seventy years of age, but he continued the rigorous travel schedule by horse and buggy. He traveled 4,000 to 5,000 miles a year, as many as 80 miles a day! It is believed that Wesley may have spent more time in the saddle that any other man who ever lived, including Bonaparte and Caesar. Equally amazing was his ability to convert the saddle to a library chair, reading literally hundreds of books while riding on horseback” (p. 40).

            In addition to preaching he wrote or edited some 233 books.  “At the time of his death in 1791, he led an enormous organization: 120,000 members in the Methodist movement, with some suggesting that the total adherents numbered one million” (p. 40).  More importantly, “Wesley’s Methodist movement flourished globally after Wesley’s death. Today there are scores of denominations that point to Wesley as their inspiration. There are millions of believers who see him as father of their denominations. In contrast, George Whitefield’s denomination, the Calvinist Methodists, had insignificant impact, eventually ceasing to exist. Why? What was the difference between Wesley’s leadership style and Whitefield’s leadership style?” (p. 40). 

            This happened because “Wesley understood the Law of Process.  He quickly saw that gaining followers was not the key issue; sustaining them was the real challenge. To that end, Wesley began to organize his new converts” (p. 41).   He organized “classes” and “bands” and “societies.”  Local leaders accepted responsibility for guiding, and holding accountable, fellow Methodists.  Lay preachers were encouraged to exercise their gifts.  Conversely, “Whitefield’s followers had no such structure to assist them in their personal growth.  Once converted, they were simply to gather in churches.  But that did not happen. What was lacking was a process, a system or device by which a person is enabled to go to the next level of growth” (p. 42).  Both men were gifted.  Both were devout.  But only one, Wesley, left a lasting imprint.  Wesley understood the importance of process!

            For purposes of illustration, I’ve focused on only three of the twenty-one “laws.”  Since I helped research and write the book I obviously recommend it!  And I think it’s worth perusing because I share Pastor Garlow’s conviction:  the study of the past reveals how significant leaders have responded to the challenges of their day, providing time-tested principles well worth heeding.

###

129 The Question of God

 

                For more than two decades Armand M. Nicholi Jr. a psychiatrist and professor at HarvardMedicalSchool, has taught a course at HarvardCollege and MedicalSchool.  Through assigned readings, lectures and class discussions, he engaged students in a dialogue between Sigmund Freud and C.S. Lewis.  Freud, thought his luster has dimmed considerably as his theories seem increasingly suspect, certainly helped shape the “therapeutic culture” which now reigns in throughout the West.  Lewis, resolutely defending the “permanent things” at the heart of classical Christian culture, stands permanently enshrined as their great apologist.  The core of his course at Harvard has been put in print by Professor Nicholi in The Question of God:  C.S. Lewis and Sigmund Freud Debate God, Love, Sex, and the Meaning of Life (New York:  The Free Press, 2002).  He tries to present both men’s views on important subjects, accurately portraying both men, interjecting his own explanations and interpretations and final evaluations in the process. 

                After short biographical introductions to the two men, Nicholi presents their views on “The Creator.”  Freud, who emphatically embraced philosophical materialism, acknowledged no Creator and judged all religions illusionary.  Reworking Feuerbach’s famous thesis, Freud thought that “believers” simply project deeply-held desires into outer space and fantasize, like children, notions such as a loving Heavenly Father.  He did, however, at times admit to a deep longing–a Sehnsucht, a hunger for something beyond earthly things–that haunted him all his life.  He attributed it to memories of long-lost days when he escaped from his father, finding solace in some woods near his boyhood home. 

                Lewis, on the other hand, after espousing atheism for more than 15 years, underwent a profound conversion at age of 31 and defended theism for the rest of his life.  Believing in a Creator, he insisted, brought one into contact with the ultimate Reality of the universe, morally demanding and fearsomely holy–hardly the kind of “god” we would conjure up if we wanted to comfort ourselves.  Still more, perhaps our inner longings (for which he used the same German word that Freud used, Sehnsucht), our hungers, accurately orient us to realities that will satisfy the.  Lewis said:  “If I find in myself a desire which no experience in this world can satisfy, the most probably explanation is that I was made for another world” (p. 47).    Lewis’s conversion, those how knew him testified, wrought deeply rooted changes in him.  “A buoyant cheerfulness replaced his pessimism and despair.   On the last days before he died, those who were with Lewis spoke of his ‘cheerfulness’ and ‘calmness'” (p. 77).  Freud, conversely, though he often cited Scripture in his letters and dealt with spiritual themes in his books, apparently never tasted a religious experience.  Styling himself an “infidel Jew,” he discounted reports detailing life-changing spiritual break through, judging them a form of  “hallucinatory psychosis.”                  

                Both men sought to understand and explain man’s “Conscience,” wondering if any Universal Moral Law existed.  No! said Freud.  One’s conscience, engrafted into him by parents and culture as a “superego,” obviously regulates behavior.  But it certainly contains no timeless truths.  Behavioral rules are crafted to lubricate social relationships and change continually as cultures evolve.  This position enabled Freud to consider himself  “very moral person” who compared favorably with the rest of mankind.  Yet, paradoxically, in one of his letters Freud claimed to “subscribe to the excellent maxim of Th. Visher:  ‘What is moral is self-evident.'”  (p. 66). 

                Freud’s admission that there is “self-evident” truth that gives moral guidance would have pleased C.S. Lewis.  Such an admission, he reasoned, ultimately leads one too acknowledge an ultimate Source, a Lawgiver, who prescribes righteous behavior for us.  Moral laws, like mathematical laws, Lewis believed, are discovered when we honestly investigate the manifold structures of the cosmos.  They reveal themselves to us.  We cannot “create” them.  Consequently, as he noted in Mere Christianity, two phenomena stand out in human history:  “First . . . human beings, all over the earth, have this curious idea that they ought to behave in a certain way, and cannot really get rid of it.  Secondly . . . they do not in fact behave in that way. . .  These two facts are the foundation of all clear thinking about ourselves and the universe we live in” (p. 61). 

                To live well, Lewis thought (sharing Aristotle’s view), makes one happy.  Freud also noted that one can hardly deny that most everyone, seeks “happiness; they want to become happy and to remain so” (p. 99).  Nevertheless, he despaired of its attainment.  Imbibing early of Schopenhauer’s pessimism, he apparently believed that “‘Man is never happy, but spends his whole life striving after something he thinks will make him so,” as Schopenhauer said (p. 98).  What satisfaction there is, he thought, comes from control of things, much as Friedrich Nietzsche, another of Freud’s mentors, insisted.  To Nietzsche one answers the “happiness” question by defining it as:  “The feeling that power increases–that resistance is overcome” (p. 98).  Nevertheless, Freud was frequently depressed, resorted to drugs like cocaine to numb his mind to his despair, and declared that as soon as you think happiness is “in your grasp” it slips away” (p. 109). 

                In his atheist years, Lewis shared Freud’s morose assessment of life.  “I was at that time living, like so many Atheists,” he wrote, “in a whirl of contradictions.  I maintained God did not exist.  I was also angry with God for not existing.  I was equally angry with Him for creating a world” (p. 113).  With his conversion, however, came unexpected happiness, pure joy.  Conversion turned his attention from himself to God and others, and he began to take delight in the good times he enjoyed with his friends and, late in life, with his wife.  One of his best friends for 40 years, Owen Barfield, remembered Lewis as “unusually cheerful,” taking “an almost boyish delight in life” (p. 115).  Miraculously, Nicholi says, following his conversion, Lewis “changed from an introvert who, like Freud, was highly critical and distrustful of others, to a person who reached out and appeared to value every human being” (p. 185). 

                Turning to the topic of sex, Nicholi stresses its centrality in the thought of Freud.  Unfortunately, popular misrepresentations have portrayed him as an advocate of libertine “free love.”  What he wanted to do freely was talk about sex and understand its importance.  “To believe that psycho-analysis seeks a cure for neurotic disorders by giving a free rein to sexuality,” he wrote, “is a serious misunderstanding which can only be excused by ignorance.  The making conscious of repressed sexual desires in analysis makes it possible, on the contrary, to obtain mastery over them which the previous repression had been unable to achieve.  It can be more truly said that analysis sets the neurotic free from the chains of his sexuality” (p. 132).   Freud himself apparently lived according to the restrained “Victorian” ethos of his era.  He (at the age of 30) and his wife were virgins when they married.  They had six children in the next eight years, whereafter he apparently discontinued sexual relations with his wife.  Amazingly enough, to those who see him as a libertine, he said, in a 1916 lecture:  “We . . . describe a sexual activity as perverse if it has given up the aim of reproduction and pursues the attainment of pleasure as an aim independent of it” (p. 149). 

                Lewis, though he married quite late in life, thought much about sex as part of the human condition.  Contrary to Freud, Lewis found “love” to be vaster than “sex.”  Rooted in the great works of literature, he “thought Freud’s understanding of love and relationships was incomplete” (p. 165).  Couples in love certainly taste the delights of “Eros,” but this must not be reduced to “Venus,” the sex act itself.  “Perhaps the greatest contribution Lewis makes to understanding sexuality and love,” Nicholi says, “is his clear distinction between being in love and love in its deeper, more mature form” (p. 141).  Love, even love between the sexes, is more than sublimated sexual desire. 

In a profound analysis, The Four Loves, Lewis distinguished between “Gift-love” and “Need-love.”  In Lewis’s words:  “Need-love says of a woman ‘I cannot live without her’; Gift-love longs to give her happiness, comfort, protection–if possible, wealth” (pp. 165-166).  Still more, Lewis utilized four Greek terms to indicate more fully the ramifications of love:  “(1) Storge, affection between members of a family; (2) Philia, friendship; (3) Eros, romantic love between people ‘in love’; and (4) Agape, the love one has toward God and one’s neighbor” (p. 166).  As Nicholi discusses these distinctions, he clearly prefers Lewis to Freud who scoffed at the biblical injunction to “love your neighbor as yourself” and rejected the possibility of loving one’s enemy.  Speaking personally, Nicholi writes, “As a clinician, I have observed that Agape is the key to all successful relationships, even those within groups and organizations” (p. 177).  Importantly, as Lewis clarified the meaning of Agape, he insisted that “Love is something more stern and splendid than mere kindness.”  This is because “love, in its own nature, demands the perfecting of the beloved” whereas “mere ‘kindness’ which tolerates anything except suffering in its object is, in that respect, at the opposite pole from Love” (p. 211).   This enabled him to deal effectively with another of life’s great questions:  the reality of pain and suffering. 

                Both Freud and Lewis personally suffered, and both thought deeply about it.  Freud, the atheist, routinely railed against God for making an anguished world, though a consistent atheist, of course, can hardly complain about pain since there is only a deaf, irrational, unfeeling cosmos responsible for it. Lewis, on the other hand, set forth, in one of his early books, The Problem of Pain, persuasive intellectual reasons as to how a good God could allow suffering:  it “is not good in itself.  What is good in any painful experience is, for the sufferer, his submission to the will of God, and, for the spectators, the compassion aroused and the acts of mercy to which it leads” (p. 203).  Though intellectually persuasive, however, such words failed to fully comfort Lewis in the midst of his wife’s dying.  Here we read, in A Grief Observed, the heart cry of a broken man at a loss for answers.  He raged and he doubted.  But (contrary to the impression left by the movie Shadowlands) he emerged from his sorrow with an even deeper faith, knowing that even death cannot destroy the soul that trusts in God. 

                Dealing with death further polarizes the two thinkers.  Agreeing with Schopenhauer, Freud quoted him to the effect that “the problem of death stands at the outset of every philosophy.”  Freud feared it all his life, finding each birthday a painful event, reminded thereby of his mortality.  When his own mother died he refused to attend her funeral.  When possible, he seemed to avoid thinking about it!  During his final days, he read and pondered Balzac’s The Fatal Skin, a story (much like Faust, Goethe’s classic that often cited) about a “young scientific man” selling his soul to the devil.  Then, asking his doctor to follow instructions, Freud was injected with a lethal dose of morphine, dying a (physician-assisted) suicide. 

                But Lewis, sustained by his Christian faith, believed, Nicholi says, that “the only person do decide the time of one’s death was the Person who gave one life” (p. 230).  He fully enjoyed each passing year, apparently relishing the very process of aging.  “Yes,” he wrote, “autumn is the best of the seasons; and I’m not sure that old age isn’t the best part of life” (p. 232).  He spent his final days contentedly, reading favorite authors, including Homer (n Greek), Virgil (in Latin), and other classic works of literature.  “Never was a man better prepared” to die, said a man who lunched with Lewis shortly before his death.  His brother, Warren, reported that Lewis said to him, a week before he died:  “I have done all that I was sent into the world to do, and I am ready to go.”  To his brother, “I have never seen death looked in the face so tranquilly” (p. 239). 

                Two men.  Two ways to live.  In his Epilogue, Nicholi emphasizes that the great difference between Freud and Lewis was God.  The book’s final paragraph merits repeating as Nicholi’s position:  “The answer to the question of God has profound implications for our lives here on earth, both Freud and Lewis agree.  So we owe it to ourselves to look at the evidence, perhaps beginning with the Old and New Testaments.  Lewis also reminds us, however, that the evidence lies all around us:  ‘We may ignore, but we can nowhere evade, the presence of God.  The world is crowded with Him.  He walks everywhere incognito.  And the incognito is not always easy to penetrate.  The real labor is to remember to attend.  In fact to come awake.  Still more to remain awake'” (p. 244).  Looking for some direction in life?  Try Lewis, says Nicholi!


                In God the Evidence:  The Reconciliation of Faith and Reason in a Postsecular World (Rocklin, CA:  Forum, c. 1997, 1999), Patrick Glynn explains how he recently came to believe in God and the immortality of the soul.  In the book’s first chapter, “The Making and Unmaking of an Atheist,” he explains his early embrace of atheism.  Attending a Catholic grade school, he encountered Darwin’s theory of evolution.  “It immediately occurred to me,” he says, “that either Darwin’s theory was true or the creation story in the Book of Genesis was true” (p. 3).  Siding with Darwin, he declared his position by standing up in class and making his case.  Though still a child, Glynn saw clearly the ultimate import of Darwin, for his theory “breathed fresh life into the atheist position–a fact immediately recognized across the globe.  Notably, that other famous nineteenth-century atheist, Karl Marx, asked Darwin if he could dedicate the English translation of Capital to the great naturalist” (p. 37).  Darwin demurred, but Marx rightly saw Darwin as an asset to his agenda. 

                Entering Harvard in 1969, Glynn fell in with the “New Left” and its Marxist views, solidifying his adolescent agnosticism.  Ultimately earning a Ph.D. in philosophy from Harvard, he settled into a deeply-entrenched atheism.  “Ironically,” he writes, “at the very time I was plumbing the depths of philosophical nihilism, science itself, unbeknownst to me and to many other people, was taking a surprising new turn” (pp. 6-7).  Physicists, acknowledging the reality of the “Big Bang,” were working out some of its implications, including the “anthropic principle,” the notion “that all the myriad laws of physics were fine-tuned from the very beginning of the universe for the creation of man” (pp. 22-23).  Rightly understood, this involves “a refutation of the original premise of the overarching modern philosophical idea:  that of the ‘random universe'” (p. 7).  Dealing honestly with this new evidence, Glynn began to ponder the implications of “A Not-So-Random Universe.”  Amazingly enough, “the picture of the universe bequeathed to us by the most advanced twentieth-century science is closer in spirit to the vision presented in the Book of Genesis than anything offered by science since Copernicus” (p. 26).  Design, not random material developments, better explains the way things really are!  Glynn laces his discussion with clear explanations of the most recent scientific discoveries, further indicating their philosophical importance by placing them within a historical framework.  Cracks are appearing in the foundations of the scientific-secularism that has reigned in the West for more than two centuries.

                Something similar, Glynn says, is transpiring in the inner world.  In a chapter entitled “Psyche and Soul:  Postsecularism in Psychology,” he documents the ebbing away of Freud’s substitute religion, psycho-analysis.  Awakening from its naturalistic slumbers, “Slowly but surely, modern psychology is belatedly rediscovering the soul” (p. 63).  Spirituality seems resurgent.  Witness the massive success of books such as Scott Peck’s The Road Less Traveled!  “It is more than a little ironic,” Glynn says, “that after its long odyssey into the unconscious and its multiplication of dark modernistic concepts of mental life, modern psychology at the end of the twentieth century should have arrived at a formula for mental well-being and happiness hardly distinguishable from that of traditional religion–faith, hope, love, self-discipline, and a life lived in conformity with solid, traditional moral principles” (p. 74).  The Ten Commandments make more sense than the Oedipus Complex!   

                In yet another realm, Glynn finds a growing bond between “faith and the physicians.”  In the words of a Harvard Medical School professor, Herbert Benson, we’re “wired for God” (p. 80).  With that comes some “intimations of immortality,” the accumulating data from near-death testimonials that something at the heart of us survives the body’s demise.  Careful studies indicate that people see out of body details, while apparently “dead,” that cannot be naturalistically explained apart from the reality of a “spirit.”  Indeed, Glynn holds that the ancient view, preeminently the New Testament view, that we by nature are primarily spiritual, still holds.  This means that the Enlightenment apotheosis of Reason must dissolve.  For centuries men have sought to replace God with human Reason, with dismal results, including what Martin Buber perceptively called the “deactualized self.”  Discarding God, man debases himself in the process.  Glynn argues:  “Reason, freed from divine guidance, originally promised humanity freedom; but its culmination in the moral realm is postmodernism, and the spirit of postmodern thought is nothing if not the spirit of [what Buber called] ‘caprice'” (p. 146).  Taking nothing seriously, postmodern man does whatever appeals him for a moment, taking not thought for eternity.  The chaos consequent upon the Sexual Revolution of the ’60s illustrates this pattern. 

                “What I am suggesting,” Glynn writes, with reference to these recent developments, “and what it seems to me history tends to corroborate, is this:  The knowledge of the Spirit is prior to the knowledge of reason.  Where reason follows Spirit, the results are good; where it rejects or parts ways with the Spirit, the results are invariably disastrous, whether one speaks of the political, societal, or personal spheres” (p. 166).  Indeed, he writes in his final sentence:  “If the history of this century offers any lesson, it is that goodness–and a relationship to God, to the Absolute by whatever name He is called–is not only the beginning of wisdom but the only path by which it can be attained” (p. 169). 

                Scholarly in its depth, popular in its presentation, God:  The Evidence makes a strong case, giving us a treatise ideally suited for those serious thinkers who wonder if there is, in fact, views worth considering.

###