178 Hadley Arkes’ “First Things”

One of America’s fines thinkers, Hadley Arkes (a professor at Amherst College), has for decades espoused a natural law ethic and effectively worked in the political sphere for legislation such as the “born alive infant protection” and “defense of marriage” acts.  His thought was cogently set forth in FirstThings:  An Inquiry into the First Principles of Morals and Justice (Princeton: Princeton University Press, c. 1986).  He begins by stressing that Justice James Wilson, in 1793, recognized that the Supreme Court of the new United States could not cite earlier decisions. Thus it would be necessary to invoke ‘”principles of general jurisprudence'” and, furthermore, to align with the common sense philosophy of Thomas Reid in acknowledging “the validity of the laws of reason and the grounds of our moral understanding” (p. ix).

Arkes hopes, in his writing, “to restore that tradition of understanding in which Reid held such an evident, important place. That tradition of moral reflection took seriously the notion of ‘first principles’ in morals and law, as well as in physics and mathematics, because it recognized that our knowledge, in all its branches, found its common philosophic origin in the laws of reason or the ‘principles of understanding'” (p. x). Thus the first of the book’s two sections is devoted to “the groundwork of moral judgment,” which entails challenging the skepticism and moral relativism so evident today. Fortunately, Arkes says, his challenge accords with the convictions of the men who founded this nation, who often invoked John Locke, the philosopher who “had no doubt that ‘morality [stood] amongst the sciences capable of demonstration:  wherein I doubt not but from self-evident propositions by necessary consequences, as incontestable as those in mathematics'” (p. 5). There are axioms—first principles—”truths in the domain of morals and law that we not only know, but that cannot be otherwise. This book is an attempt to take up that mandate by ‘reminding’ us of the things that philosophers and statesmen once knew” (p. 6).

Whereas Aristotle and the ancients held that ethics is a rational discipline, moderns such as Thomas Hobbes relied on subjective and biological desires for guidance. Thomas Reid, like Aristotle, said:  ‘”Feeling, or sensation, seems to be the lowest degree of animation we can conceive. . . . We commonly distinguish feeling from thinkin’g” (p. 22) because we rightly grasp the nature of human nature. Arkes shows the significance of this distinction in his discussion of the 1858 Lincoln-Douglas debates. Stephen A. Douglas, championing “popular sovereignty,” was content to abide by whatever a majority of people desired. Thus laws are purely conventional, rooted in whatever a people momentarily desires. He was, in today’s lingo, a “pro-choice” advocate of “cultural relativism.” Abraham Lincoln, however, insisted there was a higher law, a moral law, an eternally and universally true standard, that decreed slavery intrinsically wrong. He articulated a natural law ethic. With Aristotle, Lincoln recognized, Arkes says, that “Polity arises from the capacity of human beings for moral judgment. The mark of a polity is the presence of law, and law (as we can see now) arises directly from the logic of morals. …. We have law only because we have morals—only because it is possible to speak of things that are right and wrong” (p. 25).

There are in fact “necessary truths” that transcend personal “feelings,” and we cannot live without them.  “It would be possible,” Arkes says, in a persuasive paragraph, “for us to reject the existence of morals if we were indeed prepared then to live out the rest of our lives without the use of moral terms and the functions they serve.  We would have to be willing to live without complaining or showing outrage, from the smallest villainies to the most massive evils—from being shortchanged at the supermarket, to encountering the horrors of genocide. We would have to cease condemning injustices, complaining about faults; we would have to stop despising what is hateful and loving what is admirable. In short, we would have to live a life barren of those things that give human life its special character, because we would rule out the one thing that is truly distinctive about human beings: our capacity for moral judgment” (p. 74).  

Having established a groundwork for moral judgment, Arkes turns, in the book’s second part, to “cases and applications.” He considers the risky path the nation’s courts have walked by allowing “conscientious objection” to certain laws, specifically conscription. He devotes two chapters to the Vietnam War, providing a valuable historical survey and showing how the American media largely misrepresented the struggle. He notes how critics of the “war and to American intervention had depended critically on the premises of cultural relativism” (p. 261). In fact, “only the intervention of the United States” offered the Vietnamese the “right to be ruled by a government of their own choosing” (p. 269). Abandoning South Vietnam after the hardest battles had been won, and soon witnessing the truth of the “domino theory,” the United States opened the floodgates to Pol Pot’s genocide in Cambodia. America’s involvement in Vietnam, Arkes thought, illustrated our moral “obligation to rescue” those in need. “Those who would save lives with food and medicine in all countries, whose who would protest the extinction of ‘human rights’ in countries other than our own, and those who would press their humanitarian concerns even when they know they would be intervening in the politics of other countries have all acknowledged the most decisive principles that sanctioned the American intervention in Vietnam” (p. 292).

Likewise, though there is a moral justification of welfare, “redistributive justice” is quite problematic. Needy people—paraplegics, for example—are entitled to assistance from the community. Such assistance obviously requires taxing those who work to support those who need rescuing. How the taxes are generated, however, merits the moral scrutiny evoked in the 18th and 19th centuries. J.R. McCulloch, for example, “wrote in 1845, ‘The moment you abandon . . . the cardinal principle of exacting from all individuals the same proportion of the income or their property, you are at sea without rudder or compass, and there is no amount of injustice or folly you may not commit'” (p. 313). Redistributive taxation (the graduated or progressive income tax) is designed, as Marx made clear in his Communist Manifesto, to equalize wealth. That it has been so widely embraced (too often simply as an easy way to generate revenues) without “serious moral challenge” distresses Arkes, for the pieties of “redistributive justice” have “no moral ground of justification, but rather a mean, unredeeming truth: the persistence of spiteful envy” (p. 232). “The world,” he concludes, “could have been spared a large measure of misfortune—and no harmless train of moral blundering—if it had turned away from policies of redistribution in the way that the French finance minister Jacques Turgot turned away from one of the early proposals for a graduated tax on income. With his cultivated judgment, Turgot managed to sense at once that the scheme was as morally doubtful as it was economically ruinous. ‘One ought,’ he said, ‘to execute the author and not the project'” (p. 326).

Turning to “the question of abortion and the discipline of moral reasoning,” Arkes laments the illogic of many recent judicial decrees. Arkes shows how a sentence of Justice Blackmun’s in Roe v. Wade, illustrates that lack of “any rigorous philosophic and moral reasoning which has become typical of the Supreme Court in our own time” (p. 360).  The incoherence of being “pro-choice,” for example, is evident when one realizes that “one could be “pro-choice” on the torture of children only if there were nothing in principle wrong or illegitimate about the torture of innocent people. The point is not grasped so quickly in relation to unborn children because they are not viewed as children, or ‘persons'” (p. 362).  Pro-choice rhetoric regarding the unborn as “potential persons” is likewise misleading, for a “fetus may be a potential doctor, a potential lawyer, or a potential cab driver; but he cannot be considered merely a potential human being, for at no stage of his existence could he have been anything else” (p. 364).

The courts have decided, in recent decades, that what a woman wants is the only relevant question regarding abortion rights. The woman’s desire, not any truth regarding the reality of the unborn child, is sovereign. “With this kind of license,” Arkes reasons, “there would be no obstacle to carrying out abortions, not only past the first trimester, and not only up to the moment of birth: it would become clear very soon that a child who survived an abortion could legitimately be destroyed if the presence of the living child would he a cause of distress for the mother” (p. 370).

* * * * * * * * * * * * * * * * 

In Beyond the Constitution (Princeton: Princeton University Press, c. 1990) Hadley Arkes continues the careful legal and philosophical analyses evident in First Things. The book’s title encapsulates his conviction “that there is a need to move beyond the text of the Constitution, to the principles of moral reasoning that stood antecedent to the Constitution” (p. 245). He endeavors to recover, restate and defend those natural law convictions (what Blackstone called “the law of nature and reason”) that inspired this nation’s Founders, fully aware that “our current lawyers and professors of law find it hard to speak seriously in this vein” (p. 10).  He hopes to expose and refute the sophisticated silliness of the influential philosopher Richard Rorty, who quipped that ‘”nothing interesting can be said about truth.  It is almost literally not worth talking about'” (p. 11). And though Rorty represents the views of the left, influential thinkers on the right, such as Raoul Berger and Robert Bork, have similarly dismissed the relevance of earlier jurists’ concerns for “natural justice.” There is, quite simply, a radical disconnect between any absolute morality and judicial decisions because “morality” has been relegated to personal opinion and the Constitution is regarded as simply an evolving consensus of the people. That position, however, departs from that of the Founders, whose jurisprudence “was built on the connection that was traditionally understood between morals and law.  The Constitution they finally produced, as our second Constitution, could be understood and justified, only in moral terms, only by an appeal to those standards of natural right that existed antecedent to the Constitution. My argument in this book is that the Constitution produced by the Founders cannot be understood if it is detached from those moral premises” (p. 17).

The Founders believed that “judges were not free to shape the law according to their own enthusiasms. They were obliged, rather, to move from the stipulations of the positive law to the guidance of the natural law, or what Blackstone called at different times ‘common reason,’ or ‘the law of nature and reason'” (p. 22). They stood rooted in the tradition shaped by Aristotle and Aquinas, Grotius and Reid. And they shared the judicial reasoning of the Old Testament. For the past century, however, influential legal scholars have promoted the proposition that America’s Founders embraced “the ‘modem’ notion of natural rights put forth by Hobbes” that presumed “that rights were in fact surrendered in entering civil society. But not the least of the difficulties, passed over in this interpretation,” Arkes says, “is that if fails to take seriously the Christianity of the Founders. With men like Wilson and John Jay, the understanding of Christianity pervaded their writings and sustained their convictions about natural rights. The Author of the Universe, the Author of the laws of physics, was also the Author of universal moral laws. Any serious believer in a single, universal God, could of course understand that the God of the Universe would not create a separate moral law for New Jersey and France. These moral laws, then, were immanent in the universe” (p. 64). Importantly: “It was not the existence of government that created these rights; it was the existence of these rights that called forth and justified the existence of the government” (p. 64).

Nor can these natural rights be annulled by any government!  A good government simply protects them.  “Man has,” Supreme Court Justice Wilson said, ‘”a natural right to his property, to his character, to liberty, and to safety’—which is to say, that he has a right to be protected, so far as practicable, from virtually all species of injustice” (p. 65). Life, liberty, and property merit protection. Importantly, the First Amendment to the Constitution is “not itself the source of these rights” (p. 81). Thus the oft-repeated cliche that we enjoy certain rights “under the First Amendment” reflects the influence of legal positivism, not the view of the men who crafted it.

Illustrative of the move to legal positivism is the current “incorporationist” understanding of the Bill of Rights. Before 1925, the Supreme Court routinely held that the Bill of Rights applied only to the federal system, not the states. In Gitlow v. New York, however, “a new doctrine, or at least a new slogan, of jurisprudence would spring, namely, that the Fourteenth Amendment had “incorporated’ or absorbed the full inventory of provisions in the Bill of Rights and made them binding upon the states. Starting in 1947, Justice Hugo Black embraced this position emphatically and made it part of his agenda for the Court” (p. 157). Subsequently “the Court would extend to the states the provisions in the Bill of Rights, read in the most restrictive way” (p. 157). And this, quite simply, explains why the courts have issued many arbitrary decisions regarding abortion, civil rights, the separation of state and church, etc.

* * * * * * * * * * * * * * *

Hadley Arkes begins his Natural Rights and the Right to Choose (Cambridge: Cambridge University Press, c. 2002) with some “searing lines of Justice McLean, in his dissenting opinion in the Dred Scott case:  “You may think that the black man is merely chattel, but ‘He bears the impress of his Maker,and is amenable to the laws of God and man; and he is destined to an endless existence.’ He has, in other words, a soul, which is imperishable” (p. 1).  McLean’s moral absolutism, however, has largely evaporated in the modern world (influenced as it is by the ethical relativism of Nietzsche and Heidegger and their epigones) wherein even the “common man” generally espouses a version of “soft relativism” disguised as non-judgmentalism and tolerance. Consequently, Arkes says, “in the most affable and serene way, many Americans, and especially, members of the political class, have come to talk themselves out of the premises of the American Founders and Lincoln” (p. 7).

Though Lincoln probably never read Aristotle or Aquinas, he espoused a common sense realism and natural law ethic quite similar to theirs and “managed to bring the logic of natural rights to bear on the most vexing issue in our politics” (p. 17). His understandings no longer shape the intellectual life of the nation, especially in elite university circles, where fashionable movements such as deconstruction, postmodernism, radical feminism, and multiculturalism reign. Whatever their labels, however, they are merely new installments of an ancient philosophy: epistemological skepticism and moral relativism.

Such skepticism dramatically stamps the famous “mystery passage” in the 1992 Supreme Court decision. Planned Parenthood v. Casey, wherein the justices declared that “at the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life'” (p. 43). This led, within a decade, a federal judge to explain her overturning a law forbidding partial-birth abortion in New Jersey thusly: “There was, in reality, no child to be born, and no ‘delivery’ of a baby, because ‘a woman seeking an abortion is plainly not seeking to give birth'” (p. 43). Compare these recent dicta of legal positivism with the natural law argument of Alexander Hamilton in Federalist #31:  ‘”In disquisitions of every kind there are certain primary truths, or first principles, upon which all subsequent reasonings must depend.  These contain an internal evidence which, antecedent to all reflection or combination, command the assent of the mind. . . . Of this nature are the maxims in geometry that the whole is greater than its parts;…. Of the same nature are these other maxims in ethics and polities'” that provide the foundation for a good society (p. 40).

Hamilton’s certainties have disappeared in the nihilistic milieu of our postmodern times, wherein “the new jurisprudence reaches its completion by detaching itself from every premise necessary to the notion of lawfulness. It rejects the logic of natural rights; it denies that any of us has rights of intrinsic dignity because it denies that there is any such intrinsic dignity attaching to any human being, as the subject and object of the law” (p. 146). The judges who have promulgated this new “antijural jurisprudence” have made their position clear in decisions regarding “partial birth abortion.” Knowing they could not demonstrate a significant difference between “late term” and earlier abortions, they have determined that legislators “may not erect a barrier, indecorously firm, against infanticide if that legal proscription would have the effect of inhibiting abortions anywhere else in the stages of pregnancy” (p. 116). Granted, “they have not quite endorsed it in a full-throated way or proclaimed infanticide as a positive good. Yet they have made it clear, in a chilling way, that they will not be put off, or distracted, from the defense of abortion, even in the cases where abortion merges with outright infanticide” (p. 125).

This openness to infanticide was evident to Arkes when he was closely involved drafting and promoting the “Bom-Alive Infants Protection Act.” Abortion rights organizations opposed the legislation, but remained largely silent because friendly politicians warned them that the people overwhelmingly supported it. But NAROL and Planned Parenthood—and supportive politicians—share Professor Peter Singer’s conviction “that human beings only become ‘persons,’ and acquire a right to life, sometime well after birth'” (p. 155). In response. Professor Robert “George crystallized the matter: ‘The legitimization of infanticide constitutes a grave threat to the principle of human equality at the heart of American civil rights ideals'” (p. 155). Both Arkes and George know that the struggle over abortion is in fact “a more complicated argument over natural rights” (p. 155).

That American judges are on the cusp of implementing infanticide leads Arkes to devote a chapter to some “prudent warnings and imprudent reactions: ‘judicial usurpation’ and the unraveling of rights.” In their absolutist defense of abortion rights, federal judges have repudiated the “natural rights” that were basic to the republic designed by the Founders and defended by Abraham Lincoln.  Even William Rehnquist, a notably “conservative” chief justice, justified his views in terms of legal positivism, not natural rights—siding with Nietzsche rather than Lincoln. In response.  Harry Jaffa said; ‘”To say that safeguards for individual liberty do not have any intrinsic worth is to say that individual liberty does not have any intrinsic worth.  To say that individual liberty does not have any intrinsic worth is to say that the individual human person does not have any intrinsic worth. This is to deny that we are endowed with rights by our Creator.  To deny that is in effect to deny that there is a Creator. This is atheism and nihilism no less than moral relativism'” (p. 176).

A new generation of judges, laments Arkes, have been “fed on the notion that judges, in modem America, rule” (p. 207). Rendering impotent the Constitution, with its clear separation of powers, powerful judges have violated their rightful role as interpreters of the law. “As Alexander Hamilton had remarked in the Federalist #78, the Court had no control of the sword or of the purse—it had ‘neither force nor will, but merely judgment; and must ultimately depend upon the aid of the executive arm even for the efficacy of its judgments.’ The power of the Court would ultimately depend, then, on the force of its reasoned argument.

With that sense of the matter, Lincoln insisted that other officers of the government could not be obliged to accept any new ‘law’ created by the Court unless the, too, were persuaded of its rightness” (p. 219). To deny that, Lincoln believed, would mean that an unelected, “eminent tribunal” had seized control of the republic. And that’s what’s happened during the past century! Judges now rule.

177 Cultural Deathworks

Few thinkers have more deeply probed the currents of modernity than Philip Rieff, a professor of sociology emeritus at the University of Pennsylvania. His The Triumph of the Therapeutic (reviewed in my “Reedings” #91) is perhaps the finest analysis of one of the most profound cultural shifts in the 20th century:  from objective faith and reason to subjective experience and emotion.  This became evident in Christian circles, where personal experience dislodged creedal affirmation. “Religious man was born to be saved,” Rieff said, whereas “psychological man is born to be pleased. The difference was established long ago, when ‘I believe/ the cry of the ascetic, lost precedence to ‘one feels,’ the caveat of the therapeutic” (pp. 24-25). Within the Christian tradition, this trend solidified as early in 1857, when Archbishop Temple, favorably echoing the thought of F.D.E. Schleiermacher, said:  ‘”Our theology has been cast in a scholastic mould, all based on logic.  We are in need of…a theology based on psychology'” (pp. 41-42).  Today’s “therapeutic gospel” has deep cultural roots in liberal theology and cannot, perhaps, be severed from it.

In My Life among the Deathworks: Illustrations of the Aesthetics of Authority (Charlottesville:  University of Virginia Press, c. 2006), the first volume of a trilogy entitled Sacred Order/Social Order, Rieff explores the fact that “cultures give readings of sacred order and ourselves somewhere in it.” Throughout human history, James Davison Hunter explains, in his helpful “Introduction,” all cultures have been “constituted by a system of moral demands that are underwritten by an authority that is vertical in its structure. …. These are not merely rules or norms or values, but rather doxa: truths acknowledged and experienced as commanding in character” (p. xix). First (pagan) and Second (Judeo-Christian) World Cultures, to use Rieffs categories, humbly aligned themselves with a higher, invisible Reality—the Sacred.

The modem (what Rieff labels “Third World”) culture shapers, working out the position espoused by Nietzsche’s Gay Science in 1882 (declaring that “God is dead”) have negated that ancient sacred order. Turning away from, indeed assailing, any transcendent realm, they have rigidly restricted themselves to things horizontal—material phenomena and human perspectives.  Rather than reading Reality, they actively encourage illiteracy regarding it—e.g. idiosyncratic “reader responses” to “texts,” the venting of personal opinions, and the construction of virtual realities.  Their relentless attacks upon the sacred are what Rieff calls “deathworks” that are both surreptitious and ubiquitous, shaping the arts and education, dominating movies and TV, journalism and fiction, law schools and courtrooms. As he says: “There are now armies of third world teachers, artists, therapists, etc., teaching the higher illiteracy” (p. 92).

Throughout the treatise, Rieff weighs the import of the raging culture war. This Kulturkampf “is between those who assert that there are no truths, only readings, that is, fictions (which assume the very ephemeral status of truth for negational purposes) and what is left of the second culture elites in the priesthood, rabbinate, and other teaching/directive elites dedicated to the proposition that the truths have been revealed and require constant rereading and application in the light of the particular historical circumstance in which we live. And that those commanding truths in their range are authoritative and remains so” (p. 17). He especially emphasizes that: “The guiding elites of our third world are virtuosi of de-creation, of fictions where once commanding truths were” (p. 4). In denying all religious and moral truths, they have established an effectually godless “anti-culture.” RiefFs analyses of influential artistic works (many of them reproduced in the text) are particularly insightful and persuasive. What was evident a century ago only in a few artists, such as James Joyce, Arnold Schoenberg and Pablo Picasso, and psychoanalysis such as Sigmund Freud and Karl Jung, now dominates the mass media and university classrooms, where postmodern gurus Michel Foucault and Jacques Derrida are routinely invoked.

One thing these elites, “in the world-affirming immanentism of their ‘value’ conventions,” will not acknowledge:  any transcendent,”divine creator and his promised redemptive acts before whom and beside which there is nothing that means anything” (p. 58). Nietzsche folly understood this, propounding “a rationalism so radical that it empties itself, as God the Father was once thought to have emptied himself to become very man in the Son. Kenotic theory [pervasively evident in 20th century theologians who stress the humanity of Christ to the neglect of His deity] lives in the deadly therapeutic rationalism of the third culture. In that transferred kenosis, the human becomes not a god but an artist, a mad artist who is given an empty canvas, fills it with the likeness of panic and emptiness, and declares it his masterpiece” (p. 70).

Rieffs grandfather, a survivor of the Nazi death camps, “was appalled to discover not only in the remnant of his family in Chicago but in the Jewish community of the family’s Conservative synagogue … that the Jewish sense of commanding truth was all but destroyed. Those old traditions were treated as obsolete, replaced by the phrase that horrified my grandfather most: everyone is entitled to their own opinion” (p. 82). The nihilism of the Nazis flourished in Chicago! To Rieff, Auschwitz signifies “the first full and brutally clear arrival of our third world” (p. 83). But the death camps, both Nazi and Bolshevik, were, quite simply, the logical culmination of Hamlet’s ancient view that “there is nothing good or bad in any world except thinking makes it so. M. Descartes and his progeny have a lot to answer for” (p. 83).

What was manifest in Auschwitz, Rieff says, is equally evident in the world’s abortion mills!  In one of Sigmund Freud’s prophetic letters, we read his “death sentence, casually uttered, upon sacred self: ‘Similarly birth, miscarriage, and menstruation are all connected with the lavatory via the word Abort (Abortus).’  How many things,” Rieff muses, “turn before my eyes into images of our flush-away third world” (p. 104). Rejecting “pro-choice” advocates’ denials, he insists: “The abortionist movement does bear comparison the Shoah [the Jewish Holocaust]. In these historic cases both Jews and ‘fetuses’ are what they represent, symbols of our second world God. It is as godterms that they are being sacrificed” (p. 105, ft. 31). The sacrilegious, barbarous essence of our world stands starkly revealed in these deathworks!

My Life among the Deathworks, says Hunter, “is stunning in its originality, breathtaking in its erudition and intellectual range, and astonishing in the brilliance of its insights into our historical moment” (p. xv). It is however “difficult, intentionally so,” because “Rieff wants the reader to work for the insight he has to offer; to read and then reread” (p. xvi). The book rather resembles Pascal’s Pensees—a collage of aphorisms and illustrations (many of them paintings) rather than a systematic development of a thesis. The book does, however, richly reward the reader’s persistence!

* * * * * * * * * * * * * * * * 

In a very different (and more elementary) way Ramesh Ponnuru’s The Party of Death: The Democrats, the Media, the Courts, and the Disregard for Human Life (Washington: Regnery Publishing, Inc., c. 2006) explores the same phenomena as Rief’s My Life among the Deathworks.  Both books forthrightly uphold the sanctity of life, and William F, Buckley, Jr. says “Ponnuru’s book will be accepted almost immediately as the seminal statement on human life. The Party of Death is stunning as scholarship, ingenious in its construction, passionate—but never overbearing—in its convictions. It will be read for decades, and revered as the most complete and resourceful essay on great questions that divide America.”

Ponnuru (who once supported the “pro-choice” position) announces his theme in the book’s first sentence: “The party of death started with abortion, but its sickle has gone from threatening the unborn, to the elderly, to the disabled; it has swept from the maternity ward to the cloning laboratory to a generalized disregard for ‘inconvenient’ human life” (p. 1). He notes that the alleged pro-abortion “emanations and penumbras” discovered by Justice Harry Blackman and the Supreme Court in Roe v. Wade are in fact more vividly seen in “our law, politics, and culture” (p. 2). Though the “party of death” is a cultural, not a political, phenomenon, the Democratic Party has increasingly become “the party of abortion on demand and embryo-killing research, and is on its way to becoming the party of assisted suicide and euthanasia. And it is the party of those for whom abortion has become a kind of religion” (p. 2). California Senator Barbara Boxer’s website, for example, proudly identifies her as the “‘Senate’s leading defender of a woman’s right to choose [abortion]'” (p. 35).  She favors legislation that would force doctors (regardless of personal conscience) to perform abortions.  She, along with senators Hillary Clinton, Ted Kennedy, and Joe Lieberman, have worked for the passage of the Freedom of Choice Act, which would eliminate any state or federal restrictions on taking the lives of unborn babies. They have opposed any restrictions on “partial birth” abortions. In 2004, folly three-fourths of the Democrats in Congress opposed legislation (Laci’s law) that identified the unborn child as a second victim of murder when both mother and child are killed. Individual Democrats may very well defend the sanctity of life, but the Party itself has taken a dogmatic and intensely intolerant position regarding a “woman’s right to choose.” The Democratic Party (personified by senators John Kerry and Barbara Boxer) has virtually banished pro-life politicians from its leadership positions.

The legalization of abortion, in Roe v. Wade, in 1973, has decisively shaped our culture.  That decision “has given rise to a radical challenge to human rights (radical because it denies the existence of human rights at their roots)” (p. 4).  Ponnuru carefully examines Roe (as well as its companion and more far-reaching decision handed down on the same day. Doe v. Bolton) and refutes many popular misunderstandings regarding the legal status of the procedure.  In fact the Court “has effectively forbidden any state from prohibiting abortion even in the final stages of pregnancy” (p. 10). Consequently, a seismic fault has divided the nation. The main difference between red and blue states is abortion. The real (if rarely mentioned) issue debated in Senate Judiciary committee hearings is abortion. Arguments about a variety of subjects are, more deeply, about abortion!

Ironically, the most fervent supporters of abortion rights rarely use the word! In this area euphemisms abound! In fact: “Abortion-on-demand has been made possible by the verbal redescription of human beings as though they were something else: ‘products of conception,’ ‘protoplasm,’ ‘a few cells,’ ‘potential life.’ The abortionist does not suck out the baby’s brains; the abortion provider evacuates the cranial contents of the fetus or, even better, ‘reduce[s] the fetal calvarium'” (p. 56). “One abortionist testified that his goal was to ‘safely and efficiently empty the uterine cavity, rendering the woman unpregnant'” (p. 61). Rather than discuss “partial birth abortion,” its defenders insist on calling it a “D&X or Intact D&E because these terms convey no information to most people” (p. 45).

Abortion rights advocates also lie, routinely and deliberately. Contrary to the wildly inflated statistics regarding the thousands of women endangered by illegal abortions cited by pro-abortionists like Bernard Nathanson (who later admitted they simply made up numbers), in 1972 there were only 41 women who died undergoing illegal abortions, while 24 died that year as a result of legal abortions. When Congress, in the 1990s, began to pass legislation banning partial birth abortions. Planned Parenthood declared that the procedure was “extremely rare,” involving only 500-600 cases per year (arbitrarily reduced by the Los Angeles Times to 200!). Before long, however, a reporter discovered one New Jersey clinic that “performed 1,500 partial-birth abortions per year” (p. 47). NAROL Pro-Choice and Planned Parenthood routinely manufacture “facts” to sustain their propaganda.

History professors aided the cause when 400 of the nation’s distinguished scholars signed on to an influential brief submitted to the Supreme Court in Planned Parenthood v. Casey.  They “claimed that Americans had recognized the right to choose abortion at the time of the Republic’s founding” (p. 106). Opposition to abortion, they asserted, was a relatively recent development!  Taking the historians’ brief at face value, law professors (such as Harvard’s Laurence Tribe) and philosophers (such as Harvard’s Ronald Dworkin) promoted it as an accurate rendition of the past.  But Ponnuru details, with painstaking patience, the perniciously erroneous nature of the historians’ brief.  “The historians reached their false conclusions by mischaracterizing sources, misreporting facts, and supporting claims with citations that have no relevance to those claims. They ripped quotations out of context. They relied on discredited sources” (p. 116). In short: nothing dissuades abortion rights advocates from pursuing their agenda. Though I’ve focused exclusively on Ponnuru’s discussion of abortion, he also addresses, in Part II, farther aspects of the “bioethics of death,” involving mercy killing, embryonic stem cell research, the sale of body parts, and infanticide. He takes seriously the ethics and influence of Princeton’s notorious Professor Peter Singer, and shows how utilitarian positions such as his now shape Holland’s “health care” system, where growing numbers of children and adults are euthanized when found unworthy of life.

In Part III, “Life and the Parties,” Ponnuru examines the party of death’s public face. He shows how energetically the media, following the lead of the New York Times, with its “tone of contempt” for pro-life folks, seek to advance what philosopher Ronald Dworkin has identified and lauded as “choices for death.”  One judicious study concluded that “97 percent of media elites” support abortion rights. Of the 217 reports on partial-birth abortion on the big three TV networks, only 18 accurately depicted the procedure.

Despite, this, however, Ponnuru shows that the tide may be turning in pro-life directions. Statistical studies show a decline in support for abortion. Even Democratic leaders seem to be re-thinking their party’s stance, wondering if it has contributed to their steady loss of power during the past 30 years. If so. The Party of Death must be saluted as one of the best accounts of substantial reasons for that change.

* * * * * * * * * * * * * * * * *

That Roger Kimball is one of the nation’s premier culture-critics is evident in a collection of his essays:  Experiments Against Reality: The Fate of Culture in the Postmodern Age (Chicago: Ivan R. Dee, c. 2000).  The book’s title, he says, comes from “Hannah Arendt’s description of totalitarianism as a sort of ‘experiment against reality’—one that, among other things, encouraged people to believe that ‘everything was possible and nothing was true.'” Furthermore, “what Arendt called a ‘mixture of gullibility and cynicism'” seems amply evident, indeed triumphant, in today’s “postmodern” culture. That “nothing is true” is one of the main planks of postmodernism. (The English philosopher Roger Scruton has aptly countered: “The man who tells you truth does not exist is asking you not to believe him. So don’t.”) But postmodern thinkers, following the lead of Friedrich Nietzsche, cheerfully ignore the logical contradiction at the heart of their rhetoric and insist that “truth” is pretty much whatever one wants it to be.

In Kimball’s judgment, Nietzsche, who famously declared that there are no facts, no truths, only interpretations, indwells (like a virus) “almost every destructive intellectual movement this century has witnessed” (p. 6). Today’s university professors dispense “what we might call Nietzscheanism for the masses, as squads of cozy nihilists parrot his ideas and attitudes. Nietzsche’s contention that truth is merely ‘a moveable host of metaphors, metonymies, and anthropomorphisms,’ for example, has become a veritable mantra in comparative literature departments across the country” (p. 193). Determined to move “beyond good and evil,” Nietzsche defined “the good as that which enhances the feeling of life. If ‘to see others suffer does one good, to make others suffer even more,’ then violence and cruelty may have to be granted the patent of morality and enlisted in the aesthete’s palette of diversions. In more or less concentrated form, Nietzsche’s ideal is also modernity’s ideal. It is an ideal that subordinates morality to power in order to transform life into aesthetic spectacle. It promises freedom and exaltation. But as Novalis points out, it is really the ultimate attainment of the barbarian” (p. 213).

Kimball demonstrates, in essays dealing with significant 19th and 20th century poets (Eliot, Stevens, Auden), philosophers (Mill, Nietzsche, Sartre, Foucault), and novelists (Spark, Musil) the various ways great thinkers have approached reality. For some, like T. S. Eliot, there was “a craving for reality” that was manifestly evident in great poetic works such as The Four Quartets. Eliot understood that culture and religion cannot be severed—'”if Christianity goes,’ he said, ‘the whole of our culture goes'” (p. 79). Contradicting one of the guiding premises of postmodernism, long before it was recognized as a movement, Eliot said: ‘”Man is man . .. because he can recognize supernatural realities, not because he can invent them'” (p. 81). There is an objective Reality, and we can know truths about it.

Eliot’s contemporary, Wallace Stevens, by contrast, was a “metaphysical claims adjuster.” A disciple of William James, he determined to believe whatever appealed to him, for all beliefs are fictions that he knows are fictions. ‘”The exquisite truth,’ wrote Stevens, ‘is to know that it is a fiction and you believe in it willingly'” (p. 90). Living by fictions, however, failed him, and in Stevens’ final published poem, “As You Leave the Room,” we read: ‘”I wonder, have I lived a skeleton’s life, / As a disbeliever in reality'” (p. 93). Indeed, as he elsewhere lamented: ‘”A fantastic effort has failed'” (p. 93). Significantly, in the final days of his life, he entered, through baptism, the Roman Catholic Church in 1955.

In John Stuart Mill we find the great champion of libertarianism and feminism, “indispensable elements in the intoxicating potion that constitutes Millian liberalism and that makes much of his thinking so contemporary” (p. 161). Yet, though Mill is generally portrayed as a champion of individual freedom, his On Liberty, Maurice Cowling said, is ‘”not so much a plea for individual freedom, as a means of ensuring that Christianity would be superseded by that form of liberal, rationalizing utilitarianism which went by the name of the Religion of Humanity'” (p. 166). Indeed, ‘”Mill, no less that Marx, Nietzsche, or Comte, claimed to replace Christianity by “something better” (p. 167).

Michel Foucault, arguably the most influential of recent postmodernists, was (like the Marquis de Sade, whom he lionized) fascinated with death. He “came to enjoy imagining ‘suicide festivals’ or ‘orgies’ in which sex and death would mingle in the ultimate anonymous encounter” (p. 240). To his admirers, “Foucault’s penchant for sadomasochistic sex was itself an indication of admirable ethical adventurousness” (p. 241). Following this penchant, he plunged into (at the age of 50) the gay bathhouse scene in San Francisco, unleashing his desire for ‘”the overwhelming, the unspeakable, the creepy, the stupefying, the ecstatic,’ embracing ‘a pure violence, a wordless gesture'” (p. 247). Whether or not he knew he was dying of AIDS cannot be demonstrated, but he clearly followed, in his final years, the “Faustian pact” he celebrated in volume one of his The History of Sexuality—willingly exchanging ‘”life in its entirety for sex itself, for the truth and the sovereignty of sex. Sex is worth dying for'” (p. 252). And die he did, in San Francisco, of AIDS, at the age of 57.

As a perfect antidote to Foucault et al., Kimball brings us, in his final chapter, to Josef Pieper, the great Thomistic philosopher, who urges an openness to (rather than experiments with) reality. Pieper, following the lead of Aquinas and Aristotle, says Kimball, has the answers profoundly lacking in postmodernism. “Cardinal Newman was right when he observed that, about many subjects, ‘to think correctly is to think like Aristotle'” (p 336). Or, Pieper would add, to think like Aquinas and know God! 

176 “Knowing the Enemy”: Jihadists

Much has been written, since 9/11, regarding the “roots” responsible for Islamic terrorism.  Marxists cite economic inequities, Anti-Americans blame U.S. imperialism, Palestinian proponents fault Israel, and Howard Dean targets George Bush.  Still others, acknowledging the religious rhetoric of the terrorists, have sought to locate reasons for their violence in The Qur’an while Islamic apologists allege that Islamic “fundamentalists” have hijacked the holy book of a peaceful religion.  Providing a thorough and thoughtful evaluation of this complex issue, Mary R. Habeck, an associate professor at the School of Advanced International Studies, John Hopkins University, has written Knowing the Enemy:  Jihadist Ideology and the War on Terror (New Haven:  Yale University Press, c. 2006).  

She gets right to the point in her first chapter:  “Why They Did It.”  Obviously al-Qaida orchestrated the 2001 assault on America.  Equally obvious, not all Muslims support al-Qaida, so the terrorists’ roots are located in only a slice of Islam.  Professor Habeck argues that they “are part of a radical faction of the multifaceted Islamist belief system.  This faction—generally called ‘jihadi’ or ‘jihadist’—has very specific views about how to revive Islam, how to return Muslims to political power, and what needs to be done about its enemies, including the United States.”  The extremists differ from other Muslims in their “commitment to the violent overthrow of the existing international system and its replacement by an all-encompassing Islamic state.  To justify their resort to violence, they define ‘jihad’ (a term that can mean an internal struggle to please God as well as an external battle to open countries to the call of Islam) as fighting alone.  Only by understanding the elaborate ideology of the jihadist faction can the United States, as well as the rest of the world,  determine how to contain and eventually end the threat they pose to stability and peace” (pp. 4-5).  

In the jihadists’  historical account, devout Muslims followed Allah’s will—the “true faith”—for a millennium (or parts thereof).  Then apostasy swept away large segments—or perhaps all—of real Islam.  Christians and Jews began to dominate the world, including much of the world earlier ruled by Islam.  Some jihadists locate the apostasy quite early, following the four righteous Caliphs of the seventh century.  Others mark the final collapse in “1924, when Mustafa Kemal Ataturk abolished the Ottoman Caliphate” (p. 11).  Whatever the various explanations, it’s clear that today’s terrorists want both to make radical changes in the Islamic world and destroy all Western powers that oppose them.  To the al-Qaida terrorists, attacking the U.S. on 9/11 would “begin the ultimate destruction of falsehood around the world” (p. 14) and lead to the establishment of a world-wide Islamic state.  

Professor Habeck provides a careful historical explanation of jihadism, detailing the influence of thinkers such as Ibn Taymiyya in the 13th century, Muhammad ibn ‘Abd al-Wahhab in the 18th, and Sayyid Qutb in the 20th.  Today’s terrorists, whether in Hamas, Hezballah, the Muslim Brotherhood, or al-Qaida, almost all justify their frenzy with interpretations of Islam given by these clergymen, all of whom championed aggressive, violent forms of jihad.  Though they could not but rely upon the Qur’an as their “constitution,” these radicals quote it quite selectively and generally rely heavily upon supplementary materials, such as the Hadith and biographies of Muhammad, for their distinctive messages.  

Their version of Islam, which they consider the only true one, is comprehensive and totalizing.  The one true God, Allah, is totally in control, and all non-Islamic religions and political institutions are evil as well as false.  No human laws deserve respect, for Allah has prescribed everything, down to tiny details, necessary for a righteous society.  All property is God’s, to be controlled by His representatives, and individual freedom is an anathema.  The Muslim-controlled world (dar al-Islam) ever wars with the rest of the world (dar al-harb), and ultimately all peoples must submit to the true faith and embrace Islamic law (the Sharia).  Parts of the world once ruled by Islam, including Israel and Spain, must be reclaimed, as was Palestine from the Crusaders in the 12th and 13th centuries.  “Israel is seen as part of the military assault by the West to ‘subjugate a portion of the Muslim world permanently’” (p. 97), so it must be eliminated.  Importantly, to some jihadists, wherever an enclave of Muslims takes up residence dar al-Islam (the house of Islam) is de facto established.  In such communities, whether in England or Canada, Islamic law must be established.  Immigrants thus lay claim to pockets of the world that is later to be incorporated into the world-wide Islamic state.  Meanwhile, Muslim warriors, waging jihad against dar al harb (those outside the house of Islam) justifiably use any means necessary to attain their righteous ends—lying, looting, suicide bombings, terror tactics designed to elicit fear and capitulation are all “good.”

Many of the jihadists take Muhammad’s life as the perfect pattern.  In Mecca he first enunciated the religious principles of Islam.  Moving to Medina, he launched an offensive jihad, waging war and leading raids and seizing booty.  There he established a state-within-a-state, building up strength until he could at last return to Mecca and lay claim to his rightful role as ruler, both religious and secular, of the world.  To extremists, such as Osama bin Laden, places such as Afghanistan under the Taliban are modern equivalents of Medina.  From there jihadist assaults are to be launched, preparing the way for the final victory of Islam.  Osama bin Laden, says Habeck, was shocked that the U.S. did not collapse following the 9/11 attacks.  He fully expected a repeat of President Reagan’s retreat from Lebanon in 1984 and President Clinton’s withdrawal from Somalia in 1993.  That President Bush attacked rather than postured proved that America was not the paper tiger the jihadists believed.  Yet thee jihadists are prepared for a long struggle, much like the 200 years needed to expel the Crusaders from Palestine.  And if we are to effectively wage the war on terror, Habeck insists, we must first understand the depth of the jihadists’ convictions.  Along with military response, there must be an awareness of the theological basis for the terrorists’ resolve.  Radical preachers, as well as suicide bombers, must be countered.  And ultimately, she hopes, if the jihadists are appropriately stigmatized, a large majority of Muslims will turn away from extremism and embrace a healthy form of democracy that will resolve the conflict now endangering the world.

* * * * * * * * * * * * * * * * * * * 

Bat Ye’or, an Egyptian living in Switzerland, writing under a pen name to help insure her survival, has devoted her life to the study of Islam.  In Islam and Dhimmitude:  Where Civilizations Collide (Madison, N.J.:  Fairleigh Dickinson University Press, c. 2002), she provides historical documentation—ample appendices, notes, and bibliography—for what happens to people (ethnic majorities treated as religious minorities) whose lands fall prey to the sword of Islam.   As was evident in her earlier work, The Decline of Eastern Christianity Under Islam (which I reviewed “Reedings” #131 ), she brings a relentlessly critical, indeed hostile, approach to her subject.  But, importantly, she demonstrates a mastery of her sources and provides ample documentation for her assertions.  

The story of dhimmitude began in Medina, as soon as Mohammed seized control of the settlement in 622 A.D.  He and the Jews of Khaybar made an agreement (a dhimma) whereby Jews were allowed to continue farming “their lands, but only as tenants; he demanded delivery of half their harvest and reserved the right to drive them out when he wished” (p. 37).  In return, he promised to provide military protection.  Thus was established the pattern of dhimmitude that endures to this day:  first there is a jihadist conquest, then taking booty and seizing land, and finally the abject subservience of all unbelievers to Islamic rule.  Dhimmi were forbidden to bear arms or own land or ride horses, and they were forced to wear distinctive clothing, and pay extortionate taxes.  They were forcibly removed from the sacred soil of Arabia and were often reduced (despite clear Islamic laws forbidding it) to slavery.  Indeed, Christian slaves, in places of influence, often mitigated the harsher aspects of dhimmitude.  Though Jews and Christians were technically “free” to worship, it was a tightly limited freedom, and “[u]nder the Fatimid caliph al-Hakim (996-1021), every church and synagogue in Egypt, Palestine, and Syria was demolished” (p. 85).  

Bat Ye’or traces the pattern established in the 7th century as it was sustained over the next 14.  At times the dhimmi managed to live in a somewhat satisfactory accord with their Muslim masters.  At other times intense persecution (indeed genocidal attacks such as took place in Armenia a century ago) made it almost intolerable for Jews and Christians to survive under Islamic rule.  “The Armenian tragedy,” she concludes,” is not an unique phenomenon; it belongs to an immense historical cycle of dhimmitude that still operates—in Lebanon, Sudan, in the war against Israel, and in other Muslim-Christian conflicts.  This cycle has its own characteristics, linked to the principles and values of the civilizations of jihad” (p. 374).  Following WWI, and the instability that ensued as a result of Europe’s policies, Arab nationalism (often attuned to Nazi propaganda) became both intense and vitriolic, especially regarding the Jews.  “Anti-Zionist terrorism was merely the modern version of jihad” (p. 173).  Nearly one million Jews were removed from Muslim lands in the Middle East following the founding of Israel in 1948.  

More than historically important, jihad, dhimmitude and shari’a are essential, unchanging components of Islam.   In southern Sudan, let us never forget, some two million (largely Christian) people perished.  “Abduction, slavery, and forced Islamization of children . . . are similar to those mentioned in the historical records relating to jihad” (p. 206).  (Sadly enough, when details regarding slavery in Sudan came to light, neither the U.N., the Vatican, the World Council of Churches, nor influential NGOs dared criticize the Islamists ruling the country!)  To Bat Ye’or, “for all modern Islamists, the aim of jihad will always remain the expansion of Islam—by war or by persuasion—over the entire world, and the establishment worldwide of shari’a, the law of Allah.  The concept of dar al-harb embraces all non-Muslim countries; they constitute the empire of Evil and ignorance, the jahiliyya—the Arabic word used for the ‘paganism’ that preceded Islam.  It is the religious duty of Muslims to replace it with the empire of Good and of True Faith, which is Islam” (p. 218).  

Islamic goals are facilitated by various intra-dhimmi conflicts, especially evident in a number of Christian churches’ efforts to benignly portray and compromise with Islam.  Leaning in the direction of the heretic Marcion, who eliminated the Old Testament from his “Christian” canon in the second century, some modern churches have excised or minimized references to Jewish aspects of the Christian faith.  Others urge believers to embrace dhimmitude, to “serve” their Muslim masters under the guise of love and compassion—a move that strikes Bat Ye’or as both devious and wrongheaded.  Compounding these  problems, numbers of Christians (all too many of them high ranking ecclesiastics) court favor with the Muslims and profit from the alliance.

Equally important in assisting jihad are the Europeans and Americans who champion the Muslim (and particularly the Palestinian) cause.  For some French politicians, the reason is economic, since Middle East oil sustains the modern industrial system.  For others, especially professors in prestigious Western universities who are  committed to “multiculturalism” and tolerance, siding with Islam is a mark of modern secularism.   In the process, they rewrite history in much the same fashion as Stalinists updated school textbooks to fit the Party line.  To an alarming extent the West has legitimized dhimmitude, establishing “at all levels a dissymmetry in respect of human rights, freedom of the press, of opinion and religion, as well as of democratic rights.  The reason is that dhimmitude is not recognized as a crucially important page of world history.  Hence the West has adopted the Islamic view of history, where dhimmi nations had no history, no culture, no existence.  Indeed, dhimmi peoples have neither a cause nor history.  They do not have the right to claim any reparations for the centuries of exile, deportations, spoilations, massacres and persecution.  They do not even have the right to speak of this” (p. 398).  

But speak of this Bat Ye’or does!  

* * * * * * * * * * * * * * * * * * * 

Though Bat Ye’or has devoted her life to historical research, her most recent publication—Eurabia:  The Euro-Arab Axis (Madison, N.J.:  Fairleigh Dickinson University Press, c. 2005)—focuses almost singularly upon current conditions.  “This book describes Europe’s evolution from a Judeo-Christian civilization, with important post-Enlightenment secular elements, into a post-Judeo-Christian civilization that is subservient to the ideology of jihad and the Islamic powers that propagate it.  The new European civilization in the making can be called a ‘civilization of dhimmitude’” (p. 9).  To grasp her message, she warns us that Islamic jihad has, during the past 1300 years, transformed “once thriving non-Muslim majority civilizations” into appalling states “of dysfunctional dhimmitude,” impoverished and oppressed (p. 9).  Jihadist tactics never change:  “Hostage taking, ritual throat slitting, the killing of infidels and Muslim apostates are lawful, carefully described, and highly praised jihad tactics recorded, over the centuries, in countless legal treatises on jihad” (p. 159).  Looking at the globe, wherever Islam has taken root earlier (and often higher) civilizations “have disappeared.  Others remain as fossilized relics of the past, unable to evolve” (p. 9).  What happened in the Middle East centuries ago could happen to Europe, Bat Ye’or warns, unless movements currently in motion are quickly reversed.  

Too weak today to mount a military invasion to conquer Europe, Islamists have devised less overt strategies.  Relentlessly trumpeting “peace and justice,” Muslim emissaries have worked to subvert the Judeo-Christian West and are successfully establishing “Eurabia.”  They have persuaded Europeans to defend radical Muslim positions, especially in support of Palestinians, in order to maintain economic ties with oil-rich Muslim states.  “The huge sums that the EU pays to Arab Mediterranean countries and the Palestinians amount to another tribute exacted for its security within the dar al-sulh.  Europe thereby put off the threat of a jihad aimed at the dar al-harb by opting for appeasement and collusion with international terrorism—while blaming the increased world tensions on Israel and America so as to preserve its dar al-sulh position of subordinate collaboration, if not surrender to the Islamists” (p. 77).  

Within a generation, this collaboration has led to the establishment of Eurabia—a process, Bat Ye’or insists, illustrating continuous Muslim demand for land in exchange for peace and security.  This “is the foundation of the Islamic jihad-dhimmitude system” (p. 104) and is manifestly evident in the relentless  attacks on Israel.  “By implicitly enlisting in the Arab-Islamic jihad against Israel—under labels such as ‘peace and justice for the Palestinians’—Europe has effectively jettisoned its values and undermined the roots of its own civilization.  It even struck a blow against worldwide Christianity, abandoning the Christians in Lebanon to massacres by Palestinians (1975-83), those of Sudan to jihad and slavery, and the Christians of the Islamic world to the persecutions mandated by the dhimma system” (p. 115).  

Within Europe itself a parallel movement has taken place within one generation.  Enormous immigration from Muslim lands has changed the demography of the continent.  Islamic cultural centers, mosques and schools, have proliferated in Europe—whereas nothing comparable has been allowed in Islamic countries.  The success of the Muslim agenda is markedly evident in the policies established by the European Union (which recently proposed a constitution omitting any reference to Christianity), entailing “six main themes:  1) the Andalusian utopia; 2) the alleged Islamic cultural superiority over Europe, and hence the inferiority of the latter; 3) the creation of a Western Palestinian cult, (Palestinocracy); 4) European self-guilt; 5) anti-Zionism/antisemitism; 6) anti-Americanism and Christianophobia” (p. 161).  

None of these themes, all of them manifest misrepresentations and falsifications of history, withstands scholarly scrutiny.  Yet the “Muslim version of history is now being taught and accepted in Europe and America, while more accurate treatments” are disregarded (p. 195).  To a large degree, today’s historians are, like pampered intellectuals in Byzantine lands conquered by Muslims, dhimmis who refuse to fight for the truth.  They are happy with a cultural dhimmitude “based on peaceful surrender, subjection, tribute, and praise” (p. 204).  The same must be said for scriptwriters working for TV and movies—nothing critical is to be said about Islam, whereas there are no limits to anti-Christian sneers and polemics.  And Christian clerics have been perhaps the worst of the compromisers and apologists for the Islamic agenda!  

All of this was exposed by America’s response to 9/11.  “The effect of America’s unmasking of Islamist terrorism, which Europe had officially denied and tried to deflect onto Israel, was profound.  The Iraqui war brought to the surface the anti-Americanism that had been simmering for years among European Arabophiles, neo-Nazis, Communists, and leftists in general” (p. 227).  It is now clear that the European Union,  and the intelligentsia that supports it, “is implicitly abetting a worldwide subversion of Western values and freedoms, while attempting to protect itself from Islamic terrorism by denying that it even exists, or blaming it on scapegoats” (p. 227).  

Bat Ye’or argues her case with ample evidence.  Her bleak appraisal of Europe’s prospects cannot but dismay folks like me, rooted in respect for European culture.  But as events continue to unfold, I suspect that she, and not the champions of tolerance and negotiated peace settlements, rightly understands the truth about Islam’s unending jihad.  Anyone seriously interested in the reasons our world is rent with terrorism—and already immersed in a world war—ought not avoid a thoughtful reading of her books.  

* * * * * * * * * * * * * 

Even more ferociously anti-Islamic than Bat Ye’or is the celebrated Italian journalist Oriana Fallaci, whose The Force of Reason (New York:  Rizzoli International, c. 2006) details both the persecution, censorship and death-threats she has endured for daring to criticize Islam as well as her reasons for doing so.  (This book is a sequel to The Rage and the Pride, which I reviewed in “Reedings” #136).  Muslims, demonstrating “the only art in which the sons of Allah have always excelled, the art of invading and conquering and subjugating,” are marching, and “[t]heir most coveted prey has always been Europe, the Christian world,” now rapidly submitting to Islamic aggression (p. 36).  She provides a rapid overview of history, concluding “that today’s Islamic invasion of Europe is nothing else than a revival of its centuries-old expansionism, of its centuries-old imperialism, of its centuries-old colonialism.  More underhand, though.  More treacherous” (p. 51).  Muslims today are moving against Europe through immigration, petro-diplomacy, and jihad.  Indeed, “the war Islam has declared on the West is not really a military war.  It’s a cultural war.  A war, Tocqueville would say, that instead of our body wants to strike our soul.  Our way of life, our philosophy of Life.  . . . .  Our freedom” (p. 266).

Fallaci laments her slowness in discerning developments evident in the ‘60s, when most everyone failed to see them as a greater threat to the West than the Cold War.  But now she sees clearly!  She also has nothing but contempt for the “collaborationists” and “the traitors who invented the lie of Pacifism” (p. 137).  She has interviewed many of the most powerful Islamic leaders and terrorists such as Yasser Arafat.  She has peered into the enemy’s eyes and knows we’re in a great war.  That her personal, journalistic perspective so closely resembles the scholarly stance of Bat Ye’or should give one pause.  

175 Are Men Necessary? Dowd, Mansfield, Eldridge, O’Bierne

Though I’d heard about Maurine Dowd (the Pulitzer Prize winning columnist for the New York Times), I’d read nothing of her work before picking up Are Men Necessary?  When Sexes Collide ( New York:  G.P. Putnam’s Sons, c. 2005).  This book appears to be a collection of her columns, which perhaps explains its elusive structure and disconnected paragraphs, for it often reads like a series of clever, snidely satirical remarks.  She’s a hugely successful career woman but has, apparently, failed in the realm of romance and marriage, a her main message is this:  “Little did I realize that the sexual revolution would have the unexpected consequence of intensifying the confusion between the sexes, leaving women in a tangle of dependence and independence as they entered the twenty-first century” (p. 8).  She’s as confused as anyone, she says, and offers no answers—only an interested “observer’s” reflections on the issue.  

Men, she thinks “prefer women who seem malleable and awed” (p. 42) at their mere presence.  

So independent women such as Dowd, who have ascended to positions of power, fail to attract men.  Proving her point, she says, “A friend of mine called nearly in tears the day she won a Pulitzer:  ‘Now,’ she moaned, “I’ll never get a date!’” (p. 117).  Her friend illustrates the fact that:   “It took women a few decades to realize that everything they were doing to advance themselves in the boardroom could be sabotaging their chances in the bedroom” (p. 42).  Consequently, Dowd laments, more and more women want, primarily, to be wives!  This is evident in the sharp decline, during the past decade, in the numbers of women who retain their maiden name when they marry. 

Women still seem to delight in being sexual objects of male attention, investing greatly in clothes and appearance. “Forty years after the dawn of feminism, the ideal of feminine beauty is more rigid and unnatural than ever” (p. 229).  Sadly enough, from Dowd’s perspective:  “American women are evolving backward—becoming more focused on their looks than ever.  Feminism has been defeated by narcissism” (p. 233).  Accordingly:  “A lot of women now want to be Maxim [a men’s magazine that apparently celebrates the female form] babes as much as men want Maxim babes.  So women have traveled an arc from fighting objectification to seeking it” (p. 183).

From Dowd’s perspective, however, this does not make men particularly admirable.  She delights in heaping ridicule on President Bush and his associates—referring to them as “Bushies” or “ninnies” who lack the testosterone manifestly evident in Hillary Clinton and Condi Rice!   The President, she declares, oppresses American women, and those “who wear low-slung jeans are losing rights in an administration where faith trumps both science and facts” (p. 202).  Still more, she declaims:  “If W.  hadn’t been propelled into office party because of sex, he wouldn’t have been able to restrict sexual freedoms, such as gay marriage and women’s rights” (p. 202).  She’s obviously fixated on things sexual, and in a lengthy section she ponders the differences between the male and female chromosomes.  The male Y, she rejoices to report, is much smaller and simpler than the female X.  Thus, to her way of thinking, males are less weighty than females.  Indeed, with a few hundred “semen slaves” women could do quite nicely without men!  Though reared a Roman Catholic—“I was a serf in a feudal society where men made the rules and set the tone” (p. 193), Dowd has clearly repudiated her Church and its male clergy, including Pope Benedict XVI, who is, she says “God’s Rottweiler,” an evil man who condemns such things as abortion.  

Bill Clinton, as one would expect, receives Dowd’s endorsement—though she certainly dispenses suitably caustic comments concerning some of his activities.  He (and Hillary’s obvious dependence upon him) put feminists like Dowd in a real dilemma.  He supported their pro-abortion agenda but violated their canon of correct behavior and compromised the movement’s credibility.  Hillary, particularly causes Dowd concern, for:  “As part of their conjugal/political deal, or their ‘passionate codependence,’ as James Carville calls it, she always chose her husband and sold out her sisters” (p. 314).  Amazingly, “It was a bold hat trick; she finished off what was left of feminism, yet remained a feminist icon.” (p. 314).  She succeeded because of her victim’s status.  As a martyr she appealed to the hearts of voters and was catapulted into the United States Senate.  So the question Dowd poses to Hillary is this:  “Will the ‘I am woman, see me grow’ senator ever be genuinely self-reliant from her husband?  Or are men necessary?” (p. 338).   For Hillary, apparently the answer is yes!  

* * * * * * * * * * * * * * * * * 

Harvey C. Mansfield, Professor of Government at Harvard University, has written an important book simply titled Manliness (New Haven:  Yale University Press, c. 2006).  He is troubled that our current effort to establish a “gender-neutral” society will suppress the manly aggressiveness necessary for human flourishing.  Most manifestly:  “The attempt to make our language gender-neutral reveals something of the ambition of our democracy today.  A gender-neutral language implies a gender-neutral society, marking a pervasive change in the way we live our lives.  Our society has adopted, quite without realizing the magnitude of the change, a practice of equality between the sexes that has never been known before in all human history” (p. 1).  

This gender-neutral project, however, will run aground on the rigid rock of human nature, Mansfield argues.  Following the Nietzschean agenda of Simone de Beauvoir, it fantasizes that:  “’One is not born, but becomes, a woman’” (p. 133).  Thus defying nature, gender-neutrality, not patriarchy, is the “social construction” that denies reality.  Men and women are, he insists, significantly different, and he validates his position with extensive scholarly citations.  Men, at their manly best, exemplify the Greek virtue of thumos, “a quality of spiritedness” that prompts them “to risk their lives in order to save their lives” (p. xi).  They are assertive, crave honor and independence, and enjoy taking charge.  Consequently, “every previous society, including our democracy up to now, has been some sort of patriarchy, permeated by stubborn, self-insistent manliness” (p. 58).  

This healthy thumos, however, has been derailed in modern times by thinkers such as Darwin and (especially) Nietzsche, who have advanced a “manly nihilism,” providing “a license from science and philosophy to boast and to act without restraint” (p. 83).  Rather than the chivalrous gentleman we get the “tough guy” for whom, according to Nietzsche:  “’Nothing is true, everything is permitted’” (p. 111).  Unfortunately, Mansfield argues, modern feminists, determined to gain equality at any price, embraced this Nietzschean nihilism as their gold standard, celebrating independence and autonomy.  They determined to be free from men.  “They wanted independence from morality, which kept them in subordination to men under the yoke of the ‘doubled standard’; and from nature, giving them wombs, compelling them to be mothers, which kept them subordinate to men; and—is the point coming into view?—from men.  In order to be free of men, these women wanted to change morality and deny nature” (p. 123).   

Mansfield provides in-depth analyses of the major feminist thinkers, from Mary Wollstonecraft to Simone de Beauvoir and Betty Frieden.  He distinguishes “radical” feminists, who openly attack and would destroy the family, from “moderate” feminists, who tolerate, but do not really support it.  He is particularly effective in exposing their reliance upon Marx and Nietzsche.  For example, they have committed themselves to “consciousness-raising,” a Neo-Marxist strategy “apparently first used in 1969 in the Red Stockings Manifesto, coauthored by Shulamith Firestone when she helped found a radical feminist group in New York” (p. 15).  Betty Frieden appropriated the phrase, and it quickly permeated the feminist movement.

  By embracing nihilism, modern feminists failed to discern another approach that might have proved more worthy:  Liberalism.  Mansfield finds in philosophers such as Locke, Mill, and Burke an ideal of chivalrous manliness that stresses self-discipline and commitment to traditional standards.  Similarly, the classical virtue ethics of Aristotle offer resources for reasonable arguments on behalf of women’s rights.  These thinkers take nature “as the guide for nurture” (p. 202).  They deal with reality rather than imagined utopias.  They appreciate the importance of courage and emphasize the need of authority in securing the good society.  And, importantly, they agree with Aristotle “that the sexes are not autonomous but related, hence that the ideal of autonomy put forth by the women’s movement will not work.  Women see themselves in relation to men, and men, who are spirited, have a need for women that they often do not care to admit” (p. 213).  

This is a deeply informed and richly rewarding treatise.  Unfortunately, though the book seeks to be address a general audience it cannot escape the academic tone of its author.  Without sufficient background in social and political philosophy, readers would be recurrently perplexed by some Mansfield’s argument.  But it is, unlike many critiques of feminism, a solidly reasoned and trenchant analysis.

* * * * * * * * * * * * * * * *

John Eldridge largely anticipated Mansfield’s argument in a more reader-friendly and Christian publication titled Wild at Heart:  Discovering the Secret of a Man’s Soul (Nashville:  Thomas Nelson Publishers, c. 2001).  A famous Teddy Roosevelt declaration nicely sums up the book’s message:  “It is not the critic who counts, not the man who points out how the strong man stumbles, or where the doer of deeds could have done them better.  The credit belongs to the man in the arena, whose face is marred by dust and sweat and blood, who strives valiantly . . . who knows the great enthusiasms, the great devotions, who spends himself inn a worthy cause, who at the best knows in the end the triumph of high achievement, and who at the worst, if he fails, at least fails while daring greatly, so that his place shall never be with those cold and timid souls who have never known neither victory or defeat” (p. xiii).  

Deep in his heart every man, Eldridge says, longs “for a battle to fight, an adventure to live, and a beauty to rescue” (p. 9).  Thus men prefer football to ballet, adventure films to “chick flicks.”  They long for mountains to climb, not beachfront boardwalks to stroll.  “The masculine heart needs a place where nothing is prefabricated, modular, nonfat, zip lock, franchised, on-line, microwavable.  Where there are no deadlines, cell phones, or committee meetings.  Where there is room for the soul” (p. 5).  Unfortunately, for the past several decades a gender-neutral society has endeavored to emasculate men.  

Worse still, from Eldridge’s viewpoint, Christianity has contributed to the problem by urging men to be “nice guys.”  In fact:  “Really Nice Guys.  We don’t smoke, drink, or swear; that’s what makes us men.  Now let me ask my male readers:  In all your boyhood dreams growing up, did you ever dream of becoming a Nice Guy?” (p. 7).  Jesus is portrayed, all too often, as a supremely “sensitive,” compassionate caregiver.  As Dorothy Sayers discerned long ago, “the church has ‘very efficiently pared the claws of the Lion of Judah,’ making him ‘a fitting household pet for pale curates and pious old ladies’” (p. 26).  

Pale curates and pious old ladies would marshal men not into armies but into small groups for sharing and caring, holding hands and baring their feelings.  But, Eldridge insists:  “We don’t need accountability groups; we need fellow warriors, someone to fight alongside, someone to watch our back.      . . . .  The whole crisis in masculinity today has come because we no longer have a warrior culture, a place for men to learn to fight like men.  We don’t need a meeting of Really Nice Guys; we need a gathering of Really Dangerous Men” (p. 175).  Men long for the camaraderie that Shakespeare’s Henry V declared at the battle of Agincourt:   “’We, we happy few, we band of brothers, /  For he to-day that sheds his blood with me / Shall be my brother’” (p. 175).   

Wild at Heart basically sets forth Eldridge’s own views, rooted in his own experiences.   But I’ve had students who’ve read and positively responded to his message.  And the book provides, I think, a healthy antidote to the therapeutic messages so dominant in today’s educational and clerical circles.  

* * * * * * * * * * * * * *

Only a woman would dare write, as has Kate O’Bierne, a book titled Women Who Make the World Worse:   and How Their Radical Feminist Assault Is Ruining Our Schools, Families, Military and Sports (New York:  Sentinel, c. 2006).  O’Bierne is a lawyer, the Washington editor of National Review magazine, and a regular panelist on CNN’s The Capital Gang.  For 30 years she has witnessed, with increased alarm, the ravages of radical feminism, and she writes this book to alert us to its harms.  Her thesis, as the book’s final sentence indicates, is this:  “All of these women who make the world worse by waging a destructive war between the sexes are at war with Mother Nature” (p. 199).  

First, consider the plight of the modern family, one of the primary targets of radical feminists, who want the government to assume the role of fathers in traditional families.  Influential women, such as Penn State’s renowned sociologist Jessie Bernard, have denounced the “’destructive nature’ of marriage” (p. 3).  Motherhood, she claims, harms women’s health.  The very desire for children, to her, is “a sexist social construction” (p. 3) rather than a natural hunger.  Textbooks written by the likes of Professor Bernard contain a litany of anti-marriage complaints.  Judy Aulette’s Changing Families, for example, devotes three chapters to marriage and says nothing positive about it.  She does, however, invoke Friedrich Engels to conclude “‘that marriage was ‘created for a particular purpose:  to control women and children’” (p. 6).  

Next, consider day care, which assumes that hired hands can rear children better than moms at home.  Radical feminists idealize socialist systems, such as Sweden’s, and insist that women find meaning in life through making money.  Ignoring the fact that most women actually prefer to invest energy in bearing and rearing children, the leaders of the women’s movement insist they be freed from the burdens of childcare.  They frequently urge young women to put careers first, attaining financial success, before considering pregnancy.  They fail, however, to disclose important truths.  For one thing, “most current studies show that female fertility begins to drop at age twenty-seven, and by age thirty can decline by as much as 50 percent” (p. 44).  Thinking, as many young women seem to, that one can start a family in their mid-thirties, defies one of the more important “facts of life.”  So too it is evident that ordinary women cannot “have it all.”  One can effectively devote her energies to either a job or to children, but not both.  Third, O’Bierne exposes “lies about wages, discrimination, and harassment.”  In truth, she says:  “There is no discriminatory pay gap between working men and women” (p. 50).  Unmarried women who work full-time receive the same pay as their male counterparts (when one makes honest comparisons considering educational background and ability).  Women who stop working, or work part time, to devote themselves to their children, obviously receive less pay.  But they choose to do so and apparently find much fulfillment in so doing.  “A Pew Research Center survey found that 86 percent of mothers rated their children a 10 for their importance as a source of happiness, on a scale of 1 to 10, while only 30 percent of employed women rate their job as a 10” (p. 195).  Women also choose careers in areas, such as education, that do not pay as much as more rigorous or dirty or dangerous professions.  

Fourth, there is a militant sex bias in America’s schools, significantly privileging females.  Supreme Court Justice Ruth Bader Ginsburg, for example, explaining the court’s Virginia Military Institute case, asserted that all-female colleges are manifestly justifiable while all-male schools are constitutionally illicit!  Under the influence of feminists such as Carol Gilligan, teachers have “resolved to transform education to eliminate ‘male-identified attributes’ like ‘reason and logic’ in favor of ‘feminine ways of knowing’” (p. 68).  Thus O’Bierne titles her chapter:  “In the Classroom . . . Boys Will Be Girls.”  Consequently history, once a record of heroic exploits, of battles and adventures appealing to boys, has been “rewritten to achieve the Ms.-education of our children” (p. 74).  Games involving competition or rough play are abolished on behalf of feminine “sharing” of rewards and emotions.   

In chapter five, “Spoil Sports—Boys benched,” O’Bierne details the decline of athletic opportunities for boys.  This is especially due to the implementation of Title IX of the Educational Amendments of 1972, whereby “the goalpost has been moved:  from increasing athletic opportunities for women to reducing the number of men playing sports to achieve parity between the sexes” (p. 93).  Dramatically, between 1992 and 1997, “over 20,000 male slots on team rosters were eliminated,” while “the number of female athletes increased by 5,800” (p. 98).  Major men’s programs, in swimming and wrestling, for example, were simply eliminated.  All this has transpired despite the fact that women simply don’t enjoy sports as much as men.  Only 15% of the women at elite all-women’s colleges such as Wellesley voluntarily engage in athletics.  Radical feminists, O’Brien contends, have simply found traction in varsity athletics for a deeply political agenda:  remaking the nation into a gender-neutral society.

The military has similarly suffered.  Radical feminists have long insisted that women in the armed services enjoy all the opportunities available to their male counterparts, including combat.  Doing so has involved propagating the “woman-as-warrior myth” (p. 120).  While crying out for equality, however, they have simultaneously pled for special treatment.  Thus at West Point women are given 5:30 minutes to complete an obstacle course that men must run in 3:20.  Women only need do 48 push-ups whereas men must do 72 in a two minute period.  (I just did 100 in 60 seconds, and I’m 65, so the task’s not too tough!).  

In a chapter entitled “Abortion—the Holy Grail,” O’Brien says that “Modern feminism’s biggest enemies are the smallest humans” (p. 157).  Though 19th century feminists, such as Susan B. Anthony, were staunchly anti-abortion, their successors, emboldened by Supreme Court decisions, are militantly, uniformly pro-abortion.  Though they claim to support the nation’s women, in fact, as the late pro-life Democratic Governor of Pennsylvania, Robert Casey, said, “’most women in America opposed abortion on demand, while the most avid supporters of abortion are unmarried males between 18 and 35’” (p. 171).  

Unfortunately for radical feminists, they “have squared off against Mother Nature, and she’s no feminist” (p. 180).  Like defying the law of gravity, the sexual revolution, with its androgynous illusions, is designed for disaster.  There are absolutely inflexible sex differences that we must recognize and respect.  Indeed, doing so “can liberate women—from the feminist orthodoxy in conflict with their natural talents and desires” (p. 180).  Common sense sex stereotypes are, in fact demonstrably true.  “On the first day of life, girls are more drawn to a picture of a face, and newborn boys to a mechanical mobile.  At a year, boys have shown a stronger preference for a video of cars, while girls at age one prefer a talking head” (p. 190).  “Mothers are particularly good at reading babies’ faces” (p. 190), and a “study of 186 societies found that mothers, with their superior skills, are the exclusive or primary caretakers of infants” (p. 191).  Boys prefer hierarchical structures while girls enjoy equitable, empathetic relationships.  “Women judge character and moods better than men” (p. 194).  Men are generally “more aggressive and enjoy superior math skills, and women are, on average, more nurturing, with better verbal skills” (p. 182).  Men looking for a mate mainly desire a beautiful woman; women primarily value resourceful men who will provide for them.  

In short:  truth really matters and women who deny reality make the world worse!  

174 Henry Adams’ America

One of my former students—and subsequently a good friend—Allen K. Brown, an attorney in Fullerton, CA, sent me his review of Garry Wills’ recent book, Henry Adams and the Making of America, that I duplicate for you:

    While under the teaching of Dr. Gerard Reed he encouraged me to study the philosophy of history of Henry Adams.  A century ago Henry Adams, the grandson of John Quincy Adams and great-grandson of John Adams, had a significant influence in teaching historians, writers and journalists; he developed the innovative method of using archival sources, interviews from eye witnesses and other techniques that established high standards in historical writing.  I found most of my material from The Education of Henry Adams.   I regret that I never tried to read Adams’s nine volumes of History of the United States of America During the Administrations of Thomas Jefferson & James Madison.  However, I continued to read Adam’s novels and books concerning him.  

    In a recent study, Henry Adams and the Making of America, the prolific Northwestern University historian Garry Wills establishes his point that, as a historian, Adams was an original.  Indeed, he asserts, Adams’ History is “the non-fiction prose masterpiece of the nineteenth century in America.”  Wills declares that Adam’s History  “turns upside down the previous consensus on the period covered, so drastically that many have missed the point of the History entirely” (p. 389).  He believes that most readers, including historians, failed to read the complete nine volumes and thus incorrectly concluded that Adams was writing a family defense and justification.

    Although Wills objects to people reading Henry Adams backward from The Education of Henry Adams to The History, he does exactly that in his book, dividing his treatise into two parts:  one, “The Making of an Historian,” and two, “The Making of a Nation.”  However, because of the great influence of Henry Adams, and of the recurring relevance of analyzing the formation of the United States, the method Wills employs in this book is justified.  Importantly, for us, anyone interested in “original intent” of the Constitution should Adams’ History of the United States of America During the Administrations of Thomas Jefferson & James Madison.  For Adams gives evidence to a unique intellectual development and maturity that enabled him to correctly analyze how this great nation of ours was made.  “The processes of his own development and the nation’s are mutually reinforcing.  One mirrors the other.”  

    Adams would, Wills insists, as a philosopher of history, accept Tolstoy’s rule that men do not make events but events make men. He believed that the making of the American nation was “forged on the anvil of other nations.”  Yet, in an ironic twist, Adams as a practicing historian wrote and dwelt on the great leaders of this historical era.  He believed that two of the leading figures that shaped the America that we know today were the actions and response of two men responding to particular events:  Jefferson and Napoleon (p. 389).  Wills quotes an 1883 Adams’ statement reflecting on leading characters of his History:  “I am at times almost sorry that I ever undertook to write their history, for they appear like mere grasshoppers, kicking and gesticulating in the middle of the Mississippi River.  There is no possibility of reconciling their theories with their acts, or their extraordinary foreign policy with dignity.”  

Yet, though such statements might indicate otherwise, Wills insists that Adams was not the deterministic, defeatist and pessimistic historian some have portrayed him to be.  Adams believed that a leading man’s response to events would and did bring about a brighter and better future.  I am reminded of similarities to President Ronald Reagan when Wills explains Adams explanation of Jefferson:  “By trusting that the outcome would be glorious, he became the transmitter of forces that would make the outcome glorious—and would make him accept it almost despite himself” (p. 392).

    Thus, the “second revolution” that Jefferson spoke about was not a return to the original intent of the founders, as he claimed, but his leadership instead “led a breakout from both ideologies” of the Federalists and Republicans.  Jefferson the President was not a Jeffersonian!  Wills encapsulates this conclusion with an enlightening statement which gives us insight about Adams, about the making of history, about the making of men, and about the making of our great nation we call Home:  “This is the irony of history as Adams traces it.  It tells us how the Jeffersonians wrought better than they knew while they thought they were doing something else.  In the end, they made a nation.”

    In our own political climate, in an era with its own issues with political parties, court appointments, administrative power, legislative corruption and centralized government, we should agree with Adams who believed it made no logical sense to view American History and current political events as a continuation of the eighteenth-century feud between the Hamiltonians and the Jeffersonians.  

    After reading Wills’ book, the only incentive that should remain is to find and read Henry Adams’ History—all nine volumes.

* * * * * * * * * * * * * * * * * * * * 

Fortuitously, Allen Brown’s review arrived while I was in the midst of reading all nine volumes of Adams’ History!  Such reading, I confess, should have taken place at the beginning rather than the ending of my academic life, but (as the cliché says) better late than never.  Anyone interested in doing the same will find that Henry Adams’ History of the United States of America During the Administrations of Thomas Jefferson (New York:  The Library of America, c. 1986) and History of the United States of America During the Administrations of James Madison (New York:  The Library of America, c. 1986) are now available in two relatively inexpensive, nicely-bound volumes.  

Several strengths mark these volumes:  1) thorough, footnoted research in primary sources, frequently giving extended quotations from letters and official documents; 2) discerning evaluations, helping the reader grasp the real import of events; 3) a sustained commitment to placing American events within the broader European context; 4) an engaging style that encourages one to persevere in reading all 2500 pages of the history.  Above all it’s evident that two of this nation’s premier thinkers proved to be at best second-rate presidents!  Jefferson’s greatness resides in The Declaration of Independence and the ideas he advocated—such as states-rights and strict constructionism.  Madison’s work in forging the Constitution of the United States secured for him a bright star in the nation’s firmament.  Neither man, however, effectively implemented many of his deepest convictions as President.  Compared with George Washington, for example, neither man merits the label “great” as President.  

The first six chapters (some of the best in the entire work) of Adams’ study of Jefferson’s administration are devoted to portraying the physical, economic, popular, and intellectual conditions of the new nation in the year 1800.  There were some five million Americans, two thirds of whom lived within 50 miles of the Atlantic seaboard.  Roads through the mountains had been built, however, and a mass movement into the vast Ohio and Mississippi River basins was beginning.  Drawing upon travel narratives, Adams enables us to visualize the differences between residents of the North and South, East and West.  Thus, he noted (anticipating by five years Frederick Jackson Turner’s famous frontier thesis):  “The Mississippi boatman and the squatter on Indian lands were perhaps the most distinctly American type then existing, as far removed from the Old World as though Europe were a dream” (p. 40).  

“European travelers,” Adams wrote, “who passed through America noticed that everywhere, in the White House at Washington and in log-cabins beyond the Alleghenies, except for a few Federalists, every American, from Jefferson and Gallatin down to the poorest squatter, seemed to nourish an idea that he was doing what he could to overthrow the tyranny which the past had fastened on the human mind” (pp. 119-120).  Still more:  “Greed for wealth, lust for power, yarning for the blank void of savage freedom such as Indians and wolves delighted in,—these were the fires that flamed under the caldron of American society, in which, as conservatives believed, the old, well-proven, conservative crust of religion, government, family, and even common respect for age, education, and experience was rapidly melting away” (p. 121).  

Intellectually, Noah Webster (the noted New England educator and compiler of the influential dictionary) lamented, “’our learning is superficial in a shameful degree, . . . our colleges are disgracefully destitute of books and philosophical apparatus, . . . and I am ashamed to own that scarcely a branch of science can be fully investigated in America for want of books, especially original works’” (p. 45).  Clearly, Adams said, “The labor of the hand had precedence over that of the mind throughout the United States” (p. 90).  And this was even  more pronounced in the South than the North, where a Southerner, Thomas Jefferson, declaimed:  “’Those who labor in the earth are the chosen people of God if ever he had a chosen people, whose breasts he has made his peculiar deposit for substantial and genuine virtue’” (p. 102).  Jefferson himself, of course, was hardly a yeoman farmer!  Adams notes that he found his “true delight” in the “intellectual life of science and art.  To read, write, speculate in new lines of thought, to keep abreast of the intellect of Europe, and to feed upon Homer and Horace, were pleasures more to his mind than any to be found in a public assembly” (p. 99).  

And it was this man who was inaugurated as the third President of the United States in 1801.  He was the first President inaugurated at Washington, and the ceremony was notably simple.  “In Jefferson’s eyes a revolution had taken place as vast as that of 1776” (p. 130) and he determined to make visible his democratic convictions.  In his First Inaugural Address, Jefferson reaffirmed his deepest philosophical principles and called for “’a wise and frugal government, which shall restrain men from imbuing one another, which shall leave them otherwise free to regulate their own pursuits of industry and improvement, and shall not take from the mouth of labor the bread it has earned.  This is the sum of good government, and this is necessary to close the circle of our felicities’” (p. 137).  He disliked national banks, national debts, standing armies and centralized powers.  “Peace is our passion!” he said, in an 1803 letter (p. 285), and he wanted to keep America removed from European conflicts.  He also hoped for national unity, famously declaring:  “’We are all Republicans, we are all Federalists’” (p. 136).  

Jefferson’s words and ideals were quickly tested by the realities of his presidential position.  Governing effectively was quite different from writing elegantly.  Jefferson himself was temperamentally incapable of authoritative leadership, for as Alexander Hamilton noted, “a true estimate of Mr. Jefferson’s character warrants the expectation of a temporizing rather than a violent system’” (p. 189).  He was, Henry Adams concluded, “sensitive, affectionate, and, in his own eyes, heroic.  He yearned for love and praise as no other great American ever did” (p. 220).  Still more:  “Jefferson had the faculty, peculiar to certain temperaments, of seeing what he wished to see, and of believing what he willed to believe” (p. 641).  He was, in short, ill-suited for executive effectiveness, something that was earlier evident when he served as Governor of Virginia during the War for Independence.  Jefferson quickly found himself at odds with a strong-willed Federalist, John Marshall and the Supreme Court!  He faced not only Federalist opposition but dissensions within his own Republican ranks in Congress. His every appointment and decision seemed to evoke criticism and calumny from various quarters.  

Added to this were foreign entanglements!  Napoleon Bonaparte was restructuring Europe, and Adams goes into great detail documenting developments that process, for it inevitably affected America.  Muslim pirates in North Africa, off the coast of Tripoli, were terrorizing commercial ships, including those sailing under an American flag.  Spain still controlled vast regions of North America, and tensions developed wherever American frontiersmen rubbed up against Spanish authorities in the Old Southwest.  

Of particular import was the opportunity given Jefferson to acquire the Louisiana Purchase.  When Napoleon persuaded Spain to cede to France this vast region, he apparently envisioned a extension of his empire.  Soon thereafter, however, the successful revolt of slaves in Haiti, dissuaded him, and he impulsively offered to sell the territory to the United States.  To Adams, “The sale of Louisiana was the turning point in Napoleon’s career” (p. 328), for he thereby lost credibility in France.  But he inadvertantly  blessed the United States!   “The annexation of Louisiana was an event so portentous as to defy measurement; it gave a new face to politics, and ranked in historical importance next to the Declaration of Independence and the adoption of the Constitution” (pp. 334-335).  

In purchasing Louisiana, however, and bringing it “into the Union without express authority form the States” (p. 363), Jefferson compromised his own convictions regarding a strict construction of the Constitution.  In Adams’ judgment, “the Louisiana treaty gave a fatal wound to ‘strict construction,’ and the Jeffersonian theories never again received general support” (p. 363).  Indeed:  “By an act of sovereignty as despotic as the corresponding acts of France and Spain, Jefferson and his party had annexed to the Union a foreign people and a vast territory, which profoundly altered the relations of the States and the character of their nationality” (p. 381).  Adams deals carefully with such developments as the Yazoo land claims, the impeachment trial of Justice Chase, but it was diplomatic developments in Europe that conclusively shaped Jefferson’s first term.  “’The United States,” said a French diplomat, writing to Talleyrand, “find themselves compromised and at odds with France, England, and Spain at the same time.  This state of things is in great part due to the indecision of the President, and to the policy which leads him to sacrifice everything for the sake of his popularity’” (p. 577).  

These conflicts determined the course of Jefferson’s second administration.  Like many Presidents, he found his second term less than successful!  Spain’s presence in Florida created continual problems— particularly since frontiersmen pressured their politicians to acquire it.  His former Vice President, Aaron Burr, engaged in a conspiracy designed to empower him in the Old Southwest.  Napoleon’s Berlin Decree—and Britain’s Orders in Council—embroiled him in Europe’s conflicts and precipitated his commitment to an 1807 embargo on foreign commerce!  This was yet another violation of Jefferson’s states-rights philosophy, proved economically disastrous for New England especially, and led to serious talk of secession on that region.  Though the embargo was repealed before Jefferson left the presidency, John Randolph declared that “’never has there been any Administration which went out of office and left the nation in a state so deplorable and calamitous’” (p. 1239).  The man who longed above all to be popular 

“left office as strongly and almost as generally disliked as the least popular President who preceded or followed him” (p. 1239).  

Inaugurated in 1809, the fourth President of the United States was another Virginian, James Madison, who shared with his predecessor a fine philosophical mind and administrative incompetence.  Even intellectually, Adams says, Madison showed little power as President.  Indeed, he declared:  “If Madison’s fame as a statesman rested on what he wrote as President, he would be thought not only among the weakest of Executives, but also among the dullest of men” (p. 125).  Napoleon’s wars, including the struggle with England for supremacy of the seas, inexorably involved the United States in European affairs, culminating in the War of 1812, promoted primarily by “war-hawks” in the West who envisioned the annexation of Canada to the new nation.  As William Randolph lamented, following the 1811 debates in Congress, “’we have heard but one word,—like the whipporwill, but one monotonous tone,—Canada, Canada, Canada!’” (p. 395).  

Adams carefully recounts (in 800 pp.) the military and diplomatic operations in this war with England, demonstrating the woeful ineptitude of most American endeavors, especially the failed invasions of Canada—“disasters” that could have ended the nation’s brief life.  Congress often interfered with, and refused to appropriate funds for, the war Madison supervised.  There were a few, isolated naval victories on the Great Lakes and Atlantic and profit-hungry privateers enjoyed sporadic successes.  Some Indian tribes (especially the Creeks, defeated by Andrew Jackson at Horseshoe Bend in 1814) were crushed.  But on the whole the War of 1812 was a marked military failure.  Ironically, Andrew Jackson’s great 1815 victory in New Orleans (costing the British 2,236 casualties, compared with 71 Americans lost) was won after the Peace of Ghent had ended the war.  

What was lost in war was regained by the peace treaty, however, because “the treaty became simply a cessation of hostilities, leaving every claim on either side open for future settlement” (p. 1218).  Both sides compromised, and the Americans may have seemed to to be “the chief losers; but they gained their greatest triumph in referring all their disputes to be stttled by time, the final negotiator, whose decision they could safely trust” (p. 1219).  Subsequently, Madison’s final two years in office proved far more balmy than Jefferson’s.  The economy boomed as commerce revived.  Cotton quickly proved to be “king” in the South, and tobacco and rice continued to be good cash crops.  The Federalists, strong enough to stage a secessionist Hartford Convention during the war, simply disappeared as rivals to the Republicans by 1816, though in fact Federalist principles had become large parts of the Republican system.  

Culturally, the new nation vibrated with new ideas.  “Religious interest and even excitement were seen almost everywhere, both in the older and in the newer parts of the country; and every such movement offered some means of studying or illustrating the development of national character” (p. 1301).  Unitarianism replaced Calvinismwherever Harvard University shaped New England’s intelligentsia.   Eminent preachers, such as William Ellery Channing, considered “dogma” rather irrelevant to the religion of “love” and “righteousness” they defined as Christianity.  “No more was heard of the Westminister doctrine that man had lost all ability of will to any spiritual good accompanying salvation, but was dead in sin” (p. 1306).  A “deified humanity” replaced the orthodox Trinity as the focus of man’s worship.  In the West, however, a decidedly different version of Christianity flourished as the camp meeting movement, the beginnings of America’s “Second Great Awakening,” launched a spiritual revolution that would effectively christianize the nation by mid-century.  

Tested and tried, during the 16 years of the administrations of Jefferson and Madison, the United States had assumed a more definite character and established a deeper unity and strength, sufficient to sustain the nation in the unfolding 18th century.  To understand this, Henry Adam’s History proves invaluable.  

# # #

173 The Souls of Our Young

In Soul Searching: The Religious and Spiritual Lives of American Teenagers (New York: Oxford University Press, c. 2005), Christian Smith and Melinda Lundquist Denton provide an amply documented and academically persuasive portrait of America’s youth. Smith is a professor of sociology at the University of North Carolina and the principal investigator of the National Study of Youth and Religion—a well funded, methodologically clear endeavor that relies upon both extensive surveys and personal interviews. Denton is the study’s project manager. “To our knowledge,” they say, “this project has been the largest, most comprehensive and detailed study of American teenage religion and spirituality conducted to date” (p. 7).

America’s teenagers are remarkably religious; 40 percent attend “religious services once a week or more, and 19 percent report attending one to three times per month” (p. 37). Only 18 percent have no religious involvement. Amazingly enough, “teens as a group profess to want to attend religious services not less, but actually more than they currently do” (p. 38). They praise their congregations as “warm and welcoming” (p. 61) and find adults therein reliable and trustworthy. Their parents, more than anyone else, influence them, and they reveal little hostility toward them. Such youngsters have little interest in fringe or “alternative” religions and seem to be quite conventional in almost every way. “The vast majority of U.S. teenagers identify themselves as Christians” and “regularly practice religious faith” (p. 68). The mantra of avant garde folks like Michael Lerner—”spiritual but not religious”—hardly registers with typical teenagers.

One interviewee, incidentally, was attending a Nazarene church and spoke highly of it. He liked Wednesday and Sunday night services, the youth group and Sunday school. What he found attractive in the church was this: “It’s good people, you know. And not only that, I also actually learn,” something important to him because he wanted to know how to “be a God-fearing person and go to heaven or whatever, you know?” (p. 100).

The more devout among them are thereby advantaged in “a host of ways,” making a positive difference in: “risk behaviors, quality of family and adult relationships, moral reasoning and behavior, community participation, media consumption, sexual activity, and emotional well-being” (p. 219). Whether one considers drugs and alcohol or school attendance or getting along with parents, the religious teenagers do much better. They watch less TV, fewer R rated movies, less pornography, and play fewer video games. In some categories—such as pornographic movies, where the “devoted” teens watched 0.5 a year while the “disengaged” saw 2.5—the statistics reveal dramatic differences. “Nearly all Devoted teens believe in waiting for marriage to have sex, compared to less than one-quarter of the Disengaged who believe the same” (p. 223). Devoted teens are far happier than the Disengaged and feel more closely connected with others. They craft positive plans for the future and seriously ponder “the meaning of life” (p. 226). The statistical tables delineating these differences, found on pp. 220-227, are most impressive in demonstrating the authors’ conviction that religion helps teens.

The positive news regarding the role of religion in teenagers’ lives must be balanced, however, by information regarding its doctrinally deficient nature. Our youngsters have little knowledge of any content to the Christian faith! They take a thoroughly individualistic approach to questions regarding God, man, and salvation—though they are generally quite inarticulate when asked to explain much of anything about their views. Indeed, the authors conclude: “In our in-depth interviews with U.S. teenagers, we also found the vast majority of them to be incredibly inarticulate about their faith, their religious beliefs and practices, and its meaning or place in their lives” (p. 131).

Their religion is best defined as “Moralistic Therapeutic Deism.” They believe in a rather distant (unless needed to solve one’s problems) God, who “wants people to be good, nice, and fair to each other, as taught in the Bible and by most world religions” (p. 162). “The central goal of life is to be happy and to feel good about oneself,” and “Good people go to heaven when they die” (p. 163). They believe God “designed the universe and establishes moral law and order. But this God is not Trinitarian, he did not speak through the Torah or the prophets of Israel, was never resurrected from the dead, and does not fill and transform people thorough his Spirit. This God is not demanding. He actually can’t be, because his job is to solve our problems and make people feel good. In short, God is something like a combination Divine Butler and Cosmic Therapist: he is always on call, takes care of any problems that arise, professionally helps his people feel better about themselves, and does not become too personally involved in the process” (p. 165).

Today’s teenagers also entertain a view of human nature quite at odds with the Christian tradition. Teens “tend to assume an instrumental view of religion. Most instinctively suppose that religion exists to help individuals be and do what they want, and not as an external tradition or authority or divinity that makes compelling claims and demeans on their lives, especially to change or grow in ways that may not immediately want to” (p. 148). While they freely acknowledge their sins, they apparently feel no condemnation as sinners! They share the broader culture’s presumption that we are autonomous individuals, free to shape our future in accord with our own desires. Religion is viewed as an enjoyable activity, but it ought not particularly influence one’s decisions. Autonomous individuals can hardly judge the behavior of others, and today’s teens are radically non-judgmental. “The typical bywords, rather, are ‘Who am I to judge?’ ‘If that’s what they choose, whatever,’ ‘Each Person decides for himself,’ and ‘If it works for them, fine'” (p. 144).

“What we heard from most teens,” Smith and Denton say, “is essentially that religion makes them feel good, that it helps them make good choices, that it helps resolve problems and troubles, that it serves their felt needs. What we hardly ever heard from teens was that religion is about significantly transforming people into, not what they feel like being, but what they are supposed to be, what God, or their ethical tradition wants them to be” (pp. 148-149). The youngsters interviewed rarely expressed interest in a religion that “summons people to embrace an obedience to truth regardless of the personal consequences or rewards. Hardly any teens spoke directly about more difficult religious subjects like repentance, love of neighbor, social justice, unmerited grace, self-discipline, humility, the costs of discipleship, dying to self, the sovereignty of God, personal holiness, the struggles of sanctification” (p. 149), or any of the classical themes of Christian discipleship.

For those of us working with young people, this book is both encouraging and chastening. Kids are hungry for God and the churches are bringing them into religious fellowships. Unfortunately, they learn little about the great doctrines of the Church and rarely are challenged to live out the sterner stuff of the scriptures.

* * * * * * * * * * * * * * *

Barbara Curtis, in Dirty Dancing at the Prom: And Other Challenges Christian Teens Face (Kansas City: Beacon Hill Press of Kansas City, c. 2005), provides a deeply personal insight into the lives of today’s adolescents. Prodded by one of her son’s remarks regarding the school prom—where “freak dancing” rather resembled sexual foreplay—she launched an investigation, primarily through interviews, into teen culture, hoping to help parents struggling with the issues she faces. What she found is (to her) alarming. Neither today’s dances, nor today’s teenagers, are quite the same as they were 40 years ago. Indeed, perhaps “it’s time proms carried warning labels” (p. 8). And not only proms but many aspects of teen culture merit them as well!

Curtis has twelve children (three of them, Down Syndrome children, adopted) and became a Christian only after she was well into the parenting process. In fact, her oldest daughter went to her high school prom and spent the night with her boyfriend. Having almost no religious roots, living in northern California, they took a laissez-faire approach to most everything, lacking any “moral compass to guide us, just following the crowd” (p. 10). She and her first husband were “hippies” who named their first two daughters Samantha Sunshine and Jasmine Moonbeam! Her second husband, a “spiritual seeker” was similarly rooted in the ’60s ethos. “Drugs, promiscuity, and radical politics” were part of the air they breathed in MarinCounty!

They became Christians, however, as a result of attending a conference where they were presented with Campus Crusade for Christ’s “Four Spiritual Laws.” Everything changed! They suddenly saw the world differently, bathed in the Light of Christ. “Though Tripp [her husband] and I had known about Jesus, we had thought of Him simply as a great spiritual teacher. . . . . This was the first time we had heard the truth about who He was. We did receive Jesus, then and there, on March 21, 1987. Tears were streaming down our faces, and we knew something profound had happened” (p. 108). And they wanted to rear their children differently. So, after home-schooling some of their children in California, they moved to Virginia, hoping to find a more solid, family-friendly society. But teen culture respects no state boundaries, and she found herself facing the great challenge of helping her kids deal with its harmful currents.

In the process she discovered the importance of seven items that constitute the chapters of this book: 1) Being Grounded in God’s Love: Self-esteem; 2) Setting Limits: Self-Assurance; 3) Avoiding Temptation: Self-control; 4) Developing Compassion: Self-sacrifice; 5) Standing Up For What’s right: Self-Respect; 6) Making the Most of Mistakes: Self-help; 7) Living with Integrity: Self-satisfaction.

Curtis discovered, firstly, how important it is to anchor teens in the reality of God’s love. When battling with self-esteem issues, so frequently savaged by their peers in the desensitized atmosphere of the schools, kids need to know they are precious in God’s sight. Those who grow up in homes where they know that both God and their parents love them are far more likely to be self-confident and resolute in resisting temptation. “Self-esteem is tied to knowing God’s love for us,” Curtis says (p. 21). Loving children requires parents to stick together. “So perhaps the most loving thing parents can do for their children is to honor their own wedding vows—for better, for worse, for richer, for poorer, in sickness, and in health, until death” (p. 25). Statistical studies demonstrate the significant suffering kids endure when their parents divorce. Curtis herself grew up “fatherless” and feels “the hole in the souls of fatherless girls” (p. 25). Girls also need godly dads who protect them! “There’s a part of every woman that still longs to be Daddy’s little girl, to feel completely safe and protected” (p. 26).

Protecting kids means setting limits. Curtis confesses she “was once a permissive parent. Having grown up with no spiritual foundation or moral guidelines myself, I didn’t have anything really to pass on. And since my background wasn’t undergirded with love, I had no understanding of what parental love looked like” (p. 31). She had no rules for bedtimes or much of anything else. She thought loving meant letting others do whatever they felt like. Then her oldest daughter, as a high school junior, began coming home at two in the morning. Mom awakened to the fact that youngsters lack wisdom and need guidance—and even clear rules. She also discovered that “kids don’t just need limits—they secretly want them” (p. 32). Love issues reasonable rules. Youngsters will always test them, but parents must uphold them for the good of their kids. This means that a mom or dad can’t be a child’s “best friend”— something 43 percent of the nation’s parents aspire to! Best friend parents, of course, never make rules or require homework or do anything to displease their “friend.” Truth to tell, however, kids both need and want parents! As one of the girls Curtis interviewed said: “‘I want my mom to be my mom'” (p. 46).

Many of the rules, in our world, necessarily focus on protecting kids from illicit sexual activities—evident in the fact “that more than one third of babies born in the United States were born to unwed mothers” (p. 48). Youngsters obviously need to develop the invaluable trait of self-control, though they find little encouragement to do so in the movies, songs, and TV programs that powerfully shape them. “The switch from romance to eroticism in entertainment has put enormous pressure on today’s teens” (p. 50). Thus parents have a great task: to both require obedience and encourage self-discipline. Curtis lists helpful ways to do so: encourage group dates; open your home to your kids’ friends; give them cell phones and keep them accountable; “eliminate latchkey hours;” and supervise entertainment.

Kids also need to learn compassion. By nature, they’re not so, necessarily! They learn to recognize, as Rick Warren says, “It’s not about you.” Others matter. And they should matter to teens. Being part of a big family certainly helps cultivate this, as Curtis makes clear. But kids still need to be taught to care for others—often by serving siblings at home. They need to know the difference between loving sinners and hating sins. They need to become aware of a world full of needs and hurts—something easily acquired through an acquaintance with world missions. Parents praying for missionaries and supporting World Vision or Compassion International clearly teach children elementary compassion lessons.

Standing up for what’s right, even when it’s unpopular, elicits a profound sense of self-respect. So parents need to both illustrate and encourage it, because our kids are on the “frontlines” of the culture war. Persecution—albeit it often subtle—is a fact of life for Christians in the public schools. The kids she interviewed all testified to the challenges they face at school. Getting involved in athletics or drama frequently forces a teen to make choices regarding his values and convictions. In the Curtis family, the author’s husband has consistently insisted: “It’s not who’s right but what’s right.” Films, such as High Noon, to Kill a Mockingbird, and Bonhoeffer, afford opportunities to emphasize the need for courage in living righteously. Kids thus nurtured generally find the courage to stand up for what’s right and discover, in the process, a great sense of personal dignity.

Growing up is marked by successes and failures. Learning from one’s mistakes, growing through disappointments, prepares one for adulthood. Curtis, of all people, knows the truth that “All have sinned and fall short of the glory of God.” The doctrine of original sin was validated by both her own transgressions and with every baby she reared! Confessing her own failures to her kids, as well as to God, showed them the value of openness and honesty. Failures aren’t fatal. With God’s help, the slips and sins of youth can be both confess and transformed into wisdom and strength. And that’s what’s needed for the integrity that makes one satisfied with life.

For parents seeking to understand and rightly rear their teenagers, Dirty Dancing at the Prom provides welcome assistance.

* * * * * * * * * * * * * * *

Two Canadian philosophers, Joseph Heath and Andrew Potter, give us an analysis of the impact of an earlier generation’s youth culture in Nation of Rebels: Why Counterculture Became Consumer Culture (New York: HarperBusiness, c. 2004). The rebels of the ’60s, the baby boomers, talked much about changing the world and making it a non-materialistic utopia of peace and beauty, but as adults they have tacitly repudiated their early idealism. The authors lament this loss, rather like socialists forever insisting on the purity of a system that never quite works as it should, but they insist we understand what happened through an analysis of the false ideas that have flourished since the ’60s.

Failing to think deeply enough and implement their convictions, counter-cultural radicals simply celebrated the wrong things—hippie attire, mindless music (today’s rap most the latest manifestation), mind-altering drugs. They generally imagined that reality could be shaped in accord with one’s nostalgia or hopes in anarchical utopias. Radicals imagined they would save the world by “subverting” the dominant culture through “alternative” art, clothing, “appropriate technology,” organic food, “free range chicken,” fair trade coffee, voluntary simplicity, and protest songs. In fact, as the baby boomers moved into positions of power in various institutions, they brought “their hippie value system with them” (p. 197).

“When the Beatles sang ‘All you need is love,’ many people took it quite literally” (p. 71). Rather than deal with the nitty-gritty problems of poverty and illiteracy and injustice, rather than understand the importance of productivity and personal discipline, counter-cultural rebels followed the lead of folks like Theodore Roszak and fixated on what he called “the psychic liberation of the oppressed.” They swallowed aphorisms coined by the likes of Herbert Marcuse, with his curious admixture of Marx and Freud, who lamented “repressive tolerance,” a phrase, Heath and Potter say, “makes about as much sense now as it did then” (p. 35). Which is to say it’s nonsense.

In short: critiquing mass society has failed to change it. The counterculture has majored in critiques for 40 years, but little resulted from their efforts. Sanctimoniously denouncing various kinds of “commodification,” radicals have settled into comfortable echelons of privilege (working at “cool jobs,” especially in universities, in “cool cities” such as Seattle and San Francisco) appropriate for themselves as the new “creative” class, earning twice as much as the working class. Indeed, “Cool is one of the major factors driving the modern economy. Cool has become the central ideology of consumer capitalism” (p. 188). Consequently, “the modern no-collar workplace, with its casual dress codes and flexible work hours” looks for all the world “like a hippie commune under professional management” (p. 202).

Nation of Rebels takes seriously the counter-culture of the ’60s, and it merits thoughtful reading. There seems to be much truth in the book’s thesis that the impact of the boomers was secondary rather than primary, and the changes they wrought were harmful rather than helpful.

172 American Autobiographies: Buckley; Hillerman; Medved

In 1951, when William F. Buckley published God and Man at Yale, there were millions of ordinary “conservatives” who lacked an intellectually vigorous forum for their ideas. When Buckley soon thereafter launched The National Review, they found both a forum and a spokesman who greatly shaped what is now arguably the dominant political position in the country. To understand the man who, for 50 years, has written and inspired an amazing array of writers and politicians, young and old, Buckley’s Miles Gone By: A Literary Autobiography (Washington, DC: Regnery Publishing, Inc., c. 2004) proves indispensable. Like the man himself, the autobiography is a bit unconventional, for Buckley simply strings together various previously written items to tell his story. “The design of this book,” he says, “is to bring together material I have written over fifty years, with an autobiography in mind” (p. xiii). Thus it is episodic rather than linear, refulgent with remembrances rather than chronological specifics. But the book is strangely effective, for one sees, through the passages presented, the world as Buckley saw it at very specific times. And one learns, while reading, who he is and how his ideas have shaped his life.

Buckley was blessed with virtuous parents. His father, he says, “was the most admirable man I ever knew” (p. 12). He prospered greatly, sired a large family, and presided over both business and family affairs with dignity and discernment. Importantly, his son remembers, he “was wonderful with children (up until they were adolescents; at which point . . . he took to addressing us primarily by mail, until we were safe again at eighteen)” (p. 35). He demonstrated “a constant, inexplicit tenderness to his wife and children, of which the many who witnessed it have not, they say, often seen the like” (p. 49). High praise for an archetypical “patriarchal” father! His mother, a vivacious and attentive woman, “never lost a southern innocence” (p. 51) and was ever determined to do “the will of God” (p. 52). “There were rules she lived by, chief among them those she understood God to have specified. And although Father was the unchallenged source of authority at home, Mother was unchallengeably in charge of arrangements in a house crowded with ten children and as many tutors, servants, and assistants” (p. 52). Amidst all the stresses and strains of caring for such a brood, she remained resolutely cheerful. Indeed, she refused to ever “complain; because, she explained, she could never repay God the favors He had done her, no matter what tribulations she might be made to suffer” (p. 54). His remarkable parents revered education, culture, and the Catholic faith, and they effectively reared their children accordingly.

Buckley’s collegiate education took place at Yale University, an experience recorded in God and Man at Yale, the book that brought him national attention (which I reviewed in issue #158 of my “Reedings.”) Twenty-five years later he was asked to write an introduction for an “anniversary edition” of the book, and now (looking back 50 years) “To young inquisitive friends, I say: Don’t bother to read the book, but do read the introduction” (p. 58), which is reprinted here. Trends evident at Yale, shortly after WWII, soon swept the country. Caving in to the fashionable notion that “all sides” of every issue deserve a hearing, insisting that “tolerance” and “diversity” are crucial components for academic respectability, most universities had lost their “mission.” A commitment to “academic freedom” had replaced their original raison d’etre. But to Buckley, only a focused “mission” justifies the existence of any university!

A surprising amount of Miles Gone By is devoted to sailing and skiing. While I cannot share Buckley’s fascination with the former I fully identify with the latter! The first day he skied (aged 29) he “thought seriously of abandoning journalism, my vendetta with the Soviet Union, my music, and my sailing, and settling down in Vermont, working five years to qualify as a ski instructor, and spending the balance of my life on the slopes” (p. 192). Fortunately he thought better of the idea. But thereafter he routinely took vacations in Switzerland and Utah, finding delight in both the beauty of the scenery and the challenge of the sport, skiing into his eight decade. “I know of no sport, no hobby, no avocation, as indulgent as skiing in giving you exactly the combination you wish of challenge, relaxation, thrill, exhilaration” (p. 195). Amen!

In a fascinating section, entitled “People,” Buckley celebrates “Ten Friends”–David Niven, the superb actor; Ronald Reagan, the president; Henry Kissinger, the diplomat; Claire Boothe Luce, the congressman; Tom Wolfe, the novelist; Vladimir Horowitz, the pianist; Roger Moore, the movie producer; Alistair Cooke, the historian; Princess Grace, the movie star turned Princess of Monaco; and John Kenneth Galbraith, the liberal Harvard economist. What’s amazing about this list is their prominence and diversity. Like Will Rogers, Buckley seems to genuinely “like” people and successfully established lasting friendships with various sorts of them.

Ever readable, ever enlightening, this “literary autobiography” is a fitting testament to its author.


In Nearer, My God: An Autobiography of Faith (New York: Harcourt Brace & Company, c. 1997), William F. Buckley gives readers insight into his soul. Almost blissfully, he reports: “I was baptized a Catholic and reared as one by devoted parents whose emotional and intellectual energies never cloyed” (p. xx). His “mother was a daily communicant. Father’s faith was not extrovert, but if you happened on him just before he left his bedroom in the morning, you would find him on his knees, praying” (p. 4). Consequently, he declares: “My faith has not wavered, though I permit myself to wonder whether, if it had, I’d advertise it . . . . “I wish I could here give to my readers a sense of my own personal struggle, but there is no sufficient story to there to tell” (p. xx). Righteous examples, particularly parental, surely matter–eternally!

As a Catholic attending Yale, he found little to trouble his faith but much to dissipate his hope for higher education! Colleges such as Yale had, before WWII, abandoned any commitment to Christian doctrine, assuming that a decent percentage of pious professors would maintain a suitable “religious atmosphere” of some nebulous sort! “When I left Yale in 1950,” says Buckley, “I had become convinced that it, and presumably, other colleges like it were engaged in discouraging intellectual and spiritual ties to Christianity” (p. 36). Half a century later, this trend is distressingly evident even in the formerly Christian prep schools of New England, where “there is today another God, and it is multiculturalism” (p. 37). More broadly, and ominously, he thinks: “What has happened, in two generations, is the substantial alienation of the secular culture from the biblical culture” (p. 233). That process now gains speed and threatens the very foundations of our society.

Buckley’s own theological convictions are rooted in thinkers such as John Henry Newman and were invigorated by challenging, far-ranging conversations with the likes of Sir Arnold Lunn (a skiing companion), Whittiker Chambers, Russell Kirk, Richard John Neuhaus, Jeffrey Hart, Malcolm Muggeridge, Chuck Colson, and Eugene Genovese. Ever eclectic in his friendships, he seems able to draw and distill insights from some of the world’s finest thinkers. And it’s clear that “faith” to Buckley is primarily an intellectual conviction regarding the truth of Christian doctrine. Nearer, My God contains none of the “personal experiences” so central to evangelical memoirs, little of the “strangely warmed” heart moments pietists prize. But it does make clear the author’s conviction that “anyone who is looking for God, Pascal said, will find him” (p. 85). That Buckley has found God is most evident in this treatise.


A writer of a different sort, Tony Hillerman, tells his life story in Seldom Disappointed: A Memoir (New York: HarperCollinsPublishers, c. 2001). Hillerman’s mysteries–The Blessing Way; Listening Woman; Skinwalkers; Coyote Waits, to name a few–are set in Navajo country and provide an effortless way to understand something of Navajo culture.

Born to an impoverished farm family in Oklahoma, Hillerman profited from the example of hard-working, devout parents. His father, he believes, literally worked himself to death and died young. His mother gave him an enduring example of courage and resolve. In the midst of depression and poverty, she refused to be daunted. To her, children “had nothing to worry about except maintaining our purity, being kind to others, saving our souls, and making good grades. With Papa’s help, she persuaded us that we were something special. We weren’t just white trash. Great things awaited us. Much was expected of us. . . . whining and self-pity were not allowed” (p. 46). Whatever happened, Mama would say: “Offer it up.” Give it to God and keep on keeping on! “We were born, we’d live a little while, and we’d die. Then would come joy, the great reward, the Great Adventure, eternal life” (p. 46).

Hillerman managed to graduate from high school and gain entrance to Oklahoma A&M, just as WWII was erupting. He soon joined the Army, went to Europe, and fought with his buddies through France and into Germany. Seriously wounded, losing an eye and walking with a limp thereafter, he received multiple decorations. All of this he describes with a wry, self-deprecating sense of humor, making light of his “heroism” and military life in general. He tested and confirmed the fact that there’s much “truth behind the axiom: ‘There are two ways of doing things. The right way and the Army way'” (p. 151). After months in various hospitals, he finally returned to Oklahoma, entered the University of Oklahoma, and studied journalism. Happy to maintain a “Gentlemanly C” grade point, his academic career was notably undistinguished, though he profited from at least one of his journalism professor’s instruction regarding “tight” writing. Use the right words! Eliminate adverbs and adjectives!

More important than professors, however, was a woman he met at OU in his senior year! Marie Unzner instantly enchanted him, and he persuaded her to become his wife. She proved a great blessing, for she “had more confidence in my writing than I did” (p. 260). Ever cheerful and optimistic, she continually encouraged him to pursue his dreams. Whereas his parents had nurtured him early in life, setting him on the right track, heading toward “that Last Great Adventure, and understanding that the Gospels Jesus used to teach us were the road map to make getting there a happy trip,” the final half-century of his life was “filled with love, joy, and laughter by a wonderful wife, partner, and helpmate named Marie” (p. 320).

Degree in hand and a wife to care for, Hillerman sought work. The position he found was in Borger, Texas, located “sixty miles north of Amarillo on rolling, almost treeless tundra of the high end of the Texas Panhandle” (p. 179). A more inauspicious beginning for a fledgling writer could not be imagined. But he started working, covering local stories and (importantly) observing people in all sorts of situations. Decades later some of the characters in his novels are based upon some remarkably admirable people he knew in Borger. Soon he found a better job in Lawton, Oklahoma, then moved to Oklahoma City to work for the United Press. That led, in 1952, to an assignment as UP Bureau Manager in Santa Fe, New Mexico, where he would work for more than a decade.

While recording the news, Hillerman sensed a deeper longing to write more creatively, to be a novelist. Despite a growing family of six children (all but one adopted), with his wife’s encouragement he decided to change careers and moved to Albuquerque to pursue a degree in English at the University of New Mexico. Once there, the opportunity to teach in the journalism department opened up, and he settled into the academic life for 15 years. Ever discerning, he discovered that the faculty was divided into two groups. Pragmatic “Organization” folks, with whom Hillerman sided, taught hard sciences and history; they mainly wanted to help the university survive and secure their salaries. Their antagonists, the “Crazy Bus” crowd–mainly representing such departments as Education, Sociology, and Anthropology–”was a mix of 1930 Marxism, Nihilism, Hedonism, and disgruntlement” who greatly troubled the state’s tax payers (p. 243). Infused with the vapors of the ’60s, they were out to change the world.

Hillerman found satisfaction teaching in the ’70s. “Students were interested, grade mania and the resulting grade inflation had barely emerged, the curse of political correctness had not yet paralyzed deans and department chairmen and corrupted the faculty” (pp. 262-263). He actually had “fun.” But the ’80s changed things. “The numbing dogma of PC hung over the campus, tolerating no opinions except the anointed ones. With free speech and free thought ruled out by inquisitors running Women’s Studies and the various minorities studies, the joy of learning had seeped out of students. With it went the joy of teaching. Time to quit” (p. 263). So he did! “One day after delivering a lecture so bad even I knew it was boring, I decided to quit academia and return to the real world” (p. 250). That meant writing and publishing novels!

Fortuitously, he found his métier–the mystery novel set in Navajo country. He also found agents and editors who enabled him to sell books. In time he flourished as his fans spread the news and his peers awarded his craftsmanship.


Michael Medved, known to many through his popular talk show, looks back on his life as a series of Right Turns: Unconventional Lessons from a Controversial Life (New York: Crown Forum, c. 2004). He structures the book with a series of 35 “lessons,” generally chronological but essentially thematic, to show how he has developed as a pundit, a very public intellectual, from a thoroughly radical leftist–opposing the Vietnam War and working for the ’72 McGovern campaign at its “Jewish desk”–to a deeply conservative Orthodox Jew, father and media figure. Importantly, he says: “This book isn’t about ‘my truth’; it’s about The Truth, to the extent I can apprehend and explain it” (p. 5). This puts him “counter to all trendy notions of moral relativism, which suggest that someone with different life experiences will inevitably read different conclusions, and that these conclusions deserve no less respect than mine” (p. 5).

Medved was born in Philadelphia, the grandson of industrious Jewish immigrants. One his grandmothers “grasped, and passed along, one of the greatest truths of life: it doesn’t matter how much you earn, so long as you spend less than you bring in” (p. 35). His parents soon moved to the Point Loma area of San Diego, where he went through the city’s public schools and imbibed liberal Democratic values from his parents. Such values were, however, early challenged by one of his uncles, Moish, an unusually erudite, self-educated and successful electrician, who was born in Ukraine in 1905. Taking him aside for a “man-to-man” talk, Uncle Moish warned young Michael against Communism, the “Scarlet Plague” that was ruining millions of people around the world, and “‘the people who are most likely to get sick, and who are going to suffer the most, are the brightest minds, the biggest idealists, the natural leaders of this world. They are people just like you'” (p. 47).

But young Michael hardly heeded (though he remembered) his uncle’s admonition for many years. While attending Yale, awash with radical students in the ’60s, he observed youngsters in the Students for a Democratic Society who vividly illustrated the “Scarlet Plague.” He also witnessed, as a sophomore, the impact of the drug-addled counterculture that swept through the university in 1966. Medved seemed temperamentally hostile to the “dopester dementia” and listened to a different melody, finding a healthy alternative by hitchhiking, almost every weekend, through sections of the “flyover world” disdained by the academic elitists. He also began, at Yale, a slow return to the faith of his fathers, Judaism, discovering, as he titles one chapter, “You Can Go Home Again.”

Following his graduation from Yale, he entered Yale Law School (getting acquainted with Hillary Clinton) but quickly decided he was not really interested in being a lawyer. So he returned to California, married, and enrolled in a writing program at the University of California, Berkeley, in 1972. Here he confirmed the truth that “Liberal Heroes Aren’t All Heroes” in the person of Ron Dellums, a “Castroite” congressman representing Berkeley and Oakland. Medved accepted a staff position in the Dellums’ campaign and grew quickly disillusioned with his candidate, who “reminded me of another tall, lanky, hugely ambitious, humorless pol I had known (and disliked) years before: John Kerry” (p. 170).

Barely arrived in Berkeley, Medved experienced another wake-up call–his home was burgled. The police caught the thief, who was a career criminal routinely released to practice his craft at public expense. The typical Berkeley intellectual’s sympathy for criminals, evaluated from the perspective of a victim, was simply “mad.” The cops, Medved decided, not the UC professors, see things as they really are. Consequently, Medved abandoned, in the ’70s, the vacuous ideologies and “utopian promises of the youth counterculture, while embracing traditional Judaism, entrepreneurial adventure, cops, and even Christians” (p. 204). He also read “Alexander Solzhenitsyn’s harrowing masterpiece, The Gulag Archipelago” (p. 209), a timely gift from his uncle Moish. Realizing that the “Scarlet Plague” explained both the USSR’s gulag and the counterculture’s fanaticism, he felt “guilty and heartsick for my country and for the so-called peace movement in which I had played such an active part” (p. 213).

Relocating to Los Angeles, where his folks now lived, he wrote, with a friend, a successful book, What Really Happened to the Class of ’65? and gained entrée to the media world. He wrote more books and became a noted film critic, interacting on a regular basis with the Hollywood elite. He also moved steadily toward Orthodox Judaism, getting involved in synagogue activities and taking seriously the precepts of his faith. His childless first marriage had collapsed, and he now shared, with his second wife, Diane, the conviction that “children represented an explicit focus of our relationship, giving us a sense of purpose, of destination” (p. 297). They came to strongly oppose divorce and abortion, enlisting as partisans in the “culture war” that divides America.

Addressing this “war,” he said, in an off-the-cuff 1990 speech: “This is the very nature of the cultural battle before us. It is, at its very core, a war against standards. It is a war against judgment. It’s proponents insist that the worst insult you can offer someone today is to suggest that he or she is judgmental” (p. 344). This is dramatically evident in the realm of art, where “ugliness has been enshrined as a new standard,” where “the ability to shock” is as admired as “the old ability to inspire” (p. 345).

Given the opportunity to do talk radio, Medved moved to Seattle determined to “inspire” listeners to embrace the “right” way he has found. This book certainly clarifies the message he wants to impart and enables one to understand the messenger.

171 Captialism & Christians

Returning recently from a conference featuring some influential contemporary thinkers, I noticed a book in my library with essays by a number of them–The Capitalist Spirit: Toward a Religious Ethic of Wealth Creation (San Francisco: Institute for Contemporary Studies, c. 1990), edited by Peter Berger. One of the 20th century’s most influential sociologists, Berger was driven by the data to shift from an anti-capitalist to a pro-capitalist stance, and in the book’s foreward he explains that the most important thing about wealth is that it must be created. This has proven to be a difficult concept for many religious thinkers to either understand or embrace, for most pre-modern religious thinkers, living in relatively static societies, could only envision justice through distributing existing wealth. Like the 18th century mercantilists, today’s “zero sum” economists envision a world with finite resources that need to be properly shared.

We all know that many pre-modern 16th century thinkers like Martin Luther resisted any re-casting of Medieval cosmology. But in time most everyone recognized that Copernicus and Kepler had rightly read the skies and set forth an accurate account of the ways the world functions. Equally important economic insights came to light in the 18th century as the Industrial Revolution opened up new avenues for productivity and the creation of wealth. But many ethicists, rooted in a pre-modern philosophy, failed to craft their moral convictions to fit the new economy. “It is no wonder, then,” says Berger, “that so many religious thinkers have been anticapitalist and prosocialist in their instinctive inclinations” (p. 2). Having shared such inclinations for years, Berger sympathizes with them. But he finally realized how they misled him. They simply are not true.

Nor have they ever been! Nostalgic visions of the Early Church as a precursor of socialism–sharing “all things in common”–are unfounded. Socialistic assertions regarding the Early Church resemble Rousseau’s portrayal of the “noble savage” in North America. The distinguished ancient historian Robert M. Grant, in “Early Christianity and Capital,” concludes that, contrary to the assertions of Christian socialists: “The church both ancient and medieval respected private property. The 38th Article of Religion of the Church of England . . . simply follows the central tradition when it insists that ‘the Riches and Goods of Christians are not common, as touching the right, title and possession of the same; as certain Anabaptists do falsely boast.’ In an equally traditional manner, the article balanced this statement with the exhortation that ‘nothwithstanding, every man ought, of such things as he possesseth, liberally to give alms to the poor, according to his ability” (p. 28).

Contrary to those who winnow the Old Testament for their socialistic economics–often taking things such as the Jubilee Year out of context, David Novak (an eminent Jewish scholar) insists that “equality” in the Old Testament has meaning only “in the sphere of rectification, that is, the restoration of private property misappropriated in one way or another” (p. 32). Even “charity” was not emphasized, for it too often renders recipients passive and dependent. In fact, economic justice, “in accordance with the principles of the Covenant, is thus best accomplished by loans” (p. 38). And commercial loans, Jewish teachers decided, must be understood differently from agricultural loans. Thus loaning money for investment justified collecting interest, whereas agricultural loans (generally of a brief duration) did not. “In other words, the loan is not given because the borrower has nothing but the shirt on his back, so to speak. Rather, the loan is now more probably for the sake of investment, a risk taken by both lender and borrower in the hope that the future will yield a better income than the present. In this case, the need for the sabbatical year release from indebtedness, which in the agricultural context would make a loan into a charitable gift, would no longer be required” (p. 47).

Michael Novak’s essay, “Wealth and Virtue: The Development of Christian Economic Teaching,” shows how a select circle of 18th century Scottish “moralists” (David Hume and Adam Smith) understood the essence and importance of free enterprise capitalism, defending the proper pursuit of wealth as admirable and socially beneficial. They “sought to construct a new ethos for Western civilization and, indeed, the world” (p. 70). Both sought to alleviate the plight of the poor, and envisioned “the surge of spiritual independence and the extension of humane sympathies that would flow from the sway of a more free and beneficent regime” (p.74). They particularly sought to replace the elitist, anti-capitalist position generally championed by intellectuals and artists with one favoring the bourgeoise, which empowered ordinary people. Indeed, though often portrayed as advocating a ruthless “dog eat dog” economy, Adam “Smith’s discussion reminds one of Saint Thomas’s definition of love: to will the good of another” (p. 68).

George Weigel, reknowned for his definitive biography of John Paul II, Witness to Hope, recounts the uneasy history of “American Catholicism and the Capitalist Ethic.” Whereas many Protestants have supported free enterprize capitalism, Catholics tended to critique it. Like Southern slave owners, enamored with the novels of Sir Walter Scott, they idealized the agrarian socioeconomic structures of the Middle Ages. Uneasy with the individualism evident in Protestant America, 19th century Catholics like Orestes Brtownson condemned capitalism and proposed an ideal “Christian society.” Eminent Catholic clerics early sided with the Knights of Labor in the 1880s, embracing its socialist prescriptions and strongly supported FDR’s New Deal 50 years later. Half-a-century later, in the 1980s, despite mounting evidence to the contrary, Catholic bishops and academics generally denounced “Reaganomics” and free enterprise capitalism. However, in the aftermath of Vatican II, and a fresh Catholic openness to the modern world, came the “creation-centered social thought of John Paul II” (p. 96). From the highest authority came the endorsement creative entrepreneurship. “Wealth creation,” to John Paul II, “is a specifically economic form of human participation in God’s abiding creativity, God’s sustaining care for his creation” (p. 96). It’s time, Weigel argues, for Catholics to embrace the free enterprise economy that has so uplifted the world and join John Paul II in making it Christian.

* * * * * * * * * * * * * * * * * * * *

In The Church and the Market: A Catholic Defense of the Free Economy (New York: Lexington Books, c, 2005), a Catholic historian, Thomas E. Woods, endeavors to counteract the anti-capitalist views of Christians who fail to see its worth. The Industrial Revolution, often deplored by socialists because of its reliance on child labor and exploitative practices, was in fact a great boon for the working classes. Bad as it was, it was an improvement on what went before! “To say that the free market led to the destruction of some previously existing, harmonious community life is simmply to defy historical testimony” (p. 165). Child labor, for example, was no new thing in 1800! Farm kids worked long and hard from time immemorial. To work hard in factories was not a major change. What changed, as economic conditions improved during the 19th century, was the ultimate abolition of “child labor” and the radical improvement of their living conditions–life expectancy, nutrition, education, etc.

Woods especially urges readers to seriously study economics and to discover truths discerned by 15th and 16th century Spanish Scholastics such as Juan de Mariana, as well as 20th century Austrians such as Ludwig von Mises. Whereas modern socialists, enthralled with Karl Marx, have embraced an illusion, the truth-seeking economicsts have carefully studied mans’s nature and prescribed the best ways for his flourishing. The Scholastics and Austrians, Woods says, both “sought to ground economic principles on the basis of absolute truth, apprehensible by means of reflection on the nature of reality” (p. 216). Prices, for example, rightly reflect market demand. Consumers–not the labor expended producing–should determine the value of various goods. Whenever the state intervenes, artificially setting “just prices,” dire if unintended consequences follow. Just wages are also best set by the marketplace. To Domingo de Soto, writing in the 16th century, workers who agree to a given salary are fairly paid when their employer pays as promised. Wages rise when wealth is created, and the prerennial socialist impulse to dictate “fair wages” generally militates against the very creative process that justifies higher salaries.

Money and banking, of course, are major economic concerns. We Americans live under the rule of the Federal Reserve, which, by issuing “fiat currency” basically “creates money out of thin air” (p. 93). Since it was founded in 1913, “the dollar has lost about 95 percent of its value” (p. 93). While claiming to control inflation, the “Fed” has, in fact, caused it! There are major moral problems with fiat currency, Woods argues, for it is, in fundamental ways, “not conceptually distinct from simple counterfeiting” (p. 97). The Spanish Scholastics knew this centuries ago. They also knew that some of the traditional teaching regarding usury could not address the dynamic, commercial economy of the world emergent in the 16th century. Indeed, “Catholic theologians had overturned virtually all of the older arguments against usury–at the very time that Martin Luther was busily attempting to rehabilitate them” (p. 114).

Regarding the welfare state, Woods invokes a recent warning by Pope John Paul II: “By intervening directly and depriving society of its responsibility, the Social Assistance State leads to a loss of human energies and an inordinate increase of public agencies, which are dominated more by bureaucratic ways of thinking than by concern for serving their clients, and which are accompanied by an enormous incrfease in spending. In fact, it would appear that needs are best understood and satisfied by people who are closest to them and who act as neighbors to those in need” (p. 147). But the welfare state directly harms neighborhoods and families. And it undermines private property rights–rights that Pope Leo XIII branded “sacred and inviolable” (p. 195).

* * * * * * * * * * * * * * * * * * * * *

Woods anchors his position regarding “the church and the market” in the scholarly work of Alejandro A. Chafuen, Faith and Liberty: The Economic Thought of the Late Scholastics (New York: Lexington Books, c. 2003). The popularity of Max Weber’s thesis, yoking capitalism and Calvinism, has obscured the numbers of Catholic philosophers who carefully situated free enterprise capitalism within the natural law teachings of the Church. “Our analysis of the Schoolmen’s writings,” says Chafuen in his conclusion, indicates “that modern free-market author owe the Scholazstics more than they realize. The same can be said for Western civilization” (p. 159).

This meant they stressed the sanctity of private property. Noting that many of Jesus’s associates “were quite wealthy for their times” (p. 32), they “declared it was heresy to say that those who have property cannot enter the kingdom of heaven” (p. 33). “According to [Juan de] Molina, private property may have existed even before original sin, since in that state, men could agree by common consent to divide the goods of the earth. The commandment ‘Thou shalt not steal’ implies that the division of goods does not pervert natural law” (p. 36). One scholar says that for these Scholastics “‘the right to property was an absolute right that no circumstances could ever invalidate'” (p. 42). This, Chafuen says is because: “Private property is rooted in human freedom, which is founded in human nature, which, like any other nature, is created by God. Private property is the essential prerequisite for economic freedom” (p. 160).

When they considered “public finance,” the Scholastics cautioned against government involvement in economics. “To believe in private property means to believe in limited government” (p. 132). Taxes should be minimal. The budget should be balanced. The currency should never be debased as a means of redistributing wealth. Administrative officials should not be allowed to grow rich at public expense. More than anything else, high taxes produce poverty. “‘Taxes are commonly a calamity for the people and a nightmare for the government'” said Juan de Mariana. “‘For the former, they are always excessive; for the latter, they are never enough, never too much'” (p. 57).

Contrary to socialists, for whom the “labor theory of value” of commodities is an article of faith, Scholastics trusted the marketplace to establish fair prices. Commerce and trade are necessary for a healthy society. Surviving through subsistence farming and barter economics condemns folks to perpetual indigence. To Molina, one should not scoff at different prices for the same goods in different areas, for “‘the just price of goods depends principally on the common estimation of the men of each region. When a good is sold in a certain region or place at a certain price (without fraud or monopoly or any foul play), that price should be held as a rule and measure to judge the just price of said good'” (p. 75). We value goods insofar as they are useful to us. It’s their usefulness–not the effort invested into making them–that determines what we’re willing to pay. Efforts to fix prices, through monopolistic controls established by either entrepreneurs or workers, are harmful and wrong. Wages, the Late Scholastics taught, should be set by the marketplace, where a “just” wage is whatever a worker freely accepts. A doctor’s wages, it follows, will be higher than a garbage collector’s, for we are willing to pay more for medical care than manual labor.

In all their works, the Scholastics sought to clarify the nature of justice for all–and especially for ordinary folks. “The protection of private property, the promotion of trade, the encouragement of commerce, the reduction of superfluous government spending and taxes, and a policy of sound money were all detined to improve the condition of the workers. They recommended private charity as a way to alleviate the sufferings of those who could not work. According to the Late Scholastics, and in agreement with the Holy Scriptures, the rich are under obligation to help the poor. Money could be better used if the rich would reduce their superfluous spending and increase their alms” (p. 110).


Still worth reading, to understand the Evangelical economic thought, is Craig M. Gay’s With Liberty and Justice for Whom? The Recent Evangelical Debate over Capitalism (Grand Rapids: William B. Eerdmans Publishing Company, c. 1991), originally written as a Ph.D. dissertation under Peter Berger’s guidance. As one would expect, this work is detailed, carefully documented, and quite helpful for anyone wanting to hear different voices from within Evangelicalism. Gay first noted the growing influence of the “New Class” intellectuals within Evangelicalism who profit from and thus endorse the leftist planks of the welfare state. This “New Class,” says Peter Berger, “‘rhetorically identifies its own class interests with the general welfare of society and especially with the downtrodden. . . . This is especially so because the knowledge class has such an interest in the welfare state, which is ostensibly set up on behalf of the poor and of other disadvantaged groups'” (p. 189). Led by “radicals” such as John Howard Yoder, Jim Wallis and Ron Sider, leftist Evangelicals denounce capitalism and America’s “oppressive” society. “Jim Wallis has stated, for example, that ‘overconsumption is theft,’ and Ronald Sider has insisted that ‘an affluent minority devours most of the earth’s non-renewable resources'” (p. 31). Anabaptist thought undergirds much of their protest, and they clearly long to establish their vision of the “kingdom of God” in this land. This should come through redistribution–taxes on the rich funding programs for the poor, legislation securing entitlements establishing various kinds of economic, racial, and sexual “equality” everywhere.

Clark Pinnock, closely associated with Wallis in the ’70s, later renounced his radical views, stating: “I remember being asked if I realized the Marxist content of what we were saying . . . and being puzzeled by the question. . . . It seemed reasonable to think of the rich as oppressors, and the poor as their victims. The Bible often seemed to do the same thing. It was obvious to me that the welfare state needed to be extended, that wealth ought to be forcibly redistributed through taxation, and that the third world deserved reparations from us, that our defense spending was in order to protect our privilege, and the like'” (p. 36). What’s now clear to Pinnock and other scholars is the Marxist influence on the Evangelical Left.

Rejecting Marxism and defending capitalism is the Evangelical Right, represented by Harold Lindsell, for years the editor of Christianity Today, and Ronald Nash, an influential Reformed philosopher, who taught at Westminister Theological Seminary. “In a sense,” Gay says, “those on the right have become traitors to the New Class” (p. 193). Thus they rarely if ever get invited (as does Jim Wallis) to high profile meetings of the inner circle of opinion shapers in New York and Washington, D.C. They may be intelligent, but they’re not accredited member of the reigning intelligentsia that controls the media and universities.

Those on the Right are, Stephen Brint says, “‘blue-collar workers, small-business people, and farmers'” (p. 191). They tend to be older, less educated, and live in what’s now called “red” states. Free market capitalism, they insisted, provides the best system yet developed to produce and distribute goods. The world is far better off as a result of modern capitalism. Wealth is created and spread abroad through free trade, and “in such an economy, no man becomes rich by oppressing another but rather by helping others” (p. 70). Thinkers on the Right anchor their defense of capitalism in the natural law. Given our nature, it’s the best economic system. To Lindsell, the free enterprise approach is approved by God’s Word and “is binding on all men everywhere.” Divinely ordained, “it is normative, it will work, and it will prove itself to be superior to socialism, which can only be validated by denying what God has revealed and can only function by destroying the foundations on which Western culture has beeen built'” (p. 100).

Neither Left nor Right is the “Evanglical Center” that finds “capitalism as a cause for concern.” Folks like Carl F.H. Henry (in The Uneasy Conscience of Modern Fundamentalism) and Bob Goudzwaard (in Capitalism and Progress) represent, for Gay, the evangelical mainstream. Henry clearly rejected Marxism, but his concern for social justice led him to criticize aspects of modern American capitalism, and “Goudzwaard has argued that the crisis of Western civilization . . . has been precipitated by the idolization of progress in the modern period, a problem linked to the institutionalization of modern capitalism” (p. 136). Such “mainstream” thinkers want to preserve valuable aspects of free enterprise capitalism while encouraging governmental intervention to mitigate its excesses and provide basic welfare needs for all peoples.

Gay concludes his book with two chapters evaluating what he has described, providing the reader a helpful perspective. “This is the best survey of evangelical thought about capitalism that I know of,” says Goudzwaard, and I concur. It still merits attention, nearly 20 years after it was written.

170 Disconsolate Brits

Recent laments from some eminent British writers provide a somber appraisal of their nation’s current conditions.  As ever, one must put such complaints in perspective, but their concerns certainly merit reflection.  Peter Hitchens (not to be confused with his brother Christopher) is a provocative journalist who compiled a collection of essays entitled The Abolition of Britain:  From Winston Churchill to Princess Diana (San Francisco:  Encounter Books, 2000).  As the subtitle indicates, Hitchens repeatedly contrasts Winston Churchill and Princess Diana (and their markedly different funerals) to compare the Britain of 1997 with that of 1965.  “The dead warrior was almost ninety, full of years and ready to die.   He represented the virtues of courage, fortitude and endurance; he was picturesque rather than glamorous,” whereas Diana, dying young in an accident, “was snatched from life in the midst of youth, beauty and glamour.  Her disputed virtues were founded on suffering (real or imagined) and appealed more to the outcasts and the wounded than to the dutiful plain heart of England” (p. 17).   More broadly, in society, the independence and tenacity of Churchill gave way, during the last third of the 20th century, to a celebrity culture curdling in an ethos of sentimentalism and victimization evident in England’s response to Princess Diana.

This cultural change has been aided by the erosion of historical knowledge.  In an essay entitled “Born Yesterday,” Hitchens laments the demise of  historical perspective in a land where “all kinds of rubbish are blown by the wayward winds of modern education and popular ‘culture'” (p. 46).  In the schools, the study of history has shifted from knowledge to “skills.”  What is studied or learned, say the educationists, is less important than asking questions and (especially) empathizing with those mistreated, for various reasons, in either the past or present.  Consequently, many traditional “heroes,” particularly of the military sort, are portrayed as villains, because they fought rather than appeased their enemies.  There is, in fact, a “belittling of the Second World War” in the current curriculum (p. 60).  For example, a 1995 videotape distributed to the schools to commemorate VE-day “mentioned Churchill only for a few seconds, and then to say he lost the 1945 election” (p. 60).

What’s encouraged in the schools is emotivism, especially self-righteous wrath regarding racists, sexists, or capitalists–all pilloried as oppressors of the weak and marginalized.  “The sort of topics recommended” by the educationists “have a weary familiarity for anyone acquainted with the Marxist interpretation of the twentieth century:  ‘the working classes’, ‘women in society’, ‘imperialism’ and so on” (p. 56).  Re-phrased, Marxist thought fuels Britain’s “class war.”  But Marxism is only an updated  version of the radicalism unleashed in 1789 by the French Revolution.  For nearly two centuries the British resisted the radical, totalizing, Jacobist ideology embraced by many Europeans in the 19th and 20th centuries that now “seeks to extinguish Britain, not by revolution, but by stealth” (p. 300).  Today’s leftists, intent on cultural rather than economic revolution, believe “education should be used to eradicate privilege and elitism, to spread the gospel of the new society in which everyone (and everything) is equal, a sort of concrete embodiment of that hideous song ‘Imagine’, which has become the hymn of sixties boomers” (p. 64).

The triumph of this trend was encapsulated by Prime Minister Tony Blair who, in 1997, the year of Princess Diana’s funeral, said:  “I am a modern man.  I am part of the rock and roll generation–the Beatles, colour TV, that’s the generation I come from” (p. 1).  Indeed, much about  Blair and the left-wing leftists who currently control the nation elicits Hitchens’ scorn.  He notes, for example, how prophetically Aldous Huxley, in Brave New World, envisioned “the cynical, puerile, bubblegum election campaigns fought by Bill Clinton in 1992 and 1996, and by Tony Blair in 1997” (p. 139).  That the Beatles and TV–not Shakespeare and Handel–have shaped a “modern man” like Blair cannot but dismay cultural conservatives.

In “Hell Freezes Over” Hitchens lampoons recent developments in the Church of England.  “Hell was abolished,” he writes, “around the same time that abortion was legalized and the death penalty was done away with” (p. 105).  Eminent ecclesiastics, such as Bishop John Robinson (of Honest to God fame) led the way on every front in the war against traditional, orthodox Christianity.  “The Ten Commandments, once blazoned behind every altar in the kingdom, were frequently left out of the Church of England’s Communion service . . . .  The King James version of the Bible, with its majestic but sometimes frightening language, was rejected by modernizers who sought to make it more ‘accessible’, replaced new versions which nonetheless somehow lacked the old scriptures’ force'” (p. 106).  The ancient majestic liturgy of the Book of Common Prayer was subtly subverted by “alternative” services.  Hymns disappeared.  “At the funerals of the young, entirely secular pop songs are often played as substitutes for hymns.  In the last few years, mourners have taken to telling jokes during funeral eulogies, as if they were at a wedding” (p. 126).  And the ancient Gospel of personal redemption from sin through the work of Jesus Christ was replaced by a Social Gospel urging folks to support political activism of a leveling, leftist sort.  Consequently, the Church of England is hardly more than an empty shell–emptied of theology, worship, beauty, and (much to the dismay of the “reformers”) people.

* * * * * * * * * * * * * * * * * * * *

Theodore Dalrymple shares Hitchens’ evaluation of his native land:  “In the past few decades, a peculiair and distinctive psychology has emerged in England.  Gone are the civility, sturdy independence, and admirable stoicism that carried the English through the war years.  It has been replaced by a constant whine of excuses, complaints, and special pleading.  The collapse of the British character has been as swift and complete as the collapse of British power” (p. 5).  Dalrymple is a medical doctor who has worked for the past two decades in an inner-city hospital and prison in London.  Though without religious faith, he seems to sense a humanitarian “call” to work among what he calls the “underclass.”  Along with his medical work, he has also flourished as an essayist, and he is, Peggy Noonan says, “the best doctor-writer since William Carlos Williams.”  Some of his essays appeared in Life at the Bottom:  The Worldview That Makes the Underclass (Chicago:  Ivan R. Dee, c. 2001).  “A specter is haunting the Western World,” he says “the underclass” (p. vii).  His first sentence, of course, replicates Marx’s opening line in The Communist Manifesto, substituting “underclass” for “communism.”

This “underclass” Dalrymple deals with on a daily basis demonstrates the power of pernicious ideas, for “the social pathology exhibited by the underclass” has been promulgated by an intelligentsia intent on denying free will and personal responsibility, promoting a fashionable moral relativism.  Educators discount correct grammar or spelling.  Artists claim there is no higher or lower culture.  Highly educated folks dress and talk like the less educated “workers” they feign to understand and emulate.  “Differences” between cultures and behaviors there may be, but nothing is qualitatively better that anything else, nothing is absolutely right or wrong.  Consequently, “the aim of untold millions is to be free to do exactly as they choose and for someone else to pay when things go wrong” (p. 5).  Sadly enough, those responsible for such behavior, the elite “intellectuals were about as sincere as Marie Antoinette when she played the shepherdess” (p. xi).  Their play-acting “is a crude and simple one, a hangover from Marxism:  that the upper and middle classes are bad; that what has traditionally been regarded as high culture is but a fig leaf for middle- and upper-class oppression of the working class; and that the working class is the only class whose diction, culture, manners, and tastes are genuine and authentic” (p. 81).

Thus he cites in Shakespeare’s King Lear to clarify the book’s theme:  “This is the

excellent foppery of the world, that when we are sick in fortune, often the surfeits of our own behavior, we make guilty of our disasters the sun, the moon, and stars; as if we were villains on necessity, fools by heavenly compulsion, knaves, thieves, and teachers by spherical predominance, drunkards, liars, and adulterers by an enforced obedience of planetary influence; and all that we are evil in, by a divine thrusting on.  An admirable evasion of whoremaster man, to lay his goatish disposition on the charge of a star!” (I, ii).  Dalrymple beholds Shakespeare’s truth, on a daily basis.  Prisoners he treats routinely use the passive mood when describing their crimes.  Thus three men who stabbed others “used precisely the same expression when describing to me what happened.  ‘The knife went in,’ they said” (p. 6).  They weren’t responsible!  The knife simply went in, killing a person–acting on its own, one assumes!  Another prisoner, a car thief, claimed he could not stop stealing and blamed the doctor for not stopping him!  These ill-educated criminals are, without knowing their source, voicing ideas spawned by some of the 20th century’s most powerful ideologies–”Freudianism, Marxism, and more recently sociobiology–in denying consciousness any importance in human conduct” (p. 22-23).

And the criminals know how to use criminologists’ rhetoric to legitimate their crimes!  “The great majority of the theories criminologists propound lead to the exculpation of criminals,” Dalrymple says, “and criminals eagerly take up these theories in their desire to present themselves as victims rather than victimizers” (p. 218).  So too they latch on to the ideas of social reformers, leftist philosophers and politicians who call for economic egalitarianism and denounce the wealthy.  The thieves he deals with generally “believe that anyone who possesses something can, ipso facto, afford to lose it, while someone who does not possess it is, ipso facto, justified in taking it.  Crime is but a form of redistributive taxation from below” (p. 219).  He astutely connects the fashionable theories of the professors and journalists with the lawlessness on the streets, noting that “those who propagate the idea that we live in a fundamentally unjust society also propagate crime” (p. 220).

In an essay entitled “What Is Poverty?” Dalrymple insists the real poverty in England is moral rather than economic.  Sadly enough, the Welfare State, designed to eliminate material “poverty,” has incubated a more devastating spiritual poverty.  He notes that young medical doctors (many of them the children of immigrants) who join his hospital staff initially think that their patients are oppressed by society and in need of various kinds of assistance.  “By the end of three months,” he says, “my doctors have, without exception, reversed their original opinion that the welfare state, as exemplified by England, represents the acme of civilization” (p. 142).  After working with London’s tax subsidized underclass, a Filipino doctor said:  “‘life is preferable in the slums of Manila'” (p. 142).  Dalrymple himself, having worked for a time in Tanzania and Nigeria, declares that  “nothing I saw–neither the poverty nor the overt oppression–ever had the same devastating effect on the human personality as the undiscriminating welfare state.  I never saw the loss of dignity, the self-centeredness, the spiritual and emotional vacuity, or the sheer ignorance of how to live that I see daily in England.  In a kind of pincer movement, therefore, I and the doctors from India and the Philippines have come to the same terrible conclusion:  that the worst poverty is in England–and it is not material poverty but poverty of soul” (p. 143).

* * * * * * * * * * * * * * * * * * * *

Theodore Dalrymple revisits many of the same issues in a more recently collection of essays:  Our Culture, What’s Left of It:  The Mandarins and the Masses (Chicago:  Ivan R. Dee, c. 2005), and his jaded pessimism grows apace.  Civilization, he notes, is a terribly fragile thing, as the horrors of the 20th century demonstrate.  And in London, as the 21st century begins, he’s witnessing its collapse–a collapse caused by nihilistic intellectuals who, to this point, have not yet suffered the dire consequences evident in the inner-city.  “Having spent a considerable portion of my professional career in Third World countries in which the implementation of abstract ideas and ideals has made bad situations incomparably worse, and the rest of my career among the very extensive British underclass, whose disastrous notions about how to live derive ultimately from the unrealistic, self-indulgent, and often fatuous ideas of social critics, I have come to regard the intellectual and artistic life as being of incalculable practical importance and effect” (p. xi).  Sadly enough, economists, novelists, film directors, journalists, and rock stars are waging a relentless war on the very innards of civilization, for barbarism begins, as Ortega y Gassett said, with the collapse of standards.

In a variety of ways, modern intellectuals have dismantled the barriers that restrain evil behaviors.  “In the psychotherapeutic worldview to which all good liberals subscribe, there is no evil, only victimhood” (p. 260).  Justify, as does George Soros, the legalization of drugs, and drug abuse soon shatters the delicate social bonds of family and neighborhood.  Encourage folks to “do your own thing,” and financially subsidize them with the welfare state, and all kinds of destructive things transpire!  Consequently, the nation that in 1921 recorded only one crime for every 370 inhabitants suffered one for every 10 in 2001.  England has become, especially since WWII, a distressingly crime-ridden land.  Fathers, who once accepted the responsibilities of caring for children, now knowingly abandon them “to lives of brutality, poverty, abuse and hopelessness” (p. 13).  Social workers have replaced fathers, freeing men to live as perpetual adolescents, forever seeking adventures and entertainments, “petulant, demanding, querulous, self-centered, and violent” when frustrated (p. 14).  They’ve simply embodied the fashionable theories of the intelligentsia, whose notions have mounted “a long march not only through the institutions but through the minds of the young.  When young people want to praise themselves, they describe themselves as ‘nonjudgmental.’  For them, the highest form of morality is amorality” (p. 14).

Interestingly, Dalrymple recurrently stresses the importance of dress!  How one looks seems mysteriously linked to how one acts and who one is.  With tongue (slightly) in cheek, he even suggests that tattoos cause crime!  He says this because virtually all the prisoners he treats sport a bewildering variety of tattoos.  In his younger days, he resisted any notion that appearances matter.  He “had assumed, along with most of my generation unacquainted with real hardship, that scruffy appearance was a sign of spiritual election, representing a rejection of the superficiality and materialism of bourgeois life.”  Wealthy artists and slovenly professors once seemed avante garde and stylish.  Older and wiser now, he says, “I have not been able to witness the voluntary adoption of torn, worn-out, and tattered clothes–at least in public–by those in a position to dress otherwise without a feeling of deep disgust.  Far from being a sign of solidarity with the poor, it is a perverse mockery of them; it is spitting on the graves of our ancestors, who struggled so hard, so long, and so bitterly that we might be warm, clean, well fed, and leisured enough to enjoy the better things in life” (p. 26).

Those feigning to reject bourgeois values think themselves (in accord with Marx) champions of the proletariat.  Virtually all modern intellectuals claim to identify with and support the poor, the marginalized, the disadvantaged.  Focusing on this, in an essay entitled “How–and How Not–to Love Mankind,” Dalrymple compares Karl Marx with Ivan Turgenev.  Both men were born in 1818 and died in 1883.  “Turgenev saw human beings as individuals always endowed with consciousness, character, feelings, and moral strengths and weaknesses; Marx saw them always as snowflakes in an avalanche, as instances of general forces, as not yet fully human because utterly conditioned by their circumstances.  Where Turgenev saw men, Marx saw classes of men; where Turgenev saw people, Marx saw the People.  These two ways of looking at the world persist into our own time and profoundly affect, for better or worse, the solutions we propose to our social problems” (p. 77).  Consequently, in Marx’s writings “we enter a world of infinite bile–of rancor, hatred, and contempt–rather than of sorrow and compassion” (p. 83).  Latent in his Communist Manifesto is the carnage wrought by his followers in the last century.  Millions died in the gulags.  And millions more today languish, dying spiritually, in the darkening eddies of the Marxist-inspired modern welfare state.

Dalrymple’s essays touch on many themes I’ve not mentioned.  His essays on Shakespeare and Virginia Woolfe, for example, indicate his concern for literary culture.  His observations on the differences between Hindu and Muslim immigrants are well worth pondering.  He is, Roger Kimball says, “the Edmund Burke of our age, eloquently anatomizing the moral depredations of that pseudo-enlightenment which has left large tracts of Western Society the province of thugs, social workers, liberal bureaucrats, and other enemies of civilization and the ordered liberty upon which it depends.”

* * * * * * * * * * * * * * * * * * * * * * *

Alice Thomas Ellis is one of England’s finest contemporary novelists.  She is also a devout, thoroughly traditional Roman Catholic.  Consequently she wrote, for the periodical Oldie, some short, trenchant, columns packed with her distaste for things happening in her Church that were recently published as:  God Has not Changed:  The Assembled Thoughts of Alice Thomas Ellis (London:  Burnes & Oates, 2004).  The churches that have emptied, during the past 40 years, did so for a reason–the ancient Faith has been jettisoned by a clergy more intent on being well-liked and respected than on teaching the truth.  Many of them “are too nervous to mention their beliefs–if they’ve even got them any more–and subject us instead to anodyne twaddle about their own experience” (p. 65).  Allegedly trying to reach the young, they have failed and in the process alienated loyal, older folks like herself.   No longer real believers, they’re much like butchers “inclined to vegetarianism” (p. 17) who lack the decency to change vocations!

She’s particularly distressed with the allegedly “Christian” feminists agitating for power and preeminence in the Church.  “A group recently carted round a church crucifix with a female on it–happily not a real one–referring to the curious thing as Jesa Crista.” Says Ellis:  “Sheer, pure nuttiness can go no further.  Never mind it is blasphemous, it is silly to suggest that historical figures can change sex” (p. 2).  She’s equally critical of those who reduce to the Gospel to the most fashionable “social justice” movement.  What she calls “the Red Guard of the Church, in the Wake of Vatican II” has effectively “completed the work of destruction begun in the Reformation” (p. 33).  Indeed, she warns, the “humanist protestantism to which the liberals incline is a first dip in the sea of atheism” (p. 50), and the Church is sinking rapidly into its depths.  Ellis is most probably too pessimistic, but her verbal darts deftly call into question certain postures and pronouncements of contemporary churchmen.  And one cannot but smile as she skews some of the more outrageous fads and heresies afflicting the Church.

169 Original Intentions – Founding Fathers

As a “new nation” America was uniquely shaped during the first half-century other existence. To Daniel Webster, the forging of the Constitution was absolutely central to that process. “Hold on to the Constitution of the United States of America and the Republic for which it stands–what has happened once in six thousand years may never happen again,” said Webster. “Hold on to your Constitution, for if the American Constitution shall fail there will be anarchy throughout the world.” The genius of this Constitution, M.E. Bradford argued, lies in the Founding Fathers’ Original Intentions: On the Making and Ratification of the United States Constitution (Athens: The University of Georgia Press, c. 1993).  Bradford, who died the year this book was published, was a professor of English at the University of Dallas, a very traditional Catholic university. Like Richard Weaver, with whom he has much in common, he belonged to the “Southern agrarian” school and considered himself primarily a “rhetorician.” He devoted his scholarly life to understanding the American way. As is evident in his important earlier work,

A Worthy Company: Brief Lives of the Framers of the United States Constitution, Bradford especially sought to show the deeply Christian (rather than secular Enlightenment) commitments of the men who birthed this nation.

Original Intentions is a collection of lectures Bradford delivered at various law schools (e.g. The University of South Carolina) and universities (e.g. DartmouthCollege) during 1976, the “bicentennial” year. They rest upon a thorough investigation of the primary sources–especially the records of influential persons largely unknown today but influential in that era. Several of the lectures deal with the debates that took place in states (Massachusetts; North Carolina; South Carolina) considering ratification following the convention. In a most helpful forward, the distinguished historian Forrest McDonald identifies two themes that weave their way throughout the lectures. First, Bradford argued that the Constitution established a clearly, indeed severely limited government. Second, he repeatedly employs the English philosopher Michael Oakeshott’s distinction between “nomocratic” and “ideological” readings of the document.  According to the nomocratic reading, McDonald says, the Constitution “is primarily a structural and procedural document, specifying who is to exercise what powers and how. It is a body of law, designed to govern, not the people, but government itself; and it is written in language intelligible to all, that all might know whether it is being obeyed” (p. xii). For fully 150 years this nomocratic understanding generally prevailed. Since WWII, however, a teleocratic view has captivated the nation’s laws schools and courts “and has all but destroyed the original Constitution” (p. xii).

In 1787, there were men like Alexander Hamilton and James Madison who wanted to establish a strong, centralized government. Madison, Bradford shows, was hardly the “father of the Constitution,” for his “Virginia Plan” was quickly rejected as the majority of delegates insisted on preserving significant roles for the 13 states. In fact, they almost disbanded the convention until a series of compromises brought into being a much more modest compact than Madison envisioned. The committee that finally drafted the document was chaired not by Madison but by John Rutledge of South Carolina and closely followed the proposals of Connecticut’s Roger Sherman, who believed the “objects of union . . . were few: (1) defense against foreign danger; (2) control of internal disputes and disorders; (3) treaties; (4) foreign commerce, and a revenue to be derived from it” (p. 11).

Philadelphia in 1787 was quite unlike Paris following the 1789 Revolution. The Americans drafted a document designed to establish “a more perfect union,” but not an absolutely perfect nation.  While abstractions like “Liberty, Equality, Fraternity” may arouse emotions, catchy slogans no more establish a sound republic than New Year’s resolutions establish a good character. The American Constitution “is more concerned with what government will not do for each of us than with the positive description of acceptable conduct, which is left to local and idiosyncratic definition–to society, local customs, and tested ways. Most important, it is not about enforcing the abstract ‘rights of man’ or some theory of perfect justice and aboriginal equality, not even with the Bill of Rights added to it” (p. 13).

The French approach, on the other hand, stressing the “rights of man,” Sir Herbert Butterfield wisely noted, illustrates the modem endeavor to “make gods now, not out of wood and stone, which though a waste of time is a fairly innocent proceeding, but out of their abstract nouns, which are the most treacherous and explosive things in the world” (The Englishman and His History {Archon Books, 1970}, p. 128). America’s Constitution, conversely, contained few “abstract nouns,” concentrating instead (following the British example) on the “old liberties” familiar to English-speaking peoples. The Common Law jurists, such as Coke and Blackstone, not the radical philosophes, such as Diderot and Rousseau, were their authorities. “John Adams, especially, admired the fundamental law of Great Britain, describing it as ‘the most stupendous fabric of human invention’ and a greater source of ‘honor to human understanding’ than any other artifact in the ‘history of civilization'” (p. 28). And Virginia’s Patrick Henry agreed, touting the British system ‘”the fairest fabric that ever human nature reared'” (p. 31).

“It is,” Bradford concludes, “impossible to understand what the Framers attempted with the Constitution of the United States without first recognizing why most of them dreaded pure democracy, judicial tyranny, or absolute legislative supremacy and sought instead to secure for themselves and their posterity the sort of specific, negative, and procedural guarantees that have grown up within the context of that (until recently) most stable and continuous version of the rule of law known to the civilized world: the premise that every free citizen should be protected by the law of the land” (p. 32).

Bradford’s lecture on “Religion and the Framers: The Biographical Evidence,” reveals how profoundly wrong-headed is the modem judiciary’s “separation of church and state.” Anyone deeply-rooted in the primary sources, he insists, cannot but recognize and revere the deeply Christian beliefs of some 95 percent of “the 150 to 200 principal Founders of the Republic” (p. 88). In their private papers, wills and ars moriendi, they routinely referred “to Jesus Christ as Redeemer and Son of God” (p. 89). Many of them, including Patrick Henry and George Washington, opposed Jefferson’s moves to disestablish the church in Virginia. Central figures in the making of the new nation–including Elias Boudinot, Roger Sherman, Charles Cotesworth Pinckney, Luther Martin, and John Dickinson–were deeply devout and zealous Christians. To portray the Framers as deists, a la Jefferson and Franklin, is, Bradford declares, egregiously wrong. It is, however, the typical textbook story foisted upon the public these days.

Turning to the post-Civil War Reconstruction Amendments, Bradford argues they did not significantly change the “nomocratic” essence of the 1787 Constitution. But since 1945 these amendments (and especially the 14th), through the doctrine of “incorporation,” have been increasingly used to make the Constitution “a teleocratic instrument: a law with endlessly unfolding implications in the area of personal rights” (p. 104). This has been done through “the shoddy scholarship of the Warren Court,” amply evident in an opinion of Justice Potter Stewart, who selectively cited “bits of speeches that appear to support his views and especially radical language contained in clauses rejected by Congress as a whole” (p. 118). Consequently, “in the end we get Chief Justice Warren saying that ‘the provisions of the Constitution are not time worn adages or hollow shibboleths . . . [but] vital living principles.’ And we also get Warren’s apologists coming after him, arguing that the court had always from the Founders the ‘implied power’ to revise and rewrite the Constitution according to its recognition of a ‘higher’ or ‘natural law.’ Taken together, their words describe according to its essence just what a teleocratic constitution might be, or describe no constitution at all (p. 125).

Bradford’s burden in these lectures is obviously to limit the powers of the federal government, making it truly a “federal” government of limited authority. And the evidence he cites certainly validates his conviction that the “original intentions” of the Founding Fathers were largely forgotten during the 20th century.

* * * * * * * * * * * * * * * * * *

Bradford’s recent concerns were previsioned, at the beginning of the Republic, by Anti-Federalists like Patrick Henry and John Taylor of Caroline (the intellectual leader of the strict constructionist Jeffersonian Republicans). Born in 1753, Taylor was admitted to Virginia’s CarolineCounty bar in 1774, just as the American Revolution began. He joined a Virginia regiment and ultimately became a major in the Continental army. Thereafter he served in the Virginia General Assembly and was thrice appointed to serve out senatorial terms in the United States Senate. But his great vocation, he believed, was to farm well and write wisely. His plantation, “Hazelwood,” became a model of “scientific” farming–reclaiming exhausted soil and illustrating the goodness of the agrarian life. In 1813 he published Arator: Being a Series of Agricultural Essays, Practical and Political. Writing an introduction to it, ME. Bradford said: “Taylor is like Cato … in treating advice on farming as a species of moral instruction . . . [for] Arator is about the social order of an agricultural republic, and not just about farming” (in the Liberty Fund edition, 1977, p. 37). Like Jefferson, Taylor believed that agriculture should be the basis of any healthy society.

As a political thinker, Taylor is best known for helping craft the Virginia Resolutions in 1798 and for three lengthy works published during the last decade (1814-1824) of his life: (1) An Inquiry into the Principles and Policy of the Government of the United States, (2) Tyranny Unmasked, and (3) New Views of the Constitution of the United States. He wrote to decry the manifest concentration of power in the federal government that was utterly unwarranted by the Constitution. The financial policies of Secretary of the Treasury Alexander Hamilton (such as funding state debts, internal improvements, and the National Bank) in the 1790s contravened the Constitution. Subsequent protective tariffs were designed to help northern industries (and wealthy industrialists) rather than the people. And the nationalistic decisions of the Supreme Court under the guidance of John Marshall, were not envisioned by the architects of the United States. All such centralizing developments elicited Taylor’s strong condemnations.

In Tyranny Unmasked (Indianapolis: Liberty Fund, 1992), Taylor primarily attacked the protective tariff that so harmed the agrarian South. There is no difference, he insisted, between taking property through violence and taking it through taxes and fiscal policies designed to award a privileged minority. “A tax may be imposed for two objects; one to sustain a government, the other to enrich individuals” (p. 116). There is no difference between a tyranny with one man on top and a tyranny with a thousand men on top. Elected tyrants are still tyrants. Fifty years after the Revolution, Taylor warned, Americans “must once more decide whether we will be a free nation. Freedom is not constituted solely by having a government of our own. Under this idea most nations would be free. We fought a revolutionary war against exclusive privileges and oppressive monopolies” (p. 84). To grant similar privileges and monopolies under the auspices of the “national” government would betray the fundamental nature of the United States.

A free people, Taylor insisted, require a limited government. “All reflecting individuals, except those bribed by self-interest, believe that liberty can only be preserved by a frugal government, and by excluding frauds for transferring property from one man to another. In no definition of it has even its enemies asserted, that liberty consisted of monopolies, extensive privileges, legal transfers of private property, and heavy taxation. In defining a tyrant, it is not necessary to prove that he is a cannibal. How then is tyranny to be ascertained? In no other perfect way that I can discern, except as something which takes away our money, transfers our property and comforts to those who did not earn them, and eats the food belonging to others.” (p. 226).

Ambition and avarice ever haunt the corridors of power. Thus freedom flourishes only when power is restrained by the checks and balances set forth in the Constitution, and most especially in the 10th Amendment that specified: “The powers, not delegated to the United States by the constitution, nor prohibited by it to the States, are reserved to the states respectively, or to the people.”

* * * * * * * * * * * * * * * * * * * *

In 1823, a year after publishing Tyranny Unmasked, John Taylor of Caroline published New Views of the Constitution of the United States (Washington, D.C.: Regnery Publishing, Inc., c. 2000). Whereas the protective tariff served as the focus for the earlier work, the original intentions of the Framers of the Constitution served as the subject for the latter, and it is, James McClellan says in his Introduction, “the locus classicus of states’ rights jurisprudence” (p. xiii). In 1818 Congress had permitted the publication of Robert Yates’ notes of the Constitutional Convention. (Before this, by Congressional order, nothing was known of the behind-the-scenes debates of the delegates, and James Madison’s journal was not published until the 1840s. Thus the strong states’ rights concerns of the Convention’s Framers was largely unknown for 25 years). Comparing their actual intent, as recorded in Yates’ Journal, with the widely-known interpretations set forth by Madison and Hamilton in The Federalist Papers, Taylor discovered pervasive “distortions of the original meaning and a nationalistic bias” (p. liii) of the latter.

“Had the journal of the convention which framed the constitution of the United States, though obscure and incomplete,” Taylor said, “been published immediately after its ratification, it would have furnished lights towards a true construction, sufficiently clear to have prevented several trespasses upon its principles, and tendencies towards its subversion” (p. 13). The Framers clearly envisioned a limited federal government, not the national regime evident by 1820. Indeed, as the several states appointed delegates to the Convention they insisted on using the right word, unanimously rejecting “the recommendation of a national government, and by excluding he word national from all their credentials, demonstrated that they well understood the wide difference between a federal and a national union” (p. 18).

Taylor devoted many pages to carefully examining the materials in Yates’ journal, dismayed that its contents had been buried for 30 years. “Thus the vindicators of a federal construction of the constitution are deprived of a great mass of light, and the consolidating school have gotten rid of a great mass of detection. Secrecy is intended for delusion, and delusion is fraud. If it was dictated by an apprehension, that a knowledge of the propositions and debates, would have alarmed the settled preference of the states and of the publick, for a federal form of government, it amounts to an acknowledgement that these propositions and debates were hostile to that form and to the publick opinion” (p. 47). Deprived of the truth, many naively believed the positions espoused in The Federalist Papers. So Taylor devoted much of the book to an examination and refutation of the interpretations set forth therein by Hamilton and Madison,

as well as clarifying his own understanding of the Constitution. Their differences are demonstrable:  “These gentlemen believed that a supreme national government was best for the United States, and I believe that a genuine federal system is more likely to secure their liberty, prosperity, and happiness” (p. 75). The question is: which interpretation best represents the “original intentions” of the Framers?

Given the evidence from the original sources, Taylor defended the “federal” rather than the “national” system. “The delegations, reservations, and prohibitions of the constitution, combined with the rejection of powers proposed in the convention, constitute a mass of evidence, more coherent and irrefragable for ascertaining the principles of our political system, than can be exhibited by any other country; and if it cannot resist the arts of construction, constitutions are feeble obstacles to ambition, and ineffectual barriers against tyranny. …. This mass of evidence stands opposed to those constructions which are labouring to invest the federal government with powers to abridge the state right of taxation; … to expend the money belonging to the United States without control; to enrich a local capitalist interest at the expense of the people; to create corporations for abridging state rights; to make roads and canals; and finally to empower the supreme court to exercise a complete negative power over state laws and judgments, and an affirmative power as to federal laws” (p. 189).

Looking at Taylor’s 1823 list in 2006, it is evident that his fears have materialized. Uncontrolled spending, even by Republicans elected to restrain it, continues unabated as we enter the 21st century.  “Local capitalists” routinely gain advantages, through the hoards of lobbyists (many of them former senators and congressmen) who wine and dine “public servants” such as Congressman Randy “Duke” Cunningham. Federal bureaucracies, such as the Environmental Protection Agency or Department of Education, have slowly increased their coercive roles in realms formerly reserved to state and local governments. Internal improvements–”roads and canals” in Taylor’s day–have been widely nationalized, as is most evident in “disaster relief in Louisiana and federal influence in minor matters like speed limits. And the Supreme Court, greatly feared by Taylor, has become a major player in making laws and shaping society. Court decisions, whether mandating abortion rights or racial preferences in university admissions, reveal the enormous political power now resident in the hands of nine unelected jurists.

Hamilton and Madison certainly exerted influence in the 1787 Constitutional Convention, but ultimately their position, calling for a strongly centralized government, was soundly rejected by that body.  This was because the Framers prized an ordered liberty. “Society, well constructed, must be compounded of restraint and freedom, and this was carefully attended to in framing our union. The states are restrained from doing some things, and left free to do others; and the federal government was made free to do some things, but restrained from doing others. This arrangement cannot be violated, without making one department a slave or an usurper. A division of political rights between the people and a government, can only preserve individual liberty” (p. 301). In sum: “Freedom without restraint, or restraint without freedom, is either anarchy or despotism” (p. 301).

Taylor’s position, of course, was embraced by John C. Calhoun and in time by the architects of The Confederate States of America. Thus his states’ rights argument cannot escape the stigma of slavery and segregation in the South. But the essence of what Taylor (and Bradford) argue–that the best government is a limited government–still has currency. One need do no more than note the latest “pork barrel” legislation, or the Supreme Court’s meddling in local decisions regarding placement of the Ten Commandments, or recent presidents’ decisions to help hurricane victims or pay for drug prescriptions or dispatch troops around the world, to realize how centralized and powerful the government created by the Constitution has become. The current regime may be necessary–or it may be better. But it is clearly not what the 1789 convention envisioned.