288 Debunking Utopia

 In Dostoyevski’s The Brothers Karamazov, Father Zossima tried to counsel a distraught woman by encouraging her to embrace an “active love” by helping others.  She  unfortunately failed to follow his advice, settling into an abstract “love for humanity.”  Father Zossima called her’s a “love in dreams” and noted that “love in action is a harsh and dreadful thing compared with love in dreams,”  for when daydreaming, or imagining how we can help “humanity,” we slip into a non-existent future that helps no one.  Consequently we label “utopian” those societies designed for beings quite unlike our species.  Nima Sanandaji, in Debunking Utopia:  Exposing the Myth of Nordic Socialism (Washington, D.C.:  WND Books, c. 2016), reminds us, with ample facts, that socialism forever fails simply because it cannot succeed in the real (as opposed to the imaginary) world.  He makes three important points:  1) post-WWII Scandinavia’s economic success results from the region’s cultural roots rather than socio-political structures; 2) trying to duplicate Nordic economic structures elsewhere cannot but to fail; and 3) “democratic socialism” is now collapsing even in Scandinavia due to its intrinsic flaws—indeed, “the days when the Nordic countries could actually be socialist are long gone” (#2333).  

Sanandaji’s family emigrated from Iran to Sweden in 1989, and he personally enjoyed all the benefits of that nation’s generous (socialistic) welfare state, ultimately finishing a Ph.D. in economics, publishing 20 books and scores of scholarly papers.  After years of carefully examining the “democratic socialism” of Nordic countries (Sweden, Norway, Finland, Iceland, and Denmark), he understands why it’s seriously flawed and now imploding.  Indeed, though American socialists (e.g. Bernie Sanders) and  progressives (e.g. President Obama) naively laud them, today “only one of the five Nordic countries has a social democratic government” (#107).  In various ways Scandinavians seem to be moving away from a failed model.  

Without question the Nordics enjoy many good things—longevity, education, health care, women’s rights, generous vacations, etc.  But the good life there evident, Sanandaji insists, results mainly from Nordic culture rather than socialist structures.  They all “have homogenous populations with nongovernmental social institutions that are uniquely adapted to the modern world.  High levels of trust, a strong work ethic, civic participation, social cohesion, individual responsibility, and family values are long-standing features of Nordic society that predate the welfare state.  These deeper social institutions explain much of the success of the Nordics” (#337).  Imagining the United States—or African or South American countries—could duplicate the Nordic model without the Nordic culture is simply wishful (and extraordinarily frivolous) thinking.  Still more:  it’s important to acknowledge that many of the world’s finest places, enjoying the highest level of well-being, are places like Switzerland and Australia which differ markedly from the Nordics.  

For instance, using one of the main criteria for national well-being—longevity—we find Japan, Switzerland, Singapore, and Australia at least as good as the Nordics.  “Instead of politics, the common feature seems to be that these are countries where people eat healthily and exercise” (#423).   Rather than thinking welfare states make everyone healthy through universal health care, we should understand the life-style ingredients that truly matter.  Then consider the celebrated economic “equality” praised by the likes of  Bernie Sanders.   Sanandaji’s brother Tino (also an economist) notes:  “‘American scholars who write about the success of the Scandinavian welfare states in the postwar period tend to be remarkably uninterested in Scandinavia’s history prior to that period.  Scandinavia was likely the most egalitarian part of Europe even before the modern era’” (#521).  

In part this grew out of the region’s agrarian roots.  For centuries hard-working farmers had survived in an unusually difficult environment.  Necessarily they forged a culture “with great emphasis on individual responsibility and hard work” (#630).  They also secured property rights and embraced a market system that enabled them to thrive as independent yeomen committed to the “norms of cooperation, punctuality, honesty, and hard work that largely underpin Nordic success” (#659).  These norms were then brought to the United States by Scandinavian immigrants in the 19th century, and we find transplanted Swedish-American and Norwegian-American communities distinguished by conscientious, law-abiding, hard-working people.  Consequently they thrived and easily entered the mainstream of their new nation.  Today the eleven million Americans who identify themselves as Nordic are doing even better than their kinsmen still living in Scandinavia and “have less than half the average American poverty rate” (#830).   Culture, not economics, explains the difference!   

Rather than helping improve Scandinavia, Sanandaji says, socialism has actually harmed the region!  As an article in “The Economist explains:  ‘In the period from 1870 to 1970 the Nordic countries were among the world’s fastest-growing countries, thanks to a series of pro-business reforms such as the establishment of banks and the privatization of forests.  But in the 1970s and 1980s the undisciplined growth of government caused the reforms to run into the sands.’  Around 1968 the Left radicalized around the world . . . .  The social democrats in Sweden and other Nordic countries grew bold, and decided to go after the goose that lay the golden eggs:  entrepreneurship” (#1104).  Implementing “democratic socialism” they targeted and taxed the “rich”—the businessmen, the wealth-creators, the very folks responsible for their nations’ prosperity.  Though Scandinavian countries enjoyed remarkable prosperity immediately following WWII, by becoming welfare states they struggled for the next half-a-century to preserve it.  “Third Way socialist policies are often upheld as the normal state Swedish policies.  In reality, one can better understand them as a failed social experiment, which resulted in stagnating growth and which with time have been abandoned” (#1127).  

Rather than celebrating the glories of socialism, the Nordics have learned a sad lesson and recently turned toward a more free market economy.  They grew “rich during periods with free-market policies and low taxes, and they have stagnated under socialist policies.  Job growth follows the same logic” (#1250).  Small government and low taxes spell prosperity; intrusive government and high taxes make for slow (or no) growth.  Recognizing this—and retreating from state-run monopolies—educational and health care facilities have been “opened up in Sweden as voucher systems, allowing for-profit schools, hospitals, and elderly care contorts operate with tax funding” (#1458).  Such moves “drove up wages, evident by the fact that these individuals gained 5 percentage points’ higher wages than similar employees whose workplaces had not been privatized” (#1485). 

Sanandaji has written this book to warn Americans who look favorably on “democratic socialism” in a nation “only very marginally more economically free than Denmark” (#2348).  Noting that Franklin D. Roosevelt was the “architect of the American welfare state,” he then reminds us that FDR also warned:   “‘The lessons of history, confirmed by the evidence immediately before me, show conclusively that continued dependence upon relief induces a spiritual and moral disintegration fundamentally destructive to the national fibre.  To dole out relief in this way is to administer a narcotic, a subtle destroyer of the human spirit.  It is inimical to the dictates of sound policy.  It is in violation of the traditions of America’” (#1512).  Recently a German scholar, Friedrich Heinemann, has validated FDR’s concern regarding “moral disintegration” in welfare states.  Data from the World Value Survey indicate “a self-destructive mechanism exists” in them which dissolves norms.   This is sadly evident, Sanandaji says, in Nordic lands, whose celebrated “work ethic” has dissipated.   Many young Scandinavians work less diligently than their parents, fail to form solid families, and falsely claim to be “sick” to avoid work when it suits them.  

Debunking Utopia should give us Americans additional pause as we endeavor to shape our nation’s immigration policies as well as its economy, for Sanandaji dolefully describes the mounting problems Sweden faces as a result of opening the border to refugees from Muslim nations.  “Sweden is in a ditch because many politicians , intellectuals and journalists—on both the left and the right—have claimed that refugee immigration is a boon to the country’s economy and that large-scale immigration is the only way of  sustaining the welfare state.  . . . .  But of course, serious research has never shown that refugee immigrants boost the Swedish economy.  The truth is quite the opposite” (#2216).    In truth, poverty and crime and educational problems have accelerated as waves of immigrants have washed over the country.  

In fact, Americans should look seriously at their own traditions and seek to revive them rather than fantasizing about any form of “democratic socialism.”  As an old country song declares:  “there ain’t no living in a perfect world,” and “the true lesson from the Nordics is this:  culture, at least as much as politics, matters.  If the goal is to create a better society, we should strive to create a society that fosters a better culture.  This can be done by setting up a system wherein people are urged to take responsibility for themselves and their families, trust their neighbors and work together.  The Nordic countries did evolve such a culture—during a period when the state was small, when self-reliance was favored.  For a time these societies prospered while combining strong norms with a limited welfare state, which was focused on providing services such as education rather than generous handouts.  Then came the temptation to increase the size of the welfare state.  Slowly a culture of welfare dependency grew, eroding the good norms” (#2395).  Only by resurrecting those good norms—and abandoning the failed welfare state—can Scandinavians or Americans truly prosper.  

                     * * * * * * * * * * * * * * * * * * * * * * * * * * 

“Two decades ago, New Zealand went through a dramatic transformation, from a basket case welfare state saddled with crushing public debt, rampant inflation, and a closed and moribund economy, to what is today widely regarded as one of the freest and most prosperous countries in the world.  This is the story of how that happened.”  So Bill Frezza sums up his New Zealand’s Far-Reaching Reforms:  A Case Study on How to Save Democracy from Itself (Guatemala, Guatemala:  The Antigua Forum, c. 2015).  Following WWII—and before such reforms—as more and more money was printed to fuel more and more welfare programs, New Zealand graphically illustrated H. L. Mencken’s quip:  “Every election is a sort of advance auction sale of stolen goods.”  Indeed, Roger Muldoon won election as prime minister in 1975 by running on a platform described by critics as a “denial of economic reality accompanied by bribery of the voters.”  

But all this changed through reforms orchestrated by two politicians (Roger Douglas and Ruth Richardson) representing opposing parties, who made “common cause in a fight against their own party leaders” to save their nation from corrosive inflation and ultimate bankruptcy in the late 1980s and early 1990s.  They both demanded “honest and open accounting” wherewith to tell the truth regarding the nation’s condition.  “If private businesses kept their books the way most governments keep their books, our jails would be full of CEOs” (#361)  But the reformers determined:  “The country was going to be run more like a successful business than a public piggy bank” (#554).  Richardson especially focused on reforming the educational system, turning it to charter schools answerable to the parents.  Douglas worked to reduce corporate and personal income taxes, eliminate inheritance taxes, and establish a consumption tax.  Since they eliminated government sponsored enterprises, such as Fannie Mae and Freddie Mack in the U.S., New Zealand didn’t suffer the housing bubble that burst and devastated America in 2008.  

Consequently, “New Zealand enacted its most lasting reforms when advocates for efficient government, free markets, free trade, sound money, and prudent fiscal policy came together” and legislative acts were passed that “forever changed the way New Zealand’s government did business” (#248).  Freeza explains the important acts and shows how they changed the country.  Government agencies were privatized and compulsory union membership eliminated.  “All civil service employees were moved from job-for-life union contracts and a seniority based advancement regime to individual employment contracts and a merit based regime” (#331).  Opposition to such changes was inevitably intense.  “As a poster child for the bitter medicine being administered, Ruth became the most hated politician in New Zealand.  Effigies were burned in the streets, protesters poured a pot of blue paint on her (she saved the ruined dress for a charity auction), and police had to protect her on her jog to work every morning” (#549). 

But the reforms worked and made a lasting difference.  The 2014 Index of Economic Freedom ranks “New Zealand fifth in the world, behind Hong Kong, Singapore, Australia, and Switzerland with ratings of 90 percent or higher for rule of law, freedom from corruption, business freedom, and labor freedom” (#623).  The nation’s GDP increased fourfold while the national debt shrunk from $25 billion in 1993 to $15 billion in 2007.  Trade, especially with China, has flourished.  In retrospect, Freeza says (wondering if the New Zealand story can be duplicated):  “The list of success factors required for a democracy to flourish economically is not long:  honesty, integrity, transparency, accountability, efficiency, thrift, prudence, flexibility, freedom, leadership, and courage.  Does anyone care to stand up and deny that these are virtues not just of good government but of a good life?  Although universally acclaimed by economists, philosophers, and theologians, why are these virtues so hard to find in governments and politicians?” (#668).  Unfortunately, politicians such as Roger Douglas and Ruth Richardson, willing to risk losing elections and incurring criticism, rarely appear.  But without them majoritarian democracies will, it seems, sadly enough, generally follow a destructive path.

* * * * * * * * * * * * * * * * * * * * * * * *

In The Problem with Socialism (Washington, D.C.:  Regency Publishing, c. 2016), Thomas DiLorenzo notes that a recent poll showed “43 percent of Americans between the ages of eighteen and twenty-nine had a ‘favorable’ opinion of socialism” and preferred it to capitalism (#85).  Another poll indicated 69 percent of voters under 30 would support a socialist for President—as Bernie Sanders’ near victory in the Democrat Party primaries certainly illustrated.  Misled by a multitude of educators, these young folks fail to realize G.K. Chesterton’s insight:  “the weakness of all Utopias is this, that they take the greatest difficulty of man [i.e. original sin] and assume it to be overcome, and then give an elaborate account [i.e. redistribution of wealth] of the overcoming of the smaller ones.  They first assume that no man will want more than his share, and then are very ingenious in explaining whether his share will be delivered by motorcar or balloon” (Heretics).   Or, as Lady Margaret Thatcher famously quipped:  the ultimate problem with socialists is they finally run out of other people’s money to spend.

Though socialism in the 19th century meant the “government ownership of the means of production,” in the 20th century it morphed into redistributive measures designed to eliminate all sorts of inequalities through progressive taxes and regulatory edicts.  Inevitably socialists want government to control as many industries (e.g. health care), confiscate as much land (e.g. national forests), and destroy capitalism “with onerous taxes, regulations, the welfare state, inflation, or whatever they thought could get the job done” (#127).  Also inevitably, nations embracing socialism impoverish themselves.  Africa bears witness to the fact that 40 years of socialistic experiments made them “poorer than they had been as colonies” (#183).  Indeed, one of DiLorenzo’s chapters is entitled:  “socialism is always and everywhere an economic disaster.”  A glance at American history shows how socialistic endeavors in colonial Jamestown utterly failed.  But, Matthew Andrews says:  “‘As soon as the settlers were thrown upon their own resources, and each freeman had acquired the right of owning property, the colonists quickly developed what became the distinguishing characteristic of Americans—an aptitude for all kinds of craftsmanship coupled with an innate genius for experimentation and invention’” (#255).  Socialism, whether of the dictatorial or majority-rule democratic variety, is all about planning.  It’s preeminently “the forceful substitution of governmental plans for individual plans” (#588).  Planned economies always look wonderful to the planners.  But the plans inevitably founder when implemented because they run counter to human nature.

Socialists further violate human nature by seeking to dictate economic equality (e.g. “free” education, health care, housing, food, etc.) which “is not just a revolt against reality; it is nothing less than a recipe for the destruction of normal human society,” as became brutally evident in Russia and China (#374).  By eliminating capitalism’s division of labor and freeing each person to cultivate his own talent as well as his own garden, socialism (Leon Trotsky believed) would enable the perfection of our species so that the “human average will rise to the level of an Aristotle, a Goethe, a Marx.”   That such never happens—indeed could never happen—effectively refutes such utopianism.  “How remarkable it is that to this day, self-proclaimed socialists in academe claim to occupy the moral high ground.  The ideology that is associated with the worst crimes, the greatest mass slaughters the most totalitarian regimes ever, is allegedly more compassionate that the free market capitalism that has lifted more people from poverty created more wealth, provided more opportunities for human development, and supported human freedom more than any other economic system in the history of the world” (#697).  

The intrinsic deficiencies of socialism are also on display in those “islands of socialism in a sea of capitalism—government-run enterprises like the U.S. Postal Service, state and local government public works departments, police, firefighters, garbage collection, schools, electric, natural gas, and water utilities, transportation services, financial institutions like Fannie Mae, and dozens more” (#500).  Though there may very well be practical reasons for their existence, they are “vastly more inefficient, and offer products or services of far worse quality than private businesses” (#508).   Economists generally hold “that the per-unit cost of a government service will be on average twice as high as a comparable service offered in the competitive private sector” (#508).  That privately owned and operated firms like UPS and FedEx prosper, while the USPS needs abiding subsidies, surprises no economist.  Nor does it surprise anyone that USPS employees “earn 35 percent higher wages and 69 percent greater benefits than private industry employees” (#558).  

This problem ultimately led to recent changes in Scandinavia, where free-market reforms are currently reversing decades of “democratic socialism.”  The Swedish Economic Association recently reported “that the Swedish economy had failed to create any new jobs on net from 1950 to 2005.”  Consequently, Sweden is actually “poorer than Mississippi, the lowest-income state in the United States” (#880).  Within a half-century, the nation slipped “from fourth to twentieth place in international income comparisons.”  It has simply proved “impossible to maintain a thriving economy with a regime of high taxes, a wasteful welfare state that pays people not to work and massive government spending and borrowing” (#855).  Of Denmark’s 5.5 million people, 1.5 million “live full-time on taxpayer-funded welfare handouts” (#890).  One Swedish economist, Per Byland, says giving out “benefits” and thereby “‘taking away the individual’s responsibility for his or her own life, a new kind of individual is created—the immature, irresponsible, and dependent.’”  Thus the celebrated, carefully planned Swedish “welfare state”  has unintentionally created multitudes of “psychological and moral children’” (#872).  

Sadly enough, DiLorenzo concludes, socialism ultimately harms the very folks its designed to help—the poor.  It’s a “false philanthropy.”   And it should be resisted wherever possible. 

287 The Kingdom of Speech; Undeniable; Evolution 2.0

 For five decades Tom Wolfe has remained a fixture atop the nation’s literary world—helping establish the “new journalism,” publishing essays and novels, credibly claiming to discern the pulse and diagnose the condition of America.  His most recent work, The Kingdom of Speech (New York:  Little, Brown and Company, c. 2016), finds him entering (with his customary wit) the somewhat arcane worlds of biological evolution and linguistics, finding therein much to question and pillory while educating us in the process.  He was prompted to research the subject when he read of a recent scholarly conference where “eight heavyweight Evolutionists—linguists, biologists, anthropologists, and computer scientists” had given up trying to answer “the question of where speech—language—comes from and how it works” (#18).   It’s “as mysterious as ever” they declared!  Amazingly, one of the eight luminaries was Noam Chomsky, for 50 years the brightest star in the linguistics’ firmament!  Now for academics such as Chomsky this is no small admission, for:  “Speech is not one of man’s several unique attributes—speech is the attribute of all attributes” (#36).  When the regnant Neo-Darwinian theory of evolution fails to explain language it fails to explain virtually all that matters!   

To put everything in historical context, Wolfe guides us through some fascinating developments in evolutionary theory, including deft portraits of Alfred Wallace and Charles Darwin (who maneuvered to co-opt Wallace as the singular architect of the theory of biological evolution of species through natural selection).  While styling himself an empirical scientist, Darwin subtly propounded a cosmogony that closely resembles the creation stories of many American Indians.  In fact, Darwin’s story, with its “four or five cells floating in a warm pool somewhere” developing into a world teeming with remarkable creatures was, rightly understood, a “dead ringer” for that of the Navajos!  “All cosmologies, whether the Apaches’ or Charles Darwin’s faced the same problem.  They were histories, or, better said, stories of things that had occurred in a primordial past, long before there existed anyone capable of recording them.  The Apaches’ scorpion and Darwin’s cells in that warm pool were somewhere were by definition educated guesses” (#281).  They were all “sincere, but sheer, literature” (#293).  

  While telling his story, however, Darwin recognized that speech “set humans far apart from any animal ancestors.”  Other traits he might passably explain, but he utterly failed to show how “human speech evolved from animals” (#205).  “Proving that speech evolved from sounds uttered by lower animals became Darwin’s obsession.  After all, his was a Theory of Everything” (#215).  Critiquing this theory was England’s most prestigious linguist, Max Muller, who insisted there is radical difference in kind between man and beast—and that difference is language.  “Language was the crux of it all.  If language sealed off man from animal, then the Theory of Evolution applied only to animal studies and reached no higher than the hairy apes.  Muller was eminent and arrogant—and made fun of him” (#860).  And then, just when Darwin mustered up the nerve to publish The Descent of Man, and Selection in Relation to Sex, declaring apes and monkeys evolved into human beings, the pesky Alfred Wallace (who had been busily writing trenchant biological treatises) wrote an article, “The Limits of Natural Selection as Applied to Man,” pointing out certain uniquely human traits, including language, impossible to explain through natural selection.  “No said Wallace, ‘the agency of some other power’ was required.  He calls it ‘a superior intelligence,’ ‘a controlling intelligence.’  Only such a power, ‘a new power of definite character,’ can account for ‘ever-advancing’ man” (#694).  But this Darwin could not allow!  All must be the result of purely material, natural processes!  “He had no evidence,” Wolfe says, but he told a good “just so” story that captured much of the public mind.  Yet his followers, for 70 years, gave up trying to explain the origin of language and turned to simpler evolutionary matters, upholding the Darwinian standard and insisting, with Theodosius Dobzhansky:  “Nothing in Biology Makes Sense Except in the Light of Evolution.”  But not even Dobzhansky ventured to suggest precisely how speech evolved!  

Then came Noam Chomsky, who (as a graduate student at the University of Pennsylvania) set forth a revolutionary theory of linguistics, a “radially new theory of language.  Language was not something you learned.  You were born with a built-in ‘language organ’” (#1000).  Along with your heart and liver, you’re given it—a biological “language acquisition device” (routinely referred to as the LAD in the “science” of linguistics).  Chomsky summed it all up in his 1957 Syntactic Structures and thereby became “the biggest name in the 150-year history of linguistics” (#1012).  But what, precisely was this LAD?  Was it a free-standing organ or an organ within the brain?  Like all else in the evolutionary scheme, it had to be something material.  But where could it be found?  Take it by faith, Chomsky said—in time empirical scientists would find it!  

After 50 years of absolute preeminence in the field of linguistics, however, Chomsky suddenly faced an antagonist!  Daniel L. Everett, having spent 30 years living with a small tribe in the Amazon jungle—the Piraha, arguably the most primitive tribe on earth—dared to challenge the Master!  He declared Chomsky’s theory falsified by the Indians he studied.  They “had preserved a civilization virtually unchanged for thousands . . . many thousands of years” (#1313), and no “language organ” or “universal grammar” could explain how they spoke.  When Everett presented his findings to the public a decade ago—declaring they provided “the final nail in the coffin for Noam Chomsky’s hugely influential theory of universal grammar” (#1393)—a “raging debate” ensued.  In fact, it was total war, with Chomsky and his epigones determined to destroy Everett!  They questioned his integrity, discounted his credentials, and schemed to ostracize him from the academic community.  

Fighting back, in 2008 Everett published Don’t Sleep, There Are Snakes, summarizing his 30 years among the Piraha.  Amazingly, for a linguistics treatise, it became something of a best-seller!  “National Public Radio read great swaths of the book aloud over their national network and named it one of the best books of the year” (#1637).  Dismissing Chomsky’s celebrated theory, Everett argued:  “Language had not evolved from . . . anything.  It was an artifact” (#1631).  He followed this up with Language:  The Cultural Tool, insisting “that speech, language is not something that had evolved in Homo sapiens, the way the breed’s unique small-motor-skilled hands and . . . or its next-to-hairless body.  Speech is man-made.  It is an artifact . . . and it explains man’s power over all other creatures in a way Evolution all by itself can’t begin to” (#1675).  Soon he found some distinguished defenders, including Michael Tomasello—co-director of the Max Planck Institute for Evolutionary Anthropology.  In an article entitled “Universal Grammar Is Dead,” Tomasello opined:  “‘The idea of a biologically evolved, universal grammar with linguistic content is a myth’” (#1663).  Then Vyvyan Evans published The Language Myth and simply dismissed the innate “language instinct” notion.  Still others soon joined the growing condemnation of the Chomsky thesis!  

Chomsky of course responded, defending himself—but subtly retracting some of his earlier hypotheses.  Then, in a long, convoluted article, we find him confessing:  “‘The evolution of the faculty of language largely remains an enigma’” (#1734).  An enigma, no less!  Fifty years of feigning The Answer!  (It seems Chomsky knows less than Aristotle, who concluded that humans have a “rational soul” enabling them to function in uniquely human ways.)  And to Tom Wolfe, this at least became crystal clear:  “There is a cardinal distinction between man and animal, a sheerly dividing line as abrupt and immovable as a cliff:  namely, speech” (#1890).  He thinks:  “Soon speech will be recognized as the Fourth Kingdom of Earth.”  In addition to the mineral, vegetable, and animal worlds, there is “regnum loquax, the kingdom of speech, inhabited solely by Homo Loquax” (#1938).  How interesting to find Wolfe affirming what an earlier (and deeply Christian) literary giant, Walker Percy, identified (in Lost in the Cosmos) as the “delta factor”—the symbolic language unique to our species.  There’s an immaterial dimension to language, rendering it impossible to reduce to (or explain by) mere matter.  

* * * * * * * * * * * * * * * * * * * * * * * * * 

While practicing their craft, scientists cannot but ask philosophical questions.  The empirical details of their discipline may very well prove interesting to certain scholars, but the deeper philosophical implications of their findings constantly press for examination and explanation.  Thus, in Undeniable:  How Biology Confirms Our Intuition that Life Is Designed (New York:  HarperOne, c. 2016), Douglas Axe, a highly-credentialed biologist (degrees from Cal Tech and Cambridge University; research articles published in peer reviewed journals) notes:  “The biggest question on everyone’s mind has never been the question of survival but rather the question of origin—our origin in particular.  How did we get here?” (#195).  We cannot but wonder:  “What is the source from which everything else came?  Or, to bring it closer to home:  To what or to whom do we owe our existence?  This has to be the starting point for people who take life seriously—scientists and nonscientists alike.  We cannot rest without the answer, because absolutely everything of importance is riding on it” #275).  

Axe mixes many enlightening personal anecdotes—struggling to survive within an antagonistic academic establishment while entertaining serious questions concerning the dogmas espoused therein—with an expertise honed in laboratories (most notably Cambridge University) and through interactions with both eminent biologists and cutting-edge publications.  But he urges us to rely not upon prestigious authorities.  We should trust our common sense, believing what we see and intuitively know rather than what we’re told to see and believe.  He shares St. Paul’s probing conviction that “the wrath of God is revealed from heaven against all ungodliness and unrighteousness of men, who suppress the truth in unrighteousness, because what may be known of God is manifest in them, for God has shown it to them.  For since the creation of the world His invisible attributes are clearly seen, being understood by the things that are made, even His eternal power and Godhead, so that they are without excuse, because, although they knew God, they did not glorify Him as God, nor were thankful, but became futile in their thoughts, and their foolish hearts were darkened.  Professing to be wise, they became fools” (Ro 1:18-22).  

At an early age children (even if reared in atheist homes) prove St. Paul’s point, sensing there’s an ultimate God-like source responsible for a world that seems to function in accord with certain regularities and principles.  This Axe labels the universal design intuition that recognizes an intelligent dimension to all that is.  Thus children “innately know themselves to be the handiwork of a ‘God-like designer,’” only to suffer schools wherein they’re generally “indoctrinated with the message that they are instead cosmic accidents—the transient by-products of natural selection” (#843).  To refute that materialistic dogma, philosophical rather than scientific, Axe to presents in-depth scientific information pointing to intelligent design as the answer to our deepest questions.  He’s particularly adept at showing how the latest findings in molecular biology (in particular the tiny and incredibly complex proteins he examines in the laboratory) and cosmology make purely naturalistic explanations truly improbable.  Fortunately, for the general reader, Axe explains things in simple, intelligible ways while demonstrating his mastery of the materials.  And he insists:  “What is needed isn’t a simplified version of a technical argument but a demonstration that the basic argument in its purest form really is simple, not technical” (#898).  We don’t need a Ph.D. in science to understand the common sense science basic to the question of origins.  

Axe’s argument actually takes us back to Aristotelian metaphysical tradition (though he doesn’t overtly align himself with it), for the world we observe contains real beings (what he calls “busy wholes) innately orientated to discernible ends.  There’s more to Reality than mindless matter—there’s information, reason, a Logos giving shape to that matter.  “When we see working things that came about only by bringing many parts together in the right way, we find it impossible not to ascribe these inventions to purposeful action, and this pits our intuition against the evolutionary account” (#1264).  Consider such amazingly complex creatures as spiders, salmon, and whales, each of which “is strikingly compelling and complete, utterly committed to being what it is” (#1117).  The utter inescapability of the material, formal, efficient, and final causes necessary to understand and explain them cannot be denied!  Thus life doesn’t just happen as a result of atoms randomly bouncing through space.  And to imagine life originated in a primordial pond of inorganic compounds violates both the empirical evidence of science and the laws of thought.  To anyone with eyes to see, “life is a mystery and masterpiece—an overflowing abundance of perfect compositions” that cannot be explained in accord with Darwin’s natural selection (#1129).  

Having presented his case, Axe says:  “The truth we’re arrived at is important enough that we have a responsibility to stand up for it.  Think of this is a movement, not a battle.  When a good movement prevails, everyone wins” (#2835).  He further believes that Darwinian devotees are now on the defense, retreating on many fronts.  They know Darwin himself understood his theory’s vulnerability, admitting:  “If it could be demonstrated that any complex organ existed, which could not possibly have been formed by numerous successive, slight modifications, my theory would absolutely break down.”  But though he credibly described the survival of the species he simply failed—as have his successors—to explain their arrival!  A century of intensive research leaves unanswered the truly fundamental questions:  how did organic life (e.g. the first cell containing proteins providing genetic instruction for making proteins) arise from inorganic materials?  why are humans uniquely conscious and marked by distinctively non-utilitarian traits such as altruism?  

Unlike many advocates of Intelligent Design, who insist they are not making an argument for the existence and power of God, Axe forthrightly moves from his scientific data and philosophical arguments to “naming God as the knower who made us.  I see no other way to make sense of everything we’ve encountered on our journey” (#3096).  The material world can only be—and be understood—because of an immaterial world, the spiritual and supernatural world.  “In the end,” he says, “it seems the children are right again.  The inside world is every bit as real as the outside one.  Consciousness and free will are not illusions but foundational aspects of reality, categorically distinct from the stuff of the outside world.  Following the children, if we allow ourselves to see the outside world as an expression of God’s creative thought, everything begins to make sense” (#3190).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

An electrical engineer by training and profession and fully immersed in cutting-edge computer developments, Perry Marshall’s consuming interest in evolutionary theory has prompted him to publish Evolution 2.0:  Breaking the Deadlock Between Darwin and Design (Dallas:  Benbella Books, Inc., c. 2015) in hopes of bringing about a rapprochement between folks primarily committed to Scripture and those strongly rooted in evolutionary Science.  Neither a “God of the Gaps” young earth creationism nor a mindless matter-in-motion Old School Darwinism will suffice, given the best available evidence.  Reared in young-earth creationist milieu but educated as a scientist, Marshall long struggled with seemingly unanswerable questions.  But:  “One day I had a huge epiphany:  I suddenly saw the striking similarity between DNA and computer software.  This started a 10-year journey that led me down a long and winding road of research, controversy, and personal distress before I discovered a radical re-invention of evolution” (#333).   This combination of a divinely-engendered creation and evolutionary process he calls Evolution 2.0 and urges it upon his readers.  

Codes provide patterns for both computers and biological organisms.  Though we struggle to understand mysterious powers such as gravity and thermodynamics, we fully understand how to create computer codes.  And we absolutely know that codes do not write themselves!  Codes exist because minds devise them!  Without intelligent coders the there could be no codes!  He supports both creationists (believing God encodes creation) and Darwinists (believing much of the evolutionary account).   Independently wealthy, Marshall even offers a multi-million dollar prize to the “first person who can discover a process by which nonliving things can create code” (#416).  But:    “Nobody knows how you get life from nonlife.  Nobody knows how you get codes from chemicals.  So nobody gets to assume life just happens because you have some warm soup and a few billion years; we want to understand how and why” (p. 178).  He stresses, in bold print:  “Codes are not matter and they’re not energy.  Codes don’t come from matter, nor do they come from energy.  Codes are information, and information is in a category by itself” (p. 187).  

Marshall begins this very personal book by confessing that his childhood cosmological beliefs were seriously challenged by the data he faced as a scientist.  He sincerely wanted to discover the truth and resolved to find it, whatever the cost.  “At the core of my being, I knew I could not live apart from integrity; I could no somehow make myself believe something that was demonstrably untrue” (p. 6).  Trusting his engineering training, he resolved let it guide him, fully aware it might lock him into atheism.   The electrical engineering he’d mastered is highly mathematical—everything works precisely.  And as he delved into biological science he found tiny living organisms working just as precisely, following sophisticated instructions.  He also found that evolutionary theory nicely explained much that is evident in living creatures, confirming Darwin’s insights.   But one of the Darwinists’ core beliefs—that random mutations fully explain the information necessary for living beings—he found untenable.  There must be some intelligent source for the information markedly present in all that lives.  

It soon dawned on him that computer codes and biological DNA are remarkably alike.  It’s information that enables them both to work in such wonderful ways.  Prior to any evolving organisms there must be information, precisely coded in the DNA, that enables them to function.  “The pattern in DNA is a complex, intricate, multilayered language.  And incredibly efficient and capable language at that” (p. 165).  It’s not “natural selection” but “natural genetic engineering” that best explains the living world.  Marshall carefully discusses important things such as “transposition,” “horizontal gene transfer,” “epigenetics,” “symbiogenesis,” and “genome duplication,” to illustrate the wonderful ways cells function.  “One cell can hold a gigabyte of data; plant and animal tissues have a billion cells per cubic centimeter” (p. 102).  A simple cell has information equivalent “to about a hundred million pages of the Encyclopedia Britannica” (p. 106).  Indeed:  “As amazing as Google is, a single cell possesses more intelligence than a multibillion-dollar search engine” (p. 235).  And within each cell there are the true building blocks of all organisms—tiny, information-laden proteins that enable to cell to thrive.  “No human programmer has ever written software that does anything even close to this.  Could randomness do it?  No chance” (p. 142)

Only God could have written the software evident in our world.  That many of the the greatest scientists of the past—Newton, Kepler, Kelvin, Mendel, Planck—believed in such a God should encourage us to follow their example.  For himself, Marshall says:  “after years of research, expense, scrutiny, and debates, my conclusion is:  Not only is Evolution 2.0 the most powerful argument for a Designer that I’ve ever seen (!), but people of faith were on the cutting edge of science for 900 or the last 1,000 years.  The rift between faith and science might heal if everyone could see how evolution actually works” (p. 270).    Marshall has read widely and provides helpful bibliographic materials.    Though not a trained philosopher, he clearly understands sophisticated arguments and logic, and his scientific preparation enables him to both grasp and explain current data.  While not as conclusively persuasive as he might like, he does provide a valuable treatise on this absolutely crucial issue.

286 Clinton Cash; Armageddon; Stealing America

 Peter Schweizer is an investigative journalist with a muckraker’s penchant for pursuing the darker dimensions of American politics, looking for scoundrels whose behavior needs exposing.  So in Architects of Ruin he detailed governmental corruption underlying the 2008 financial collapse; in Makers and Takers he highlighted the many faults of the welfare state; and in Throw Them All Out he brought to light the many suspicious stock trades enriching members of Congress.  Just recently, in Clinton Cash:  The Untold Story of How and Why Foreign Governments and Businesses Helped make Bill and Hillary Rich (New York:   Harper Collins, c. 2015), he documents the extraordinary number of questionable ties linking the Clintons and their foundation to wealthy foreign governments and businessmen.  Most all of his critical findings present only circumstantial evidence.  Demonstrable quid pro quo transactions are by their very nature are enshrouded in secrecy and rarely leave overt proof.  But Schweizer’s evidence leads the reader to suspect the Clintons of massive corruption and malfeasance in office.  Legally, there’s a Latin phrase—res ipsa loquitur (the thing speaks for itself)—that fully applies to Schweizer’s evidence.  When first published, the book was attacked and dismissed by the Clinton-supporting mainline media.  Thus ABC’s George Stephanopoulos (without disclosing the fact that he personally had contributed $75,000 to the Clinton Foundation) glibly assured viewers that nowhere did Schweizer establish any “direct  action” taken by Hillary “on behalf of the donors.”  Thus, he declared, there were no quid pro quo deals.  However, subsequent Congressional and FBI investigations make Schweizer’s case increasingly credible.  Res ipsa loquitur!  

Admittedly there has always been considerable dishonesty in American politics.  But the Clintons have been unusually close to wealthy foreigners, raking in millions of dollars for speeches and garnering  contributions for the Clinton Foundation.  Indeed, “the scope and extent of these payments are without precedent in American politics.  As a result, the Clintons have become exceedingly wealthy” (#167 in Kindle).  Indeed:  “No one has even come close in recent years to enriching themselves on the scale of the Clintons while they or a spouse continued to serve in public office” (#201).  “Dead broke” in 2001, Hillary claimed, they quickly prospered (accumulating $136 million within a decade) by circumventing the law which prohibits foreign interests from contributing to political campaigns.  Lavish speaking fees and gifts to the Clinton Foundation (which employed friends and covered lush “expense” accounts for the inner circle) were the “legal” (in fact the only discernable) ways whereby the Clintons became inordinately wealthy.  “The issues seemingly connected to these large transfers are arresting in their sweep and seriousness:  the Russian government’s acquisition of American uranium assets; access to vital US nuclear technology; matters related to the Middle East policy; the approval of controversial energy projects; the overseas allocation of billions in taxpayer funds; and US human rights policy, to name a few” (#236).  

Symptomatic of things to come was President Bill Clinton’s pardon (just before leaving the White House in 2001) of billionaire fugitive Marc Rich, who was living abroad to avoid facing a variety of charges.  One of the world’s richest men, he’d been indicted for illegal trading practices and tax evasion.   His “business ties included a ‘who’s who’ of unsavory despots, including Fidel Castro, Muammar Qaddafi, and the Ayatollah Khomeini.”  Rich “owed $48 million in back taxes that he unlawfully tried to avoid and faced the possibility of 325 years in prison,” earning him a place on the FBI’s Most Wanted List.  A federal prosecutor, Morris Weinberg, said:  “The evidence was absolutely overwhelming that Marc Rich, in fact, committed the largest tax fraud in the history of the United States.”  Rather than risk imprisonment, Rich fled the country in 1991.  Fortunately, he had a charming former wife, Denise, who ingratiated herself with the Clintons, making lavish contributions—moving $1.5 million into their coffers.  President Clinton then pardoned Marc Rich soon after Denise donated “$100,000 to Hillary’s 2000 Senate campaign, $450,000 to the Clinton Library, and $ 1 million to the Democrat Party” (#270).  These transactions were helped along by Rich’s lawyer (and former White House counsel) Jack Quinn, who pled her case with Bill and Hillary.  Informed of the pardon, Mayor Rudolph Giuliani, the U.S. attorney who spearheaded the Rich investigations, refused to believe it.  Surely it was “impossible” that a president would pardon him.  But Clinton did.  Ever mindful of the letter of the law, he evaded clear quid pro quo connections, but what rational person could deny it!  Res ipsa loquitor!  Rich’s was merely one of Clinton’s many last-minute pardons—crooks, con men, relatives, ex-girlfriends, former cabinet members and congressmen.   

Such suspicious Clintonian behavior persisted—indeed escalated—during the following years as Bill and Hillary established the Clinton Foundation and erected the Clinton Library, soliciting funds from various donors and negotiating huge fees (often amounting to $500,000 or more) for speaking engagements around the world—especially in developing nations such as Uzbekistan and Kazakhstan where despots flush with cash sought to multiply it.  Skeptical journalists such as Christopher Hitchens wondered:  “why didn’t these third world oligarchs ‘just donate the money directly [to charities in their own countries] rather than distributing it through the offices of an outfit run by a seasoned ex-presidential influence-peddler?’” (#300).  Their activities caused the Obama team to voice significant concern regarding Hillary’s financial ties when she was appointed Secretary of State, so she promised to fully disclose their financial activities and secure State Department approval before accepting gifts to the foundation from foreign sources.  But she quickly broke these promises:   “Huge donations then flowed into the Clinton Foundation while Bill received enormous speaking fees underwritten by the very businessmen who benefited from these apparent interventions” (#395).  Interestingly enough, while ex-presidents’ speaking fees gradually decline once they’re out of office, Bill Clinton’s dramatically escalated when his wife became Secretary of State.  

One of the businessmen most frequently involved in the Clintons’ financial endeavors was a Canadian mining tycoon, Frank Giustra, who first connected with them  in the 1990s and frequently provided a luxurious private jet for Bill to fly around the world (or to campaign for Hillary) after he left the White House.  It was Giustra who arranged a meeting between Bill and the dictator of Kazakhstan that led to an involved uranium deal, helped along by then Senator Hillary Clinton’s activities in the Congress.  This “deal stunned longtime mining observers,” and soon thereafter “Giustra gave the Clinton Foundation $31.3 million” (#593).   Yet another uranium deal involved a Canadian company and Russian investors who sought to gain control of “up to half of US uranium output by 2015” (#751).  This move was set in motion by Vladimir Putin, who personally discussed various issues, including trade agreements, with Secretary of State Clinton in 2010.  Monies then flowed into the Clinton Foundation, thanks to significant gifts from folks invested in the uranium industry.  “Because uranium is a strategic industry, the Russian purchase of a Canadian company holding massive US assets required US government approval.  Playing a central role in whether approval was granted was none other than Hillary Clinton” (#821).  Though a number of congressmen protested the deal, it was duly authorized by the Committee on Foreign Investment in the United States—a select committee that included the Secretary of State and other Obama Cabinet members.   Coincidentally, Salida Capital, one of the Canadian companies involved in the transaction and possibly a “wholly owned subsidiary of the Russian state nuclear agency,” would give “more than $2.6 million to the Clintons between 2010 and 2012” (#875).  Ultimately, “Pravda hailed the move with an over-the-top headline:  ‘RUSSIAN NUCLEAR ENERGY CONQUERS THE WORLD’” (#969).  

Since most of the millions flowing through the Clintons’ hands goes to (or through) the Clinton Foundation, Schweitzer devotes many pages to probing its activities as well as providing fascinating portraits of its denizens.  Though its “window-display causes” portray the foundation as admirably charitable, helping victims of AIDS, poverty, obesity, etc., it’s more probably both “a storehouse of private profit and promotion” (#1326) and a generous employer for a numbers of Clinton associates, advisers and political operatives.  (A recent review of the foundation’s 2014 IRS report reveals that of the $91 million expended only $5 million actually went to needy causes; the rest was devoted to employees, fundraising, internal expenses.)  In fact, the foundation has virtually no infrastructure and does very little to actually help those in need.  Rather, it seeks broker deals between “government, business, and NGOs” (#1349).  That some good is ultimately done cannot be denied, but it’s not actually done by the foundation itself.  “While there are plenty of photos of Bill, Hillary, or Chelsea holding sick children in Africa, the foundation that bears their name actually does very little hands-on humanitarian work” (#1356).  When Hillary became Secretary of State, she utilized a special government employee (SGE) rule that allowed her to appoint aides, including Huma Abedin, to her department while simultaneously garnering a salary from the Clinton Foundation.  “Abedin played a central role in everything Hillary did” (#1589), and according to the New York Times “‘the lines were blurred between Ms. Abedin’s work in the high echelons of one of the government’s most sensitive executive departments and her role as a Clinton family insider’” (#1595).  

The Clintons’ approach to “charitable” work was manifest following the devastating 2010 earthquake in Haiti which killed some 230,000 people and left 1.5 million more homeless.  Days after the earthquake, Secretary of State Hillary Clinton flew to the island, as did husband Bill.  “With a cluster of cameras around him, Bill teared up as he described what he saw” (#2497).  “The Clintons’ close friend and confidante, Cheryl Mills, who was Hillary’s chief of staff and counselor at the State Department [recently granted immunity for telling the FBI what she knew about the thousands of Hillary’s deleted emails] was assigned responsibility for how the taxpayer money, directed through USAID, would be spent” (#2497).  A special committee, with Bill as cochair, was appointed to distribute these funds, and he made speeches describing how Haiti would marvelously recover under his guidance.  But little construction actually took place!  For example, in “December 2010 Bill and Hillary approved a ‘new settlements program’ that called for fifteen thousand homes to be built in and around Port-au-Prince.  But by June 2013, more than two and a half years later, the GAO audit revealed that only nine hundred houses had been built” (#2712).  

Rather than rebuilding the nation’s infrastructure, the money was spent on “worthless projects,” and “in several cases Clinton friends, allies, and even family members have benefited from the reconstruction circumstances” (#2521).  Consider the story of Denis O’Brien, an Irish billionaire who studiously curried the Clintons’ favor (often making available his Gulfstream 550) while successfully promoting his mobile phone company, Digicel.  The firm profited enormously from its Haitian programs and O’Brien himself collected $300 million in dividends in 2012.  O’Brien invited Bill to speak three times in three years in Ireland, and almost simultaneously his company was granted profitable positions in Haiti.  Then there’s Hillary’s brother, Tony Rodham, who had absolutely no background in the mining industry but became a member of the board of advisors for a mining company that secured a “gold mining exploitation permit”—a “sweetheart deal” that outraged the Haitian senate.  Meanwhile, Bill’s brother Roger collected $100,000 for promising builders he’d arrange a sweet deal with the Clinton Foundation.  “In sum, little of the money that has poured into Haiti since the 2010 earthquake has ended up helping Haitians.  And how that money was spent was largely up to Hillary and Bill” (#2770).  

In conclusion:  “The Clintons themselves have a history of questionable financial transactions” (#2806).  They neither follow the same rules nor receive the same treatment as most Americans, yet they have famously flourished within modern American politics.  That they have succeeded, despite the record of questionable activities detailed in Clinton Cash, should give us pause!  

* * * * * * * * * * * * * * * * * * * * * *

Few political insiders know Bill and Hillary Clinton better than Dick Morris, the architect of Bill’s “triangulation” strategy which enabled him to coast to re-election in 1996.  Morris’s Armageddon:  How Trump Can Beat Hillary (West Palm Beach, FL:  Humanix Books, c. 2016), co-written with his wife, Eileen McGann, offers a unique perspective on this year’s election.  Given Morris’s checkered history, his pronouncements must always be considered with significant reservations!  Much of his life he’s worked as a “hired gun” and shown little ethical concern when involved in the rough and tumble world of partisan politics.  But inasmuch as he was one of Bill Clinton’s most trusted consultants in the 1990s he certainly provides information worth pondering as we consider Hillary’s presidential aspirations.  Morris also discusses Donald Trump’s prospects, but it’s his knowledge of the Clintons that most interests me.  

As the book’s title indicates, Morris writes as an alarmist:  “The ultimate battle to save America lies straight ahead of us:  It’s an American Armageddon, the final crusade to defeat Hillary Clinton” (#138).  Her election, he says, listing a litany of fears, will consign us to “four long years of another bizarre Clinton administration, featuring the Clintons’ signature style of endless drama, interminable scandals, constant lies, blatant cronyism and corruption, incessant conflicts of interest, nepotism, pathological secrecy, hatred of the press, his and her enemies lists, misuse of government power, inherent paranoia, macho stubbornness, arrogant contempt for the rule of law, nutty gurus, and thirst for war.  Those will be the disastrous and unavoidable hallmarks of a Hillary regime” #246).  With a cast of characters including Bill and Chelsea Clinton, Sidney Blumenthal, David Brock, Terry McAuliffe, et al.—“unqualified and greedy cronies and  her money-grubbing family members” roaming Hillary’s White house—the nation will suffer gravely.  When we think of the Clinton scandals, we usually focus on Bill’s sexual escapades, but Morris declares “that almost every single scandal in the Bill Clinton White House was caused by Hillary:  Travelgate, Whitewater, Filegate, her amazing windfall in the commodities futures market, the Health Care Task Force’s illegal secrecy, the household furniture and gifts taken from the White House to Chappaqua, Vince Foster’s suicide, Webb Hubbell’s disgrace—all Hillary scandals” (#412).  

In his first chapter Morris lists “A Dozen Reasons Hillary Clinton Should Not Be President.”  These include:  1) her dismal failure to respond well to the terrorist attack in Benghazi; 2) her compulsive, life-long lying about almost everything; 3) her penchant for hawkish, pro-war pronouncements; 4) her ties with the Muslim Brotherhood, as evident in her close ties to Huma Abadin, whose parents (and she herself) fervently supported the organization; 5) her easily documented record of flip-flops on a variety of issues (e.g. gay marriage, free trade) during the course of her life; 6) her manifest corruption—a “way of life” most evident in her multifaceted financial deals, e-mails, and Clinton Foundation; 7) her obsessive concern for secrecy; 8) her queen-like ignorance regarding ordinary Americans; 9) her economic vacuity; 10) her reliance on disreputable “gurus” such as Sidney Blumenthal; 11) her stubbornness; and 12) her notorious nepotism.  

Clearly Dick Morris dislikes and distrusts Hillary Clinton.  How seriously you take his warnings naturally depends upon how much you trust him.  But when placed in proper context, and compared with other accounts corroborating his data, he’s persuasive.  

* * * * * * * * * * * * * * * * * 

With the election of 2016 approaching, Dinesh D’Souza published two clearly polemical treatises designed to warn America about Hillary Clinton and the Democratic Party:  Stealing America:  What My Experience with Criminal Gangs Taught Me About Obama, Hillary, and the Democratic Party (New York:  Broadside Books, c. 1016), and Hillary’s America:  The Secret History of the Democratic Party (Washington, D.C.:  Regnery Publishing, c. 2016).   For many years D’Souza has espoused conservative principles, shaped in part by his unique story as an immigrant (from India) feeling deeply blessed to thrive in his adopted country.  For me his treatises serve to elicit thought, not to chart a course!  

In 2012 D’Souza gave a friend running for a state office in New York $10,000 and persuaded two others to donate the same amount, for which he reimbursed them.  He knew he was skirting the campaign finance limit but didn’t think he was breaking the law.  Soon thereafter, however, he was pursued by the Justice Department and (unlike virtually all other violators) found himself paying half-a-million dollars in legal fees and serving eight months of nights in a confinement center in San Diego.  That he’d just produced an anti-Obama film (2016) was, he believed, anything but coincidental!  Commenting on his case, Harvard law professor Alan Dershowitz said:  “‘What you did is very commonly done in politics, and on a much bigger scale.  Have no doubt about it, they are targeting you for your views’” (p. 14).  In confinement D’Souza “understood, for the first time, the psychology of crookedness.  Suddenly I had an epiphany:  this system of larceny, corruption, and terror that I encountered firsthand in the confinement center is exactly the same system that has been adopted and perfected by modern progressivism and the Democratic Party” (p. 26).  He came to see the party of Obama and the Clintons not simply as “a defective movement of ideas, but as a crime syndicate” (p. 26). 

Pursuing this thesis—however preposterous it might seem—makes for interesting reading.  In particular, one learns much about the criminal underclass populating America’s prisons and its utter cynicism regarding the political system.  The murderers and thieves with whom D’Souza lived noted that most politicians enter “office with nothing and leave as multimillionaires.  So how did this happen?  It just happened?” (p. 47).  If nothing else they understood crime—and they knew criminality undergirds this process!  D’Souza soon grasped the truth of St. Augustine’s famous observation in The City of God:  “What are kingdoms but gangs of criminals on a large scale?  What are criminal gangs but petty kingdoms?”  Translating that truth into contemporary America, D’Souza concludes that “the ideological convictions of Obama, Hillary, and the progressives largely spring out of those base motives and that irrepressible will to power.  The progressives have unleashed a massive scheme for looting the national treasury and transferring wealth and power to themselves, and their ideology of fairness and equality is to a large degree of justification—a sales pitch—to facilitate that larceny.  Previously I didn’t see this very clearly; now I do” (p. 50).  

The same basic message characterizes D’Souza’s Hillary’s America, the book basic to the widely-viewed documentary bearing the same title.  He clearly believes Hillary is a threat to the republic, but more basically he argues the Democratic Party has (since its inception under Andrew Jackson) supported a variety of evil endeavors running from stealing Indians’ lands to enslaving Africans to endorsing Jim Crow laws and racist eugenics.  To D’Souza the “progressive narrative” of American history is “a lie” and the Democratic Party must be held accountable for its desultory past.  Hillary Clinton is merely the current representative of a movement that has “brutalized, segregated, exploited, and murdered the most vulnerable members of our society.”  As such Hillary and the Democrats must, he insists, be defeated!  

# # #

285 Deadly Notions

Given our rationality, ideas inevitably have consequences and deeply shape human history.  In The Death of Humanity:  and the case for life (Washington:  Regnery Faith, c. 2016), California State University historian Richard Weikart helps explain the “culture of death” so pervasive throughout the past century—during which both belief in the dignity of man and the actual lives of millions of men demonstrably perished.  Consider the case of the serial killer and cannibal Jeffrey Dahmer:  following his arrest in 1991, he said that he believed “‘the theory of evolution is truth, that we all just came from the slime, and when we died . . . that was it, there was nothing—so the whole theory cheapens life.’  With this vision, he saw no reason not to kill and eat other men.  As he confessed, ‘If a person doesn’t think there is a God to be accountable to, then what’s the point in trying to modify your behavior to keep it in acceptable ranges?’” (#224).  Similarly, Eric Harris, one of the killers in Columbine High School in 1999, confessed (in his journal) to loving Thomas Hobbes and Friedrich Nietzsche; furthermore, he wore a T-shirt declaring “Natural Selection” when he launched his killing spree.  

Having survived Auschwitz, the great Austrian psychologist Victor Frankl analyzed the intellectual currents he held responsible for the Holocaust:  “If we present a man with a concept of man which is not true, we may well corrupt him.  When we present man as an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment, we feed the nihilism to which modern man is, in any case, prone.  I became acquainted with the last stage of that corruption in my second concentration camp, Auschwitz.  The gas chambers of Auschwitz were the ultimate consequences of the theory that man is nothing but the product of heredity and environment—as the Nazi liked to say, of ‘Blood and Soil.’  I am absolutely convinced that the gas chambers . . . were ultimately prepared not in some Ministry or other in Berlin, but rather at the desks and in the lecture halls of nihilistic scientists and philosophers” (The Doctor and the Soul,  xxvii).  

In many ways, Weikart’s work is an extended commentary on the anthropological ideas Frankl  held responsible for genocide, holding man to be “an automaton of reflexes, as a mind-machine, as a bundle of instincts, as a pawn of drives and reactions, as a mere product of instinct, heredity, and environment.”  During the past 200 years a multitude of thinkers have embraced varieties of philosophical materialism and rejected the traditional Christian “sanctity-of-life” ethic.  Some of them, beginning with Julien Offray de La Mettrie in 1747, imagined humans in terms of Man the Machine.  As a machine running within a mechanistic universe utterly devoid of meaning or purpose, it follows that a man is as irresponsible for his behavior as is the moon circling the earth.  Picking up on this notion, Ludwig Feuerbach famously said “Man is what he eats” (Der Mensch ist, was er isst!) and Karl Marx drank deeply from this fountain of atheism as he began his revolutionary career.  In our day, Francis Crick, celebrated for his DNA discoveries, “has probably done as much as anyone to promote the idea that humans can be reduced to their material basis” (#807) and speaks for many eminent academics.  Since we’re nothing but genes and molecules, i.e. matter-in-motion:  “‘No newborn infant should be declared human until it has passed certain tests regarding its genetic endowment and that if it fails these tests it forfeits the right to life’” (#826).   

Other philosophical materialists reduce man to a highly-evolved animal, following Darwin’s dictum that he was “created from animals” and has no soul.  That view, as adumbrated by People for the Ethical Treatment of Animal’s Ingrid Newkirk, holds:  “A rat is a pig is a dog is a boy.  They are all mammals.”  PETA enthusiasts, of course, elevate animals to human status, demanding they be treated tenderly.  But others lower humans to animals, treating them as disposable if worthless.  Following Darwin, human life must be devalued, since all animals share a common ancestry and natural selection requires the denial of any purpose to life.  Necessarily there can be no moral standards—might-makes-right as the fittest survive and individuals struggle for supremacy in life-and-death competition.  

Weikart traces the powerful trajectory of Darwinism (a theme he earlier documented in From Darwin to Hitler:  Evolutionary Ethics, Eugenics, and Racism in Germany and Hitler’s Ethic:  The Nazi Pursuit of Evolutionary Progress), culminating in some of the pronouncements of Peter Singer—Princeton University’s Professor of Philosophy.  Studying Singer—famed for his promotion of “animal liberation,” infanticide, bestiality, and “unsanctifying human life”—one realizes how much that’s wrong with our world can be traced to Charles Darwin, whose moral relativism justified any behavior which increased an individual’s survival potential (“even killing one’s own offspring”) if advantageous.  “Singer admits that Darwinism informs his own position that humans are not special or uniquely valuable.  He claims that Darwin ‘undermined the foundations of the entire Western way of thinking on the place of our species in the universe’” (#1054).  Without those Christian foundations, there is no reason to condemn the might-makes-right victors in the struggle for existence.  

Differing somewhat from the Darwin (who stressed environmental factors and understood little of what we label genetics), there are biological determinists such as Harvard University’s Stephen Pinker.  In The Blank Slate:  The Modern Denial of Human Nature, he justifies infanticide inasmuch as one’s “genes made me do it.”  Pinker labels just-born humans “neonates,” whose killing he calls “neonaticide” rather than murder.  Since the neonate lacks “morally significant traits” and is not demonstrably a full person, he has no more right to life than a mouse.  Hard-wired by our genes, we have no free-will and simply follow what’s prescribed for us.  Criminals are thus not to be held responsible for their crimes—a position increasingly held by criminologists and judges who attack this nation’s incarceration policies.  

Biological determinism was strongly asserted, more than a century ago, by Charles Darwin’s cousin, Francis Galton, who eagerly embraced the theory of natural selection and applied it to eugenics—a “‘new religion’ that would humanely improve humans biologically” (#1792).   In Galton we encounter “Social Darwinism” in its purist form.  Enthusiasts for this endeavor promoted a blatantly-racist agenda in Europe and America, passing laws and influencing a variety of academic disciplines.  Later tarnished by its association with the Nazis, eugenics following WWII, but it has recently revived under the rubric of “sociobiology” and “evolutionary psychology.”  Sociobiology’s architect, Harvard University’s E. O. Wilson, restricted reality to “chance and necessity” and insisted “that everything about humans—behavior, morality, and even religion—is ultimately explicable as the result of completely material processes” (#1955).  Given this assumption, virtually any behavior may be “good” as long as it contributes to the evolutionary process.  If one finds animal species engaging in suicide or infanticide or incest, numbers of evolutionary devotees declare such behavior may very well be appropriate for humans as well.  

Environmental determinists glibly declare “my upbringing made me do it”—as did Clarence Darrow (the famed defense attorney at the Scopes monkey trial) when he defended Nathan Leopold and Richard Loeb in a celebrated Chicago case a century ago.  The two young men, both brilliant and wealthy, had murdered a 14 year old boy simply to carry out the “perfect” crime.  Darrow (working out the implications of the Darwinism he had early defended in the Scopes trial) declared they were simply acting out what had been programmed into them and were thus guiltless of any crime!  Wikhart traces the genealogy of this position across 200 years, running from Helvetius through Robert Owen and his socialist supporters to Marx and his 20th century revolutionaries such as Stalin and Mao.  Prominent American psychologists, led by John B. Watson and B.F. Skinner, declared that one can prescribe any kind of behavior by applying the right stimulus.  Thus criminals ought not be held responsible for their acts—society shapes them and they do what they cannot but do.  

Yet another powerful component in the “culture of death” is the “love of pleasure” most sharply evident in the works and influence of the Marquis de Sade, who embraced any kind of behavior (including sadism) that “feels good.”  He and other Enlightenment thinkers recovered and promoted Epicurus and Lucretius—ancient writers clearly at odds with the Christian tradition.  Subsequently, Jeremy Bentham and John Stuart Mill constructed an ethical system—Utilitarianism—reducing all moral questions to a pleasure/pain calculus.  There are no “natural” rights, only more or less pleasurable experiences.  Maximizing pleasure becomes the sole “good,” whether one considers an individual or a society.  Subsequently, Sigmund Freud set forth his highly-influential psychology, reducing most every question to its sexual implications and satisfactions.  His case for sexual liberation had enormous influence, particularly as the counterculture of the ‘60s worked out its hedonistic ethos.  

Some of the 20th century’s most toxic deadly notions flow from existential and postmodern philosophers.  One of the chief sources for both movements is Friedrich Nietzsche, who declared, in Also sprach Zarathurstra:  “Die at the right time; thus teaches Zarathustra . . . .  Far too many [people] live and hang much too long on their branches.  May a storm come to shake all these rotten and worm-eaten ones from the tree.”  When Clarence Darrow mounted his defense of Leopold and Loeb he invoked Nietzsche as well as Darwin to explain away their responsibility for murder.  Nietzsche certainly shared Darwin’s view of human origins, writing:  “‘You have made your way from worm to man, and much of you is still worm’” (#3307).  When one carefully studies the careers of Mussolini and Hitler it becomes evident that many of the most murderous regimes were influenced by the atheistic existentialism of Nietzsche, including his contempt for the less fit in life’s struggle.  We have a picture of Hitler looking at a bust of Nietzsche in 1938.  A caption for the picture proudly claims “Nietzsche was a forerunner of Nazism” (#3300), and Hitler certainly wanted to move “beyond good and evil” in his will-to-power ambitions.  Traditional ethical notions, such as opposing suicide and infanticide, were to be discarded in an endeavor to purify and elevate the race.  

Having looked at the many thinkers responsible for our culture of death, Weikart assesses the fact that suicide, euthanasia, infanticide, and abortion have become increasingly acceptable in much of our world.  Thus we find two medical ethicists, in 2012, proposing we re-conceptualize infanticide as “after-birth abortion” to insure its social acceptability.   “Death-With-Dignity” initiatives have succeeded in Washington, Oregon, and California and promise to succeed elsewhere as secularism replaces Christianity as the nation’s moral foundation.  Many secularists (including the famous “situation ethicist” Joseph Fletcher) insist that mere human beings are not fully “persons” and have no right to life.  Persons, Fletcher asserted, “must have certain qualities, such as the ability to make moral decisions, self-awareness, self-consciousness, and self-determination” (#4078).  Similarly, Peter Singer says, neither an unborn “fetus” nor a newly-born baby can be considered a “person.”  Nor do severely handicapped individuals or terminally ill comatose patients qualify as “persons.”  

In his “Conclusion,” Weikart says:  “Humans on display in zoos.  Comparing farm animals in captivity to Holocaust victims.  ‘After-birth abortion.’  Physicians killing patients, sometimes even when they are not sick or in pain.  Accusing  fetuses of assaulting their mothers, just because they are living peaceably in utero.  Promoting ‘love drugs’ to make us more moral.  Granting computer programs moral status.  These are just a few examples that powerfully illustrate how sick our society is.  As many intellectuals have abandoned the Judeo-Christian sanctity-of-life ethic in favor of secular philosophies, we have descended into a quagmire of inhumanity.  Some today view humans as nothing more than sophisticated  machines or just another type of animal.   For them, human s are nothing special—just another random arrangement of particles I an impersonal cosmos” (#4936).  

The Death of Humanity deserves careful study and reflection.  J. Budziszewski, one of today’s finest Christian philosophers, says:  “So often I have heard the question, ‘How did we ever become so muddled in this twenty-first century?  What happened?’  This is a question for a historian, who can weave a single coherent story about a great many sources of confusion.  Richard Weikart is that historian, and I will be recommending his sane and lucid book often.”  As will I—and am so doing with this review!   

* * * * * * * * * * * * * * * * * * * * * * 

In Architects of the Culture of Death (San Francisco:  Ignatius Press, c. 2004), Donald De Marco and Benjamin Wiker provide brief vignettes of 23 thinkers, grouped together in seven sections, who bear responsibility for the dehumanizing “culture of death” facilitating the killing of innocent persons.   De Marco is a philosopher; Wiker is a biologist; both are committed Catholics who write to promote the “Personalism” associated with Pope John Paul II and deeply embedded in two millennia of Christian thought.  “It is precisely because of the infinite value of each human person, as revealed especially in the great drama of Jesus Christ, that truly Christian culture must be a Culture of Life, a culture that sees the protection of persons and their moral, intellectual, and spiritual development as the defining goals of society.  Whatever contradicts these goals can have no place in the Culture of Life” (p. 14).  

Clearly at odds with the Culture of Life is Friedrich Nietzsche, one of the “will worshippers” who celebrated a “Will to War, a Will to Power, a Will to Overpower’” (p. 41).   His heroes were “Supermen” like Julius Caesar who imposed their will on others, using whatever (frequently violent) means necessary.  In 1940 an American historian, Crane Brinton, diagnosed the impact of his literary works:  “Point for point he preached . . . most of the cardinal articles of the professed Nazi creed—a transvaluation of all values, the sanctity of the will to power, the right and duty of the strong to dominate, the sole right of great states to exist, a renewing, a rebirth, of German and hence European society. . . .  The unrelieved tension, the feverish aspiration, the driving madness, the great noise Nietzsche made for himself, the Nazi elite is making for an uncomfortably large part of the world’” (p. 52).   

Though not connected with them in any formal way, Nietzsche certainly shared much with eminent eugenicists of his era, who all embraced Charles Darwin’s notions of evolution through “natural selection” and the “survival of the fittest.”  Though Darwin himself evaded the implications of his theory for human beings for much of his life, it became clear in 1871, with the publication of his Descent of Man, that he was a eugenicist.  And he was also “a racist and a moral relativist” (p. 76).  Thus his cousin, Francis Galton, enthusiastically worked out the social implications of Darwinism by promoting eugenic measures designed to improve the race.  Just as we can breed better dogs we can breed better babies.  Inferior members of the species are best left to die off or forced to embrace celibacy.  Private correspondence between cousins Galton and Darwin proves how totally the latter endorsed the work of the former, so the two share responsibility for what we term “Social Darwinism.”  Embracing some of the deadlier aspects of this movement, the German zoologist Ernst Haeckel championed a rather ruthless form of evolutionary philosophy he called Monism, “drawing out the full implications of Darwinism” (p. 107).  He fervently espoused “eugenics and racial extermination” as well as “abortion, infanticide, and euthanasia as well” (p. 107).  Haeckel’s books were widely read at the turn of the 20th century and demonstrably influenced many of the policies crafted by Adolf Hitler.   

“Secular utopianists,” preeminently Karl Marx, prepared the way for mass-murderers such as Stalin and Mao.  Though his devotees religiously absolve Marx from any responsibility for the behavior of Communist regimes—asserting all efforts to implement his teachings strayed from the founder’s intent—there is clearly a deadly dimension to all efforts to establish a perfectly egalitarian world.  In fact, “Marx could not be more limpid in his call for violence.  He advocated hanging capitalists from the nearest lampposts” (p. 125).  Aligned with Marx (sharing both his atheism and communism), the French Existentialist Jean-Paul Sartre’s “philosophy leads logically and directly to despair and suicide. . . .  His world of atheism is a kingdom of nothingness plunged into intellectual darkness, convulsed with spiritual hate and peopled by inhabitants who curse God and destroy each other in their vain attempt to seize his vacant throne” (p. 175).  (There is thus some warrant for Paul Johnson to suggest, in his biography of Darwin, that Pol Pot, the genocidal Cambodian Communist, derived some of his murderous ideas from both Sartre, who introduced him to Darwin, and from Darwin himself!)

While the “pleasure seekers” might not seem to promote the culture of death, at least indirectly they do!  Thus Helen Gurley Brown, who made Cosmopolitan magazine a stellar success (especially on college campuses), singularly promoted “feel-good sex.”  Her “Sex and the Single Girl, [was} a ‘shameless, unblushing, runaway, unmitigated’ manual advising and instructing women on how to seduce men and enjoy their inalienable right to have as much sex as humanly possible” (p. 237).  Her message helped shape the enormously successful television show, “Sex in the City,” mainstreaming her ideas.   Inevitably she approved adultery, contraception, and abortion—anything pleasuring yourself was fine.  

So too “sex planners” added their notions to the anti-life brew.  Margaret Mead, named “Mother of the World” by Time magazine in 1969, was certainly one of the most influential anthropologists of the 20th century and reached a broad women’s audience through her regular columns (1961-1978) for Redbook magazine, helping “bring the twentieth-century sexual revolution to its culmination”  (p. 250).  As a young woman she published Coming of Age in Samoa (1928) and instantly became an academic superstar.  Though her misleading portrayal of the sexually libertine Samoans was “autobiography disguised as anthropology,” the book would be required reading in hundreds of university classes and help undermine the Christian tradition’s commitment to chastity and opposition to abortion.  Joining Mead as a spurious “scholar” was Alfred Kinsey, who sought to justify his own covert homosexuality and pedophilia with allegedly statistical studies on the Sexual Behavior of the American male and female.  The Kinsey Reports lent an aura of respectability to deviant behaviors simply by falsely stating large numbers of Americans actually practiced them.  

Finally, there are the “death peddlers”—Derek Humphry, who in Final Exit championed suicide; Jack Kevorkian, the pathologist who bragged about his “mercy-killing” activities and “personifies the Culture of Death” as vividly as anyone; and Peter Singer, the Princeton philosopher who seeks to discard the “traditional Western ethic” which for 2,000 years has promoted the “sanctity of life.”  Taking Darwinism to its ultimate conclusions” (p. 363), Singer denies significant differences between humans and other animals.  He also believes a “person” is a human being with certain capacities and thus not all humans are “persons” worthy of being.  His books—and his international prestige as being one of the preeminent ethicists in the world—bear witness to the triumph, in many sectors, of a noxious ideology.  

284 Hillary’s History

In a court of law, eyewitness testimony is highly privileged, considered “first-hand” evidence most worthy of consideration.  So too historians relentlessly seek out “primary” sources—eyewitness accounts showing “how it actually was.”  Eyewitnesses may, of course, render skewed accounts—shaped by personal biases or faulty memories or delimited vision.  They may very well be a bit inarticulate and disjointed in telling their stories.  So juries and historians take such things into account and try to put everything in its proper context.  But in the final analysis eyewitness testimony and primary sources provide us our surest route to historical truth.

One recent eye-witness account meriting attention is Gary J. Byrne’s Crisis of Character:  A White House Secret Service Officer Discloses His Firsthand Experience with Hillary, Bill, and How They Operate (New York:  Center Street, c. 2016).  After serving in the Air Force, Byrne realized his vocational aspirations and became “an elite White House Secret Service officer, a member of its Uniformed Division,” entrusted with guarding the President, his family and staff.  He began his assignment when George H.W. Bush (affectionately referred to as “Papa Bush”) was still President.  “I assumed every president would follow Papa Bush’s example,” Byrne says.  “The work ethic, love of country, work environment, and respect for the people serving would be constant, and politics would never matter” (p. 36). 

But his high regard for the Bush family turned to anguish as he watched the Clintons occupy the White House and witnessed first hand—among other things—the Monica Lewinsky affair.  In addition:  he  “saw a lot more.  I saw Hillary, too.  I witnessed her obscenity-laced tirades, her shifting of blame” (p. ix) and other traits disqualifying her from most any high office, much less the presidency.  He and his fellow officers “were measured by the highest of ethical requirements” while “[t]hose at the very pinnacles of power held themselves to the very lowest standards—or to none whatsoever” (p. x).  “The Clintons are crass.  Papa Bush is class” (p. 277).  To Bryne, Hillary  “simply lacks the integrity and temperament to serve in the office.  From the bottom of my soul I know this to be true.  So I must speak out” (p. xi).   

Byrne’s critical comments are confirmed and underscored by other agents, who provided Ron Kessler the information recorded In The President’s Secret Service:  Behind the Scenes with Agents in the Line of Fire and the Presidents They Protect—an historical narrative of the agency.  In the chapter devoted to the Clintons, Kessler says that Bill was charming, if utterly undisciplined, but “Hillary Clinton could make Richard Nixon look benign.  Everyone on the residence staff recalled what happened when Christopher B. Emery, a White House usher, committed the sin of returning Barbara Bush’s call after she had left the White house.  Emery had helped Barbara learn to use her laptop.  Now she was having computer trouble.  Twice Emery helped her out. For that Hillary Clinton fired him” (p. 146).  He would then be unemployed for a year, thanks to the vindictive First Lady!  One agent said:  “‘When she’s in front of the lights, she turns it on, and when the lights are off and she’s away from the lights, she’s a totally different person.’” Off stage she was “‘very angry and sarcastic and is very hard on her staff.   She yells at them and complains.’”  Though publically she pretended to adore the agents assigned to protect her, she “‘did not speak to us.  We spent years with her.  She never said thank you’” (p. 147).  That other agents share Byrne’s disdain for Hillary lends his account considerable credibility!  

Agent Byrne first encountered the Clintons in 1992 when he worked at some of the candidate’s  campaign rallies.  Chatting with a sheriff from Arkansas, he mentioned the many rumors then revolving around the Clintons.  The sheriff “gave me a thousand-yard stare.  ‘Let me tell you something, Gary.  Everything—everything they say about them is true.  The Clintons are ruthless.  And [the media-led public] don’t even know the half of it’” (p. 39).  The next six years amply proved to Byrne the truth of that sheriff’s assertion.  The polite, orderly White House deteriorated into “helter-skelter” chaos as the Clinton crew failed to “focus, pace themselves, or even delegate.  Staff wore jeans and T-shirts and faced each problem with grand ideological bull sessions” (p. 50).  Hillary Clinton’s “doting, barely post-adolescent staffers resembled enabling, weak-willed parents.  She threw massive tantrums” (p. 56) which only intensified as the years passed.  Her friendly, empathetic public facade belayed the private fury evident in “antics [that] made my job interesting.  She’d explode in my face without reservation or decorum, then confide in some visiting VIP, ‘This is one of my favorite officers, Gary Byrne’” (p. 60).  

Byrne provides important details regarding various scandals and insights into personalities in the Clinton White House, but he is best known for his testimony regarding the Monica Lewinsky affair that figured largely in the impeachment of the president.  She was what the secret service called a “straphanger” or “loiter”—a young volunteer intern with political connections, wondering about the White House seeking access to powerful persons.  Lewinsky clearly stalked President Clinton, doing everything possible to frustrate the agents who tried to shield him from her advances.  But rather quickly it became an open secret that she and Clinton were having an affair—one many such trysts the president engaged in while living in the White House, including sessions with Eleanor Mondale, the daughter of the former vice president.  Still more, a fellow agent told Byrne:  “‘You have no idea what it’s like on the road’” (p. 107), where women regularly traipsed in and out of Clinton’s quarters.  He “had difficulty managing where he saw his many mistresses, whether it was at the White House or on the road.  It baffled the Uniformed Division as to how he could manage all these women without any of them realizing there were so many others.  We wondered how he got any work done and joked that he would have been better at running a brothel in a red-light district than the white House” (p. 127).  

After encountering Lewinsky, President Clinton put her on the White House payroll and gave her his top-secret phone number so they could have intimate talks.  To Byrne:  “paying a mistress with taxpayer funds and giving her security clearance?  These were new lows” (p. 111).  Ultimately the semen-stained blue dress would prove the president guilty of perjury and lead to his impeachment.  Then when special prosecutor Ken Starr, investigating Clinton’s affair with Paula Jones, learned of the Lewinsky affair, he brought the weight of the Justice Department to bear on Byrne, seeking information helpful to his investigation.  So very much against his will he was subpoenaed and forced to tell what he had observed in the White House.  Testifying via videotape before a grand jury, he would soon be seen by the nation on C-SPAN—though he had been promised his testimony would remain sealed.  As a Secret Service agent he had vowed to protect the president—committed to never revealing “information that might jeopardize [his] safety and security”—so he refused to discuss certain things.  But as a citizen he had to reveal certain details relevant to the Starr inquiry.  Consequently, he became one of the most important under-oath witnesses regarding the Clintons’ behavior in the White House.

Now safely removed from that crisis-ridden epoch, Byrne can look back and assess it.  While testifying, he remembered that Arkansas sheriff’s words regarding the Clintons’ ruthlessness, and he confesses to fearing them and what might happen to him and his family because of his testimony.  Still more, he’s outraged:  “I was compelled to tell the truth, but why the hell was neither the president nor Mrs. Clinton ever really compelled to tell the damn truth?” (p. 165).  Bill Clinton misbehaved and lied and easily moved on virtually unscathed while many “little people” had their lives ruined by his behavior and his wife’s machinations.  “This is the man I was protecting?  That’s what I tolerated?  I had tried and tried to prevent harm to this president, but he failed us all!” (p. 177).  

Two decades later, Byrne says:  “Our collective amnesia about the Clinton White House is dangerous because it could happen again—maybe with a different Clinton dealing the cards, but with the same stacked deck” (p. 273).  So he has written this book to dissuade us from electing Hillary, particularly in light of her careless handling of classified materials and suspicious work with the Clinton Foundation.  He “was there with the Clintons.  I could not keep silent then, and I can’t keep silent now” (p. 274).  

* * * * * * * * * * * * * * * * * * * * *

In the early ‘60s David Schippers led the Justice Department’s Organized Crime and Racketeering Unit, successfully prosecuting mobsters such as Sam Giancana.   A lifelong Democrat who twice voted for Bill Clinton, he was renowned for his skills as a prosecutor and trial attorney.  More importantly:  he was known as a man of integrity.  As The House of Representatives began the inquiries which led to the impeachment of President Clinton, Shippers was brought to Washington to lead an oversight investigation of the Justice Department and ultimately became Chief Counsel of the House Managers entrusted with pursuing evidence for the president’s impeachment.  In Sellout:  The Inside Story of President Clinton’s Impeachment,Shippers provided an “insider’s account” of what happened nearly 20 years ago.  

In the light of evidence he probably knew better than anyone else, Shippers believed Clinton should have been removed from his office for his “high crimes and misdemeanors.”  Though the president claimed to be “proud of what we did” during the impeachment process—declaring he “saved the Constitution”—Schippers thought him demonstrably guilty of  “some of the most outrageous conduct ever engaged in by a president of the United States” (p. 3).  He quickly learned to detect and deeply abhor the Clintons’ guiding modus operandi:  do anything to avoid the truth.  White House spin-masters manipulated the media (portraying the president as a victim) and glossed over his incessant lies which were obvious to skilled lawyers who saw through his legalistic obfuscations.  To Shippers, Clinton’s real “high crimes and misdemeanors” were perjury and obstruction of justice.  But he and his media accomplices successfully reduced the whole inquiry to nothing more than questions of lamentable sex with Monica Lewinsky.  “The White House never ceased to astound and dismay me in the extent to which it demonstrated its utter contempt for the Judicial Branch, the Legislative Branch, and the American people” (p. 171).  

As much as anyone, then, David Schippers understands the Clintons’ duplicitous behavior.  So when he commends a recent book by Dolly Kyle we may assume he validates much of her account in Hillary:  The Other Woman:  A Political Memoir (Washington, D.C.:  WND Books, c. 1916).  Schippers says the book “is as timely as tomorrow’s newspaper” inasmuch as it contains “Ms. Kyle’s firsthand knowledge obtained over many years” (#56).  Acutely aware of the investigations he conducted 20 years ago, he affirms the truth of Kyle’s memoir since she’s known the Clintons for half-a-century and occupies “a unique position to reveal the truth about Billy and Hillary that no one else can tell” (#178).  She wrote this book because “Hillary Rodham Clinton is running for president.  She is morally and ethically bankrupt” (#144).  From Kyle’s perspective:  “The average person cannot comprehend that two politicians could have managed to get where they are with so many crimes in their wake, and so little reporting about it” (#1028).  The Clintons are, to be candid:  “lying, cheating, manipulative, scratching, clawing, ruthlessly aggressive, insatiably ambitious politicians . . . and nothing about them has changed in the past forty-plus years, except that they have deluded more and more people” (#1034).  

Dolly Kyle met Bill Clinton in 1959 in Hot Springs, Arkansas, when she was eleven years old.  They both graduated from Hot Springs High School in 1964, and she provides many details and insights into the community and families that help us better understand “Billy” Clinton.  She was immediately attracted to him and “a liaison . . . evolved from puppy love to dating to friendship to a passionate love affair” (#209) that lasted, off-and-on, for 40 years.  Their affair was pretty much an “open secret” in Arkansas, though it attracted little media attention.  “I’m not proud (and have repented) of having that decades-long affair with Billy Clinton, but it is a fact” (#448).  They became lawyers, married other persons, had children, and repeatedly interacted with each other.  Sadly enough, for too many years she simply thought of him as a lovable rascal, indulging his appetites with a series of willing women.  “I didn’t realize until many years later, that Billy was a serial sexual predator and a rapist” (#1886).  Nor did she then understand Hillary’s role in suppressing any evidence of his philandering.  

When Hillary Rodham moved to Arkansas and married Bill Clinton, she necessarily had contact with her husband’s Arkansas friends, including Dolly Kyle.  Though Dolly retains a lingering affection for “Billy” (despite his wayward ways he’s “a charming rogue who was sexually addicted”), she clearly dislikes Hillary.  In her opinion, Bill moved in with Hillary while they were students at Yale in order to share her wealth and the two have simply used each other to advance their respective careers ever since.    During their early years, it was “generally Hillary’s job to make the money and provide the financial base from which she and Billy could maneuver their way to the White House” (#2424).  “Even their decision to have a child was a calculated political maneuver to make them appear to be a normal couple” (#1533).  

Meeting Hillary for first time in Little Rock in 1974, Kyle was shocked at the “dowdy-looking woman who appeared [at a distance] to be middle-aged” (#590), wearing thick glasses, shapeless dress and sandals; she clearly cared little for style or personal appearance.  When Bill introduced them, Dolly “smiled and extended my right hand in friendship,” but Hillary “responded only with a glare at me.  Finally, seeing my hand still extended, she managed a grudging nod.  She did not condescend to shake my hand” (#608).  Obviously there would be little love lost between these two women in Bill Clinton’s world!  But their encounters were minimized as Bill usually attended events (such as high school reunions) without Hillary and could easily engage in various liaisons to his liking.  At the 30th reunion there occurred “the infamous scene between the two of us that was immortalized under oath in the impeachment investigation” (#935).  

Ultimately, when he was president, Bill Clinton’s sexual affairs came under increased judicial scrutiny, and Kyle (under oath in a disposition in the Paula Jones v. Clinton lawsuit) disclosed the nature of their relationship.  She had earlier discovered first-hand the malice and vindictiveness with which Hillary pursued any woman who might endanger her aspirations.  In fact, when an English journalist was about to disclose her affair with Bill when he was running for president in 1992, her own brother had warned her, speaking for Billy:  “If you cooperate with the media, we will destroy you’” (#3291) if she confirmed the truth about her relationship with him.  So in time she concluded:  “While proclaiming himself to be the champion of women’s rights, Billy Clinton has continually betrayed the woman he married, the girl he fathered, and the untold numbers of women he used for his sexual gratification.  Meanwhile, proclaiming herself to be the champion of women’s rights, Hillary Clinton has been behind the threats and intimidation of the women her own husband abused and molested” (#1579).  

In addition to providing details regarding Billy’s sexual misconduct, Kyle shares what she knows about the Clintons’ multifaceted adventures in Arkansas and the White House.  She discusses important  personalities such as Webb Hubbell and Vince Foster (one of Bill’s childhood friends and a partner with Hillary at the Rose Law Firm).  She cynically notes that Hillary was first hired and later became a partner of the Rose Law Firm at precisely the same moments her husband became attorney-general and then governor of Arkansas!  Vince Foster “knew the facts about Hillary’s double-billing practices that had enabled her to receive questionable foreign money with strings attached” as well as the “FBI files that had been taken illegally for illegal purposes and would later be found with Hillary’s fingerprints on them” (#3525).  He knew all the details regarding the Clintons’ financial adventures.  In time, Kyle thinks, he committed suicide simply because he could not handle all the stress he experienced as a result of his work with the Clintons, dying under the weight of being betrayed by his friends.  

Dolly Kyle also conveys—as she documents the evils done by the Clintons—a deep sense of betrayal.  She feels personally betrayed, but in a larger sense she’s persuaded they have betrayed an enormous number of others and this nation itself.  While distressingly disorganized and subject to criticism because of her personal animosities, Hillary, the Other Woman, certainly gives us first-hand insights into the character (or lack of it) of two of the most prominent politicians of our era.  

* * * * * * * * * * * * * * * * * * * * * 

            Perhaps the best-known victim of the terrorists’ attacks on September 11, 2001, was Barbara  Olson, the wife of the nation’s Solicitor-General, Ted Olson.  Like her husband, she was a lawyer, and had served as both a prosecutor for the Department of Justice and as counsel to a House congressional committee that investigated some of the Clintons’ scandals.  She died aboard the hijacked airplane that smashed into the Pentagon two days before her long-awaited book—ironically titled Final Days—was to be published.  She concluded that book with a solemn reminder and a warning regarding the deeply radical views of Bill and Hillary Clinton which she had earlier catalogued in Hell to Pay: The Unfolding Story of  Hillary Rodham Clinton (Washington: Regnery Publishing, Inc., c. 1999).

          Olson’s eyes opened while investigating allegations regarding missing FBI files and the firing of White House Travel Office employees in order to give the jobs to some of the Clintons’ Arkansas friends.  .  Immersing herself in the witnesses’ evidence, Olson came “to know Hillary as she is—a woman who can sway millions, yet deceive herself; a woman who has persuaded herself and many others that she is ‘spiritual,’ but who has gone to the brink of criminality to amass wealth and power” (p. 2). Olson had “never experienced a cooler or more hardened operator,” a more singularly calculating public figure, whose  “ambition is to make the world accept the ideas she embraced in the sanctuaries of liberation theology, radical feminism, and the hard left” (p. 3).  Machiavellian to the core, Hillary proved herself to be “a master manipulator of the press, the public, her staff, and-likely-even the president” (p. 3).

          Intellectually gifted, Hillary attended Wellesley College in the late ‘60s.  Awash in the currents of the counterculture, she gradually embraced its radical agenda, participating in antiwar marches, defending a Black Panther murderer, and enlisting fellow students to change the world.  She was selected to speak at her commencement following an address by Massachusetts’ Republican Senator Edward Brooke.  Rather than give her prepared speech, however, Hillary “‘gave an extemporaneous critique of Brooke’s remarks’” (p. 41), rudely reproving him. “We’re not interested in social reconstruction,” she shouted; “it’s human reconstruction” (p. 42). Nothing less than the Marxist “new man”—would satisfy her.  

          That youthful obsession, Olson argues, persisted.  Hillary found Western Civilization bankrupt, needing more than reform.  Only “remolding,” only radical new structures, can bring about the “social justice” she pursues.  Such can come only “from the top—by planners, reformers, experts, and the intelligentsia.  Reconstruction of society by those smart enough and altruistic enough to make our decisions for us.  People like Bill and Hillary Clinton.  Hillary, throughout her intellectual life, has been taken by this idea, which is the totalitarian temptation that throughout history has led to the guillotine, the gulag, and the terror and reeducation camps of the Red Guard”  (p. 311).  Overstated?  Well, Olson knew Hillary well!  

283 Feminist Fallout

   That the unintended consequences of revolutionary political and social movements frequently surpass their original intent may be easily discerned in the study of history.  This truth poignantly surfaces in Sue Ellen Browder’s Subverted:  How I Helped the Sexual Revolution Hijack the Women’s Movement (San Francisco:  Ignatious Press, c. 2015).  She begins with this confession:  “I can give you no justification for what I did I my former life.  I will only say this is my weak defense:  I was a young woman searching for truth, freedom, and meaning in the world, but I had no clue where to find them” (#37 in Kindle).  

In part Subverted is an autobiography, an account of a modern journalist.  As a youngster growing up in Iowa, Browder longed to escape her small-town environs and join the more exciting, opportunity-laden cosmopolitan world she saw in magazines and television.  Determined to become a writer, she entered and then graduated from the University of Missouri’s School of Journalism.  She then worked briefly for a publication in Los Angeles before going to New York and landing a job as a free-lance writer with Helen Gurley Brown’s Cosmopolitan, which in 1970s was “the undisputed reigning queen of women’s magazines—the hottest women’s magazine in the nation” (#45).  She basked in the glow of early success, seeing her talents displayed in the pages of  a publication renowned for promoting the causes she most ardently supported—including the ‘60s sexual revolution.  

She’d all so quickly realized her adolescent dream!  “Only later would I realize how dark the dream had become.  Eventually, it would lead to a cacophony of mixed, confused messages in our culture about women, work, sex, marriage, and relationships—errors that have divided our nation and continue to haunt us to this day.  It would lead me to make disastrous decisions” (#63).  But as she and her husband and two children moved about the country, finding a variety of positions and surviving as writers, she continued, for 24 years, publishing articles in Cosmopolitan, telling “lie upon lie to sell the casual-sex lifestyle to millions of single, working women” (#69).  

So Browder’s purpose in writing the book is more than autobiographical—she wants to clarify where and why she went so wrong for so wrong.  It all begin with here naive enlistment in the women’s movement.  Though she’d been reared by parents clearly committed to her personal development, reading Betty Frieden’s The Feminine Mystique when she was 17 powerfully affected her.  “‘The only way,” Frieden declared, “for a woman, as for a man, to find herself, to know herself as a person, is by creative work of her own.  There is on other way’” (#265).   That goal Browder successfully pursued.  But she also met and married another writer, Walter Browder, launching a relationship which would put her at odds with the liberationist feminism Frieden promoted.  She naturally “took the Pill without a qualm,” imagining she could “enjoy sterile sex and control my own sex life” (#334), not knowing how it would “put me on a hormone-powered emotional roller-coaster, which regularly plunged me into black pits of depression” (#334).  

Despite the Pill she became pregnant and had a baby shortly before moving to New York—another complicating relationship!  In her initial interview with the Cosmopolitan staff (knowing Helen Gurley Brown “saw the single girl as ‘the newest glamour girl of our times’ and viewed children as ‘more of a nuisance than a blessing’”) she carefully avoiding mentioning the fact she was a mother.  “At Cosmo, I was a dedicated follower of Planned Parenthood founder Margaret Sanger, the foremost proponent of birth control as a panacea to the world’s problems.  Sanger idolized sex without kids.  ‘Through sex,’ Sanger sang joyously in The Pivot of Civilization, ‘mankind may attain the great spiritual illumination which will transform the world, which will light up the only path to an earthly paradise’” (#556-557).  Writing for “Cosmo,” the author laments, “I danced in Sanger’s procession” (#557).

Hired to write articles for Brown’s magazine, Browder quickly learned that lots “of the alleged ‘real people’ we wrote about in the magazine were entirely fictitious” (#527).  While working in California, she’d seen journalists blithely make up “sources” and write articles without doing the hard work of actually investigating events, so constructing stories about a Cosmo Girl who would “sleep with any man she pleased” and enjoy an upwardly mobile career became quite easy for her.   She just constructed imaginary stories, writing about an unreal world.  She remained “a loyal foot soldier in the women’s movement’s media army.  Even as I rejected the sexual revolution lifestyle as a sham, I scrambled to climb aboard NOW’s freedom train” (#694), promoting “a false path to freedom that was not just reprehensible but evil” (#717).  

Blatant evil triumphed when Betty Frieden led the National Organization of Women to join forces with Larry Lader’s NAROL, an abortion-rights group determined to secure abortion-on-demand.  “At Cosmo,” Browder confesses, “the one assumption I never thought to question in my confusion was whether or not abortion and contraception were good for women” (#930). On a personal level, Browder herself would abort a baby when family finances seemed to dictate.  But she found that having an abortion was hardly the trivial affair Cosmopolitan readers assumed!  Part of herself, as well as her tiny baby, died on that gurney.  As she would later learn when she researched the subject, Lader’s spurious book, Abortion, was cited repeatedly by Justice Harry Blackmun in his Roe and Doe decisions.  In time, Browder would carefully read and reflect on Blackmun’s role in prescribing abortion-on-demand for the country, finding the man and his judicial work seriously flawed.  

  Even while writing her Cosmo articles, at home Browder found in her husband and son a different world, a “better way,” a life “filled with light, laughter and love” (#595).  Her success as a writer only temporarily satisfied her, whereas her work as a mother was “sheer delight” (#1831).  She finally realized “that by focusing almost exclusively on money, power, and career, while denying women’s deeper longings for love and a family, the modern women’s movement got its priorities upside down and backward” (#2475).  So she began asking deeper, more philosophical questions.  Initially, she embraced the “self-actualization” psychology of Abraham Maslow’s—in reality a “self-as-god” way of thinking that cannot but fail.  “Detached from God,” she laments, “I was ready to listen to any blowhard who came my way” (#1436).  Ultimately, she and her husband “went back to church” and found, much too late in many ways, the truth she’d always sought.  “After we returned to church, everything in our lives seemed fresh and new.  Never had we been so happy” (#2192).   

The Browders initially entered an Episcopal church in Connecticut.  Later, while living in California and ever-more deeply hungering for God’s Reality, they entered the Catholic Church in 2003.  To her amazement, “This wasn’t the ‘stuffy, old, patriarchal church’ I’d heard about.  The Church’s teachings were all about love, joy, and forgiveness.”  Still more:  “This was a complete system of philosophical thought and mystical faith with answers the entire world needed to hear” (#3031).  Subverted is an engrossing story, packed with important insights, that tells us much that’s gone wrong in our country during the past half-century.  

                                           * * * * * * * * * * * * * * * * * * * * *

In Tied Up in Knots:  How Getting What We Wanted Made Women Miserable (New York:  Broadside Books, c. 2016), Andrea Tantaros sets forth a secular critique of modern feminism that blends praise and protest for what’s happened for and to women during the past half-century.  She grew up taking to heart Betty Friedan’s message in The Feminine Mystique.  Empowered thereby she pursued a media career and ultimately landed a position with Fox News, where she regularly airs her views before a national audience.  What more could a young woman want?  And she likes what she’s got and still supports the feminist agenda—“If I have to choose between feminism and the pre feminist days, I will choose feminism without hesitation” (#3320).  Yet, it turns out, amidst all her success there has come a gnawing suspicion that there’s more to life than the feminist mantra of “making it in a man’s world.”  Acting like men, feminists insisted they “pay our bills, open our own doors, and carry our own bags” (#757).  But as they stopped acting like women real men steadily lost interest in them.  Ah there’s the rub!

Certainly “women should be equal with men, but, at the same time,women aren’t men.  Equal does not mean the same” (#199).  Yet that’s what many feminists demanded.  Consequently, “feminism doesn’t feel very feminine” (#204).  What Tantaros calls “the Power Trade” negotiated by feminists was in fact “a deal with the devil,” for by imitating men women “abandoned our potent and precious female power” and ceased to act like ladies (#210).  Indeed, many of the movement’s leaders have waged war against men and done lethal harm to healthy heterosexual romance and marriage.  Speaking autobiographically, Tarantas says:  “I have been a one-woman focus group on the tenets of feminism for three decades.  But it wasn’t until I found myself single after two back-to-back long-term relationships that I realized how different the dynamic between the sexes had become” (#233).  In short:  she’d become a highly successful woman with  neither husband nor children—and that’s not really how it’s supposed to be!  Sadly:  “Postponing marriage and motherhood comes with huge costs—and no one is telling young girls this” (#2753).  

Given her own predicament, she’s written this book to try and understand it.  But her analysis, alas, is too often as superficial as the life she’s lived!  She makes interesting observations, tells vivid anecdotes and cites various studies, but she lacks the philosophical, much less theological, resources to address the real issues that so obviously trouble her.  She knows she wants something but cannot actually understand what it is.  So daydreams about the the “superrelationship” she and her “generation of women” await:    “We want a soulful, sexy, and inspired union that can help us realize our full potential in life.  We want a deep connection with a best friend, an emotional and spiritual confidant, and intellectual counterpart who gets our inside jokes, matches us financially, and who loves us with a passion that rivals Romeo’s.  Women have gained power and are refusing to settle—and that is a good thing.  Women can find that kind of love, but we just have to be patient enough to wait for it and refuse to settle for anything less than what we want:  love, fidelity, kindness, respect” (#1142-44).  Such soaring aspirations rarely find fulfillment simply because they’re basically unreal—so lonely women like Tantaras will forever be “tied up in knots” I fear.  

* * * * * * * * * * * * * * * * * * * *

One of the 20th century’s most remarkable women was Edith Stein, a Jewess who studied philosophy with Edmund Husserl, taught philosophy in German universities, and then converted to the Catholic Church.  She joined the Carmelite order, devoting herself to teaching (clearly her great vocation) in its schools.  When the Nazis gained control of Germany, Stein fled to Holland but was in time arrested, sent to a concentration camp, where she perished.  In 1998 Pope John Paul II elevated her to sainthood, standing as a wonderful witness to bother her intellectual brilliance and spiritual sanctity.  During the 1930s she wrote and delivered as lectures a series of papers now collected in volume two of her collected works and titled Essays on Woman, Second Edition, Revised (Washington:  ICS Publications, c. 1996).  That few if any leading feminist thinkers (e.g. Betty Frieden) are first-rate thinkers becomes clear when one reads how a truly great philosopher addresses the topic!  Given the nature of a collection of papers, many of Stein’s positions are routinely repeated and a careful perusal of a selected few would reveal the essence of her thought  

Feminism, in accord with a litany of other ideologies, inevitably fails inasmuch as it misrepresents and endeavors to evade Reality.  But as a serious philosopher, Stein understood her task:  to see clearly and better understand whatever is.  Thus she continually sought to probe the essence of womanhood—“what we are and what we should be”—discerning therein direction for evaluating the feminist movement and describing the proper life—and particularly the redemptive form of life—best for females.  She applied St. Thomas Aquinas’ understanding of analogy entis to her work, seeing God’s image in human beings who need (like a planted seed) both human assistance and divine grace to attain their true end.  Though feminists generally insisted there were no significant differences between men and women, thus calling for identical educational curricula and vocational opportunities, Stein upheld what she considered an indubitable truth:  sexual differences matter greatly.  

Thus, in “The Ethos of Women’s Professions,” she sought to discuss work in light of the “an inner form, a constant spiritual attitude which the scholastics term habitus” (#718) which necessitates we recognize “specifically feminine” vocations.  To Stein, there  are “natural feminine” traits that “only the person blinded by the passion of controversy could deny” (#747).  As both Scripture and common sense make clear, “woman is destined to be wife and mother.  Both physically and spiritually she is endowed for this purpose.”  Giving structure to her bodily being is that spiritual reality—the anima forma corpus—which differentiates her from men of the same species.  Thus she “naturally seeks to embrace that which is living, personal and whole.  To cherish, guard, protect, nourish and advance growth is her natural, maternal yearning” (#755).  Unlike men, with their penchant for abstractions and devotion to tasks, women relish more concrete, living things.  Their “maternal gift is joined to that of companion.  It is her gift and happiness to share the life of another human being and, indeed, to take part in all things which come his way, in the greatest and smallest things, in joy as well as in suffering, in work, and in problems” (#762).   Works of charity, in particular, come quite naturally to her.  

Understanding this God-given reality, women rightly enter various professions, and “there is no profession which cannot be practiced by a woman” (#815).  Yet some work—nursing, teaching, social work—more easily accommodate the “sympathetic rapport” that comes naturally to them.  Yet “the participation of women in the most diverse professional disciplines could be a blessing for the entire society, private or public, precisely if the specifically feminine ethos would be perserved” (#844).  Still more, in light of the Thomistic position that “Grace perfects nature—it does not destroy it,” women should always seek to flourish in accord with their unique nature, their femininity, serving God through “quiet immersion in divine truth, solemn praises of God, propagation of the faith, works of mercy, intercession, and vicarious reparation” (#858).  Surrendering to God, seeking to do His will, opens the door to human flourishing.  “God created humanity as man and woman,” she concludes, “and He created both according to His own image.  Only the purely developed masculine and feminine nature can yield the highest attainable likeness to God.  Only in this fashion can there be brought about the strongest interpenetration of all earthly and divine life” (#955).  

Stein consistently contends for “The Separate Vocations of Man and Woman According to Nature and Grace.”  If we carefully attend to what’s real, “the person’s nature and his life’s course are no gift or trick of chance, but—seen with the eyes of faith—the work of God.  And thus, finally, it is God Himself who calls.  It is He who calls each human being to that to which all humanity is called, it is He who calls each individual to that to which he or she is called personally, and, over and above this, He calls man and woman to such to something specific as the title of this address indicates” (#974).  In the biblical creation account, Adam and Eve “are given the threefold vocation:  they are to be the image of God, bring forth posterity,and be masters over the earth” (#990).  Given that assignment, Eve is called to be Adam’s “helpmate”—an “Eser kenedo—which literally means ‘a helper is if vis-a-vis to him’” (#1000).  Both sexes are equal and equally important, sharing responsibility to “fill the earth and subdue it,” though their roles in carrying out the assignment rightly differ.  Theirs is a complementary relationship:  “man’s primary vocation appears to be that of ruler and paternal vocation secondary (not subordinate to his vocation as ruler but an integral part of it); woman’s primary vocation is maternal:  her role as ruler is secondary and included in a certainty in her maternal vocation” (#1228). 

Rather than point to an evil “patriarchy” or unjust polity, Stein locates the source of the problems women experience:  As a result of man’s Fall:  “Everywhere about us, we see in the interaction of the sexes the direct fruits of original sin in the most terrifying forms:  an unleashed sexual life in which every trace of their high calling seems to be lost; a struggle between the sexes,one pitted agains the other, as they fight for their rights and, in doing so, no longer appear to hear the voices of nature and of God.  But we can see also how it can be different whenever the power of grace is operative” (#1264).  So there is, in God’s Grace, hope for us all:  “The redemption will restore the original order.  The preeminence of man is disclosed by the Savior’s coming to earth in the form of man.  The feminine sex is ennobled by virtue of the Savior’s being born of a human mother; a woman was the gateway through which God found entrance to humankind.  Adam as the human prototype indicates the future divine-human king of creation; just so, every man in the kingdom of God should imitate Christ, and in the marital partnership, he is to imitate the loving care of Christ for His Church.  A woman should honor the image of Christ in her husband by free and loving subordination; she herself is to be the image of God’s mother; but that also means that she is to be Christ’s image” (#1160).  

As a committed Catholic, Stein defends the Church’s tradition regarding the priesthood.  “If we consider the attitude of the Lord Himself, we understand that He accepted the free moving services of women for Himself and His Apostles and that women were among His disciples and most intimate confidants.  Yet he did not grant them the priesthood, not even to his mother, Queen of the Apostles, who was exalted above all humanist in human perfection and fullness of grace” (#1375).  Why?  Because in the natural order designed by God, “Christ came to earth as the Son of Man.  The first creature on earth fashioned in an unrivaled sense as God’s image was therefore a man; that seems to indicate to me that He wished to institute only men as His official representatives on earth” (#1391).  Men and women are equally called to enter into communion with their Lord, but they are called to follow different paths in doing so.  “It is the vocation of every Christian, not only of a few elect, to belong to God in love’s free surrender and to serve him” (#1391).  This is, above all, everyone’s vocation and therein there is “neither male nor female.”  

“God has given each human being a threefold destiny,” Stein says:  “to grow into the likeness of God through the development of his faculties, to procreate descendants, and to hold dominion over the earth.  In addition, it is promised that a life of faith and personal union with the Redeemer will be rewarded by eternal contemplation of God.  These destinies, natural and supernatural, are identical for both man and woman.  But in the realm of duties, differences determined by sex exist” (#1627).  Especially in the process of procreating and rearing children, women must carefully sense and assent to God’s plan for man.  Though single women like Stein herself have an important calling, for most women marriage and children should be fundamental—for it is, in truth, most vital to their being and ultimate happiness.  

Had feminists in the 20th century thought as deeply as Stein—and followed the truth wherever it leads—much of the negative fallout felt by today’s young women could have been avoided!   Were influential academics as committed to truth telling as Stein we’d not be burdened with the strident declarations that there are absolutely no differences between the sexes!  Were Christians more concerned with God’s will than politically correct posturing, there would be greater focus and effectiveness to the Church’s mission.

# # #  

282 A “Republican” Constitution?

  Following the work of the Constitutional Convention of 1787, a Philadelphian asked Benjamin Franklin:  “Well, Doctor, what have we got, a republic or a monarchy?”  Franklin promptly responded, “A republic, if you can keep it.”   He and his colleagues obviously sought to establish a constitutional republic, subject to laws rather than men, but they also (as was evident in many of their debates) wanted to preserve this “republic” from a host of “democratic” abuses that might threaten it.  This differentiation sets the stage for Randy E. Barnett’s insightful treatise, Our Republican Constitution:  Securing the Liberty and Sovereignty of We the People (New York:  HarperCollins, c. 1916), wherein he argues that we must interpret the Constitution in light of the Declaration of Independence’s memorable assertion:  “We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness.  That to secure these rights, Governments are instituted among Men, deriving their just powers from the consent of the governed.”  

That pre-existing, natural Rights are given by the Creator and possessed by individual persons, not groups of people, marks a true Republic!  Such rights were then secured by a written document, the Constitution, affording coming generations protection from those who would infringe upon them.  “A Republican Constitution views the natural and inalienable rights of these joint and equal sovereign individuals as preceding the formation of governments, so first come rights and then comes government” (#621).  Contrariwise, when one thinks rights reside in collectives—and are therefore posited or granted by certain groups, e.g. a majoritorian government—he champions Democracy.  “A Democratic Constitution is a ‘living constitution whose meaning evolves to align with contemporary popular desires, so that today’s majority is not bound by what is called ‘the dead hand of the past.’  The will of yesterday’s majority cannot override the will of the majority today” (#592).  It logically follows that in a Republic there are elected “representatives” who serve the people; in a Democracy there are “leaders” who court and implement the will of their supporters. 

To oversimplify, Americans lived under a Republican Constitution for the first century of this nation’s existence.  During the next century, however, an increasingly Democratic Constitution became normative.  At issue today is this:  can we—will we—find ways to restore the Republic established by Franklin and his colleagues?  To do so requires us, firstly, to rightly understand the Constitution as crafted in 1787, beginning with the Declaration of Independence and its reliance on the “Laws of Nature.”  Here Barnett, a distinguished professor of law at Georgetown University, exemplifies his pedagogical profession, describing and explaining it.  To understand what the Declaration meant by this phrase, Barnett cites an illuminating passage from a sermon delivered by the Reverend Elizur Goodrich in 1776:  “‘the principles of society are the law, which Almighty God has established in the moral world, and made necessary to be served by mankind; in order to promote their true happiness, in their transactions and intercourse.’  The laws, Goodrich observed, ‘may be considered as principles, in respect of their fixedness and operation,’ and by knowing them, ‘we discover the rules of conduct, which direct mankind to the highest perfection, and supreme happiness of their nature.’  These rules of conduct ‘are as fixed and unchangeable as the laws which operate in the natural world.  Human art in order to produce certain effects, must confirm to the principles and laws, which the Almighty Creator has established in the natural world’” (#812).  This succinctly summarizes the “Natural Law” tradition.

The Constitution composed in Philadelphia sought to establish a tightly limited government rooted in these natural laws, securing “we the people’s” inalienable rights from the pervasive excesses of democracy under the Articles of Confederation—on display to James Madison wherever “measures are too often decided, not according to the rules of justice and the rights of the minor party, but by the superior force of an interested and overbearing majority’” (#1120).  The people are indeed sovereign, the source of the republic’s authority.  But such sovereignty, as clearly recognized by John Jay and John Wilson, the nation’s preeminent judicial thinkers, resided in individuals, not the collectivist “general will” of Rousseau.   

Yet Rousseau’s position helped shape the Democratic Party which was established by Andrew Jackson and Martin Van Buren in 1832.  “The concept of the will of the people was central to Van Buren’s ‘true democracy.’  He believed that the great principle first formally avowed by Rousseau ‘that the right to exercise sovereignty belongs inalienably to the people’” who should rule through popular majorities (#1585).  In the 1850s Stephen A. Douglass would pick up on this idea and promote his vision of “popular sovereignty” in defense of allowing the diffusion of slavery wherever the people supported it.  Abraham Lincoln, of course, took a different view, and the Republican Party first waged a war and later passed the 13th, 14th, and 15th Amendments to secure the individual rights of all persons, thus eliminating slavery in this nation.   

Following the Reconstruction era, however, Barnett says we began “losing our Republican Constitution” when the Supreme Court effectively gutted the three Amendments that freed the slaves and recognized their status as citizens, thereby acceding to the will of racist Democrats in the South.  Simultaneously the Court (as personified by Oliver Wendell Holmes) gradually endorsed legislation passed by Progressives (both Democrat and Republican) who wanted to change the nation by implementing a variety of political, economic and social reforms—often through administrative agencies and courts, staffed with the “experts” so beloved by Progressives.  They insisted the Constitution is a “living” compact—a “living and organic thing” said Woodrow Wilson—constantly subject to change in whatever direction a majority of the people desire.  With the triumph of FDR and the New Deal, this “living” Constitution—a will-of-the-people Democratic agreement—  became the “law of the land.”  

Though this “Democratic” understanding of the Constitution still prevails in this nation’s corridors of power, Barnett thinks it possible to restore the original, “Republican,” understanding to its rightful place.  The federalism and limited government intended by the Founders in 1787 still matter if we are concerned with our God-given rights and personal liberties.  And since 1986, with the appointment of William Rehnquist to the Supreme Court, hopeful signs of a renewed federalism (apart from economic policies) are on the horizon, though President Barack Obama has done everything possible to frustrate this possibility.  Thus Barnett thinks we need to add ten new amendments (initiated by the states) to the Constitution, so as to preserve its Republican nature.  

Though some of Barnett’s presentation will appeal only to readers with suitable backgrounds in legal history and political philosophy, he has set forth a meaningful way to understand the basic issues in this nation’s 200 year history.  Restoring a Republican Constitution would require heroic work in many ways, but it is certainly a goal worth pursuing for citizens concerned for the real welfare of this Republic.

* * * * * * * * * * * * * * * * * 

In Living Constitution, Dying Faith:  Progressivism and the New Science of Jurisprudence (Wilmington, DE, c. 2009) Bradley C. S. Watson aims “to elucidate the connection that American progressivism as philosophical movement and political ideology has with American legal theory and practice” (p. xvi).  Combining Social Darwinism and Pragmatism (the twin ingredients of Progressivism evident in William James and John Dewey, Oliver Wendell Holmes and Louis Brandeis, Theodore Roosevelt and Woodrow Wilson and Barack Obama), we are now subject to “historicist jurisprudence”—taking what is at the moment as good and true simply because it is the current cusp of historical processes, being “on the right side of history.”  We have a judicial system that “is not only hostile to the liberal constitutionalism of the American Founders, but to any moral-political philosophy that allows for the possibility of a truth that is not time-bound” (p. xvi).  These Progressives consciously rejected the Natural Law tradition running from Plato to the American architects of the Constitution, that good law must be anchored in abiding truths authored by God.  

The “living” or “organic” Constitution as promoted by Progressives was on display when the Supreme Court, in Planned Parenthood v. Casey (1992), which justified (as a constitutional right) abortion inasmuch as every person has “the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life” (p. 3).  Within a decade the Court (in Lawrence v. Texas) further affirmed an “emerging recognition” of homosexual behavior that would lead, within another decade, to the legalization of same-sex marriage.  Only an ever-evolving “constitution,” utterly unhinged from the written document of 1787, could rationalize such judicial edicts!  But this was clearly the Progressive vision set forth by Herbert Croly century ago when he urged jurists to discard “Lady Justice,” blindfolded and holding a scale in her hands.  To replace her he suggested a studious woman wearing spectacles, committed to “social justice,” with suitable tools at hand with which to accomplish her goals.  Judges were to decide how to make the world better, not to give “what is due” to all persons.

To show how Progressivism has changed the nation, Watson revisits the “Constitution of the Fathers” which set forth the American Creed, beginning with the Declaration of Independence’s great affirmation that we “hold these truths to be self-evident, that all men are created equal.”  As Abraham Lincoln—one of the greatest of the Constitution’s interpreters—believed, “there are such things as natural rights that do not change with time, that the American Constitution is dedicated to preserving them, and that the role of great political actors, while responding to urgent necessities, is to look backward rather than forward” (p. 38).  When Lincoln famously declared (in 1863) that this nation was “conceived in liberty, and dedicated to the proposition that all men are created equal,” he clearly appealed to “the laws of nature and nature’s God,” undergirding America’s constitutional republic.  

Yet even as Lincoln was invoking God’s laws, Charles Darwin was unleashing an intellectual  revolution, reducing everything to evolution through natural selection.  Consequently, Social Darwinists, enamored with evolutionary “progress,” declared themselves freed from all allegedly eternal principles and embraced the historical developments that improve both the human animal and society as well.  Change is constant—and under the guidance of natural selection (which is helped along by scientifically-trained experts in the social world) it is always for the better!  In America, enthusiastic Darwinists, most notably John Dewey, provided a philosophy (Pragmatism) for committed Progressives from FDR to Barack Obama, who sought to improve things through “progressive education, the welfare state, and the redistribution of capital” (p. 83).  “Long before ‘the courage to change’ became effective presidential campaign slogan, Dewey helped ensure that ‘change’ would have a central position in American political rhetoric” (p. 84).   

After retelling the story of Progressivism’s political triumphs, running from Woodrow Wilson’s “New Freedom” through FDR’s “New Deal” to LBJ’s “Great Society,” Watson explains how it shaped “the new science of jurisprudence” whereby the “moral realism” of Madison and Lincoln was replaced by skepticism and sociological jurisprudence.  Thus Progressive jurists, Richard Epstein says, “‘attacked the twin doctrines that most limited government power—federalism, on the one hand, and the protection of individual liberty and private property, on the other. . . .  However grandly their rhetoric spoke about the need for sensible government intervention in response to changed conditions, the bottom line, sadly, was always the same:  replace competitive processes, by hook or by crook, with state-run cartels’” (p. 117).  

To influential jurists such as Oliver Wendell Holmes, the Constitution means whatever the Supreme Court decrees.  He and his disciples openly disdained any objective moral standards—right and wrong simply changed over the course of time as the stronger rightly dominated the weaker!  Thus “Holmes is a candidate for many labels—pragmatist, utilitarian, Nietzschean, social Darwinist, nihilist” (p. 132).  Rather like the  Thrasymachus in Plato’s The Republic, Holmes considered “justice” to be whatever the dominant person or system determined.  Whatever the established government wants, it rightly gets.  In a democracy, whatever the majority of the people wants, they should get.  In time, their wants will change, so laws (or constitutions) must change to implement  their desires.  By rejecting the Natural Law, Holmes and his followers clearly repudiated Lincoln and Madison, but they also rejected “the very notion that human beings are creatures of a certain type, with transcendent purposes and ends that do not change with time.  The new jurisprudence was suspicious of the very idea of justice itself” (p. 145).  

Obviously dismayed by the impact of this “new science of jurisprudence, Watson concludes his work by noting “the future is now.”  For the good of our nation, for the good of coming generations, it’s imperative to return to the wisdom of the Founders as endorsed by Abraham Lincoln.  To do so requires us first of all to recover our language.  Progressives, as Orwell’s 1984 makes clear, manipulate language, massaging it to attain their ends.  Thus advocates of same-sex marriage effectively change the meaning of marriage, a noun which by definition requires an opposite-sex union, something affirmed through centuries of “common law and American constitutional law” (p. 186).  Advocates of same-sex marriage dramatically illustrate the power of philosophical Nominalism—saying so makes it so!  More radically, Watson predicted, “courts will routinely declare men to be women and vice versa, according to the political pressures of the age” (p. 191).   

* * * * * * * * * * * * * * * * * *

  Living in a “constitutional republic,” we Americans should (one would think) seriously seek to understand the document that sets forth its principles and precepts.  To do so, it’s helpful to consult The Constitution:  An Introduction (New York:  Basic Books, 2015) by Michael Stokes Paulson and Luke Paulson.  This is a father (Michael) and son (Luke) duo, written during nine summer vacations while Luke was in high school and college and while Michael was teaching law at the University of Minnesota.  Their partnership initially involved Michael writing a chapter and allowing Luke to edit it with an eye on readability for students and non-lawyers, hoping “to provide a reasonably short, reader-friendly, intelligent interaction to the United States Constitution in all respects—its formation, its content, and the history of its interpretation” (#87).  

Successfully separating from Great Britain, this nation’s founders inscribed their convictions in two pivotal documents:  The Declaration of Independence and The Constitution of the United States, both declaring “the ultimate right of the people to  freely chosen self-government, protective of their natural rights” (p. 4).  When the Articles of Confederation failed to function effectively, a distinguished company of men—the “Framers”—gathered in Philadelphia in 1787 to compose “something entirely new:  a written constitution for a confederate republic, covering a vast territory and embracing thirteen separate states.  . . . .  There was literally nothing in the world like what the framers were trying to achieve” (p. 23).  That it was to be written was hugely important, establishing a government of laws, not men, clearly setting limits to what it could do and not do.  Thus “the meaning of the Constitution is fixed by the original meaning of its words.  The people can change their written Constitution by amendment, but they should not be able to evade or supplant the ‘supreme Law of the Land’ simply by inventing their own meanings for words or altering meanings to suit their purpose” (p. 27).  

As a result of considerable debate and compromise, the Constitution prescribed a federalism balancing powers within the national government (two legislative bodies, an independent executive, an unelected judiciary) and reserving important rights to the states.  When working rightly, this checks-and-balance system guards personal freedom within the legitimate controls of good government.  Though each branch of government has extensive powers, they are limited to those “enumerated” or “granted” and further curtailed by the first ten amendments.  Thus James Madison “worried aloud, when introducing his proposed Bill of Rights in the House of Representatives, that liberties like religious freedom not be set forth in language too narrow, as if to suggest that they were granted by the Constitution rather than recognized in the Constitution” (p. 99).  The Paulsons effectively describe the work of the Founders, providing helpful biographical vignettes of the leading Framers and celebrating their genius.  But one of their compromises—the three-fifths provisions regarding slavery—sullied their work and scared the new nation’s face for 70 years until a bloody war and three constitutional amendments abolished it.  

Having detailed the important components of the written Constitution, the authors address arguments set forth by proponents of a “living Constitution.”  Obviously the Founders crafted a permanent document which would not change over time, except as properly amended.  But various actions (by all three branches of the government) beginning in the first administration of George Washington and advanced by John Marshall’s Supreme Court slowly expanded its powers.  With the Union’s victory in the Civil War and Reconstruction the powers of the national government grew quickly, as was evident in Lincoln’s Emancipation Proclamation, and it is clear “that the Civil War was fought over the meaning of the Constitution—and over who would have the ultimate power to decide that meaning” (p. 155).  Then the 14th Amendment abruptly “transferred vast areas of what formerly had been exclusive state responsibility to federal government control” (p. 181).  

With the demise of Reconstruction, however, the authors lament the epoch of “betrayal”—the years from 1876-1936 when the Supreme Court “abandoned the Constitution,” denying equal rights to women, upholding racial segregation, nullifying social welfare legislation, etc.  Here it seems to me they think that whenever the Court failed to endorse progressive legislation and ideas it “betrayed” the Constitution.  Other scholars, more libertarian or conservative in their orientation, definitely see these years quite differently!  Fortunately, say the Paulsons, FDR rode to the rescue, and the New Deal Court rightly restored the Constitution by correcting earlier abuses.  FDR’s appointees upheld the constitutionality of his commitment to extend “national government power over the economy” (p. 220), though they curtailed the executive branch’s authority by annulling one of President Truman’s orders in the pivotal Youngstown case.  Especially important was the Warren Court’s Brown v. Board of Education, ending racially segregated schools and launching “the process of dismantling America’s history of legal racial apartheid” (p. 220).  

From 1960 to the present, the national government has expanded dramatically, leaving little of the Constitution’s original “federal” structure standing.  As judicial activists in the courts have sustained this process, we increasingly have an unwritten constitution, meaning whatever the current Supreme Court desires it to be, even claiming itself the “supreme authority to interpret the Constitution—provocatively elevating its own decisions to the same level as the Constitution itself.  However questionable that claim, nobody successfully challenged it” (p. 262).  Such arrogance was fully on display in the Roe v. Wade decision that imposed abortion-on-demand through the land.  “Not even Dred Scott, warped as it was in its distortion of constitutional text, so completely seemed disregard the text as Roe did” (p. 270).  Other critical decisions—ranging from affirmative action to same-sex marriage—further illustrate the withering of the “written Constitution” which once preserved this nation as one under laws rather than men.

# # #  

281 Something, Not Nothing

    If I stumble over something in the dark, I know something’s there.   It’s not something I’m dreaming about, something solely in my mind.  What it is I know not, though when carefully inspected it’s obviously a stool.  That it’s there I’m certain—such sensory information can be painfully indubitable.  It’s something!  What it is I may later determine, finding it’s clearly a four-legged steel stool, useful for reaching things on high shelves but injurious to the bare foot!  Why it’s there, however, involves an altogether different kind of reasoning, as Aristotle famously demonstrated in his Metaphysics.   When asking why the stool was there—or why it was made of steel rather than wood—I unconsciously assume the truth of an ancient philosophical proposition:  Ex nihilo nihil fit—nothing comes from nothing.  The same reasoning process ensues when I venture into the world around me.  That there’s material stuff I encounter is indubitable.  What it is I can ascertain through certain tests.  But why it exists requires a philosophical, not a scientific way of thinking.  

Empirical questions we rightly investigate using scientific means.  But there are deeper  questions which cannot be similarly pursued since they address non-empirical realities such goodness,  beauty, and God.  Thus Einstein allegedly said “scientists make lousy philosophers.”  In ancient Greece most pre-Socratic thinkers were empirical, monistic materialists, though some did think a mysterious kind of infinite, non-material Being existed.  “The decision of this question,” Aristotle said, “is not unimportant, but rather all-important, to our search for truth.  It is this problem which has practically always been the source of the differences of those who have written about nature as a whole.  So it has been and so it must be; since the least initial deviation from the truth is multiplied later a thousandfold” (On the Heavens, I, 5; 271 [5-10]).  

Aristotle’s insight is nicely illustrated in Lawrence M. Krauss’s A Universe from Nothing:  Why There is Something Rather Than Nothing (New York:  Atria, c.  2013).   A noted physicist-turned-cosmologist, Krauss tries to show, as the book’s title says, how the universe literally came from nothing.  Realizing the linguistic pit he’s digging, however, he tries to re-define the word “nothing” to mean, it seems to me:  “well, almost nothing,” since there’s a mysterious but necessarily material realm that magically gives birth to the material world.  Krauss also realizes the word “why” brings with it all sorts of philosophical baggage—especially denoting a rational direction and purpose to the cosmos—which he resolutely refuses to consider.  So he declares that scientists such as himself deal only with “how” questions—the only ones worth pondering.   And “the question” he cares about, “the one that science can actually address, is the question of how all the ‘stuff’ in the universe could have come from no ‘stuff,’ and how, if you wish, formlessness led to form” (#130 in Kindle).  Dismissive of  both philosophy and theology, he insists that he and his guild alone can provide the answers to life’s important questions.  But he slides, incessantly, from “how” to “why” questions, showing how  “scientists make lousy philosophers.”  

On one level, A Universe from Nothing offers the general reader a fine summary of what scientists have discovered during the past century.  It is indeed a fascinating “cosmic mystery story”—-filled with dark holes and quarks and dark matter—told with zest and skill.  We have before us an amazing amount of data regarding the age and shape of the material world, though the conclusions reached regarding the data certainly changed with time.  “String” theories have given way to “multi-universe” hypotheses.  The “steady-state” position once championed by distinguished physicists has been replaced by the “big-bang” view now accepted by most “authorities.”  To theists who for centuries have believed God created (ex nihilo) all that is, the big-bang notion fits easily into their cosmology—the universe simply came into being, in an instant, as God spoke it into being.  An eternal, purely spiritual Being could easily bring into being all that is.  But to materialists such as Krauss there must be a purely material Source—and he devotes this treatise to showing how it might in fact conceivably exist.  And as he chooses to use the word, “‘nothing’ is every bit as physical as ‘something,’ especially if it is to be defined as the ‘absence of something’” (#241).  

There is thus an Alice-in-Wonderland quality to Krauss—words simply mean whatever he chooses them to mean.  “‘When I use a word,’ Humpty Dumpty said, in rather a scornful tone, ‘it means just what I choose it to mean—neither more nor less.’ ‘The question is,’ said Alice, ‘whether you can make words mean so many different things.’ ‘The question is,’ said Humpty Dumpty, ‘which is to be master—that’s all.’”  So too Kraus insists words such as “nothing” mean what he wants them to mean, not what they really mean!   (And, to confuse matters even further, important word meanings shift as the book’s argument develops!).  There is thus an enormous amount of data accompanied by only a passing awareness of logic—a vital part of the philosophical thinking he disdains!  That he first asked the late Christopher Hitchens to pen an introduction to this treatise—and then turned to Richard Dawkins who assented to do so—indicates the “new atheist” agenda undergirding this book!  That Dawkins could have  seriously referred to the “selfish genes” and “memes” so memorably lampooned by the Australian philosopher David Stove shows how poorly “scientific” superstars lack basic reasoning skills!  And a similarly deficiency blemishes Krauss’s presentation.  

                                             * * * * * * * * * * * * * * * * *

In Why Does the World Exist?  An Existential Detective Story (New York:  Liverright Publishing Corporation, c. 2012), Jim Holt employs his journalistic expertise  to explore what Martin Heidegger labeled the greatest of all philosophical questions:  Why is there something rather than nothing at all?  That  is the “super-ultimate why” question!  For many years Holt has pondered this and voraciously read first-rate tomes regarding it—as is evident in his “philosophical tour d’horizon” and “brief history of nothing.”  For this book, however, he primarily conducted interviews around the world with the foremost thinkers who are trying to fathom the mystery.  Unlike Lawrence Krauss, Holt understands that the ultimate origin question requires a “meta-scientific” approach, for as the great Harvard astronomer Owen Gingrich said, this is essentially a teleological, not a strictly scientific, question.  

Holt interviewed thinkers such as diverse as Adolf Grunbaum, a distinguished philosopher of science, a dogmatic atheist who simply dismissed the question as meaningless, and Richard Swinburne, a devout Eastern Orthodox theist who has devoted his life to demonstrating the validity of the traditional belief in “God the Father, maker of heaven and earth, and of all things visible and invisible.”  He talked with David Deutsch, who thinks quantum physics justifies a “many worlds” or “multiverse” hypothesis—if there are an infinite number of universes, then it is quite probable that our universe would have just popped into existence.  Then he sought out Steven Weinberg, who wrote The First Three Minutes and is widely regarded as one of the greatest 20th century cosmologists and said:  “The more the universe seems comprehensible, the more it also seems pointless.”  Yet in his Dreams of a Final Theory, published in 1993, he admitted there was simply too much physicists don’t know for any of them to pontificate on ultimate issues, illustrating an “epistemic modesty” that “was refreshing after all the wild speculation I’d been hearing over the past year” (p. 155).    

Since Plato postulated the eternal existence of intellectual forms, many mathematicians have been Platonists of some sort, believing, as Alain Connes says, “‘there exists, independently of the human mind, a raw and immutable mathematical reality’” (p. 172).  Connes is a distinguished French mathematician who shares Kurt Godel’s confidence in the reality of this non-material numeric realm.  “How else can we account for what the physicist Eugene Wigner famously called the ‘unreasonable effectiveness of mathematics in the natural sciences’?” (p. 172).   Another world-class mathematician, Oxford’s Roger Penrose, is an “unabashed Platonist” who takes “mathematical entities to be as real and mind-independent as Mount Everest” (p. 174).  When interviewed, Penrose said there are really three worlds, “‘all separate from one another.  There’s the Platonic world, there’s the physical world, and there’s also the mental world, the world of our conscious perceptions’” (p. 177).      

John Leslie, considered by many “the world’s foremost authority on why there is Something rather than Nothing,” confesses he thought when he was young that he’d found the answer to the question.  But then he learned, ‘“to my horror and disgust,’” that “‘Plato had got the same answer twenty-five hundred years ago!’” (p. 197).  Subsequently he developed “extreme axiarchism,” positing that “reality is ruled by abstract value—axia being the Greek word for ‘value’ and archein for ‘to rule’” (p. 198).  “‘For those who believe in God,’ he thinks, ‘it has even provided an explanation for God’s own existence:  he exists because of the ethical need for a perfect being.  The idea that goodness can be responsible for existence has had quite a long history—which, as I’ve said was a great disappointment for me to discover, because I’d have liked it to have been all my own’” (p. 199).  

Holt ends the book rather as he began it—interested in all sorts of interesting theories but persuaded by none!  Though the question he’s asking is fundamentally serious, there’s a certain intellectual detachment, almost a levity, to the book.  But it does provide an interesting survey of the cosmological scene, leaving the reader to sort out what’s important or irrelevant to him.

* * * * * * * * * * * * * *

When the erudite Boston College philosopher Peter Kreeft says “This is, quite simply, the single best book I have ever read on what most of us would regard as the single most important question of philosophy:  Does God exist?  It will inevitably become a classic,” one is wise to read carefully Michael Augros’ Who Designed the Designer:  A Rediscovered Path to God’s Existence (San Francisco, Ignatius Press, c. 2015).   Unlike the many works of apologetics that rely on cosmology, with its heavy load of scientific theory and evidence, this treatise simply asks us to reason carefully.  Rather than think inductively, collecting facts, we must think deductively, following reason.  Simple, self-evident assumptions—absolute, universal propositions such as the Pythagorean theorem—carefully developed into arguments, lead necessarily to certain indubitable conclusions.   “As the argument advances,” he promises, “I will never ask you to believe in some else’s findings or observations.  Instead, all the reasoning will begin from things you yourself can immediately verify” (p. 12).   That “equals added to equals make equals” or “every number is either even or odd” cannot be denied simply because they are self-evident.  

So Augros begins with the simple truth that children incessantly ask why?  “This endearing (if sometimes trying) property of children is human intellectual life in embryo.   In its most mature forms of science and philosophy, the life of the human mind still consists mainly in asking why and in persisting in that question as long as there remains a further why to be found.  Ultimately we wonder:  Is there a first cause of all things?  Or must we ask why and why again, forever, reaching back and back toward no beginning at all?  Does every cause rely on a prior cause?  Or is there something that stands in need of no cause, but just is?” (p. 9).  In response, Augros unambiguously intends “to show, by purely rational means, that there is indeed a first cause of all things and that this cause must be a mind” (p. 10).  In many ways he simply seeks to fully demonstrate the elegant simplicity and persuasiveness of the ancient Kalam argument so successfully defended in our day by William Lane Craig:  

Premise 1:  Everything that begins to exist has a cause.

Premise 2:  The universe began to exist.

Conclusion:  Therefore, the universe must have a cause

Then let’s begin!  Whenever we reason we seek to find the causes of things.  To Aristotle:  “Evidently there is a first principle, and the causes of things are neither an infinite series nor infinitely various in kind” (Metaphysics).  On this point, “Twenty-five centuries’ worth of great philosophers and scientists nearly all are agreed” (p. 30).  But this cause is not necessarily temporal!  The universe might well be eternal and still stand in need of a First Cause!  An acting cause, such as a potter making a vase, is simultaneous with, not prior to, the product he’s producing.  “Recognizing causal priority as distinct from temporal priority opens the door to first cause of an eternal effect” (p. 32).  Thus “the great thinkers who all insist there is a first cause used the expression first cause not to mean (necessarily) a cause before all other causes in time, but a cause before all others in causal power.  It meant a cause of other causes that does not itself depend on any other cause.  It meant, in others words, something that exists and is all by itself, without deriving its existence or causal action from anything else.  And it meant not a thing stuck in the past, but a thing existing in the present” (pp. 32-33).  Ultimately, “it is impossible for things caused by something else to be self-explanatory.  There must also be something by which things are caused and which is not itself caused by anything” (p. 37). 

Granting the certain existence of a first cause, however, is only the first step in demonstrating the existence of God, Who Is the First Cause and whose Mind sketched the blueprint for the universe—the Latin word for “turned into one.”  Unlike the Greek polytheists, who assigned events to various gods, monotheists following Moses think there is only One true Cause of all that is.  Carefully considered, the material world—matter-in-motion—could not have caused itself and is quite evidently “the first thing from the first cause” (p. 66).  “Matter is not the first cause.  It is impossible for it to be so.  Matter is subject to motion.  The first cause, on the other hand, is not” (p. 60).  Only a non-material Being could be a self-mover, moving everything else.  The ancient Chinese thinker, Lao-Tzu, noted that “‘to turn a wheel, although thirty spokes must revolve, the axle must remain motionless; so both the moving and the non-moving are needed to produce revolution.’  This reasoning sounds the death knell for the theory that matter is the first cause.  Matter, energy, and fundamental particles are all subject to motion.  The first cause [the axle] is not” (p. 62).   

Thus the first cause must be non-material, incorporeal, spiritual.  Given our immersion in material things it is, admittedly, difficult to conceive of purely non-material realities!  But just as a mathematical point (which has no parts) is not a visible dot on the paper but a necessary, indivisible reality-without-parts, so too there are metaphysical realities that utterly transcend the physical world.  And the first cause, though not material, is “the most intensely existing thing” of all!  There is a hierarchy to the universe, leading from fundamentally material to essentially non-material beings.  Plants are superior to rocks, and animals are better than plants, and human beings are higher than fish and pheasants.  “Mineral, vegetable, animal, human.  These kinds of beings form a ladder of sorts.  Ascending from one rung to another, we find something more capable of including beings within its own being” (p. 91).  Higher beings possess more fullness of being.  On the highest rung, possessing the most being, is the Supreme Being, giving being to all lesser beings.  And since it is axiomatic that “nothing gives what it does not have,” we conclude that everything that exists owes it existence to the One who most fully exists, who simply IS.  

Since we are thinking beings making sense of all sorts of things, it follows that the Supreme Being is the ultimate Thinker.  Even atheistic scientists cannot but acknowledge the seeming intellectual dimension to the cosmos.  Thus Richard Dawkins cautions his fans to beware of taking seriously the “apparent” design of things.  And Stephen Hawking confesses that the “apparent laws of physics” seem to be amazingly well-designed to make for a life-welcoming universe.  But atheists cannot open the door to such non-material realities as “purpose” without bringing into question their materialist dogma.  So the evolutionary biologist Richard Lewontin confessed:  “It is not that the methods and institutions of science somehow cope us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create and apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated.  Moreover, that materialism is an absolute, for we cannot allow a Divine Foot in the door’” (pp. 147-148).  But, Augros counters, even our limited minds can “understand all things, at least in a general way” and then conceptualize a universe.  Using our limited minds we legitimately envision an Omniscient Mind knowing all things—a First Cause responsible for their existence.  Indeed:  “The intelligence of the first cause of all things explains the look of design everywhere in the universe” (p. 113).  

Rightly discerned, this omnipresent design gives things their distinctive beauty and goodness.  Wonder, both Plato and Aristotle noted, is basic to the philosophic quest—pausing to note the sheer givenness of all that is, reflecting on its mysterious configurations, delving into the why-ness of what’s beheld.  Thus Whittaker Chambers, writing about the sheer beauty of his infant daughter’s ear in Witness, dated his break with Communism to that moment.  He was overwhelmed with wonder while gazing at “the delicate convolutions of her ear—those intricate, perfect ears.  The thought passed through my mind:  ‘No, these ears were not created by any chance coming together of atoms in nature (the Communist view).  They could have been created only by immense design’” (p. 100).  Then there’s a fascinating passage in Sir Arthur Conan Dole’s Memoirs of Sherlock Holmes, where Watson recalls Holmes reflecting on “What a lovely thing a rose is!”  Gazing at the color and configuration of a moss-rose, the great detective declared:  “There is nothing in which deduction is so necessary as in religion.  It can be built up as an exact science by the reasoner.  Our highest assurance of the goodness of Providence seems to me to rest in the flowers.  All other things, or powers, our  desires, our food, are really necessary for our existence in the first instance.  But this rose is an extra.  Its smell and its color are an embellishment of life, not a condition of it.  It is only goodness which gives extras, and so I say again that we have much to hope from the flowers.”  As we wonder (with Chambers and Holmes) at the beauty and goodness of beings, we cannot but think there must be a first cause, a Supreme Being, responsible for all this.  

In the book’s “Epilogue,” Augros notes he stands on “the shoulders of giants” such as Aristotle and Aquinas.  Though primarily relying on ancient and medieval thinkers and differing in his approach from Rene Descartes, he shares some of the “first modern” philosopher’s confidence that:  “The existence of God would pass with me as at least as certain as I have ever held the truths of mathematics.”  Thinkers such as Descartes have ever worked by “deducing the logical consequences of timelessly valid principles.  It is not by chance that those principles have arisen in the thoughts of great minds again and again down through the centuries.  They are the common heritage of the human mind.  ‘Nothing comes from nothing.’  ‘What is put into action depends on what acts by itself.’  ‘Nothing gives what it does not have.’  ‘Some things are nobler than others.’’  And on and on.  Such are the laws of being, expressed in terms too universal for science to employ, let alone refute.  We are free to ignore them, since the explicit recognition of their truth is in no way necessary for our daily existence.  . . . .  The just quietly await our notice.  The conclusion that God exists, when deduced from principles like these, is true and hard-won knowledge, worthy of the name” (p. 208).  

That such laws of being point persuasively to the existence of God is the conclusion of this highly readable treatise.  Thus, with Thomas Hibbs, Honors College Dean at Baylor University I say:  “I know of no other book about the existence and nature of God that is as readable and enjoyable as this one.” 

280 Losing Our Mind?

   In 1960 America’s schools were widely considered among the world’s best.  Then came the ‘60s revolution which significantly changed the culture, including  the erosion of educational standards.  Within two decades concerns for the schools’ quality became amplified, and in 1983 a presidential committee issued Nation At Risk to alert the public to manifest deficiencies in our schools.  In 1987 Allen Bloom  voiced his lament for the quality of university education in The Closing of the American Mind (which became a surprise best-seller).  Mounting concern in political circles led to federal initiatives such as President George W. Bush’s “No Child Left Behind,” seeking to arrest the decline by insisting that certain standards be met to insure children were receiving a decent education.   But despite all the discussions—and the massive expenditure of funds—America’s K-12 students now score near the bottom in standardized tests administered in industrialized nations.    

To assess this issue, Mark Bauerlein and Adam Bellow edited a volume titled The State of the American Mind:  16 Leading Critics On the New Anti-Intellectualism (West Conshohocken, PA:  Templeton Press, c, 2015).   In their Foreword—“America:  Have We Lost Our Mind?”—the editors state that Americans had historically been characterized by “independent thought and action, thrift and industriousness, delayed gratification and equal opportunity” (#92 in Kindle).  Such traits had largely disappeared by the mid-80s as traditional content-focused courses, designed to transmit the core knowledge and wisdom of the past, were replaced by student-centered activities aiming to enhance self-esteem and critical thinking.   Consequently:  “Instead of acquiring a richer and fuller knowledge of U.S.History and civics, American students and grown-ups display astounding ignorance of them, and their blindness is matched by their indifference to the problem” (#157).  The “rugged individualism” of the past has dissolved into rampant self-absorption.  “Not only has self-reliance become a spurious boast (‘You didn’t build that’), but dependency itself has become a tactical claim” (#157).  Rather than celebrating their freedom to think and debate, large numbers of “Americans accept restrictions on speech, freedom of association, rights to privacy, and religious conscience” (#165).  The closing of the American mind seems to continue, especially in the nation’s educational institutions.

E. D. Hirsch Jr.’s “The Knowledge Requirement:  What Every American Needs to Know”  updates his 1987 manifesto, Cultural Literacy, urging the schools to recover their commitment to transmitting knowledge of history, civics, mathematics, science, and literature.  Now armed with the fact that SAT scores have declined for 50 years, Hirsch restates his case and blames the decline on the fact “that general knowledge” is not emphasized in the schools.  As teachers emphasize “skills” rather than “mere facts” many students learn very little and demonstrate it by performing poorly in international exams.  Ironically, Mark Bauerlein, in “The Troubling Trend of Cultural IQ,” notes that IQ scores have significantly increased during the past century—we’re actually getting smarter!  But higher IQs have not resulted in more knowledge.  Thus an alarming number of high school graduates (two-thirds of the students entering the Cal State University system) need remedial courses in math and writing, demonstrate a dwindling vocabulary, and have limited general knowledge. 

In “Biblical Literacy Matters,” Daniel Driesbach draws a dismal portrait that should concern everyone, for there has been “an alarming decline in biblical literacy” that includes an “ignorance of key biblical texts, stories, characters, doctrines, themes, rituals, and symbols” (#749).  Compared with George Washington, who often brought biblical phrases into his writings, today’s politicians frequently prove inept when trying to appear biblically astute—e.g. Howard Dean citing Job as his favorite New Testament book!  “In his 1800 assessment of education in America, Pierre Samuel Du Pont de Nemours observed, ‘Most young Americans . . . can read, write and cipher.  Not more than four in a thousand are unable to write legibly—even neatly.’  He attributed America’s high literacy rate to frequent Bible reading, which, he also said, ‘tends to increase and formulate ideas of responsibility’” (#864).  Two hundred years later we can hardly say the same!  And this loss of biblical literacy bodes ill for a nation whose laws and political premises are so suffused with biblical precepts.

In “The Rise of the Self and the Decline of Intellectual and Civic Interest,” Jean Twenge identifies one of the most important problems plaguing modern education.  Teachers for decades have stressed the absolute, if not ultimate importance of self-esteem.  All students, we’re told, must feel good about themselves—and any problems they have must be attributed to a lack of self-esteem.  Believing in yourself—not learning history or mastering calculus or becoming virtuous—is the pedagogical goal!  So today’s students routinely consider themselves masterful mathematicians or writers whereas their test scores demonstrate the converse.  Their inflated self-evaluation is bolstered by the rampant grade-inflection everywhere evident.  In 2012, 37 percent of high school seniors had an A average.  Whereas in the 1960s the most common grade in college “was a C, by 2000 the most common grade was an A” (#2269).  It’s revealing that “the ethnic group with the lowest self-esteem is Asian Americans” who “also demonstrate the best academic performance, possibly because their culture emphasizes hard work rather than self-belief” (#2370).  

Radio host Dennis Prager says “We Live in the Age of Feelings” bequeathed us by the ‘60s and thus fail to reason rightly.  Rather than wondering of something is “true” or “right,” today’s youngsters almost inevitably ask “how do I feel about it?”  They value their own feelings rather than the well-being of others, their own response to music and art rather than classical aesthetic criteria, and their own sexual satisfaction (e.g. cohabitation, abortion, sodomy) rather than the good of society (e.g. marriage and children).   Feelings fuel the ubiquitous concern of the young for “social justice”—meaning favoritism for the poor and disadvantaged, supporting the “poor man, even if he is in the wrong” (#3407).  Such convictions lead to such educational “reforms” as re-writing history books to exaggerate the role of women, homosexuals, and ethnic minorities, thereby erasing the truth of the American story.  

In a more foundational essay, R.R. Reno, the editor of First Things, identifies “The New Antinomian Attitude” as the “greatest threat” we face, for it has led to “an Empire of Desire” that flourishes in our postmodern world and corrupts our culture.  “Ministered to by a therapeutic vocabulary empowerment, the pedagogy of multiculturalism, and our dominant, paradoxical moral code of nonjudgmentalism, this empire has come to dominate the American Mind” (#3741).  Whatever we want we will have!  Not even stubborn realities such as sexual differences will deter us.  In adopting this antinomian attitude “we’ve empowered the dictatorship of relativism, which is closely allied with the harrying mentality of political correctness” (#3881).  And with this we have effectively constricted the reasonableness needed for a healthy culture.  So we are, in fact, losing our mind!

* * * * * * * * * * * * * * * * * * * * * * 

Though I generally encourage reading the books I review, sometimes I think people should merely know about a book without laboring to digest it first-hand.  So though the information in Terry Moe’s Special Interest:  Teachers Unions and America’s Public Schools (Washington D.C.:  Brookings Institution Press, c. 2011) is truly important, the treatise is clearly designed for scholars rather than the general public.  His thesis is clear and disturbing:  though individual teachers may very well be deeply committed to their students the teachers unions have another, overriding objective—protecting and advancing the welfare of their members.    Thus we have such things as New York City’s “Rubber Rooms,” where 700 teachers daily do nothing (since they are incompetent) while continuing to  draw their salaries and full benefits.  They’re teachers who don’t teach!  But they’re protected by their union, which makes them impossible to fire.   

While not bad enough to be sent to the Rubber Rooms, another 5-10 percent of our teachers are mediocre at best and clearly harming their students.  If we could merely replace the bottom 10 percent “‘we could dramatically improve student achievement.  The U.S.could move from below average international comparisons to near the top’” (#157).   Educational reformers know this, but every effort to change the system has failed for one simple reason:  unions oppose efforts to discipline ineffective teachers , to allow school choice or merit pay.  Before 1960 unions had little power and exerted little influence.  The National Education Association (now the largest union of any kind in the U.S.) was a professional organization largely controlled by school administrators.  In the 60s, however, the unions began to win legislative and judicial victories that enabled them, by 1980, to establish “what was essentially a new system of public education” (#218).   “The rise of the teachers unions, then, is a story of triumph for employee interests and employee power.  But it is not a story of triumph for American education” (#1419).  

The unions mastered the art of financing politicians (almost exclusively Democrat) who would in turn generously appropriate money to the schools and require union membership.  Unions  “were the nation’s top contributors to federal elections from 1989 through 2009” (#251).  They also effectively marshal their members as “volunteers” to work in important campaigns (especially school board  elections and bond proposals).  And they have effectively aligned themselves with other powerful public sector unions to mutually enrich themselves at the public purse.  When confronted with the dismal record of student achievement (near the bottom compared with other developed countries), the unions loudly insist the problem is simply financial—given enough money, all would be well in our schools!  Yet the U.S. spends “more than twice as much on education—per student, adjusted for inflation—as it spent in 1970 (and more than three times as much in 1960” (#296).  Unions insist that small classes insure better learning  and demand we hire more teachers to man small classrooms.  Yet whereas in 1955 there were 27 students per teacher and there are now 14, the students are demonstrably less well-educated.  Smaller classes mainly mean more teachers—and more union dues—but less effective instruction.

Yet amidst the generally dismal story of America’s schools there are a few “small victories for sanity.”  New Orleans has witnessed some “path-breaking” improvements launched in the wake of the devastation wrought by Hurricane Katrina in 2005!  The city’s “teachers were dismissed and the local union and its formidable power” was crushed (#4414).  Then state and local officials were free to establish “a full-blown choice system filled with autonomous charter schools” now enrolling 60 percent of the city’s children (#4420).   Short of a hurricane, however, constructive change rarely comes in the nation’s largest cities!  Consider Washington, D.C., dead last in test scores and “long known for having one of the worst, most incompetently run school systems in the country” despite its lavish funding (#4753).   When Adrian Fenty was elected mayor in 2007 he resolved to reform the system and brought Michelle Rhee on board as Superintendent of Schools to do so.  She sought to “build a new personnel system around performance:  rewarding good teachers for their success, creating strong incentives to promote student achievement, and—just as important—attracting a new breed of teachers” who would improve things (#4811).  But Rhee soon exited because the unions opposed her every move and help orchestrate the defeat of Mayor Fenty at the next election.  

In Moe’s opinion, unless the teachers unions are radically curtailed there is no hope for the children in public schools.  The symbiotic bond between the unions and the Democrat Party must be dissolved.  School vouchers, school choice, charter schools, and new technological options offer positive alternatives to the established order that may in time improve things.  But ultimately, for any meaningful reforms to take place the teachers unions must somehow be sidelined.    

                                         * * * * * * * * * * * * * * * * * * * * * 

In John Dewey and the Decline of American Education:  How the Patron Saint of Schools Has Corrupted Teaching and Learning (Wilmington, DE:  ISI Books, c. 2006) Henry T. Edmondson III demonstrates the old adage that “ideas have consequences.”  After briefly noting the general consensus regarding the decline of the nation’s schools during the 20th century, he declares that John Dewey’s educational philosophy explains much of it.  Indeed, his dolorous “impact on American education is incalculable” (p. xiv).  Committed to the pragmatic proposition that truth is “what works” when solving problems, and holding “that belief in objective truth and authoritative notions of good and evil are harmful to students,” Dewey disdained metaphysics, ethics, history and theology.  (His anti-religious statements rival those of militant atheists such as Nietzsche and Marx, whose moral nihilism and socialist aspirations he sought to promote in the schools).  Deeply influenced by Rousseau’s Emile, he considered books—and especially the classics that constituted the core of traditional pedagogy—impediments to the experiential “learning-by-doing” he favored.  Students should work out their own moral standards through “values clarification” discussions rather than study Aristotle’s Ethics or McGuffey’s biblically-laced Readers.  

Surveying the academic scene in 1964, historian Richard Hofstadter said:  “‘The effect of Dewey’s philosophy on the design of curricular systems was devastating’” (p. 37).  Rather than studying the traditional “liberal arts,” the schools now seek to “liberate” students from the shackles of the past, encouraging “creativity” and engendering “self-esteem.”  Socialization—most notably progressive social change—increasingly replaced learning and scholarly proficiency as the central mission of the schools.  “Thanks in no small part to Dewey,” Edmondson says, “much of what characterizes contemporary education is a revolt against various expressions of authority:  a revolt against a canon of learning, a revolt against tradition, a revolt against religious values, a revolt against moral standards, a revolt against logic—even a revolt against grammar and spelling” (p. 56).  

To rightly respond to the educational problems we face, Edmondson invokes Flannery O’Connor, who simply advised parents:  anything that John Dewey says “do, don’t do.”  To make our schools good for our children, the ghost of Dewey must be exorcised!  Banishing such things as “whole language learning” (which leaves students unable to read and spell well), “fuzzy” math (which replaces memorizing with analysis) and “values clarification” would be a healthy place to begin!  Making the study of history central to the curriculum is essential—as is the discipline of memorizing facts about the past.  Learning logic—unlike indulging in “critical thinking”—would equip youngsters to actually think rather than emote.   In short:  we must rescue our children from the pernicious pragmatism of of John Dewey.

                                       * * * * * * * * * * * * * * * * * * * * 

Years ago I read and favorably reviewed David Gelernter’s Drawing Life:  Surviving the Unabomber—a moving account written by one of the nation’s most prestigious computer experts—a professor at Yale, an Orthodox Jew who was seriously injured by one of the notorious Unabomber’s mailed explosives.  Subsequently I’ve found various of Gelernter’s books (several of them historical monographs) worth perusing.  This is quite true of his America-Lite:  How Imperial Academia Dismantled Our Culture (and Ushered In the Obamacrats) (New York:  Encounter Books, c. 2012).  “Everyone knows,” Gelertner begins, “that American civilization changed in the 1960s and ‘70s” (#20).  “In 1957, Americans were pleased with America and proud of it,” but within 20 years “that proud confidence was gone, crumbled like mid-bricks into flyblown could of dust” (#105).  Revolutionary cultural changes—evident in an eroding of public civility and “the etiquette that used to govern relations between men and women”—have fundamentally changed America.  In significant ways, America has been Europeanized.  Responsible for these changes are those Gelertner dubs PORGIs—“post-religious globalist intellectuals”—who disdain both patriotism (love of country) and patriarchy.  

The PORGIs have effectively orchestrated a cultural “coup” by occupying the universities and making their prestigious degrees the price of admission to social prominence and financial success.  They are energized by their leftist ideology, which is “a new religion” morphing into such things as “earth worship” and the “sacralization of homosexuality” clearly akin to “ancient paganisms” (#402).  Addicted to—and intoxicated by—various theories (e.g. social justice,  global warming, affirmative action), the intellectuals portray themselves as champions of truth and righteousness.  But, strangely enough, they have little interest in any concrete facts that might refute their theories!  As Hannah Arendt said, evaluating the social revolutionaries of the ‘60s:  “‘The trouble with the New Left and the old liberals is the old one—complete unwillingness to face facts, abstract talk, often snobbish and nearly always blind to anybody’s else’s interest’” (#356).  

Rather than dealing with the oft-inconvenient facts before them, today’s intellectuals “invent theories and teach them to Airheads.  Airheads learn them and believe them” (#286).  Airheads “never need to think at all” since they need only repeat the theories and dogmas fed them by their professors.  To Gelertner, “Barack Obama and his generation of airheads, the first ever to come of age after the cultural revolution, are unique in American history.  All former leftist movement were driven by ideology.  Obama’s is driven by ignorance” (#1479).  The president “himself is merely a mouth for garden-variety left-liberal ideas—a mouth of distinction, a mouth in a million, but a mere mouth just the same.  He is important not as statesman but as symptom, a dreadful warning.  He is important not because he is exceptional but because he is typical.  He is the new establishment; he represents the post-cultural revolution PORGO elite” (#1491).  Obama’s historical ignorance distresses Gelertner!  That a president could refer to “the bomb” (rather than the bombs) that “fell on Pearl Harbor,” or to brag that his great-uncle helped liberate Auschwitz (a Polish camp liberated by the Russians), demonstrates his prestigious but vacuous “education” in the most elite schools of the country!  “What kind of mismash inhabits this man’s brain?” (#1532).  

Unfortunately, “There is a pattern here.  This president is not an ideologue; he does no reach that level.  He is a PORGI Airhead:  smart, educated, ignorant.  And there is a deeper, underlying pattern.  Obama has learned theories about the police versus black men.  They are wrong.  He has learned theories about ‘real causes’ of terrorism and about ‘isolated extremists’ and ‘Islamophobia.’  They are wrong.  He applied his theories just the way he was taught.  But the theories, being wrong, gave him wrong answers.  That is the PORGI elite, the new  establishment” (#1614).  Obama  will soon be replaced.  Butt the ‘60s revolution has succeeded inasmuch as “a new generation of Obamacrats enters America’s bloodstream every year, in late spring, when fresh college graduates scatter like either little birds of puffs of dandelion seed to deliver a new crop of Airhead left-winger to the nation and the world” (#1497).  Knowing little about history, having read little literature, free from any grounding in logic or philosophy, their college experience has primarily trained them to be faithful leftists.  

Despite their impoverished education, their growing power has resulted in Gelertner calls “Imperial Academia,” a confirmation of one of President Dwight D. Eisenhower’s 1961 warnings concerning the pernicious power of an emergent “scientific-techological elite,” which (through government funding and political influence) posed a threat “gravely to be regarded.”  Massive amounts of federal money now flow into academic institutions, which in turn provide the government with highly-trained experts who and support big government solutions to the nation’s problems. 

# # # 

279 Stalin’s Harvest of Sorrow

 Since WWII Jews around the world have routinely resolved to “never forget” Hitler’s brutal effort to destroy the Jewish people.  So too all of us should determine to never forget the far costlier devastation visited upon Russia by Joseph Stalin.  In concentration camps such as Belsen and Auschwitz the Nazis slaughtered some six million people, but a decade earlier, in the Ukraine and adjacent Cossack areas in southern Russia, the Bolsheviks killed nearly twice as many peasants—totaling more than all deaths in WWI.  The late English historian Robert Conquest devoted much of his life to finding, rigorously documenting, and publishing the truth regarding what transpired in the Soviet Union between WWI and WWII.  One of his most powerful treatises is Harvest of Sorrow:  Soviet Collectivization and the Terror-Famine (New York:  Oxford University Press, c. 1986).  The book’s title is taken from “The Armament of Igor,” a poem lamenting that:  “The black earth /  Was sown with bones / And watered with blood / For a harvest of sorrow / On the land of Rus.’”  

For many centuries Russian peasants were serfs—working the land of aristocratic landowners who often exploited them.  Reform movements in the 19th century, much like anti-slave movements in America, led to their liberation in the 1860s.  While certainly harsh by modern standards, their lot slowly improved, though like sharecroppers following the Civil War in America they were generally landless and impoverished in a nation firmly controlled by the Tsar and aristocracy.  Thus the Bolshevik revolution in 1917 was initially welcomed by peasants who often seized and carved up the large estates they worked on, hoping for the better life promised by the upheaval.  Yet they “‘turned a completely deaf ear to ideas of Socialism’” (p. 44).  As Boris Pasternak made clear, in a passage in Doctor Zhivago:  “‘The peasant knows very well what he wants, better than you or I do . . . .   When the revolution came and woke him up, he decided that this was the fulfillment of his dreams, his ancient dream of living anarchically on his own land by the work of his hands, in complete independence and without owing anything to anyone.  Instead of that, he found the had only exchanged the old oppression of the Czarist state for the new, much harsher yoke of the revolutionary super-state’” (p. 52).  

Realizing that the innate love of farmers for land ownership and free markets militated against his totalizing ideology, Lenin noted that he would ultimately “‘have to engage in the most decisive, ruthless struggle against them’” (p. 45).  He’d found that Communists such as himself knew little about economics—as was evident when he tried to abolish money and banking—and quickly launched the New Economic Policy, effectively restoring important aspects of capitalism.  He also had to find effective ways to encourage agricultural productivity, so he delayed collectivizing agriculture in the 1920s.  By the end of that decade, however, Joseph Stalin had seized sufficient power to undertake the radical restructuring of Russian agriculture.  A 1928 grain crisis prompted Party bureaucrats to mandate production quotas, taxes and distribution mechanisms.  They also needed scapegoats to blame and signaled out the best, hardest working and most prosperous farmers (the kulaks who owned a few acres and a handful of animals and even hired laborers as needed) who seemed to qualify as closet capitalists and “wreckers.”  As Stalin declared:  “‘We have gone over from a policy of limiting the exploiting tendencies of the kulak to a policy of liquidating the kulak as a class’” (p. 115).   

Stalin and the Soviet Politburo established the All Union People’s Commissariat of Agriculture, staffed by alleged “experts,” which was authorized to push the peasants into collectives and set utterly utopian, ludicrous goals for yearly harvests.  Such policies (part of Stalin’s Five Year Plan) led to an “epoch of dekulakization, of collectivization, and of the terror-famine; of war against the Soviet peasantry, and later against the Ukrainian nation.  It may be seen as none of the most significant, as well as one of the most dreadful, periods of modern times” (p. 116).  Farmers who failed to meet their quotas or “hoarded” grain (even seed grain!) were arrested and resettled in remote regions if not shot or sent to camps.  Conquest documented, in mind-numbing, heart-rending detail, this deliberate destruction of those who stood in the way of Stalin’s grand socialistic agenda.   To the Party, in the words of a novelist, “‘Not one of them was guilty of anything; but they belonged to a class that was guilty of everything’” (p. 143).  And in the  “class struggle” intrinsic to Marxist analysis, evil classes must be destroyed.   Sifting through all the documents available to him, Conquest estimates that at least fourteen million peasants perished.  “Comparable to the deaths in the major wars of our time,” Stalin’s “harvest of sorrow”  may rightly be called genocide.  

Above all, Stalin targeted the peasants of the Ukraine, the Don and Kuban, where a massive famine transpired in the early ‘30s.  Party activists (generally dispatched from the cities and lacking any knowledge of agriculture) presided over the process.  One of them recalled:  “‘With the rest of my generation I firmly believed that the ends justified the means.  Our great goal was the universal triumph of Communism, and for the sake of that goal everything was permissible—to lie, to steal, to destroy hundreds of thousands and even millions of people, all those who were hindering our work or could hinder it, everyone who stood in the way’” (p. 233).  One of the few Western journalists daring to discern and tell the truth, Malcolm Muggeridge, said:  “‘I saw something of the battle that is going on between the government and the peasants.  The battlefield is as desolate as nay war and stretches wider; stretches over a large part of Russia.  One the one side, millions of starving peasants, their bodies often swollen from lack of food; on the other, soldier members of the GPU carrying out the instruction of the dictatorship of the proletariat.  They have gone over the country like a swarm of locusts and taken away everything edible; they had shot or exiled thousands of peasants, sometimes whole villages; they had reduced some of the most fertile land in the world to a melancholy desert’” (p. 260). 

Consequently, Soviet agriculture imploded.  In 1954 the Nikita Khrushchev admitted that despite the more highly-mechanized farming techniques in the collectives “Soviet agriculture was producing less grain per capita and few cattle absolutely than had been achieved by the muzhik with his wooden plough under Tsarism forty years earlier” (p. 187).   And what’s true for agriculture is true for the rest of the USSR under Communist rule—socialism inevitably destroys whatever it controls.  

* * * * * * * * * * * * * * * * * * * * 

One of the last of Robert Conquest’s books dealing with 20th century Russia is Stalin:  Breaker of Nations (New York:  Penguin Books, c. 1991).  Benefitting from “the great flow of new information” recently available from Russian archives, he endeavored to portray one of the few men in history who harmed both his own country and much of the world.  Immersed within a system rooted in “falsehood and delusion”—what Boris Pasternak described as the “reign of the lie”—Stalin was an arch-deceiver who misled and betrayed virtually all his associates and allies.  Indeed, he “invested his whole being in producing illusion or delusion.  It was above all this domination by falsehood which kept even the post-Stalin Soviet Union in a state of backwardness, moral corruption, economic falsification and general deterioration until in the past decade the truth became too pressing to be avoided” (p. 325).  He was, as Churchill labeled him, “an unnatural man”—the personification of the “moral nihilism” basic to both Nazism and Bolshevism.  Along with Hitler, Mao, Ho Chi Minh, Fidel Castro and Pol Pot—brutal ideologues determined to shape the world in accord with their fantasies—he contributed much to one of the bloodiest centuries in history

Stalin was born in 1879 in Georgia, an ancient nation in the Caucus annexed to Russia by Tsar Alexander I in 1801.   Christened Iosif Vissarionovich Dzhugashvili, he assumed the name Stalin (“man of steel”) when he joined underground revolutionary activities designed to overthrow the Tsar.  Periodically arrested—and sent into exile in Siberia—he gained renown for his self-discipline, party loyalty, writing skills, and ability to get things done.  He was, however, a rather minor figure until after the 1917 Bolshevik Revolution.  Thereafter he proved useful to Lenin, who valued his loyalty as well as willingness to manage unpleasant tasks.   In 1922, when Lenin suffered his first of several strokes, Stalin began to effectively maneuver himself into powerful positions within the Politburo, jockeying with Trotsky for preeminence.  Lenin apparently distrusted him, however, and disapproved him as his successor, confiding to his wife, in a document hidden from the public for 33 years:  “‘Stalin is too rude, and this defect, though quite tolerable in our midst and in dealings among us Communists, becomes intolerable in a General Secretary.  This is why I suggest that the comrades think about a way to remove Stalin from that post and appoint another man who in all respects differs from Comrade Stalin his his superiority, that is, more loyal, more courteous, and more considerate of comrades . . . .’” (p. 101).  According to one of his secretaries, Lenin had resolved “‘to crush Stalin politically’” but died before doing so in 1924.

Unlike Trotsky, who advocated the primacy of world-wide revolutionary struggle, Stalin determined to first establish “Socialism in One Country,” and he rallied (through cajolery, intrigue, slander) enough followers to impose his will on the USSR.  Once in power, he “planned to launch the party on an adventurist class-war, policy of crash industrialization and collectivization, adventurist beyond even the most extreme of the plans hitherto rejected as beyond the pale for their Leftism” (p. 141).  “Stalinism was, in part at least, the result of a simple preconception—the nineteenth-century idea that all social and human actions can be calculated, considered and predicted” (p. 322).  Such policies, crafted by alleged “experts” who often knew very little about agriculture or industry or anything but Party ideology, were ruthlessly imposed and quickly impoverished virtually everyone but Party functionaries.  As definitively described inConquest’s Harvest of Sorrow (Stalin’s liquidation of the Ukrainian peasantry, “the greatest known tragedy of the century” that killed some fifteen million souls) and his The Great Terror (Stalin’s elimination of all rivals within the Communist Party)—few monsters in all of history have ruled so barbarously.    

When WWII broke out, Stalin did whatever necessary to further his own objectives.  Thus he cheerfully aligned himself with Hitler when it looked like the two dictators would help each other, expanding their power over vast sections of Europe.  Betrayed by Hitler when the Nazis invaded Russia, Stalin then turned to Churchill and Roosevelt—flattering and dissembling and manipulating these “allies” to secure invaluable materials with which to drive back the Germans and ultimately control Eastern Europe.  Quite capable of charming those he encountered, he favorably impressed visitors such as America’s Vice President Harry Hopkins and Britain’s Foreign Secretary Anthony Eden.  President Roosevelt, though  warned to be careful in negotiating with “Uncle Joe,” followed his personal “hunch” and determined to “give him everything I possibly can and ask for nothing in return,” trusting him not to “annex” any territory and “work with me for al world of democracy and peace’” (p. 245).  In Conquest’s view,  FDR’s naive judgment “must be among the crassest errors ever made by a political leader” (p. 245).   

Following WWII, Stalin resumed his ruthless policies—waging a “cold war” abroad and purging all possible enemies to his regime within Russia—before dying in 1953.   “In real terms, Milan Djilas’s conclusion stands up:  ‘All in all, Stalin was a monster who, while adhering to abstract, absolute and fundamentally utopian ideas, in practice had no criterion but success—and this meant violence, and physical and spiritual extermination’” (p. 327).  To understand the man and his evil deeds, Conquest’s Stalin:  Breaker of Nations is a trustworthy source with which to begin.

                                           * * * * * * * * * * * * * * * * * 

When Stalin’s daughter, Svetlana Alliluyeva, fled the Soviet Union in 1967, she brought with her a manuscript—Twenty Letters to a Friend (New York:  Discus Books, c. 1967)—describing important aspects of her life, which became an instant best-seller published in many languages.  She wrote the book in 35 days in 1963 just to put her thoughts on paper and did not envision publishing it while living in her own country.  She mainly recorded memories of her mother and father, bearing witness to the insatiable longing children have to be with and love their parents, but in the process she tried to make sense of what happened around her and thus gives us insight into what took place in Russia during her lifetime, for:  “The twentieth century and the Revolution turned everything upside down” (p. 30).

Several years after her father died she took her mother’s family name, Alliluyeva—a word akin to “Hallelujah” meaning “Praise ye the Lord.”  She fondly remembers both her mother and her maternal grandparents.  When Stalin married her mother, her grandparents became part of a nurturing extended family.  Grandfather Alliluyeva was born a peasant in Georgia but became a skilled mechanic who joined the Russian Social Democratic Workers’ Party in 1898 and retained an old-fashioned revolutionary idealism and personal integrity until he died in 1945.  Her grandmother was also from Georgia, the descendent of German settlers, who spoke Georgian with a German accent.  Reared in a Protestant church she  “was always religious,” stoutly resisting the atheistic propaganda surrounding her.  “By the time I was thirty-five,” Svetlana says, “I realized that grandmother was wiser than any of us” (p. 54).  That wisdom, she came to believe, was nurtured by a religious perspective she ultimately shared.   

Pondering her maternal grandparents’ influence, Svetlana credited their love for Georgia for instilling in her a love for the beauty of nature.  “O Lord,” she wrote, “how lovely is this earth of yours and how perfect, every blade of grass, every flower and leaf!  You go on supporting man and giving him strength in this fearful bedlam where Nature alone, invincible and eternal, gives solace and strength, harmony and tranquility of spirit” (p. 82).  Amidst all the destruction wrought by various “madmen” who ravage the earth, its “beauty and majesty” needs to be revered.  Still more:  “It seems to me that in our time faith in God is the same thing as faith in good and the ultimate triumph of good over evil” (p. 83).  Consequently, “By the time I was thirty-five and had seen something of life, I, who’d been taught from earliest childhood by society and my family to be an atheist and materials, was already one of those who cannot live without God” (p. 83).   

Svetlana’s mother, Nadya, was born in the Caucasus but grew up in St. Petersburg, immersed in the revolutionary activities .  Here she met and soon married Joseph Stalin, much older than she, whose first wife had died.  Nadya was sincerely devoted to the revolutionary cause, strictly followed Party rules, and was willing to sacrifice her all for the good of the people.  Thus she worked a great deal and spent limited time with her children, though when present orchestrated lots of fun and games.  Stalin himself proved to be a poor husband, so:  “Because my mother was intelligent and endlessly honest, I believe her sensitivity and intuition made her realize finally that my father was not the New Man she had thought when she was young, and she suffered the most terrible, devastating disillusionment” (p. 117).  In 1932, following an argument with him regarding the genocidal famine taking place in the Ukraine pursuant to Stalin’s orders, she went to her room and killed herself with a pistol, though Svetlana was told she had died of appendicitis.  “Our carefree life, so full of gaiety and games and useful pastimes, fell apart the moment my mother died” (p. 133).    Svetlana was six years old.  

“For ten years after my mother died, my father was a good father to me” (p. 133).  He had always been the more affectionate parent, making sure Svetlana was well cared for in every way, including an excellent education.  But when she finished her schooling and became more independent, their relationship frayed.  Discovering the real reason for her mother’s death while reading an English magazine further depressed her.  By now she was also aware of the growing list of classmates, friends and relatives who had been sent into exile or killed under her father’s rule.  When only seventeen she met and fell in love with Alexei Kapler, a noted musician, who seemed to her to be “the cleverest, kindest, most wonderful person on earth” (p. 187).  Soon thereafter Kapler was arrested and sentenced  to the Gulag for five years, apparently for daring to court Stalin’s daughter!  “After that my father and I were estranged for a long time.”  Indeed, “I was never again the beloved daughter I had once been” (p. 192).  In 1944 she married Grigory Morozov, a fellow university student.  Stalin didn’t approve of him either—both Kapler and Morozov were Jews and he harbored a deep anti-Jewish prejudice.  He refused to meet him .  This and welcomed the news that they divorced soon after she gave birth to a son.  She then married the son of a prominent Bolshevik, with whom she had a daughter.  This marriage garnered her father’s approval but quickly dissolved.   

Despite their estrangement, father and daughter occasionally spent time together following WWII.  She found him difficult to talk with and thought the obsequious men surrounding him (Beria, Malenkov, Bulganin) odious.   The Communist Party hardly resembled what was envisioned by sincere revolutionaries in 1917.  It “had nothing in common with the spirit of my grandfather and my grandmother, my mother, the Svandizes and all the old Party people I knew.  It was all hypocritical, a caricature purely for show” (p. 207).  How superior were the simple people of the “old Russia” such as Stalin’s own mother!  Her “grandmother had principles of her own.  They were the principles of one who was old and God-fearing, who’d lived a life that was upright and hard, full of dignity and honor.  Changing her life in any whatever was the furthest thing from her mind.  She passed on all her stubbornness and firmness, her puritanical standards, her unbending masculine character and her high requirements for herself, to my father.”  Still more, when Svetlana visited her paternal grandmother’s grave, she wondered how could she not think about her “without my thoughts turning to God, in whom she believed so devoutly?” (p. 214).   In 1962, less than a decade after her father died, Svetlana was baptized in the Orthodox Church.  For her, she explained:  “The sacrament of baptism consists in rejecting evil, the lie.  I believed in ‘Thou shalt not kill,’ I believed in truth without violence and bloodshed.  I believed that the Supreme Mind, not vain man, governed the world.  I believed that the Spirit of Truth was stronger than material values.  And when all of this had entered the heart, the shreds of Marxism-Leninism taught me since childhood vanished like smoke.”  

In the godless world of Stalin’s USSR, however, there was little to celebrate.  For his  daughter, nothing “turned out well” for those she knew.  “It was as though my father were at the center of a black circle and anyone who ventured inside vanished or perished or was destroyed in one way or another” (p. 231).  Yet despite it all Svetlana found reason for hope.  Much about the Russian character evident in her faithful nurse, Alexandra Andreevna, “Granny,” still survives.  “But what is good in Russia is traditional and unchanging” and ultimately “it is this eternal good which gives Russia strength and helps preserve her true self” (p. 232).  When Svetlana’s mother died, “Granny” became “the only stable, unchanging thing left.  She was the bulwark of home and family, of what, if it hadn’t been for her, would have gone out of my life forever” (p. 237).  Though not conventionally religious, she retained a deeply moral perspective and faith.  

No doubt influenced by both her maternal grandmother and “Granny,” Svetlana developed a deeply religious conviction.  “The Good always wins out,” she said.  “The Good triumphs over everything, though it frequently happens too late—not before the very best people have perished unjustly, senselessly, without rhyme or reason” (p. 242).  She had witnessed how her father and his revolutionary comrades “tried to do good by doing evil” and ruthlessly “sacrificed senselessly, thousands of talented” human beings (p. 244).  Yet she also knew that:  “Everything on our tormented earth that is alive and breathes, that blossoms and bears fruit, lives only by virtue of and in the name of Truth and Good” (p. 245).