298 Forensic Faith

    J. Warner Wallace is a retired Los Angeles County detective, noted for appearing repeatedly on NBC’s Dateline, Fox News, and Court TV, explaining thereon how to conduct “cold case” investigations.  Until he was 35 years old, he was an atheist, a religious skeptic skilled in dissecting and mocking Christian beliefs.  But, as he says in a brief booklet entitled Alive:  A Cold-Case Approach to the Resurrection (Colorado Springs:  David C. Cook, c. 2014), he “heard a pastor preach a sermon that described the resurrection of Jesus,” apparently believing “Jesus rose from the dead and was still alive today” (#6).  His interest piqued, Wallace “decided to investigate the resurrection as I would any unsolved case from the distant past.  My journey led me out of atheism to the truth of Christianity” (#12).  Subsequently, he began reading the Gospels in light of the principles basic to his work as a homicide detective.  Looking back, he recalls:  “Somewhere on my journey from ‘belief that’ to ‘belief in,’ a friend told about C.S. Lewis.  After reading Mere Christianity, I purchased everything Lewis had written.  One quote from God in the Dock struck with me through the years.  Lewis correctly noted, ‘Christianity is a statement which, if false, is of no importance, and if true, is of infinite importance.  The one thing it cannot be is moderately important’” (#153).  

Though many great apologists (from Tertullian to Calvin to David Limbaugh) have had legal training, detective Wallace brings a unique background to his work, for:  “There are many similarities between investigating cold cases and investigating the claims of Christianity.  Cold-case homicides are events from the distant past for which there is often little or no forensic evidence” (#136).  In Cold-Case Christianity:  A Homicide Detective Investigates the Claims of the Gospels (Colorado Springs:  David Cook, c. 2013), Wallace effectively uses examples from his detective days to explain “ten important principles every aspiring detective needs to master” and then shows how they enable serious investigators to validate the New Testament’s claims.  

First of all, “don’t be a know-it-all”!  Good detectives approach the evidence with objective  humility, refusing to follow either their own presuppositions or rapid-response simple solutions.  Lawyers and judges  constantly ask jurors to disregard their biases and examine the evidence before them.  In his atheist days, Wallace took for granted the philosophical naturalism pervading our intellectual milieu, automatically rejecting miraculous or spiritual realities.   But once he set aside his presuppositions, looking carefully at the evidence presented by the Gospels, he began to see the strength of their claims.  Secondly, detectives “learn how to ‘infer,’” to follow a chain of evidence—to reason abductively.  There’s a great difference between engaging in possible (i.e. imaginable) theories and reasonable (i.e. logical) thought.  The truth that  convicts a felon or persuades a historian will be feasible, straightforward, exhaustive, logical, and superior to competing theories.  Weighing the central Christian claim, that Christ arose from the dead, while considering various explanations for the account, Wallace found himself compelled to infer that something supernatural must have occurred.  Nothing else makes sense.  Thus he also discovered that faith does not negate reason; in fact, “faith is actually the opposite of unbelief, not reason.  As I began to read through the Bible as a skeptic, I came to understand that the biblical definition of faith is a well-placed and reasonable inference based on evidence” (#678).  

Principle #3 is:  “think circumstantially.”  Evaluate events and witness’s accounts in their totality, realizing that bits and pieces of apparently unrelated materials often fall into a coherent picture when put together.  We’re tempted to disregard evidence that is “merely circumstantial” as somehow lacking significance.  But California judges, following the state’s Criminal Jury Instructions, routinely emphasize:  “Both direct and circumstantial evidence are acceptable types of evidence to prove or disprove the elements of a charge, including intent and mental state and acts necessary to a conviction and neither is necessarily more reliable than the other.  Neither is entitled to any greater weight than the other.”  Courtroom convictions, especially in cold-case trials, are frequently made purely on the basis of “circumstantial evidence.”  It’s not necessary to have an eye-witness to a crime to convict the perpetrator!  (In fact, circumstantial evidence is often better than an eyewitness, since it cannot lie!)  Similarly, when one considers various cosmological data, it seems reasonable to conclude there’s a Creator responsible for the universe.  “The cumulative circumstantial case for God’s existence is much like the circumstantial case we made in our murder investigation” (#946).  

Witnesses, of course, are vitally important to detectives, juries and judges.  But it’s important, following Wallace’s fourth principle:  “test your witnesses.”  Information gained from a witness is invaluable, but only if he is trustworthy!  Skilled  detectives master the art reading witnesses.  Subtle clues, both in his words and physical mannerisms, often lead the investigator to believe or disbelieve what’s said.  Four “critical areas should be examined . . . .  If we can establish that a witness was present, has been accurate and honest in the past, is verified by additional evidence, and has not motive to lie,we can trust that that witness has to say” (#1101).  Multiple witnesses are even more persuasive—especially if they differ on trivial matters while concurring on essential facts.  “I would far rather have three messy, apparently contradictory versions of the event than one harmonized version that has eliminated some important detail.  I know in the end I’ll be able to determine the truth of the matter by examining all three stories” (#1121).  Reading the Gospels, with their slightly different perspectives, confirmed for Wallace their truthfulness!  “All four accounts are written from a different perspective and contain unique details that are specific to the eyewitnesses.  There are, as a result, divergent (apparently contradictory) recollections that can be pieced together to get a complete picture of what occurred.  All four accounts are highly personal, utilizing the distinctive language of each witness” (#1239).  

Importantly, when interrogating witnesses, “hang on every word”—the fifth forensic principle.  Carefully recording and pondering a witness’s words often makes the difference between skilled and run-of-the-mill detectives.  When Wallace studied the New Testament, he invested much time investigating its words.  “every little idiosyncrasy stood out for me.  Every word was important.  The small details interested me and forced me to dig deeper” (#1405).  It became obvious to him, for example, that Mark relied on Peter for his information.  Sixthly, good detectives “separate artifacts from evidence.”  Materials added to witnesses’ accounts must be considered “artifacts” and weighed less heavily when discerning what exactly happened.   Ancient documents, including the Gospels, include some artifacts, such as the account of the woman accused of adultery in John 7:53-8:11.  Careful scholarship enables one to disregard such artifacts as irrelevant to the investigation.  So too it’s important to note Principle #7—“resist conspiracy theories.”  Turning to the New Testament’s claim that Jesus arose from the grave, Wallace notes that various theories have been given to explain it, but a simple assent to the testimonies of men who died for this belief makes the most sense.

Principle #8 is:  “respect the ‘chain of custody.’”  Good detectives carefully document and preserve important evidence.  So too the Early Church took care to preserve the eyewitness accounts basic to the Christian faith.  Still more:  those ancient believers seemed to embrace Wallace’s ninth principle:  “know when ‘enough is enough.’”  Given the magnitude of its claims, the New Testament is remarkably brief!  In a courtroom, juries and judges look for sufficient, not overwhelming evidence.  What ultimately  matters is what’s called the “standard of proof.”  What we want to know is what’s reasonable, not theoretically possible.  Not everything than can be said needs to be said, and certainty in a trial necessarily comes without knowing every shred of evidence.  Perfection cannot be attained in a hall of justice!  Finally, cold case detectives must always “prepare for an attack” (Principle #10).  Skilled defense attorneys will try to disprove or discount detectives’ work.  When he became a Christian, Wallace understood atheistic arguments because he had once propounded them.  Listening to the “New Atheists” who gained popularity a decade ago:  “It wasn’t as though these skeptics were offering anything new.  Instead they were presenting old arguments with new vigor humor, cynicism, and urgency.  They were much like the defense attorneys I had faced over the years” (#2241).  Dealing with them as a Christian apologist, once must, above all, insist upon the objectivity of truth, in particular the historical reliability of the Gospels.  

Having introduced the reader to a detective’s guiding principles, Wallace proceeds to carefully consider various evidences pointing to the New Testament’s reliability, setting forth arguments familiar to students of apologetics.  Doing he shows how seriously he has studied both the Scripture and the Early Church.  Though neither a biblical scholar nor an ancient historian, he has clearly consulted the texts and sought to rightly understand them.  Thus it’s reasonable to infer “from the circumstantial evidence . . . that the Gospels were written very early in history, at a time when the original eyewitnesses and gospel writers were still alive and could testify to what they had seen” (#2744).  Considering  corroborating evidence, found in both secular history and second century Christian documents, Wallace concludes that “we can have confidence that the essential teachings of the Gospels have remained unchanged for over two thousand years” (#4144).  

Over the years I’ve read dozens of apologists’ treatises.  Few of them fascinated me as much as this one, primarily because of its unique, detective’s perspective.  Wallace writes clearly, understands the contemporary world, and sets forth his case persuasively.    

                                                  * * * * * * * * * * * * * * * * * * * * *

Though police officers occasionally apprehend and arrest culprits at the crime scene, detectives are called in to study the evidence “inside the crime scene” and identify the suspect who is “outside the crime scene.”  Thus detective J. Warner Wallace followed up his initial work of apologetics—Cold-Case Christianity—with a fine treatise entitled God’s Crime Scene:  A Cold Case Detective Examines the Evidence for a Divinely Created Universe (Colorado Springs:  David C. Cook, c. 215).  To refer to creation as “God’s crime scene” might initially startle a reader, but the title makes sense when one sees a detective’s mind in action, trying to locate a person by looking at the evidence for his activity.  As was true in his earlier work, the materials presented are generally available in other works of apologetics, but Wallace provides the unique perspective of a skilled sleuth and effectively makes his case, pointing to someone “outside” the physical world responsible for its substance and structure.  

Detectives like Wallace “investigate causes.  Who caused this murder?  What motivated this suspect to commit this crime?  Criminal investigations are largely causal investigations.  Detective learn to ask good questions about causation to determiner the identity of a suspect” (p. 27).   What does the evidence suggest regarding whether or not a crime was committed?   So too, what cosmological evidence leads one to conclude that a Creator created the universe?  Wallace cites copious current data (i.e. the “Big Bang” Standard Cosmological Model, including amazing details regarding a “finely-tuned” universe) that cogently point to the fact “that our universe came into being from something beyond the space, time, matter and energy of our universe” (p. 37).  Assuming the principle of sufficient reason, we wonder “Why is there something rather than nothing?” and conclude Someone—“a purposeful Fine-Tuner”—made it.  “Inside evidence” cannot fully explain the existence of the universe.  “The evidence points to a cause outside of space,  time, and matter” (p. 44).  Evidence at a murder scene invariably points toward a murderer, not an accidental confluence of random events.  So too the amazingly fine-tuned cosmos provides evidence (“signs of design”) pointing toward an intelligent Artist orchestrating it all. 

Origin of life questions provide yet more reasons to believe in God.  Having investigated murders, Wallace fully understands the radical difference between living persons and deceased corpses.  We can easily detect and describe the difference between the living and the dead, and as we explore the mysteries of living organisms we find that “the complexity required for cells to metabolize and reproduce is mind boggling.  Cells are packed with miniature biological machines resembling (and often exceeding ) the best work of human engineers” (p. 72).   Though not a trained scientist, Wallace nicely explains what life scientists have found—amino acids, proteins, DNA etc.  As a detective he seeks to answer the “where, what, why, when, and how” life originated questions.  Purely naturalistic answers regarding the mystery of life’s origin prove ever ephemeral, especially when dealing with the vast amount of “information” basic to all that lives, for “the laws and forces of nature cannot produce information, but information is required for life to begin” (p. 321).    

As manifestly evident (and deeply mysterious) as the reality of life is the reality of human consciousness!  “Consciousness poses one of the most difficult conundrums for philosophers and scientists.  As philosopher David Chalmers lamented, ‘Conscious experience is at once the most familiar thing in the world and the most mysterious.  There is nothing we know about more directly than consciousness, but it is far from clear how to reconcile it with everything else we know.  Why does it exist?  What does it do?  How could it possibly arise from lumpy gray matter?’” (p. 122).  Though persons can see and touch physical things, their mental states, their minds, are known only to themselves.  In particular, we first intend to do things and then do them.  We think about things apart from ourselves, and those thoughts are purely mental, non-material, realities.  We’re also able to think logically, following a train of argumentation that cannot be reduced to the chemical reactions within the brain.  As the renowned philosopher Thomas Nagel recently wrote:  “‘So long as the mental is irreducible to the physical, the appearance of conscious physical organisms is left unexplained by a naturalistic account of the familiar type.  On a purely materialist understanding of biology, consciousness would have to be regarded as a tremendous and inexplicable extra brute fact about the world’” (p. 136).   

Naturalistic thinkers deny the existence of free will as well as consciousness.  Though he didn’t think clearly about this in his atheist days, Wallace’s deterministic philosophy actually undermined his legitimacy as a detective!  For without free will the criminal justice system has little justification.  If a killer couldn’t have avoided killing it’s hard to see why he should be punished for his “crime.”  But in the criminal justice system:  “Personal responsibility is assigned to every person who chooses to commit a crime when he or she could have chosen otherwise” (p. 141).  Forty years ago the Supreme Court decreed that “‘a deterministic view of human conduct’ was ‘inconsistent with the underlying precepts of our criminal system.’  In fact, the Court described ‘belief in freedom of the human will and a consequent ability and duty of the normal individual to choose between good and evil’ as the ‘“universal and persistent” foundation stone in our system of law, and particularly in our approach to punishment, sentencing, and incarceration’” (p. 147).  

Accompanying a belief in free will, detectives almost necessarily believe in “law and order”!  The laws they seek to uphold are society’s way of declaring and enforcing morality.  Laws are necessary because there are really evil people in our world!  They reflect the fact that written on the human heart is a deep awareness of the “natural law,” the notion that good should be done and evil resisted.  Wallace has found that even “hardened criminals” who break the law hold one another accountable to certain moral standards!  The man who kills another man’s wife will inevitably condemn anyone who kills the killer’s mother!  There are right ways—and wrong ways—of treating others.  Such standards are more than personal perspectives or fleeting emotional reactions.  They point to a higher, objective standard, a “transcendental moral truth giver,” a Lawgiver, an “all-powerful, non-material, nonspatial, atemporal, purposeful, personal Creator” whose laws reflect “His nature” (p. 172).   

Having carefully examined all the relevant evidence, Wallace concluded there is Someone responsible for the world we live in, ourselves included.  “I believe God exists because the evidence leaves me no reasonable alternative” (p. 201).  In jury trials, judges explain that verdicts should be based on a “Standard of Proof.”  Jurors can never be 100% sure when they’re making decisions, but they can with certainty conclude—“beyond a reasonable doubt”—that a suspect is guilty of a crime.  As a does a good detective, he also points us to “expert witnesses,” scholars who have shaped his presentation, providing us with a helpful, up-to-date reading list.  

Praise for God’s Crime Scene comes from distinguished writers such as Eric Metaxis, the bestselling author of Bonhoeffer, who says:  “What if a brilliant prosecutor tried to prove the existence of God using real evidence and crystal clear arguments?  Well, that’s precisely what J. Warner [Wallace] does in this magnificent book—and you get to be the jury.  Don’t blink.  Thrilling and amazing.”  Hank Hanegraff concurs:  “Sherlock Holmes has nothing on J. Warner Wallace.  In God’s Crime Scene, Wallace uses the tools of a world-class homicide detective to discern whether or not clues point in the direction of a Divine Intruder.  The reader can almost hear the words ‘Elementary, my dear Watson’ as Wallace evaluates the evidence for cosmic design.  A highly readable resource by which seekers and skeptics can follow truth toward its origins.” 

                                            * * * * * * * * * * * * * * * * * * 

In Forensic Faith:  A Homicide Detective Makes the Case for a More Reasonable, Evidential Christian Faith (Colorado Springs:  David C. Cook, c. 2017), J. Warner Wallace urges us to take the apologetic materials presented in his two earlier works and effectively use them as we interact with our increasingly secular culture.  All too many Evangelicals, says John Stonestreet, President of the Colson Center for Christian Worldview, dismiss such endeavors.  In his judgment:  “It certainly sounds spiritual to say things like, ‘Arguments never saved anyone,’ or, ‘No one is ever argued into the kingdom.’  Such are, however silly straw men” (#193).  In fact, most folks come to faith when they find good reasons to do so.  Truth matters!  “To be human is to reason, to reflect, and to ask questions about life and its meaning” (#198).  And, Stonestreet insists, “Christianity is really True.  With a capital T.  True for everyone, whether they believe it or not.  Christianity describes reality as it actually is” (#209).  

Policemen like Wallace are committed “to protect and to serve” the public.  It’s an honorable—indeed a sacred—calling.  To protect and serve our world, as C.S. Lewis did nearly a century ago, we Christians need to know what we believe and explain it clearly.  We need a “forensic faith,” a faith that can withstand public scrutiny and rigorous argumentation, for in addition to caring for the poor and homeless we need to provide Truth for hungry minds.  We need to become skilled “case makers,” committed to bearing witness to our Lord.  To do so we need solid biblical teaching—and Wallace provides a plethora of texts supporting his position—but we also need good training, learning how, as Origen said centuries ago, “to do battle for the truth.”  Parents and pastors, to protect and serve young believers, simply must engage them in activities designed to discipline them, to make strong disciples, able to withstand the challenges, the intellectual battles awaiting them.  Taking students on a “forensics” mission trip to UCLA may be more important than a “compassionate” mission trip to Mexico!  

To do so, using Wallace’s books—and downloading free materials from his website, ColdCaseChristianity.com—would be wise!  Inasmuch as far too many collegians forsake their religious views, youth pastors especially should ponder the case he builds for a Forensic Faith!      

297 Proof of Heaven

   Eben Alexander’s Proof of Heaven:  A Neurosurgeon’s Journey into the Afterlife (New York:  Simon & Schuster, c. 2012) is a fascinating, persuasive personal “life-after-life” account given credibility by the author’s medical training and cogent presentation.  After receiving his M.D. from Duke University Medical School, he pursued post-doctoral study and taught for 15 years at Harvard Medical School, operating on “countless patients” and becoming quite expert in dealing with brain injuries.  Though nominally religious (attending an Episcopal church at Christmas and Easter), he’d struggled with some personal issues and doubted the basics of the Christian faith, including the reality of “God and Heaven and an afterlife” (p. 34).  Believing, with Albert Einstein, that “a man should look for what is, and not for what he thinks should be,” he takes a scientific stance, determined to deal with the realities he encountered as a result of a “near-death” experience which forever“ changed his mind” regarding heaven.  

In 2008, at the age of 54, Alexander fell ill with bacterial meningitis—“arguably the best disease one could find if one were seeking to mimic human death without actually bringing it about” (p. 133)—and lapsed into a deep coma for seven days.  While his brain shut down completely—“it wasn’t working at all” (p. 8)—he encountered “the reality of a world of consciousness that existed completely free of the limitations of my physical brain” (p. 9).  Consequently, he concluded:  “My experience showed me that the death of the body and the brain are not the end of consciousness, that human experience continues beyond the grave.  More important, it continues under the gaze of a God who loves and cares about each one of us and about where the universe itself and all the beings within it are ultimately going.”  He now knows:  “The place I went was real.  Real in a way that makes the life we’re living here and now completely dreamlike by comparison” (p. 9).  Having encountered Ultimate Reality, he asserts:  “What I have to tell you is as important as anything anyone will ever tell you, and it’s true’ (p. 10).  

While Alexander was in the coma, doctors ran all the sophisticated tests modern science prescribes, preserving graphs and images of his damaged brain.  Though his brain showed no activity, he journeyed first into a dark “underworld filled with repulsive creatures and foul smells.  Then a light descended into the darkness and he heard “a living sound, like the richest, most complex, most beautiful piece of music you’ve ever heard” (p. 38).  Suddenly he was ushered into a beautiful new world—“The strangest, most beautiful world I’d ever seen” (p. 38).  “Below me was countryside.  It was green, lush, and earthlike.  It was earth . . . but at the same time it wasn’t” (p. 38).  He’d entered a really Real world!  A beautiful young “Girl on the butterfly Wing” joined him, giving him a “look that, if you saw it for a few moments, would make your whole life up to that point worth living, no matter what had happened in it so far” (p. 40).  (After he recovered, he received a picture of one of his deceased biological sisters—whom he’d never seen, even in a picture—and realized the “Girl” looked exactly like her!)  Without speaking she gave him a wonderful message:  “’You are loved and cherished, dearly, forever.’  ‘You have nothing to fear.’  ‘There is nothing you can do wrong’” (p. 40).  At that moment, Alexander felt “a vast and crazy sensation of relief.  It was like being handed the rules to a game I’d been playing all my life without fully understanding it” (p. 40).  He found his deepest questions answered, but not with words.  “Thoughts entered me directly” (p. 46).  He also felt himself immersed in the Reality of God.  Indeed, “there seemed to be no distance at all between God and myself.  Yet at the same time I could sense the infinite vastness of the Creator, could see how completely minuscule I was by comparison” (p. 47).  

Still more, he understood:  “The world of time and space in which we move in this terrestrial realm is tightly and intricately meshed within these higher worlds.  In other words, these worlds aren’t totally apart from us, because all worlds are part of the same overarching divine Reality” (p. 48).  Because of his illness, he’d taken a remarkable out-of-body “tour—some kind of grand overview of the invisible spiritual side of existence” (p. 69).  And, above all, he’d learned a priceless truth:  he—and we—are loved.  Every one of us!  “Love is, without a doubt, the basis of everything” (p. 71).  This truth is as certain to Alexander as any of the scientific truths necessary for his vocation as a surgeon.  “The unconditional love and acceptance that I experienced on my journey is the single most important discovery I have ever made, or will ever make, and as hard as I know it’s going to be to unpack the other lessons I learned while there, I also know in my heart that sharing this very basic message—one so simple that most children readily accept it—is the most important task I have” (p. 73).  

Applying his scientific understanding of the human brain—and the mind/brain/lconsciousness questions that have forever fascinated philosophers—Alexander tries to explain how the physical brain serves as a “kind of reducing valve or filter, shifting the larger, nonphysical consciousness that we possess in the nonphysical worlds down into a more limited capacity for the duration of our mortal lives” (p. 80).  We are, spiritually, in touch with an Ultimate Reality that we rarely sense because our brains too easily restrict  our consciousness to material realities.  But there is a vast, mysterious universe that is purposeful and spiritual.  Indeed:  “The physical side of the universe is as a speck of dust compared to the invisible and spiritual part” (p. 82).  We are primarily spiritual beings, designed and destined for eternal life with God.  “This other, vastly grander universe isn’t ‘far away’ at all.  In fact, it’s right here . . . .  It’s not far away physically, but simply exists on a different frequency.  It’s right here, right now, but we’re unaware of it because we are for the most part closed to those frequencies on which it manifests” (p. 156).  

When, after seven days, Alexander emerged from his coma, his family observed him smiling.  “‘All is well,’ I said, radiating that blissful message as much as speaking the words.  I looked at each of them, deeply, acknowledging the divine miracle of our very existence” (p. 113).  He was, miraculously, well!  “In fact—though at this point only I knew this—I was completely and truly ‘well’ for the first time in my entire life” (p. 123).  With each passing day his neuroscientist’s knowledge returned.  And so did his “memories of what had happened during that week out of my body . . . with astonishing boldness and clarity.  What had happened outside the earthly realm had everything to do with the wild happiness I’d awakened with, and the bliss that continued to stick with me” (p. 124).  Still more:  he was “also happy because—to state the matter as plainly as I can—I understood for the first time who I really was, and what kind of a world we inhabit” (p. 124).  

Above all, he’d encountered what’s really Real!  “What I’d experienced was more real than the house I sat in, more real than the logs burning in the fireplace.  Yet there was no room for that reality in the medically trained scientific worldview that I’d spent years acquiring” (p. 130).  His own experience led him to led to plunge “into the ocean of NDE [Near Death Experience] literature” (p. 131).  He found his experience amply confirmed by others!  Years earlier he’d heard about Raymond Moody’s Life After Life, but he’d neither read it nor considered its evidence.  Now he read it carefully and affirmed its contents.  But Alexander also realized that (compared with many other NDEs) his “was a technically near-impeccable near-death experience, perhaps one of the most convincing such cases in modern history.  What really mattered about my case was not what happened to me personally, but the sheer, flat-out impossibility of arguing, from a medical standpoint, that it was all fantasy” (p. 135).  

After a lengthy convalescence, Alexander made his way to church.  To his amazement, the music and architecture which had left him unmoved before his NDE now touched him deeply.  “At last, I understood what religion was really all about.  I didn’t just believe in God; I knew God.  As I hobbled to the altar to take Communion, tears streamed down my cheeks” (p. 149).  That heavenly realm he’d visited while in a coma was, in fact, the same realm celebrated in Christian worship.  Opening our minds to God in meditation and prayer ushers us into that eternal realm wherein we can directly communicate with God, knowing Him as He Is.

                                      * * * * * * * * * * * * * * * * * * *

Mark Twain once quipped:  “The two most important days in your life are the day you were born and the day you find out why.”  As he indicates in his book’s subtitle in Life After Heaven:  How My Time in Heaven Can Transform Your Life on Earth (New York:  WaterBrook, c. 2017), Steven R. Musick is less concerned with his own Near Death Experience than with encouraging us to live in the light of truths he discerned therein.  Musick begins by detailing his early life—growing up in Denver, accepting Jesus as his Savior at the age of seven, devoutly attending an Episcopal church.  Financially unable to finish his studies at the University of Colorado, he enlisted in the Navy and was sent to the Great Lakes Naval Station in Chicago in 1975.  He fully embraced and enjoyed the military life and managed to qualify for both the Naval Academy and the SEAL training school.  The prospects of a military career seemed bright.  But then he was given a routine flu inoculation that adversely affected him; when he didn’t recover he was given a “lethal dose” aminophylline, to which he was unknowingly allergic.  He fell into a five-week period of unresponsive unconsciousness.    

Losing consciousness, he suddenly was weightless, flying through a white tunnel, transported to another realm of reality—“That Place.”  He stood (mysteriously in his own “body”) in a “rolling green meadow, immersed in indescribable light.  “I barely know how to describe the vibrancy of it all.  It’s like super high-definition television on steroids.  Everything is crystal clear” (p. 38).  It was a world of sheer beauty filled with wondrous music and pure joy.  “It is a perfect paradox of heaven:  I feel absolutely held and absolutely free.  I am physically feeling God’s security.  The safest place imaginable is in the arms of the Father.  Once you’ve felt that, it’s all you want.  Nothing else, from that day to this, satisfies.  It is the overwhelming, wonderfully sensation of being held” (p. 39).  He felt utterly at home, being where he was designed to be.  He also saw Jesus.  “He’s a person.  Not a shadowy figure, no figment of my imagination, not translucent or some floating being.  A person.  Solid” (p. 41).  As they talked about the author’s life, Jesus’ “words reveal there is purpose behind it all, a plan woven through my life.  It gives meaning to every moment of it.  And it is okay” (p. 42).  Consequently, “I begin to see my life from the perspective of heaven.  And how different it looks” (p. 43).  

Though he didn’t want to leave heaven, he awakened from his coma and spent many weeks convalescing in the naval hospital.  He struggled to breathe, since his illness reduced to one-third his lung capacity.  Thus disabled, he was discharged from the Navy, moved back to Denver, married his sweetheart, and went back to school.  Unable to find employers willing to hire an obviously unwell employee with a compromised immune system, he began his own business as a financial adviser, becoming modestly successful in time.  Though he constantly remembered his visit to “That Place,” he said nothing to anyone about it because he couldn’t make sense of it.  He did become deeply religious, however, spending much time in Bible study and prayer.  Various experiences reminded him of God’s abiding presence, but he was resigned to living with his infirmity, unable to do many of the daily things most of us take for granted.  

He and his wife attended various churches for many years but never found a permanently home.  Then, in 1984, they discovered Denver Vineyard, a “classic Vineyard” congregation that prayed for and believed in miracles.  He thought, for a couple of years, that miracles surely happened—but not that he might experience one!  Then one night, struggling with his disability, he felt impressed to attend a service.  “The worship was so powerful that night.  I don’t remember the message at all, just a growing sense of God’s presence, the knowledge that we were in a holy place.”  At the end of the service, the pastor invited people who wanted to pray to come forward.  Musick remained seated, but the pastor said:  “‘Wait a minute.  Someone here has been dealing with a malady for years.  A decade.’”  He then added:  “‘You’ve been sick all week.  Sick sick.  I think you have a respiratory thing’” (p. 102).  

Musick was astounded, as he’d told no one in the church about his illness.  He felt prompted to get up and walk to the front of the sanctuary.  He made it half-way.  An associate pastor met him there and put his hand on his chest.  “It felt like electricity went through my body.  I fell to the ground” (p. 103).  He found himself re-entering heaven— “That Place” he’d explored a decade earlier—seeing the “same sights, smell, and sounds” (p. 103).  Again he met and talked with Jesus.  Then he awakened “on the floor of the Denver Vineyard church.”  Getting to his feet, he took “a full breath of air” for the first time in ten years.  He’d been dramatically, miraculously healed in an instant!  Driving home, he talked with God, enjoying an intense intimacy with the Father.  For the first time he shared with his wife details concerning his earlier entrance to heaven, enabling her to better understand and rejoice with him.  His skeptical doctor took out his stethoscope and discovered that his lungs sounded “clear and healthy.”  

Subsequently, Musick intensified his life of prayer, study, and worship.  He and his wife joined a “team that prayed for people” in the church, and they witnessed wonderful healings.  Though he testified regarding his own healing, he didn’t feel inclined to share his heavenly visit resulting from his Near Death Experience.  Then, in 2011 he felt impelled to bear witness to what happened to him.  More importantly, however, he wanted to use his experience as a vehicle with which to tell us that “Heaven is a lot closer than you think.”  And if we pay attention to the little “bubbles of heaven” that frequently occur we can live more joyously and productively in Christ’s Kingdom.  “God intends for all his people to experience and to encourage heaven to come to earth.  He wants his presence and power to impact our everyday lives.  He wants his love to characterize our lives” (p. 121).  

                                        * * * * * * * * * * * * * * * * *

Chauncey Crandall graduated from the Yale School of Medicine, a skilled cardiologist who fervently believes the Christian message.  He lives and works on Palm Beach Island in Florida, where  “Business moguls, celebrities, major media personalities, music artists, bestselling authors, and athletes either get their daily mail . . . or have their second or third homes”  At the age of 19, working as a hospital orderly, he encountered death for the first time and “decided I hated death and would devote myself to fighting it with everything I could muster” (p. 2).  Pondering the spiritual as well as physical aspects of dying, he became a Christian, though for a number of years his scientific training kept him from diligently practicing his faith. “Little did I know, I needed a major dose of God (and more specifically, of His Son, Jesus, and the Holy Spirit) to be able to operate at full capacity in my faith” (p. 33).  

In Touching Heaven:  A Cardiologist’s Encounters with Death and Living Proof of an Afterlife (New York:  FaithWords, c. 2015), Crandall urges us to live better lives by living attuned to heaven.  As a physician, he routinely sees “evidences of the next realm all the time, in my work and ministry; every day, this life gives us glimpses of the next.  These snapshots—from my patient’s besides and my personal experiences—are what I want to share with you” (p. 4).  The physical and spiritual realms interpenetrate.  “Both realms are real, just as surely as God is real.  And because of these realities, I now know that life doesn’t end here” (p. 5).  Still more:  he believes “the Lord can make available to us everything here on earth that is available in the kingdom of heaven” (p. 93).  

As he began his medical practice, Crandall talked with people who’d had “out-of-body” experiences.  Some described being suspended above the surgical bed watching doctors work frantically trying to save them.  A classmate in medical school described Jesus sitting in his room when he was deathly ill.  Often he heard of angels appearing and effectively helping people.  But it was only when (in 2000) his own son Chad became sick with leukemia that he seriously began to attend to spiritual realities.  Until then he’d “balked at any intimations that I might need or want more of God” (p. 31).  He’d become an expert at “looking” at patients, noting their symptoms and seeking to heal them.  But by “looking” he could only see material realities.  Then he learned to “see”—to discern deeper and higher realms of reality wherein miracles occur.  “God surrounds every one of us with His kingdom at every turn—with messages and messengers, signs and gifts—and He has given it all to us so that we would turn our eyes and hearts toward Him.  Some people don’t notice because they doubt that He cares.  Many more, though, are missing daily hints of Him simply because they’re not paying attention” (p. 22).  Enabled to see clearly, he beheld “a universe crafted by an Artist who longs to express who He is and deeply connect with all He has created, but who is particularly focused on the ones He fashioned in His image” (p. 23).  

Once Crandall’s son became ill, he began “testing the universe,” fervently seeking to fully know God.  “Having a son diagnosed with leukemia activated my faith like nothing else had—making me vividly aware of the reality and presence of heaven.  It accelerated my spiritual growth” (p. 54).  He and his wife began visiting various churches, looking for revivals at home and abroad, thinking more deeply about the Bible and Christian theology.  From nominally attending a Presbyterian church he moved into Pentecostal circles.  He attended services were people were instantly healed, where the bread and wine for a communion service mysteriously multiplied to supply an unexpectedly large congregation, where 500 youngsters were fed with only 200 prepared meals.  He began to understand the true greatness of God who is very much with us and working miracles for us.  “As time went on, I thought, if He is this big in the world, then He can be even bigger in my medical practice, which opened me up to praying for every patient who would let me (nearly every one of them, as it turns out)” (p. 39).  Seeing some 150 patients a week, he has found his own “mission field.”  “The more I have invited heaven into the operating and exam room, the more healing power I have seen at work—and the more others have recognized the hand of God” (p. 73).  Having personally seen prayers answered for patients who were in comas (even for one man definitively pronounced dead) he confidently attests to the reality of Near Death Experiences validating the reality of heaven—the “really real world.”     

While witnessing many miracles, however, Crandall had to watch his own son fail in his struggle with leukemia.  Even though his “miracle research” prompted him to believe “Chad could be healed, and that prayer was a means to it” (p. 55) his son died.  Trusting medical science as well as prayer, he and his wife secured the best care possible, including a bone marrow transplant from his twin brother.  They tried everything!  In the midst of many dark hours, they sensed God’s presence, though when Chad died Crandall “see-sawed between numbness and anger” (p. 151)—inevitable, human feelings.  In the end:  “Chad’s battle was over.  We as a family had fought our fight with everything we had, and while the enemy may have been rejoicing, thinking that cancer had won, we knew the truth:  Chad was now in heaven’s care—now fully healed” p. 157).  

# # # 

296 Rebuilding the Culture

    Anthony Esolen, a highly-regarded, scholarly translator of Dante’s Divine Comedy, has written a number of general interest works, including The Politically Incorrect Guide to Western Civilization—a stirring defense of Christian Culture as well as the civilization derived from Jerusalem, Athens, and Rome, which developed in Europe during the Middle Ages.  For 25 years Esolen taught courses in English and Western Civilization at Providence College, though he was just recently forced to leave after writing an article entitled “My College Succumbed to the Totalitarian Diversity Cult.”  (He’s learned that one dare not challenge the secular dogmas now reigning in academia, even in allegedly Catholic institutions!)   

Many of Esolen’s core convictions give structure to his just-published Out of the Ashes:  Rebuilding American Culture (Washington, D.C.:  Regnery Publishing, c. 2017).  He begins with a caveat, promising to “indulge myself in one of civilized man’s most cherished privileges.  I shall decry the decay of civilization” (#56 in Kindle).  Doing so, he identifies with the ancient historian Livy, writing at the time of Christ, who lamented Rome’s moral collapse, “with duty and severity giving way to ambition, avarice, and license, till his fellow Romans ‘sank lower and lower, and finally began the downward plunge which has brought us to the present time, when we can endure neither our vices nor their cure’” (# 59).  Though both Livy and Esolen doubtlessly exaggerate the cultural decay of their eras, both merit careful reading and reflection regarding their concerns, for:  “Sometimes entire civilizations do decay and die, and the people who point that out are correct” (#107).  In fact: “Winter comes and goes in the affairs of men and nations and cultures, and if they are to survive at all they must plant seeds:  they must remember.  What happens if they neglect the planting” (#131).  So along with alerting us to the culture’s decadence, Esolen wants to challenge us to faithfully plant good seeds in well-tilled, healthy soil, patiently awaiting their flowering.    

To do we must implement the title of his first chapter:  “Giving Things Their Proper Names:  The Restoration of Truth-Telling.”  Created in God’s image, Adam was tasked with seeing the essence of and accurately naming other creatures.  Confucius rightly noted “that the beginning of wisdom is to give things their proper names” (#259).  Nevertheless, the history of our race reveals a perennial proclivity for lying!  In our day, for example, “pro-choice” devotees routinely lie when describing the unborn babies they want to kill—they endlessly talk about “reproductive rights” and female freedom.  Then we’re told anyone can choose any “gender” he desires and that “a woman can make as good a soldier as a man” (#311).  Given such wide-spread deceits, we must become counter-cultural and honestly describe things as they are.  “Things, in their beautiful and imposing integrity, do not easily bend to lies,” says Esolen.  “A bull is a bull and not a cow.  Grass is food for cattle but not for man.  A warbler is alive but a rock is not.  The three-hundred-pound stone will not move for a little child or a boy or a feminist professor.  Water expands when it freezes and will break anything unless you allow for that.  Things are what they are.  They know no slogans, and they do not lie.  And they give witness to the glory of God” (#487).  

Further witnessing to the glory of God, we must restore a “sense of beauty.”  In The Strange Death of Europe, Douglas Murray laments the state of modern art, which “nearly all has the aura of a destroyed city.”  Forsaking transcendent meaning or truth, today’s artists “stop aiming to connect to any enduring truths, to abandon any attempt to pursue beauty or truth and instead to simply say to the public, ‘I am down in the mud with you.”  In particular,” he says, modern art “has given up that desire to connect us to something like the spirit of religion or that thrill of recognition—what Aristotle termed anagnorisis—which grants you the sense of having just caught up with a truth that was always waiting for you” (#4783).  

Though highly advanced in many ways, we are literally starved for beauty.  Esolen seriously ponder the example of Henry Adams—the son of President John Quincy Adams.  Visiting the Great Exposition in Paris in 1900, where he pondered the panoply of technical marvels on display, he fled a few miles west to take refuge in Chartres’ glorious medieval cathedral.  The difference between the artistry evident in Chartres and the mechanical genius on display in Paris moved him to write his classic study— Mont St. Michel and Chartres.  Though very much a skeptic in regards things theological, Adams sensed the almost infinite distance between the beauty of an edifice devoted to God and the whirling machines devoted to human consumption.  “‘Four fifth of [man’s] greatest art,’ said Henry Adams, was created in those supposedly dark days, to the honor of Jesus and Mary.  The Enlightenment destroyed more great art than it produced, and what the harbingers of the novus ordo saeclorum did not get around to destroying they slandered” (#624).  Recognizing this, we must begin patiently planting the seeds of beauty, especially in our churches.  Truly beautiful poetry and music must be reinstated in our “worship” centers, where too often the tawdry, tasteless, and momentarily fashionable hold sway.  

Persuaded that “a mind is a terrible thing to baste,” Esolen urges us to set about “restoring” schools and colleges to their rightful place in our culture.  This is not to say he favors funding the public schools, which are beyond reform!  Indeed, he argues the one-room schools a century ago did a better job of educating youngsters than do today’s massive consolidated training centers.  “A monstrous thing has taken its place—not just a parasite or a cancer feeding off the host, but a disease that has slowly transformed the host into itself, like an all-eating and all-digesting alien.  The word school remains, but not the reality” (#861).  Distressed at the impoverished language skills of his university students, he concludes they have learned “no grammar in grammar school,” so it’s evident “there is not much school there, either” (#854).  Failing to teach grammar, our schools rob our students of the chance to master the English language.  Failing to emphasize the names and dates of history, our schools graduate youngsters without any knowledge of the past.  And most importantly, by eviscerating religion from the curriculum, the schools are trying “to win a temporary consensus by sacrificing what the education of a human being ultimately is for.  We avoid religious questions at the cost of avoiding the most human questions.  And thus education, which should be human, is reduced to the mechanical and the low” (#1178).  Similarly he finds the nation’s colleges little better than the schools.  Institutions once dedicated to the pursuit of truth through free inquiry now serve as censors, enforcers of political correctness.  In prestigious universities, such as Princeton, students who once studied Shakespeare now slouch through courses on Young Adult Fiction!  The motto of both Harvard University and the author’s own Providence College is Veritas:  Truth.  “The old mottoes assumed the existence of God, the moral law, and the beauty of pursuing truth” (#1266).  While still engraved in stone, such mottoes no longer describe the modern universities.  They no longer attain their ends.  Consequently, if we want to rightly educate our children we must build new schools and colleges, clearly committed to the treasures of Western Civilization that will nourish youngsters’ souls.  

Much Esolen desires can be attained through “repudiating the sexual revolution:  restoring manhood.”  Indeed, Esolen insists, “Christians must repudiate the whole sexual revolution.  All of it” (#1582).  Recovering the biblical distinction between men and women and restricting behaviors in accord with their natures will prove difficult in 21st century America, but it simply must be done.  “We have to recover a strong sense of the beauty of each, and of their essence as being-for the other; man is for woman, and woman is for man, and both are for God” (#1635).  Countering our gender-bending society, determined to sanction something without precedent in human history, we must create ways for boys to become men.  Without strong, masculine, patriarchal leaders, our culture cannot be revived.  Esolen urges us to “take an honest look at what happens when men retreat from the public swuare.  You do not get rule by women.  You get anarchy” (#1763).  To see this up close, simply visit sections of Chicago any day of the week!  

Manly men, virtuous men who know that “truth is more important than feelings,” are in short supply these days!  In 1886, penning The Bostonians, Henry James envisioned the disastrous impact of  “the most damnable feminization” that would result if feminism prevailed.  Speaking through his protagonist, Basil Ransom, he said:  “‘The whole generation is womanized; the masculine tone is passing out of the world; it’s a feminine, a nervous, hysterical, chattering, canting age of hollow phrases and false delicacy and exaggerated solicitudes and coddled sensibilities, which, if we don’t look out, will usher in the reign of mediocrity, of the feeblest and flattest and the most pretentious that has ever been.  The masculine character, the ability to dare and to endure, to know and yet not fear reality, to look the world in the face and take it for what it is . . . that is what I want to preserve, or rather, as I may say, to recover’” (#1886).  

And good men will restore the womanhood needed to make homes for families.  Men build houses, but women transform them into homes, the most human of places.   Countering a culture that urges young women to ape men, Esolen urges them to be thoroughly feminine, but not feminists!  Their naturally compassionate, nurturing hearts thrive “best at the hearth, the bedside, the table.  It is the passionate self-giving that makes the home” (#2065).  Women’s home-work, especially rearing children, has a limited focus but unlimited value.  Always remember:  “The world hates the family.  The state is the family’s enemy.  The state grows by the family’s failure, and the state has an interest in persuading people that the family can do nothing on its own.  It hates fatherhood, and makes little pretense otherwise.  It hates motherhood, though it makes a show of championing the unwed mother as well as the mother who, as the ugly phrase puts it, ‘has it all,’ though a moment’s reflection should suffice to show that no one can give his or her all to a career and a family and the local community” (#2188).   

Following discussions of work, leisure, and politics, Esolen finishes his treatise by proposing we  embrace the life of  “pilgrims, returning home,” singularly intent on reaching heaven.  “The pilgrimage was the way of the Cross”—quite different from the current progressives’ endeavors to construct a heaven-on-earth; it requires “you to bend your knee in penitence for your sins” rather than blaming others (past and present) for whatever’s wrong with the world (#3050).  Christians must acknowledge that the world is not our home, and we can help it only by being truly Christian, marked with the Character of Christ.  “He who would save a culture or a civilization must not seek first the culture or the civilization, but the Kingdom of God, and then all these other things, says Jesus, shall be given unto him as well” (#3163).  

* * * * * * * * * * * * * * * * * * * * 

Currently the editor of First Things (my favorite journal), R. R. Reno has written extensively for both scholarly and popular audiences.  As this century dawned, he published In the Ruins of the Church:  Sustaining Faith in an Age of Diminished Christianity (Grand Rapids:  Brazos Press, c. 2002), seeking “to provide spiritual guidance to Christians seeking faithfulness within increasingly dysfunctional churches” (p. 13).  Like Nehemiah of old, he argued we must settle into the ruins of Jerusalem (or the Church) and rebuild her walls.  Reared an Episcopalian, he wrestled with the somber truths regarding his denomination’s disunity and decay.  What was needed was “ressourcement, a return to the sources” (p. 94), preeminently the Scripture, Richard Hooker and the ancient Fathers.  But despite his yeoman-like effort to propose reform within his denomination, there was a latent pessimism underlying his words, leaving the reader wondering if the faith could be sustained.  Thus it was not particularly surprising when Reno was received into the Roman Catholic Church in 2004, explaining:  “as an Episcopalian I needed a theory to stay put, and I came to realize that a theory is a thin thread easily broken.  The Catholic Church needs no theories.”  

Reflecting Reno’s recent position, Resurrecting the Idea of a Christian Society (Washington:  Regnery Faith, c. 2016) acknowledges a “dark side to our national character,” a poverty that is spiritual and ethical rather than economic. “Many now live without a Father in heaven.  Political correctness denies the patrimony of a workable cultural inheritance.  For an increasing number of young people, there’s not even a father at home.  A nation of orphans, literal or metaphorical, will not long endure” (#55).  Surfeited with “health, wealth, and pleasure,” many of us have little interest in either transcendental realities or the needs of our fellow men.  But we desperately need a society that “encourages human flourishing to the degree that the supernatural authority of God’s revelation is proclaimed and the natural authority of his creation sustained” (#97).  Without seeking to legally establish our convictions, Christians should “say, out loud and with confidence, that we’re best off when we live under the authority of the permanence of marriage, accept the duties of patriotism, and affirm the supernatural claims the church makes on our souls” (#97).  

Thus there is, as Reno titles his first chapter, “The Need for a Christian Society.”  To understand this need in America, we must first understand what distinguishes this nation.  To Reno, what most  Americans value above all is the freedom long celebrated by frontiersmen—whether cowboys in Wyoming or the “New Frontiersmen” in John F. Kennedy’s White House.  “Live free or die!”  To “make something” of ourselves, to become “whatever we want to be,” rather defines the American way.  While restricted (as it was in the 18th and 19th centuries) to economic and political realms, such freedom incubated much good; it was primarily a positive “freedom for” human flourishing.  But it took unexpected turns—following a negative “freedom from” recipe in the 20th century—as increasing numbers of persons and groups declared their determination to secure various kinds of “rights.”   Increasingly, folks justified licentiousness (e.g. fornication) and violence (e.g. abortion) while mouthing relativistic and reality-defying slogans.  Yesterday’s “liberals” have become “progressives,” promoting same-sex marriage and “transgender” rights, thus seeking “freedom from human nature itself, a goal that fosters a Jacobin spirit determined to destroy all that stands in its way” (#325).  “The moral relativist is defending freedom, the freedom to define moral truth for oneself” (#308).  

However alluring—however embedded in our national consciousness—such “freedom” is fundamentally false.  “Freedom properly understood is based in a pledge of loyalty, not a declaration of independence.  Our country’s freedoms arise from eternal verities affirmed, not ties severed.  As the Declaration of Independence says, ‘We hold these truths to be self-evident.’  The first and fundamental act is holding, not choosing, standing fast in truth, not making it up.  We are freest when we acknowledge the authority of the truth, not when we seek a god-like independence from all limits” (#408).  Rather than trusting ourselves, we need to listen to seasoned authorities who prescribe wholesome ways to live, for “Our American dream of freedom will become a nightmare if we do not put it in the loyal service of something greater than ourselves” (#479).  

That “nightmare” stands revealed in Charles Murray’s Coming Apart:  The State of White America, 1960-2010, wherein he describes ominous cultural trends, including the abolition of marriage, the scarcity of good jobs (especially for men), and the absence of religion in working class communities (deemed the “weak” by Reno, since they have been mistreated by the elites ruling the nation).  We’re witnessing a class-war between highly-educated elites who control the nation and ordinary folks who must suffer their policies.  Importantly:  “The weapon of mass destruction in our war on the weak has been moral relativism, heedlessly deployed by an elite culture in love with critical strategies for disenchanting old, inherited moral norms” (#629).  More desperately than food stamps and unemployment benefits, America’s working class needs “clear rules that direct them toward decisions that help them lead dignified lives” (#807).  But our academic and cultural elites, determined to impose their toney nonjudgmentalism on the nation, refuse to sanction such rules.  

Confronting the plight of the nation’s poor, Christians must respond appropriately.  Above all this means:  “A Christian society judges nonjudgmentalism unjust” (#808).  A century ago, when the “poor” were economically deprived, many Christians embraced the “Social Gospel” and sought to improve their material conditions.  “Today’s social gospel movement must have the courage to be judgmental” (#1051).  With today’s “poor” suffering from nonjudgmental moral relativism, lacking social capital rather than financial capital, Christians face a more cerebral task, needing to craft a cogent rationale for spiritual renewal.  We “who care about the teaching of Jesus must reckon with a singular fact about American poverty:  its deepest and most destructive effects, its most serious deprivations, are not economic but moral” (#871).  If you’re so inclined, give money to rescue missions or volunteer with Jimmy Carter in building a house with Habitat for Humanity, but “you will do more for the poor by resisting nonjudgmentalism.  Exercising the preferential option for the poor means having the courage to use old-fashioned words such as ‘chaste’ and ‘honorable,’ putting on a tie, turning off trashy reality TV shows, and maintaining standards of deportment.  It means restoring a public culture of moral and social discipline” (#900), including pro-marriage legislation and back-to-basics curricula in the public schools.  

Most fundamentally, Christians rebuild the culture by supporting the Church.  Church-going folks promote a good society.  Religious people donate more time and money to community endeavors than their secular counterparts.  Their generosity flows from their theological convictions.  Loving God and their neighbors, they inevitably promote the commonweal, for they care for both the local congregation and the world-wide Christian community.  They are both appropriately patriotic and concerned for global needs (amply evident in missionary and humanitarian endeavors).  Christians following Jesus in our materialistic culture must commit to openly seeking more than health, wealth, and pleasure.  Confronting a materialism that “denies the existence of higher things, and relativism denies we could know about them even if they did exist” (#1849), we must defend those “higher things” basic to a truly good life.  

Doing so, there’s “the possibility of a Christian society.”  Despite the many laments regarding the decline of religion in America, there remains a “committed core” of Christians—roughly one-fourth to one-third of the populace—who can provide the needed leaven for a resurgent, deeply Christian culture.  These church-going, bible-believing folks repudiate the sexual revolution, reject same-sex marriage, oppose divorce, and condemn the notion that “as long as we don’t hurt others, we should be able to live however we want.” In short:  “Because they have a culture, the Faithful can be countercultural” (#2220).  Consequently they generally must live “on the peripheries of cultural and institutional power” (#2250).  But throughout history creative minorities have been the salt and light of the world, and there’s a world of potential in these faithful followers of Jesus.  

Concluding his treatise, Reno admits that:  “It’s easy to be demoralized.  Many powerful forces want to make us ‘dhimmis,’ the Muslim term for non-Muslims who are tolerated as long as they don’t evangelize or challenge the supremacy of Islam” (#2344).   But we must take heart—the Church of Jesus Christ has endured for 2,000 years, and there’s good reason to hope she will continue to thrive, in various ways and various parts of the world, until He comes again.  

# # # 

295 Vanishing Adults

 One of today’s most accomplished United States Senators, Nebraska’s Ben Sasse, has recently published The Vanishing American Adult:  Our Coming-of-Age Crisis—and How to Rebuild a Culture of Self-Reliance (New York:  St. Martin’s Press, c. 2017).  He felt impelled to write this treatise by the growing conviction that “our entire nation is in the midst of a collective coming-of-age crisis without parallel in our history.  We are living in an America of perpetual adolescence.  Our kids simply don’t know what an adult is anymore—or how to become one” (#47 in Kindle).  He realized this problem while serving as the president of Midland University (affiliated with the Evangelical Lutheran Church in America) in Fremont, Nebraska, where many students seemed adrift and unable to assume adult responsibilities.  He also awakened to the fact that his three “pampered daughters” seemed unprepared to flourish in the world awaiting them.  

Sasse devotes the first section of his book to “the problem:  How do we know the situation with our kids has really gotten worse” (#146).  In essence, the problem is the passivity evident in J. M. Barrie’s story of Peter Pan, who wanted to neither attend “‘school and learn solemn things’” nor to “be a man’” (#206).  He wanted to be (in the words of Bob Dylan) “forever young,” without tasks or accountability.  This is something new in America, where children early worked with their parents and, Alexis de Tocqueville observed, “appeared not to need an adolescent stage at all” (#599).  Still more:  Peter Pan represents, for Sasse, the alarming fact that “No civilization has ever embraced endless adolescence” (#219).  Throughout human history children moved rather rapidly through adolescence to adulthood; they  aspired to be adults and early imitated their ways.  (Today, strangely enough, many adults remain childish and seek to dress and behave in accord with their fashions!)  

It’s clear that American youngsters are getting “softer.”  Childhood obesity has skyrocketed from less than one in twenty 50 years ago to one in five today.  The toy industry, which hardly existed a century ago, now hauls in a billion dollars a year.  Historically undetected behavioral problems now require a bewildering mixture of medications, running from Ritalin to Prozac to Xanax.  Video games, for many, have replaced physical activity.  “Fully one-quarter of Americans between age 25 and 29 now live with a parent—compared to only 18 percent just over a decade ago” (#673), and only 23 percent of them were married.  Some scholars predict that fully one-fourth of the Millennials will never marry.   Whereas 50 years ago fully 90 percent of collegians attended religious services, fully 35 percent of today’s Millennials have no religious ties. They seem to be psychologically vulnerable, seeking “safe spaces” and sensitive to a variety of “micro-aggressions” that hurt their feelings.     

Representing—and rather responsible—for this cultural upheaval, Sasse thinks, is John Dewey, considered by many “America’s foremost philosopher.”  He is, without question, the father of the “progressive” educational agenda now reigning in the nation’s schools, and ultimately “he is responsible for allowing schools to undermine how Americans once turned children into adults” (#425).  To Dewey, school was not envisioned as “an instrument supporting parents” by teaching youngsters reading, writing and arithmetic.  Neither was it a place to master classical or modern languages, nor to understand history and philosophy.  Rather, the school was to be an agency of the state seeking to shape evolving youngsters into effective workers and citizens.  Though Dewey may not have intended his child-centered program to prolong adolescence, that’s what took place as the 20th century ended.

Since America’s schools contribute to the problem, Sasse argues entitles one chapter:  “More School Isn’t Enough.”  We need to take seriously Mark Twain’s quip:  “I never let school interfere with my education.”  As the son of teachers, Sasse treats them respectfully, but all too often our public schools, as Paul Goodman said, engage in “compulsory mis-education.”  We expend lots of money and accomplish little!  During the past 30 years federal spending on education has quintupled without securing any measurable effect—the U.S. now ranks 20th in science and 27th in math on international tests.  Kids spend more time in classrooms than ever before, “yet they leave high school for college or the workforce less prepared and less able to cope with the next stage of their lives” (#1172).  As A Nation at Risk warned in 1983, “‘a rising tide of mediocrity . . . threatens our very future as a Nation and people.’  The authors cried out:  ‘If an unfriendly foreign power had attempted to impose on America the mediocre educational performance that exists today, we might well have viewed it as an act of war’” (#1206).  To counter Dewey’s “progressive” educational philosophy, Sasse turns to  Dorothy Sayers, who penned “The Lost Tools of Learning” in 1948.  Sasse considers it “the most important essay on education written in the last century,” and she is the “patron saint of the educational philosophy underpinning this book” (#1363).  Sayers sensed that the “‘artificial prolongation of intellectual childhood and adolescence into the years of physical maturity’” would lead to irresponsibility and societal decline.  Only by recovering the “lost tools of learning” embodied in the trivium/quadrivium-based classical curricula can this trend be reversed.  

Having assessed the problem, Senator Sasse turns to exploring solutions.  Begin, he says, by minimizing, if not eliminating, the “age segregation” which seems closely correlated with antisocial behavior.  Until modern times, multigenerational families lived and worked together, providing healthy routes to adulthood for the young.  Living in a segregated “youth” culture, today’s youngsters know very little about adults and rely unwisely on their peers.  Attending “youth” worship services in church, for example, they fail to learn how wiser and more experienced persons approach God.  Significantly, a study by the Fuller Youth Institute found:  “‘involvement in all-church [intergenerational] worship during high school is more consistently linked with mature faith in both high school and college than any other form of church participation’” (#1532).  A truly good education should bring together the elderly and the youthful.  As Cicero said, in On Old Age:  “‘People who say there are no useful activities for old age don’t know what they’re talking about.  They are like those who say a pilot does nothing useful for sailing a ship because others climb the masts, run along the gangways, and work the pumps while he sits quietly in the stern holding the rudder’” (#1577).  Especially important, in the wake of the Sexual Revolution, older folks need to provide healthy perspectives on marriage and family sorely lacking in today’s youth culture.  

Then we must help youngsters “embrace work pain” in order to grow up.  Senator Sasse is close enough to Nebraska’s soil to appreciate the “hardness” of farm work that made the state!  At the tender age of seven he was sent by his parents to “walk beans” in a local soybean field, learning first-hand the meaning of hard work, and he’s come to believe, with Theodore Roosevelt that “Nothing in this world is worth having or worth doing unless it means effort, pain, difficulty.”  Consequently, he and his wife agreed to send one of their daughters to work for on a Nebraska ranch for a month—soon discovering the necessity of manual labor.  But when he became a college president Sasse found the 21st century students arriving at Midland University with no appreciation  of the pain required to do good work.  As a 37 year old man he was taken aback to find Midland’s students markedly different from those of his generation.  These “Millennials” shunned responsibilities and mainly relished “sleeping in, skipping class, and partying” (#2072).  To a degree they are rightly branded “needy, undisciplined, coddled, presumptuous,” (#2078) and unable to meet adult expectations.  

To successfully transition to adulthood, our youngsters also need to “consume less.”  To do so runs counter to one of the most marked traits of the Millennials, who generally think buying things, getting more stuff, will make they happier.  But despite this nation’s great wealth, surveys show Americans to be less happy than they were half-a-century ago.  We’ve not learned, with Socrates, that:  “He who is not contented with what he has would not be contented with what he would like to have.”  Somehow we, and our children, need to learn that life is truly satisfying to the extent we produce goods rather than consume them.  Along with consuming less, we need to travel more—to learn, by personal discovery, how cultures differ, why world works as it does.  As Mark Twain wrote:  “Broad, wholesome, charitable view of men and things cannot be acquired by vegetating in one little corner of the earth all one’s lifetime.”  

Then we need to “build a bookshelf” suitable for living well.  “Critical, engaged reading skills are not a luxury, but rather a necessity for responsible adults and responsible citizens” (#3511).  We absolutely need a “rebirth of reading” if we’re to flourish as a people.  But younger folks are reading less and less.  (I recently saw a New York City commuter who’d been riding the train into the city for 30 years; in earlier years, most every passenger would be reading a newspaper or magazine, but now they all seem to be playing games on hand-held electronic devices!)  Intimidated by the likes of Jesse Jackson, leading Stanford University students chanting “Hey, hey, ho, ho, Western culture’s got to go!,” American colleges and universities have dramatically  eliminated required courses in both foreign languages and the humanities, thus effectively eliminating expansive reading experiences, provoking scholarly protests in the 1980s from E.D. Hirsh (Cultural Literacy) and Allan Bloom (The Closing of the American Mind).  While various writers may propose somewhat different lists of books everyone should read, Senator Sasse cites 60 basic texts he finds truly worthy of consideration—ranging from Homer and Aristotle in the ancient world to Dietrich Bonhoeffer and Martin Luther King in the 20th century.  Each of us would, if we resolved to address the task, set forth a different set of essential books for ourselves and our families.  What would really help, however, is if we would simply do so!  If not, Sasse’s list is a helpful place to start!  

Obviously concerned with the state of the American union, Senator Sasse concludes his work by urging readers to join him in strengthening our culture by helping our children become responsible adults and thereby strengthening the republic.  This is a fine treatise, attuned to significant issues, deserving widespread reading, reflection, and discussion.  

* * * * * * * * * * * * * * * * * * * * * *

For many years Christian Smith, currently a professor at Notre Dame, has released scholarly sociological studies detailing evident characteristics in young Americans.  In Lost in Transition:  The Dark Side of Emerging Adulthood (New York:  Oxford University Press, c. 2011), he and a team of researchers provide information essential for understanding this nation’s 18-23-year-old “emerging adults.”  Though Smith commends much in this demographic group, Lost in Transition focuses only on the darker side of their portrait—their “mistakes and losses, trials and grief, confusions and misguided living” (p. 3).  The youngsters we encounter are the beneficiaries of both higher education and their parents’ willingness to care for them well into their 30s; they are delaying marriage, in part because of widely-available contraceptive technologies; and they struggle to adjust to the realities of a global economy that makes employment increasingly problematic.  They have, still more, to a large degree embraced many of the postmodern views (ethical relativism and multiculturalism) espoused by thinkers such as Nietzsche and Derrida and popularized by MTV and simplistic high school teachers.  

Given their postmodern views, many emerging adults are morally adrift, embracing varieties of ethical relativism, one of their more “unsettling” traits.  One third of them claim not to know why anything is right or wrong!  Sixty percent of them take “a highly individualistic approach to morality.  They said that morality is a personal choice, entirely a matter of individual decision.  Moral rights and wrongs are essentially matters of individual opinion” (p. 21).  One interviewed woman thinks stealing is wrong—at least for her!  But if others steal it’s not really wrong— just a “dumb thing to do.”  With moral decisions reduced to personal opinions, no one should “judge” another person’s behavior and there is no need to work for any social consensus on moral standards.  One woman even refused to condemn mass-murdering terrorists!  In her opinion:  “‘It’s not wrong to them.  They’re doing the ultimate good.  They’re just like, they’re doing the thing that they think is the best thing they could possibly do and so they’re doing good’” (p. 28).  Just live according to your notion of “good” and keep quiet regarding anyone else’s!  Many emerging adults know nothing of moral philosophy, traditional religion, or anything other than their inner feelings.  Nearly three-fourths of them simply follow their “instincts,” apparently thinking moral knowledge is innate and intuitively knowable.   Lacking objective moral standards, they think “anything could be morally right, then, as long as someone believes it” (p. 29).  Shocking though it may seem, that’s “the professed outlook of nearly one-third of emerging adults today” (p. 29).  Though a third of emerging adults want to reject such extreme relativistic thinking, they lack the “moral-reasoning skills” to do so.  A substantial minority of the respondents did refer to God or the Bible, but they frequently lacked the conceptual skills to draw upon their religious traditions when explaining their ethical views.  

Their failure to reason well results, Smith suggests, from the “multicultural” indoctrination they receive in virtually all American schools.  If different cultures have different moral standards, they must all be accepted and respected in accord with their own perspectives.  To avoid being labeled a “racist,” emerging adults are ready to approve virtually anything done by groups different from theirs.  This squares with their commitment to what philosophers call “positive law” rather than the “natural law” espoused by classical and Christian thinkers.  Positive law is whatever a regime (whether hereditary or democratic) decrees.  Thus what may be right Stalin’s USSR could be wrong in Roosevelt’s USA; what was wrong in the 19th century (e.g. abortion) becomes right when the Supreme Court decrees it in the 20th.  

From the authors’ perspective, “the widespread moral individualism and solid minority presence of moral relativism among emerging adults today tells us that the adult world that has socialized these youth for 18 to 23 years has done an awful job when it comes to moral education and formation” (p. 60).  “They are morally at sea in boats that leak water badly” (p. 60).  Especially important is the lack of those “intellectual virtues” Aristotle mandated in his Ethics.  The Postmodern contempt for any form of realism has seriously truncated the Millennials’ reasoning skills.  “Central to many of the confusions in emerging adult moral reasoning is the inability to distinguish between objectively real moral truths or facts and people’s human perceptions or understandings of those moral truths or facts.  The error of not distinguishing these two things is this:  the realities themselves are confused with, and therefore dependent upon, people’s cognitive grasp of them.  What actually exists is conflated into what is believed to exist” (p. 61).  Consequently, as a society we face a huge task.  As the noted philosopher Charles Taylor observed:  “‘We have to fight uphill to rediscover the obvious, to counteract the layers of suppression of modern moral consciousness’” (p. 69).  

In addition to moral relativism, today’s emerging adults are generally devout consumers, especially of the high-tech and entertainment items that have emerged during their lifetimes.  Consequently,  “between one-half to two-thirds of emerging adults said that their well-being can be measured by what they own, that buying more things would make them happier, and that they get a lot of pleasure simply from shopping and buying things” (p. 71).  Intangible goods seem irrelevant to them, since less than ten percent “spoke of knowing God or making God proud, deepening their life of faith, or being more religious” (p. 105).  They want stuff, not spirit!  They give little thought to their acquisitiveness, considering it as normal and inescapable as breathing.  Though a few aspects of mass consumption may trouble them, they see no need to change anything, guided as they are by some of the “key assumptions of liberal individualism” (p. 80).  “All that society is, apparently, is a collection of autonomous individuals who are out to enjoy life.  The idea of people changing their own lifestyles or of mobilizing for collective social or economic change is nearly unimaginable” (p. 86).  

Illuminating this consumerist mentality, one respondent said:  “‘A good life for me would be to have more than enough money than I actually need, and live like a kid the rest of my life.  That would be my little heaven in today’s reality.  Yeah, it’s consuming a lot of stuff, but at the same time, if you can afford it, what is money anyway?  Money is meant to be spent, so why not?  You only live once, and if you have the chance to live in excess, why not?’” (p. 95).  This attitude explains their utilitarian approach to education.  “Not many emerging adults talk about the intrinsic enrichment of an education, of the personal broadening and deepening of one’s understanding and appreciation of life and the world that expansive learning affords.  Few emerging adults talk about the value of a broad education for shaping people into informed and responsible citizens in civic life, for producing members and leaders of society who can work together toward the common good” (p. 101).  They go to school for one reason:  to get a good (i.e. well-paying) job.  

When getting more stuff fails to make them happy, many emerging adults turn to the timeless  illusions of wine, women, and song!  They routinely seek to get “high, stoned, buzzed, and drunk” (p. 110).  Rather than drinking in moderation, significant numbers of them routinely engage in binge drinking and smoking pot when partying—ways to escape their “boring” lives.  They also seek satisfaction in sexual engagements.  “What were once daring and rebellious acts of ‘love’ outside of committed relationships have now for many emerging adults become routine, almost pedestrian” (p. 148).  Yet, though trumpeted as “liberating” and “fun,” the hook-up culture has proved deeply disturbing—especially for women.  “We were struck by the number of very traumatic breakups that we heard described in interviews, since we assumed that emerging adults generally want to hold off on seriously committed relationships.  But the truth is that, while most emerging adults do want to hold off on marriage, many of them—again, particularly women, it appears—also long for the kind of intimacy, loyalty, and security that only committed relationships can deliver” (p. 154).  The authors conclude that “the sexual revolution’s promise of easy, safe, uncomplicated, fulfilling, casual sex” has dramatically failed (p. 176).  It’s failed simply because it cannot alter one of the most basic aspects of human nature—the need for fidelity and permanence in sexual relations.  Sadly enough:  “not far beneath the surface appearance of a happy, liberated emerging adult sexual adventure and pleasure lies a world of hurt, insecurity, confusion, inequality, shame and regret” (p. 195).  Finally, today’s emerging adults seem unusually disengaged from the “civic and political” world.  “Citizenship is not a word in their vocabularies” (p. 223).  Self-absorbed, uninformed and apathetic, they take little interest in community, church, or national affairs, volunteering little of their time and contributing none of their money.  

No one interested in solid data regarding today’s Millennials can ignore Lost In Transition.  As a nation, we’ve failed to provide the nurturing institutions and winsome mentors obviously needed by the younger generation.  And though Christian Smith admits to not knowing exactly what to do, speaking the truth is the first step in finding a cure to the “malady” crippling our emerging adults. 

294 “Strangers in a Strange Land”

  Church history records the incessant fluctuations—the triumphs and setbacks, the flourishing and decay—of the Body of Christ.  Throughout the past century, first in Europe and now in America, we have witnessed a cascade of alarming losses experienced by the Roman Catholic Church, the mainline Protestant denominations, and now many of the the hitherto robust evangelical American churches.  A spate of recent treatises document and endeavor to explain what’s happened—primarily to the largest of these communions, the Catholic Church, but extending to others as well—and generally offering suggestions as to what’s to be done.   

Among the most notable is The Decline and Fall of the Catholic Church in America (Manchester, New Hampshire:  Sophia Institute Press, c. 2003) by David Carlin, a sociology and philosophy professor whose articles have appeared in publications as disparate as First Things and the New York Times.  As a committed Catholic, he’s dismayed by what’s happened but feels impelled to deal honestly with it.  Thus he argues:  “The root problem is that the Catholic Church in the United States has largely ceased to be Catholic,” turning itself into a culturally-acceptable and innocuous “generic Christianity or Christianity-in-general” (#34 in Kindle)—one of many declining “denominations.”  By discarding one “offensive” dogma after another, the Church now finds itself standing for nothing distinctively Catholic, softly proclaiming little more than “a gentle wish:  ‘Can’t we all just be nice to one another?’” (#44).  

Lacking a distinctive message, the Catholic Church has dramatically been imploding for 50 years.  Easily accessible data reveal the startling decline of weekly church attendance (from around 75 percent in 1965 to 25 percent today), parochial schools (4.5 million grade-school students in 1965, 1.9 million in 2002), monastic communities, and priestly vocations (slipping from 49,000 in 1965 to 4,700 in 2002).  Large numbers of professing Catholics no longer support traditional doctrines (e.g. the Trinity, Incarnation, Resurrection, Real Presence) or ethics (e.g. condemning of cohabitation, contraception, divorce, abortion, homosexuality).  Should this trajectory continue, Carlin fears, the Church will simply wither away, along with mainline Protestant denominations she’s chosen to imitate.   

The Catholic collapse resulted, Carlin thinks, when three currents converged “to produce the ‘perfect storm’”—1) implementing “the spirit of Vatican II;” 2) escaping the Catholic “ghetto;” and 3) the ‘60s’ cultural (i.e. sexual) revolution.   Before the Second Vatican Council, Catholics moved within a Church unchanged for many centuries, but suddenly, in accord with its “spirit,” much in their “immutable” faith seemed up for grabs.  The largely ethnic ghettos formed by turn-of-the-century Catholic immigrants, providing nurture and comfort, dissolved in the ‘60s as Catholics (graduating from elite universities and working in successful corporations) shed their Irish or Italian identities and defined themselves as fundamentally American.  Their freedom to thrive as Americans was signaled by John F. Kennedy’s election in 1960, a testament to their acceptance in this country as well as an opportunity to blend in with their fellow citizens.  Then the cultural revolution of the ‘60s and ‘70s—a sustained rebellion against authority of any sort— “blindsided” the Church. 

Probing these phenomena for deeply philosophical perspectives, Carlin identifies the Cultural Relativism that “seduced a generation” as one of the primary reasons for the Catholic collapse.  University students exposed to the anthropological works of Ruth Benedict and Margaret Mead came to believe cultures shape persons and different cultures prescribe and approve dramatically different, purely man-made moralities.   What’s right within one culture might be considered wrong in another—and there’s no transcultural standard whereby behaviors can be condemned.  Along with Cultural Relativism, Ethical Emotivism was widely embraced.  Influential philosophers declared every one should simply follow his feelings, consulting his heart when making choices.  More than a bumper sticker, “If it feels good, do it” became a prescription for morality!  Finally, fearing to be identified as an illustration of The Authoritarian Personality (written by members of the Frankfurt School who had emigrated to the United States), many Catholics spurned conservative traditions and mouthed the “Question Authority!” mantra.  

Such developments firmly established Secularism as “the dominant American paradigm” by 1970.  Its “antinomian moral theory entailed a rejection of a long list of traditional religion-based moral rules” regarding sexual behavior and targeted the “family ideal as downright oppressive”—especially to women who needed to be freed from the shackles of patriarchy.  Celebrating tolerance as its singular ideal, Secularism powerfully impacted all segments of American society, which quickly cast loose from its religious anchors.  To Carlin, the 1962 Supreme Court’s Engle v. Vitale decision (banning prescribed prayers in the public schools) marks the triumph of a militantly secular movement in this nation.   Gaining momentum, secularists worked to dismantle the traditional Judeo-Christian moral consensus which had shaped the country.  Thus the prayer-ban Court decision was soon followed judicial edicts legalizing contraception, abortion, and (just recently) same-sex marriages.  Morality to many Americans became mainly a matter of personal preferences—following what Carlin identifies as  the “Personal Liberty Principle” (PLP).  

Conservative (Evangelical and Pentecostal Protestant, Traditional Catholic) Christians certainly rallied to oppose this anti-Christian secularist agenda, sparking the “culture war” that still divides America.  But numbers of Liberal Christians in both mainline Protestant and Roman Catholic circles easily embraced it and precipitated thereby the radical decline in both numbers and doctrinal integrity they have suffered.  Liberal churches, Carlin believes, will inevitably fade away.  And conservative churches, to survive, must awaken to the the threats they face from today’s Secularism.  It’s an enemy which must be clearly identified and vigorously resisted.  Rather than adjust to the world, churches that survive must defy it, living in accord with Supernatural, rather than natural, standards.  

Carlin’s treatise is remarkably clear and cogent.  Though focused upon the Catholic Church, his analysis easily extends to all Christian churches.   And while basically pessimistic, his counsels and suggestions are worth heeding.  

* * * * * * * * * * * * * * * * * * * * * * * * *

One of the most widely discussed recent publications is Rod Dreher’s The Benedict Option:  A Strategy for Christians in a Post-Christian Nation (N.Y.:  Penguin Random House, c. 2017).  Over the decades, Dreher moved through Roman Catholicism and Evangelicalism to finally join the Russian Orthodox Church.  A respected journalist and unabashed believer, he writes sorrowfully, lamenting the catastrophic losses Christendom has recently experienced and believing that the “culture war that began with the Sexual Revolution in the 1960s has now ended in defeat for Christian conservatives” (p. 3).  Consequently, a nihilistic secularism prevails.  Not only have abortion, cohabitation, and same-sex marriage gained sanction, but today’s Millennials seem unusually disinterested in the Christian faith and have virtually no knowledge of its content.  Philip Rieff’s telling insight—“The death of a culture begins when its normative institutions fail to communicate ideals in ways that remain inwardly compelling”—seems sadly confirmed.  We face challenges comparable to those Christians such as St. Augustine faced as the Roman Empire collapsed during the fifth century.  

Whereas Augustine faced Vandals literally battering down the walls of his city as he died in 430 A.D., we confront home-grown, anti-Christian barbarians produced by important historical developments:  1) the 14th century’s emergence of philosophical nominalism; 2) the 16th century’s Protestant-driven fragmentation of Christendom; 3) the acidic impact of the 18th century’s Enlightenment; 4) the 19th century’s Industrial Revolution; and 5) the 20th century’s Sexual Revolution.  “Now we are on the far side of a Sexual Revolution that has been nothing short of catastrophic for Christianity.  It struck near the core of biblical teaching onset and the human person and has demolished the fundamental Christian concept of society, of families, and of the nature of human beings.  There can be no peace between Christianity and there Sexual Revolution, because they are radically opposed.  As the Sexual Revolution advances, Christianity must retreat—and it has, faster than most people would have thought possible” (p. 202).  

More profoundly, the Faith that fomented Western Civilization has been sidelined by a secular humanism that makes Man, not God, its ultimate concern.  Thus Supreme Court Justice Anthony Kennedy, determined to forever establish abortion as a constitutionally guaranteed right in, declared, in Planned Parenthood vs. Casey:  “At the heart of liberty is the right to define one’s own concept of existence, of meaning, of the universe, and of the mystery of human life.”  A nation committed to such a precept will have little patience with orthodox Christians, and Dreher says:  “The church, a community that authoritatively teaches and disciples its members, cannot withstand a revolution in which each member becomes, in effect, his own pope. Churches . . . that are nothing more than a loosely bound assembly of individuals committed to finding their own ‘truth,’ are no longer the church in any meaningful sense, because there is no shared belief” (p. 44).  

The time has come, Dreher thinks, to radically separate from this sinful world and singularly seek to be the church, challenging rather than cooperating with mainstream social structures.  “Rather than wasting energy and resources fighting unwindable political battles, we should instead work on building communities, institutions, and networks of resistance that can outwit, outlast, and eventually overcome the occupation” (p. 12).  This involves embracing what he calls the “Benedict Option,” a proposal that grew out of his reading of Alasdair McIntyre’s suggestion in After Virtue (his pivotal treatise on ethics); McIntyre said cultural barbarians have again inundated Western Civilization and it’s time to await “a new—doubtless very different—St. Benedict,” leading us to build monastic preserves devoted to maintaining truly Christian faith and practice.  

To better understand St. Benedict (a sixth century reformer), Dreher travelled to Norcia, Italy, and visited with a dozen (mainly young American) monks who recently reopened the ancient monastery, 200 years after it had been closed by Napoleon.  There he saw the ancient Benedictine Rule, blending prayer and manual labor, carefully followed.  Though the Rule was intended for monastics, its truth can easily be extended to any Christian community (family; school; church) committed to shaping its life in accord with love for God and man.  Politically, this means abandoning the effort to “take back America” and follow the examples of dissidents within Communist countries (bearing witness to eternal truths—“living in truth,” as did Vaclav Havel), and fighting for religious liberty.  It also leads to homeschooling or establishing classical Christian schools for children, living prayerfully, creating a robust Christian culture.  Above all, it means making family (a “domestic monastery”) and church the absolute foci of everything we do.      

For those interested in joining Dreher and embracing the Benedict Option, he provides examples and resources.  Clearly there are small communities around the world, such as Tipi Loschi in Italy and the Saint Constantine School in Houston, who are committed to living out their faith in radically countercultural ways.  And though the Benedict Option will never be embraced by large numbers of Christians it remains a viable means whereby the Faith is preserved and transmitted to coming generations.  

* * * * * * * * * * * * * * * * * * * * *

Philadelphia’s Archbishop Charles J. Chaput has many concerns for the future of Christianity, but in Strangers in a Strange Land:  Living the Catholic Faith in a Post-Christian World (New York:  Henry Holt and Company, c. 1917), he balances those concerns with a robust confidence in the strength of both individual believers and the Church herself to overcome them.  He especially urges us to put things into perspective, noting how dramatically a truly global Christianity has emerged during the past century.  Thus:  “In Africa, 9 million converts enter the Catholic Church each year.  By 2030, if current trends hold, China may have the largest Christian population the world” (p. 1).  In Europe and America the churches may be struggling, but around the world they may be enjoying their finest hour!  And despite much bad news, there’s much encouraging news in both Europe and America as believers creatively respond to our postmodern and increasingly post-Christian world.  

Rather than despair, Chaput urges us to remember that there have always been, as Augustine taught, two cities—the City of God and the city of man.  “We are born for the City of God.  The road home leads through the City of Man.  So we are strangers in a strange land, yes” (p. 246).  But the Church has been forever attacked and sometimes withered away in various geographic regions.  For 2000 years the Church of Jesus Christ has endured—and surely she will do so unto He returns.  Unlike Rod Dreher, who takes St. Benedict as his exemplar, Chaput celebrates St. Augustine, a bishop caring for his flock in the North African city of Hippo.  “For Augustine, the classic civic virtues named by Cicero—prudence, justice, fortitude, and temperance—can be renewed and elevated, to the benefit of all citizens, by the Christian virtues of faith, hope, and charity.  Therefore, political engagement is—or at least it can be—a worthy Christian task” (p. 14).  Despite its many flaws, this world is still a good world—what Augustine called a “smiling place.”  Despairing at the conditions of society can be as sinful as despairing of one’s own salvation.  “As Augustine said in his sermons, it’s no use complaining about the times, because we are the times.  How we live shapes them” (p. 17).   

Turning to our country, the United States, Chaput urges us to remember our godly heritage, honoring what’s good before railing against what’s bad.  Though never perfect, this nation has embraced the Christian religion and encouraged its “free exercise.”  Protestants and Catholics alike have supported America’s guiding principles, routinely giving thanks for the freedoms they enjoyed in this great land.  As late as 1955, a leading Jesuit, John Courtney Murray, could still assert that the American commonwealth “‘is not intelligible and cannot be made to work except by men who possess the public philosophy’ that the founders first brought to building it.”  Inasmuch as possible, it remains our task to recover the Founders’ vision and make sure the constitutional republic they established will survive.    

Yet times have changed, Chaput acknowledges, honestly documenting the many harmful cultural currents which have eroded much of the nation’s spiritual and ethical landscape.  As would be expected of a Catholic bishop, he devotes considerable attention to the baneful consequences of the Sexual Revolution and the dissolution of the family.  But he also looks more deeply, lamenting changes in our educational system which shows little interest in any search for ultimate Truth.  “What the modern world really wants, as Josef Pieper said, “‘is flattery, and it does not matter how much of it is a lie’” (p. 226).  Indeed, we’re surrounded by what Scott Peck described as the People of the Lie.  But a healthy society requires the careful use of words, and Pieper noted that “‘the abuse of political power is fundamentally connected with the sophistic abuse of the word.’  And the degradation of man by man, and the systematic physical violence against human beings, have their beginnings ‘when the word loses its dignity,’ because ‘through the word is accomplished what no other means can accomplish, namely, communication based on reality’” (p. 122).  

Amidst all the dreary details portraying a post-Christian world, it’s easy to despair and retreat to well-fortified cultural castles.  We must honestly assess and respond to the challenges we face, knowing that the “Church of tomorrow won’t look like the Church of today, much less of memory” (p. 187).  It may very well be smaller and poorer, but it can become more disciplined and effective.  Neither despair nor isolation are options for Chaput.  Christians necessarily have hope because Jesus arose from the grave!  “This small moment, unseen by any human eye, turned the world upside-down and changed history forever” (p. 146).  As a supernatural virtue, hope enables us to see everything in the light of eternity, never despairing of what God may in fact bring to pass.  Thus it’s our duty, John Henry Newman said, to set forth on “‘ventures for eternal life without the absolute certainty of success’” (p.152).  Despair results from trusting ourselves.  Hope springs eternal because we trust God.  Trusting God means following His precepts, summed up so powerfully by Jesus in the Beatitudes, to which Chaput devotes many pages, and embracing the call to holiness as have saints throughout the centuries.  

For guidance in the 21st century Chaput finds fascinating clues in a second century document, The  Letter to Diognetus—a wonderful manual for Christians marching as pilgrims though a hostile land.  In that ancient letter we’re reminded that the Christian Faith is not a man-made construct.  Rather it was given us by the “Creator of all, the invisible God himself, who from heaven established the truth and the holy incomprehensible word among men, and fixed it firmly in their hearts.”  So Christians live normally, following the daily customs (food; drink; clothing, work) of their countrymen.  “They marry, like everyone else, and they beget children, but they do not cast off their offspring.  They share their board with each other, but not their marriage bed.  

It is true, the Letter to Diagnetus says, that Christians are “‘in the flesh,’ but they do not live ‘according to the flesh.’  They busy themselves on earth, but their citizenship is in heaven.”  In sum:  “What the soul is in the body, that Christians are in the world.  The soul is dispersed through all the members of the body, and Christians are scattered through all the cities of the world.  The soul dwells in the body, but does not belong to the body, and Christians dwelling the world, but do not belong to the world.  The soul, which is invisible, is kept under guard in the visible body; in the same way Christians are recognized when they are in the world, but their religions remains unseen.”  Yet even when unseen they animate and uplift the world.  Loving God and others, Christians are truly the leaven, the salt, and the light of the world.  Not always triumphant, they are called not to succeed but to remain faithful, bearing witness to the Gospel—that greatest of all truths, the Good News the world always needs to hear, especially the post-modern world that despairs of any truths at all, much less one overarching Truth.  Our task, as John Henry Newman said, is “not to turn the whole earth into a heaven, but to bring down a heaven upon earth’” (p. 218).  

To do that, knowing that “beauty is the battlefield where God and Satan contend for the hearts of men” as Dostoyevsky said, part of our task is to preserve and cultivate beauty.  Wherever we encounter it, beauty points upward, symbolizing a transcendent Reality.  Discerning beauty in God’s creation ennobles us and should lead us to tend it wisely.  “Thus the spoiling of the earth with waste and the brutalizing of our human habitats with ugly art and buildings are not just clumsy mistakes of progress, but desecrations.”  Militant Muslims vandalize Buddhist and Roman monuments; iconoclastic Puritans eviscerated cathedrals; “transgressive” modern artists defile the “image of God” in their depictions of human beings.  Western Christian Culture was a God-centered culture, but “God has never been more cast out from the Western mind than he is today.  Additionally, we live in an age when almost every scientific advance seems to be matched by some new cruelty in our entertainment, cynicism in our politics, ignorance of the past, consumer greed, subtle genocides, posing as rights like the cult of abortion, and a basic confusion about what—if anything distinctive at all—it means to be human” (p. 229). 

We must remind the world of what it means to be human!  The world needs no more “love, sweet love,” but Christians who live out the Love of God, bearing witness to the eternal verities of the City of God while living responsibly and robustly in the city of man, the best counsel available in our trying times.

293 What’s Happened to the University?

In What’s Happened to the University:  A Sociological Exploration of Its Infantilisation (New York:  Rutledge, c. 2017), Frank Furedi appraises developments during the past 50 years in institutions of higher learning.  He began his academic life as a student in 1965 and is now Emeritus Professor of Sociology at the University of Kent in the UK.  In his student days, universities were open to new ideas and touted the virtues of free speech and challenging ideas.  Subsequently, however, they became “far less hospitable to the ideals of freedom, tolerance and debate than in the world outside the university gate.  Reflecting on this reversal of roles has come about is the principal objective of this book” (p. vi).   Furedi’s distressed that students now seek to ban books that threaten their vulnerable psyches and protest speakers who  might offend a variety of sexual and ethnic groups.  The free speech mantras of the ‘60s have turned into speech codes; the former devotees of free speech have frequently become, as powerful professors, enforcers of censorship.  “Safe spaces,” “trigger warnings,” “microagressions” and “chill out rooms” (replete with play dough and “comfort” animals to relieve anxieties) indicate how many universities have in fact become infantilized.   Thus:  “Harvard Medical School and Yale Law school both have resident therapy dogs in their libraries” (p. 27).  

In some ways this culminates a project educators launched in the 1980s, making “self-esteem” their summum bonum.  Feelings, above all, must be massaged and potential hurts (e.g. poor grades or athletic defeats) eliminated.  Protecting children became a parental obligation easily transferred to the schools.  Parents now accompany and hover over children entering the university.  Administrators serve in loco parentis, not as they did a century ago, by regulating campus behavior, but by protecting students’ feelings, especially if they self-identify as members of certain “vulnerable groups.”  Wellness clinics, counseling services, ethnic and same-sex study centers all cater to psychological or emotional rather than intellectual needs.   Treating students as “biologically mature children, rather than young men and women, marks an important departure from the practices of the recent past” (p. 7).  As one might anticipate, “the more resources that universities have invested in the institutionalization of therapeutic of therapeutic practices, the more they have incited students to report symptoms of psychological distress” (p. 46).  

An incident at Yale University in 2015 illustrates this.  A university committee issued guidelines regarding appropriate Halloween costumes.  One faculty member, Erika Christakis, posted an email suggesting “that ‘if you don’t like a costume someone is wearing, look away, or tell them you are offended’ and concluded that ‘free speech and the ability to tolerate offense are the hallmarks of a free and open society’” (p. 17).  Students then denounced Christakis and her husband (a psychology professor who defended her) for racial insensitivity.  Yale’s President, Peter Salvoes, promptly met with tearful undergraduates and shared their felt distress.  Though not dismissed from their positions, Erika and Nicholas Christakis soon left Yale, casualties of the raging intolerance now widespread in academia.  Another incident further illustrates campus conditions.  “Caroline Heldman, a professor in Occidental University’s politics department, recalled that some of her students began experiencing PTSD-related episodes in her classes:  ‘there were a few instances where students would break down crying and I’d have to suspend the class for the day so someone could get immediate mental health care.’  Her antidote to this problem was to introduce a trigger warning on her course” (p. 42).  

  What really matters these days is one’s racial or sexual identity.   “Universities are singularly accommodating to the objectives of cultural crusaders” (p. 65).  To identify as an African-American or Native American or gay man or lesbian woman grants one status and authority quite apart from whatever one may think or say.  In addition, it’s especially important to stress the “victim” status of one’s group, even if the only obvious victims were ancestors who lived decades if not centuries ago.  Doing so enables one to invoke “social justice” and demand preferential treatment of some sort.  “Social justice” increasingly means protesting historic policies and personalities.  So students at the University of Missouri demanded a statue of Thomas Jefferson be removed from campus because he owned slaves.  Schools must be renamed if they memorialize anyone tainted with racist or sexist traits.   Selected cultures must be sacrosanct, making intolerable any “appropriation” of their dress, music, or food.  So many campus cafeterias dare not feature Mexican or Asian food lest students remonstrate!   And, importantly, only women can speak for women, only blacks for blacks, only Indians for Indians!  Authority comes purely from one’s ancestry, not from any scholarly expertise.  Consequently:  “The reverential and self-righteous tone of cultural crusaders echoes the voice of traditional religious moralists” (p. 64).  

To provide “safe space” for culture groups leads to self-segregated dormitories, and there are now dorms reserved for blacks and other minorities at elite schools such as UC Berkeley and MIT!  These “safe spaces” for students protect them from  psychic and emotional hurts, shoring up their fragile self-esteem.  No debates are allowed, lest someone be judged wrong!  On many campuses, the notion that “criticism is violence” has gained traction, so teachers are warned to avoid even evaluating their students!  “It is an article of faith on campuses that speakers who espouse allegedly racist, misogynist or homophobic views should not be allowed to speak” (p. 103).  Challenging speakers, such as Heather Mac Donald and David Horowitz, are shouted down or prevented from appearing on campuses, for they might distress the feelings of some groups.  Advocates of safe spaces insist that “tolerance, affirmation and respect” therein provide a good environment for learning, though no empirical studies demonstrate such.  In fact, from Socrates onward it’s been assumed that learning advances when one is forced to examine his beliefs and test his presuppositions with a commitment to embracing even uncomfortable truths.  

Conjoined with “safe spaces” are the efforts to censor free speech which have accelerated since 1980.  Certain words simply cannot be uttered!  Though profanity (as traditionally understood) flourishes in dormitories and classrooms, legions of taboo words are now forbidden.  Thus one may no longer refer to his “wife”—though “partner” is allowed.  In elite universities one may proudly be a “Native American” but never an “Indian.”   “Censorship, which was once perceived as an instrument of authoritarian attack on liberty, is today often represented as an exercise in sensitive behavior management” (p. 102).  Even threatening ideas must be policed, with professors issuing “trigger warnings” that exempt sensitive students from exposure to them!  Classic texts, ranging from Sophocles’ Oedipus the King to Mark Twain’s Huckleberry Finn to J. D. Salinger’s Catcher in the Rye are now suspect!  Feminists especially object to reading classic texts they brand misogynist.   

Thus “microagressions,” even though unintentional and even unconscious, cannot be tolerated.  Lurking behind hurtful words there must be gravely immoral thoughts!   “You can’t think that” is now an acceptable policy on some campuses.  According to one influential theorist:  ‘“Many racial microaggressions are so subtle that neither target nor perpetrator may entirely understand what is happening.’”  But this may well make them “more harmful to people of color than hate crimes or the overt and deliberate acts of White Supremacists” (p. 119).  Generally speaking, only the ones who suffer from these verbal assaults really understand their evil.  An offense is in they eyes of the beholder!  Students on many campuses are now demanding the right to anonymously inform on their professors’ microaggressions and “Bias Response Teams” have been formed to enforce proper discipline on them.  To prevent hurt feelings, for example, UCLA now “prohibits people from asking Asian-Americans the question ‘Where are you from or where you born?’” lest they feel non-American.   Nor can you say “America is a land of opportunity” lest someone feel that such is not true for him (p. 109).  Correcting a student’s grammar may lead to complaints of “white privilege” and racial bias.  

The culmination of these developments, Furedi says, is “the quest for a new etiquette.”  Traditional ways, including chivalrous conduct, have generally dissolved.  To replace them we find what Jurgen Habermas “‘described as the juridification of everyday life’” (p. 125).  Yet exactly what kinds of behavior may now be condemned or approved and enacted into law remains undecided.  Administrative decrees, more psychological than philosophical in justification, seek to regulate activities but lack deeply moral (and especially religious) justification, so they quickly change and often defy common sense.   “The rhetoric of campus guidelines tends to avoid the language or right and wrong or good and evil, appealing instead to the therapeutic language of feelings” (p. 128).  To make sure feelings are protected, universities employ numbers of sensitivity experts and trainers and workshop “facilitators” to raise “awareness” and enforce speech codes and punish microagressions.  Millions of dollars are yearly expended to deal with “sexual harassment” complaints.  Students must be properly acculturated to the modern ethos, so Cambridge University now promotes “events ‘to celebrate Lesbian, Gay, Bisexual and Transgender (LGBT) History Month, Black and Ethnic Minority (BME) History Month, International Women’s Day (IWD), International day of Persons with Disabilities (IPDP) and Holocaust Memorial Day (HMD)” (p. 135).  No victim groups may be ignored lest someone’s self-esteem decay!  

None of us really knows where this will all end.  But Umberto Eco was certainly prescient when he said that “‘even though all visible trees of 1968 are gone, it profoundly changed the way of all of us, at least in Europe, behave and relate to one another.  He added that ‘relations between bosses and workers, students and teachers, even children and parents, have opened up,’ and that therefore,’they’ll never be the same again’” (p. 134).  If Furedi’s right, universities have wasted their patrimony and may never regain their rightful place in modern culture.  

* * * * * * * * * * * * * * * * * * *

During spring break in 2006, the captains of Duke University’s lacrosse team hired two strippers, including twenty-seven-year-old Crystal Magnum, to perform at an off-campus party.  Such events were not particularly notable, since 20 or so had occurred at the university that year.  But Magnum subsequently claimed to have been raped, provoking a sensational series of events carefully recorded by Stuart Taylor Jr. and KC Johnson in Until Proven Innocent:  Political Correctness and the Shameful Injustices of the Duke Lacrosse Rape Case (New York:  St. Martin’s Press,  c. 2007).   Added to the incident itself, illustrating the sexual tone of today’s universities, it’s a disturbing story laced with racial tensions, political aspirations, faculty prejudices, administrative cowardice, and media malpractice.  Though dense with details, the book fully engages the reader, alerting him to some the troublesome aspects of 21st century culture.

Though best known for its basketball prowess, Duke’s lacrosse team was a perennial powerhouse, routinely competing for national championships.  Accordingly, the team featured many fine athletes, often graduates of elite prep schools where lacrosse was emphasized.  These athletes were, moreover, generally outstanding students, bound for the graduate and professional schools which train the doctors and lawyers their parents envisioned.  The stripper, on the other hand, had a checkered background, marked by a failed marriage, illegitimate children, prostitution, and mental problems.  But she was poor and black, born and reared in Durham.  And the lacrosse players, with one exception, were white, the sires of wealthy families.  The two strippers’ performance lasted all of four minutes, in part because Magnum was apparently too drunk to stand, much less dance.  She and her colleague, Kim Roberts, departed the house, though Magnum  passed out on the back stoop.  She said nothing to Kim about being raped, nor did she say anything to a security guard who subsequently called 911 and tried to help her, nor to the police who responded.  Taken to the hospital, she was examined by doctors and nurses, who found no signs of rape.  Finally, however, a feminist nurse who considered herself an advocate for rape victims filed her own report, and it became the basis for later rape accusations.  Throughout the process Magnum’s story continually changed, so it was not clear exactly what had transpired with the lacrosse team. 

When police received information regarding the incident, a Durham detective well-known for his antipathy to Duke students took charge of the investigation.  He had no interest in interviewing Kim Roberts, who best knew what actually happened.  When another policeman interviewed her, six days after the alleged rape, Roberts declared the sexual assault story “a crock,” and her handwritten statement “contradicted Magnum on all important points” (p. 57).  The lead detective also refused to consider any data regarding Magnum’s career as a prostitute, though in time one of her associates testified to “taking her to jobs in three hotels with three different men” on the nights preceding the lacrosse party.   The detective also failed to interview the doctor who had actually performed the pelvic exam when Magnum was admitted to the hospital.  When shown pictures of all the lacrosse players, she could not identify any of them with certainty, and one of those she fingered was nowhere near the party that night.  

Taking an even greater interest in her case was District Attorney Michael Nifong, who envisioned it leveraging his political career in the coming election.  He needed the support of the black community in Durham as well as the liberal professors at Duke, so he quickly discerned how supporting Crystal Magnum’s rape accusations would ultimately enable him to win the upcoming election.  In a series of inflammatory press releases, Nifong branded the Duke athletes “rapists” fully deserving the vigorous prosecution he would pursue.  Many of his statements were demonstrably false, but newspapers and media outlets across the nation soon picked up on the case, almost unanimously assuming the guilt of the players accused.  Virtually everywhere there was a simple objective:  “Lynch the privileged white boys.  And due process be damned” (p. 121).  Writers for the New York Times and TV personalities such as Nancy Grace and Joe Scarborough cheered the mob of outraged folks determined to punish the “rapists.”  Few journalists cared to find the truth!  (Amazingly, the most balanced publication dealing with the case was Duke’s student newspaper!)   Inevitably, Jesse Jackson showed up, trumpeting his support for an abused black woman, and the local NAACP applauded Nifong’s every move!  

So too the Duke administrators (most especially President Richard Brodhead) and professors (especially from the African-American and women’s studies programs) began to loudly denounce the lacrosse players, apparently committed to the notion that any woman claiming to have been raped must be telling the truth.  Here was an illustration of the “morality tale” of “virtuous black women brutalized by white men” (p. 66).  The Duke faculty launched hysterical attacks on the lacrosse team.  (Many professors simply resented the fact that many thought of Duke in terms of its athletes, while they wanted the institution to bask in an aura of academic excellence.)  Many had a deep commitment to the feminism on display in the yearly “Take Back the Night” rallies.  And virtually all of them wanted to publicly bear witness to their racial sensitivities and liberal proclivities.   To some teachers, the players should be punished for rape  “‘whether it happened or not’” since it would help compensate “‘for things that happened in the past’” (p. 170).   Even as evidence proving the athletes’ innocence steadily mounted, Duke’s professors “served as enthusiastic cheerleaders for Nifong,” and “for many months not one of the more than five hundred members of the Duke arts and sciences faculty—the professors who teach Duke undergraduates—publicly criticized the district attorney or defended the lacrosse players’ rights to fair treatment” (p. 105).  The more radical the professor (e.g. Houston A Baker, a past president of the Modern Languages Association) the more the mainstream media loved to interview him!  Long before the trial, these professors simply assumed the men were guilty—and, of course, an illustration of how America is a racist, sexist society!  Only one lonely professor, a chemist, dared stand up and defend his friend, the lacrosse team’s coach!  In the judgment of Thomas Sowell:  “‘The haste and vehemence with which scores of Duke professors publicly took sides against the students in this case is but one sign of the depth of moral dry rot in even our prestigious institutions’” (p. 117).  

Fortunately for the lacrosse athletes, several had parents with the means and connections to assemble a strong legal defense team.  These lawyers early saw the flaws in Nifong’s accusations and found solid evidence (especially DNA) upholding the innocence of their clients.  All of the players cooperated with the police, submitting to lie detector exams and volunteering the blood samples requested for DNA tests, which proved to be the “biggest defense bombshell, since the State Bureau of Investigation reported that “‘no DNA material from any young man tested was present on the body of this complaining witness’” (p. 162).  Then the athletes’ attorneys demonstrated “the staggeringly conclusive evidence of innocence, and of probable Nifong misconduct” (p. 302).   Violating an operating rule for prosecutors, Nifong had refused to even look at evidence collected by defense attorneys, something “unheard of” in legal circles, pushing his case through a grand jury and bringing it to trial.  But in time the evidence would, in fact, become public and the athletes were vindicated.  

Cracks in the prosecution’s case began with blogs such as Liestoppers dissecting the mainstream media’s presentations.  Articles in the New York Times were shown to be filled with egregious errors, deliberately omitting crucial evidence countering Nifong’s claims.  Then a few TV programs—most notably Sean Hannity’s—questioned the assumed guilt of the lacrosse athletes.   Students on the Duke campus—many resenting the malicious role their professors played in the process—increasingly sided with the team and believed that Crystal Magnum had lied.  Ultimately CBS’s 60 Minutes, after a lengthy investigation, declared “the rape claim was a fraud and Nifong was guilty of outrageous misconduct” (p. 282).   When Nifong faced the defense attorneys in a preliminary hearing, his case quickly unravelled.  It became clear that he and one of his expert witnesses had conspired to hide evidence, and he dropped the rape charge.  He “had engaged in grossly unethical—perhaps criminal—misconduct, and the case against the lacrosse players was a travesty” (p. 317).  He lost face, soon resigned his office, and would finally be disbarred.  His effort to punish the innocent “may well have been the most egregious abuse of prosecutorial power ever to unfold in plain view” (p. 356).  In sum, Nifong was guilty of “demonizing innocent suspects in the media as rapists, racists, and hooligans; whipping up racial hatred against them to win an election; rigging the lineup to implicate them in a crime that never occurred; lying to the public, to the defense, to the court, and the State Bar; hiding DNA test results that conclusively proved innocence; seeking (unsuccessfully) to bully and threaten defense lawyers into letting their clients be railroaded” (p. 356). 

But even more shameful than the district attorney was the Duke faculty and administration!  Even when the evidence proved the lacrosse athletes innocent, activist professors remained belligerent and unrepentant!  Eight-seven professors published a letter repudiating any efforts to make them retract or apologize for their slanders.  Instead, they attacked the bloggers, students, and journalists who defended the athletes.   So too the NAACP, The New York Times, and other powerful organizations refused to retract their slanders or seek to do justice to the maligned men.  Even if they did no wrong, it seems, they represent what’s wrong in this nation’s racist/sexist/classist society!   Anyone concerned with the justice in America needs to know what happened at Duke—and is still happening in other sectors of the USA.

292 “Lo, the Poor Indian”

Reflecting a pervasive Enlightenment perspective—and presaging Jean-Jacques Rousseau’s Romantic admiration for America’s “Noble Savage”—Alexander Pope, in his Essay on Man, declaimed: 

Lo, the poor Indian!  whose untutored mind

Sees God in clouds, or hears him in the wind;

His soul proud Science never taught to stray

Far as the solar walk or milky way; 

Yet simple nature to his hope has given,

Behind the cloud-topped hill, an humbler heav’n.   

Neither Pope nor Rousseau knew much about the New World’s indigenous inhabitants, but that didn’t dissuade them from making authoritative pronouncements, and similar ignorance has infected much that’s been written or portrayed about Indians ever since.  Thus today many folks imagine they understand them  as a result of watching a TV special on the Dakota Access Pipeline or listening to alleged “Native American” spokesmen leading protests in various locales.    

Illustrating this ignorance is the widespread circulation (especially in environmentalist circles) of an alleged statement made by Chief Seattle in 1851.  The quotation declared:  “Every part of this earth is sacred to my people.  Every shining pine needle, every sandy shore, every mist in the dark woods, every clearing and humming insect is holy in the memory and experience of my people.”  Seattle’s words, duplicated in many books and displayed on schoolroom posters, effectively persuaded many Americans that the First Americans were the First Environmentalists, carefully husbanding the natural world, walking softly on Mother Earth.  In fact, the speech was written in 1972 by a Texas scriptwriter working on a film produced by the Southern Baptist Radio and Television Commission!  It fit the mood of the moment, whether or not it had any historical veracity, and became part of the nation’s folklore!  Certainly it helped establish one of the many misleading stereotypes that in the long run serve to harm Indian people.  

Endeavoring to better root us in reality, Naomi Schaeffer Riley recently toured the United States and Canada gathering material for her insightful The New Trail of Tears:  How Washington Is Destroying American Indians (New York:  Encounter Books, c. 2016).   To understand anything we need first to describe it and then think clearly to explain it.  So Riley proffers careful descriptions accompanied by reasoned analysis.  Her descriptions remind us of similar accounts through the centuries—tribal peoples beset by a multitude of problems (including the highest poverty and lowest life expectancy rate of any racial group, shocking suicide numbers, alcohol and drug abuse, rape, sexual abuse and widespread gang activity).  Her analysis, however invites us to think hard about the glaring failures of latest in a long list of “saviors”—the federal government.  Rapacious frontiersmen and ruthless armies harmed Indians in the past, but today the primary culprit responsible for their predicament is the government, the pretentiously  benevolent Welfare State.  

“As you’ll see in this book,” Riley says, “the problems American Indians face today—lack of economic opportunity, lack of education, and lack of equal protection under law—and the solutions to these problems require a different approach from the misguided paternalism of the past 150 years.  It’s not the history of forced assimilation, war, and murder that have left American Indians in a deplorable state; it’s the federal government’s policies today” (#149).  More troubling:  Indians provide us a “microcosm of everything that has gone wrong with liberalism,” caused by “decades of politicians and bureaucrats showering a victimized people with money and sensitivity instead of what they truly need—the autonomy, the education, and the legal protections to improve their own situations” (#149).  

Consider this:  the federal bureaucracies charged with responsibility for the nation’s one million reservation Indians, the Bureau of Indian Affairs (BIA) and the Bureau of Indian Education (BIE), employ 9,000 employees—roughly one bureaucrat for every 100 Indians.  The feds’ funding “for education, economic development, tribal courts, road maintenance, agriculture, and social services—was almost $3 billion in 2015.  Consequently:  “Tribal leaders only demand more money from Washington to fix their problems.  And the senators and congressmen who represent them are only too glad to oblige return for the votes of the populations” (#2910).  Yet extraordinary unemployment rates, coupled with tribal ownership of land and reliable welfare payments, leave virtually all reservations poverty-stricken.  Lacking private property rights, reservation Indians (whose lands are tribally owned but held “in trust” by the federal government) almost inevitably suffer what economists call “the tragedy of the commons.”  Theoretically, everyone owns the land, but no one owns any actual parcel and takes no responsibility for any of it.  But everyone gets annuities (and in many areas, per capita dividends from tribal casinos) that provide subsistence without needing to work—and therein lies much that’s wrong with the reservations.  

Still more:  endless federal regulations dictate how reservation lands may be used—and make it virtually impossible to use it productively!  Entrepreneurs and venturesome economic projects inevitably run afoul of a nanny state determined to insure that Indians will always be the “Indians” suitable to bureaucrats who often operate in accord with sentimental myths rather than observable realities.  Thus, for example, Michelle Obama could tell a gathering of Indian youngsters that “‘on issues like conservation and climate change, we are finally beginning to embrace the wisdom of your ancestors’” (#567).  Had she simply driven through most any reservation she could have seen how little ancestral wisdom regarding the “sacred land” may be found in Indian country!  Here the results of the Obamas’ antipathy to developing natural resources can be demonstrated.  Reservations sit on enormous coal, uranium, oil and gas reserves, but ’”86% of Indian lands with energy or mineral potential remain undeveloped because of Federal control of reservations that keeps Indians from fully capitalizing on their natural resources if they desire’” (#450).   Even a superficial assessment of Indian affairs should persuade one that the money expended on behalf of the Indians hardly helps (and probably harms) them.

The greatest natural resource, of course, is people, and children must be well educated in order to develop their potential.  The Bureau of Indian Education (BIE), a notably inefficient bureaucracy,  expends about $850 million providing for its “42,000 students (most children on reservations don’t attend BIE schools), which amounts to about $20,000 per pupil, compared with a national average of $12,400” (#87).   Only half of the students in high school graduate, and those who do frequently have less-than-adequate skills.  Providing details, Riley sets forth a sobering assessment of the schools under federal jurisdiction.  In one school on the Crow reservation in Montana, for example, $27,304 per pupil was expended—compared with $10,625 in non-Indian state schools.  Yet the graduation rate was 39%!  There, and everywhere you look, Indian schools are “among the worst in the nation” (#1562).   In stark contrast, the Saint Labre Catholic schools in southeastern Montana serve 800 Crow and Cheyenne children.  These Catholic schools take no federal monies and do nicely, enjoying a dropout rate of only one percent!  And large numbers of their graduates go on to study in college.  (There are some bright lights in Indian country, but they’re rare.)  

Aware of the educational failures of reservation schools, distraught parents and students usually blame the lack of discipline and qualified teachers, as well as nepotism-infected tribal administrations, though they also point to the breakdown of the family as the primary culprit.  Some youngsters who graduate high school then attend one of the 32 federally-funded tribal colleges, where they often study tribal traditions or arts and crafts.  Rarely do they graduate and attend a university, nor do they learn much they can use apart from the reservation.  Sadly:  “Every school on the [Pine Ridge, Sioux] reservation is scrambling for teachers.  But the tribal school—Oglala Lakota College—doesn’t even offer a degree in secondary education” (#2258).  Rather than training youngsters to effectively help their people, most colleges cater to personal proclivities, often traditional arts and crafts.  Thus  “‘The Tribal Institute of American Indian Arts in New Mexico,’ according to the Atlantic, ‘spends $504,000 for every degree it confers . . . more than Harvard or MIT’” (#1578).  

The Indians doing the best these days are the ones whose descendants lost their lands in the 19th century—or individuals who leave the reservation and find their way in the broader culture.  Descendants of the Five Civilized Tribes in eastern Oklahoma are certainly prospering nicely when compared with the reservation-rooted Sioux and Navajo.  So too the Lumbees (in the Lumberton North Carolina area), lacking language, chiefs and tribal land, blended into the area’s population.  Fully assimilated, they supported a decent school system and also embraced the “passionate Baptist faith that, to a person, they today profess’” (#1292).  In one Lumbee’s opinion, their success resulted from the “tribe’s independence from the federal government.  ‘Indians had to pay for everything themselves here.  They had pride in the people who built it’” (#1300).  They could also own and develop, buy and sell land.  

The Lumbees weren’t wealthy, but they were doing okay.  Then politicians in Washington D.C. decided to help them!  Overwhelmed with liberal guilt following WWII, the feds decided to allow landless tribes to “reconstitute” themselves.  In 1975 President Nixon signed the Indian Self-Determination and Education assistance Act, opening the coffers for grants to law enforcement, education, and environmental programs.  Increasingly, Indians could qualify for generous welfare programs.  As a result, increasing numbers of younger Lumbees ceased working and now waste their days doing drugs.  Today’s schoolchildren are notably less well-educated than their grandparents!  Whereas churches used to help the needy, the government now hands out money and enables them to idly self-destruct.  An older Lumbee, Ronald Hammonds, a successful cattle farmer, laments:  “‘Women are encouraged to have babies.  It’s economic development.  You get a check.  We’ve got more illegitimate kids than ever, and it’s getting worse.’  He calls the local housing project a ‘breeding ground’ and says that the children are mostly being raised by their grandmothers.  ‘They’ve got no responsibility.  They’re looking for the government as the solution to all our problems’” (#1419).   The only answer to the many problems the Lumbees now face, Hammonds thinks, is to get the government out of their lives.  

And that’s basically the solution Riley recommends:  eliminate the dependency engendered by the reservations!  That would, of course, mean much anguish in Indian communities—and in the non-Indian liberals who empathize with them.  But it may be the only “tough love” way to free the most impoverished peoples in America.  Indicating how little things have actually changed in 150 years, read carefully the final paragraph in Our Wild Indians:  Thirty-three years; Personal Experience Among the Red Men of the Great West (Hartford, CN:  A. D. Worthington and Company, c. 1883).  Colonel Richard Irving Dodge, who knew the Indians as well as any 19th century writer and described them with a relentless honesty, harbored no romantic or humanitarian illusions regarding either them or their cultures.  “The only hope for the Indian,” he wrote, “is in the interest and compassion of a few men, who, like the handful of “Abolitionists” of thirty years ago, have pluck and strength to fight, against any odds, the apparently ever losing battle.  These in turn must rely upon the great, brave, honest human heart of the American people.  To that I and they must appeal to the press; to the pulpit; to every voter in the land; to every lover of humanity.  Arouse to this grand work.  No slave now treads the soil of this noble land.  Force your representatives to release the Indian from an official bondage ore remorseless, more hideous than slavery itself.  Deliver him from this pretended friends and lift him into fellowship with the citizens of our loved and glorious country” (#9377).  

* * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as the main focus of my graduate study at the University of Oklahoma was Western American History—writing my master’s thesis and doctoral dissertation on Cherokee history—I for many years often taught a class entitled “The First Americans.”  One of the books I either required or recommended was Dee Brown’s Bury My Heart at Wounded Knee:  An Indian History of the American West, though I warned students to take it as more of a pro-Indian polemic than balanced history.  Despite its bias, it presented the post-Civil War Indian wars in a very readable way and alerted readers to the mistreatment of tribal peoples.  Were I still teaching today, however, I’d have better source that covers the same terrain—and basically comes to the same conclusions—with more effort to understand both white and Indian perspectives. to see both good and evil in each group of people.  

It’s Peter Cozzens’ The Earth is Weeping:  The Epic Story of the Indian Wars for the American West (New York:  Alfred A. Knopf, c. 2016). Fortunately, says Cozzens, there are primary sources unavailable to Dee Brown and he can “tell the story equally through the words of Indian and white participants and, through a deeper understanding of all parties to the conflict, better address the many myths, misconceptions, and falsehoods surrounding the Indian Wars” (#351).  He provides in-depth descriptions and interesting details regarding Indian warriors’ training and skills as well as those of the U.S. Army recruits who opposed them.  Still more:  he effectively shows how Indians themselves (through intra-tribal rivalries and conflicts as well as inter-tribal animosities) contributed to their defeat.  In many ways the book simply fills in the details contained in a succinct statement made by Lieutenant Colonel George Crook, who fought many a battle with them:  “I do not wonder, and you will not either, that when Indians see their wives and children starving and their last source of supplies cut off, they go to war.  And then we are sent out there to kill them.  It is an outrage.  All tribes tell the same story.  They are surrounded on all sides,the game is destroyed or driven away, they are left to starve, and there remains but one thing for them to do—fight while they can.  Our treatment of the Indian is an outrage’” (#318).  

After setting the stage with a discussion of United States developments and policies, as well as Indians’ tribal traits and migrations onto the Great Plains, Cozzens turns to Red Cloud’s War in 1866.  Determined to halt the movement of miners into Montana’s gold camps, Red Cloud (leading Oglala and Miniconjou Sioux warriors) prevailed, defeating an army detachment at the Fetterman “massacre” and subsequently signed the second treaty of Fort Laramie in 1868, closing the Bozeman trail and securing for the Lakotas the “Great Sioux Reservation” (today’s South Dakota west of the Missouri River), to be maintained for their “absolute and undisturbed use and occupation.”  Red Cloud’s “victory” was a rare Indian triumph—and it hardly arrested the westward movement of Americans pioneers.   

The Lakota and Northern Cheyenne further enjoyed two brief victories in 1876—the battles at the Rosebud and the Little Big Horn in southeastern Montana.  At the Rosebud, General Crook was repulsed by warriors following Crazy Horse.  Days later, Colonel George Armstrong Custer led his Seventh Cavalry into the Little Bighorn region, where he encountered one of the largest encampments of Sioux and Cheyenne (7,000 Indians; 1800 warriors) ever assembled.  He’d bragged that his Seventh Cavalry could “whip all the Indians in the Northwest,” but at the Little Big Horn he proved himself a poor prophet.  Following Crazy Horse, Sitting Bull and Gall, the Indian warriors slew 258 troopers, losing only 31 of their own.  Following the battle, Sitting Bull said:  ‘“I feel sorry that too many were killed on each side.  But when Indians must fight, they must’” (#5249).  Custer’s last stand, however, was the northern tribes’ last stand, for the army thereafter sent column after column (frequently in winter, burning their lodges and food supplies) after the hostiles and effectively broke their will within a few years.   With the surrender of Crazy Horse, the last renegade Lakotas came to terms with the United States and accepted their lot as reservation Indians.  After taking refuge in Canada for a few years, Sitting Bull too surrendered early in 1881.  “‘Nothing but nakedness and starvation has driven this man to submission,’ concluded a sympathetic army officer, ‘and not on his  own account but for the sake of his children, of whom he is very fond’” (#6070).  

On the Southern Plains, at the same time, the Cheyennes and Arapahoes were defeated (in part by Colonel Custer’s massacre of Black Kettle’s peaceful village of Southern Cheyennes on the Washita River) and confined to a reservation in the western part of Indian Territory, to be joined soon thereafter by the Kiowa and Comanche (finally defeated in the Red River Wars in the 1870s).  Adding to the relentless might of the military, the Indians further faced the loss of the buffalo—the enormous herds that supplied their every need in 1865 were simply gone by 1875.  Buffalo hunters, killing the animals for their hides, nearly wiped out the species!  Hide hunters, Phil Sheridan said, did “more to settle the Indian Problem in two years than the army had done in thirty.  For the Sake of lasting peace, let them kill and skin until the buffalo are exterminated’” (#3100).  And without the buffalo, the Indians either starved or begged for rations from army forts.  

In the Far West, the Modocs were defeated in northern California.  The Nez Perces, led by Chief Joseph, were forced from their Washington homeland and conducted an epochal struggle, coursing through 1700 miles in Idaho and Montana before surrendering near the Canadian border.  The Utes of the Rocky Mountains were defeated and relocated in reservations in Utah and southern Colorado.  In the Southwest, the Apaches under Cochise and Victorio waged some resourceful guerrilla wars, but with the defeat of Geronimo’s small band in 1886 that region was pacified.  At the end, some 5,000 troops were involved in corralling eighteen warriors led by Geronimo and Naiche!  Though there is a certain aura around Geronimo, those who knew him best generally disliked him.  One Apache leader said:  ‘“I have known Geronimo all my life put to this death and have never known anything good about him.’”  The daughter of  Naiche  “agreed.  ‘Geronimo was not a great man at all.  I never heard any good of him’” (#7348).  Significantly, the troops  who most effectively hunted down the Apache bands were other Apaches, equally skilled in tracking and surviving in harsh environs.  General George Crook, one of the officers engaged in Indian wars for three decades, said:  “‘In warfare with the Indians it has been my policy—and the only effective one—to use them against each other’” (#7566).  

The post-Civil War conflicts in the American West was consummated in a massacre at Wounded Knee South Dakota in 1891.  Hundreds of despairing Lakotas had been captivated by a new religious movement, the Ghost Dance.  A Paiute medicine man, in Nevada, Wovoka, meshed native and Christian traditions and urged followers to dance incessantly to usher in a wonderful world devoid of white men and their oppression.  Though most Indians disdained the movement, fervent practitioners worried officials in the Indian Bureau, whose agents insisted the army suppress it.  In a convoluted chapter of the ferment, Sitting Bull was arrested and killed by Indian policemen.  Then a 65 year old Miniconjou chief named Big Foot decided to lead his band to safety on the Oglalas’ Pine Ridge Reservation.  Confusion and misunderstanding led to a violent confrontation along Wounded Knee Creek, and at least 150 Sioux (mainly women, children, and old men) died.  

Thirty years of Indian wars had ended.  And Peter Couzzens provides the most readable, accurate account of them I’ve read.  

# # # 

291 The War on Humans

   When, during the last presidential debate, Hillary Clinton defended all forms of abortion (the deliberate taking of an unborn, innocent human being’s life at any time in a woman’s pregnancy), she graphically illustrated her party’s position in this nation’s decades-long cultural war.  Though the defenders of life have won some important battles, pro-abortion forces still occupy commanding positions on the battlefield.  That truth is powerfully illustrated in Ann McElhinney and Phelim McAleer’s investigative treatise—Gosnell:  The Untold Story of America’s Most Prolific Serial Killer (Washington:  Regency Publishing, c. 2017).  Four things powerfully struck me while reading the book:  1) the sheer barbarity of the late-term abortions performed by Kermit Gosnell, M.D., in his Philadelphia Women’s Medical Society clinic wherein an estimated “40 percent of the babies aborted . . . were over the gestational age limit for legal abortion in Pennsylvania” (#2318); 2) the utter indifference and dereliction of state officials required to inspect and regulate abortion clinics; 3) the lock-step commitment of the nation’s media to ignore, obscure, or at least minimize Gosnell’s crimes; and, 4) the irony of some abortions (late-term) qualifying as murder whereas others (the million or so done yearly in Planned Parenthood facilities) have absolute legal protection.   As Kirstin Powers said, ‘“whether Gosnell was killing the infants one second after they left the womb instead of partially inside or completely inside the womb—as in routine late-term abortion—is merely a matter of geography.  That one is  murder and the other is a legal procedure is morally irreconcilable’” (#157).  

McElhinney and McAleer are Irish journalists who were drawn to the story by its intrinsic merit rather than because of any pro-life convictions.  Indeed, Anne McElhinney had “never trusted or liked pro-life activists” (#127).  Then, as she began covering Gosnell’s trial, she realized that “pro-abortion advocates tend to avoid any actual talk of how an abortion is done and what exactly it is that is being aborted.”  But now she knows!  And she also now knows that “what is aborted is a person, with little hands and and a face that from the earliest times has expression.  The humanity in all the pictures is unmistakable, the pictures of the babies that were shown as evidence in the Gosnell trial—first, second, and third trimester babies, in all their innocence and perfection” (#140).  While researching and writing she “wept at my computer.  I have said the Our Father sitting at my desk.  I am no holy roller—I hadn’t prayed in years—but at times” she could do nothing else.  Even more profoundly, she sensed “the presence of evil,” the sheer lack of conscience, pervading the pro-abortion establishment.

The Gosnell case began with a drug investigation launched by a Philadelphia undercover narcotics investigator, Jim Wood, who was getting drug peddlers to reveal the sources of illegal prescriptions for drugs like OxyContin.  A tangled web of informants led Wood to Dr. Kermit Gosnell, who turned out to be “one of the biggest suppliers in the entire state of Pennsylvania,” operating out of his Women’s Medical Society clinic (#302).  Therein investigators discovered far more than a drug emporium!   They found “a filthy, flea-infested, excrement-covered” abortion clinic almost impossible to describe.  Urine and blood discolored the floors; trash, cat excrement and hair littered the facility.  They found “semi-conscious women moaning in the waiting room.  The clinic’s two surgical procedure rooms were filthy and unsanitary,” featuring rusty equipment and non-sterile instruments (#452).  Unqualified, unlicensed staff members had administered sedatives and cared for the patients—one worker had an eighth-grade education and a phlebotomy certificate!  Another liked being paid in cash (and given free Xanax, Oxy-Contin, Percocet, etc.) because it enabled her to continue drawing fraudulent disability benefits from the Veterans Administration.  “The basement was filled with bags of fetal remains that reached the ceiling” (#527).   In a cupboard there were jars filled with little baby feet—apparently something of a fetish for Gosnell.  Dead babies were found in various containers, stored in refrigerators and freezers.  “Investigators found the remains of forty-seven babies in all” (#632).  It was truly a house of horrors!

Evidence collected from the clinic and Gosnell’s house, as well as testimony from his staff and patients, was presented to a grand jury, which spent a year combing through it.  “The final report, published on January 14, 2011, is a complete page-turner, a chronicle of how America’s biggest serial killer got away with murder for more than thirty years.  In its gruesome 261 pages, the grand jury named and shamed—and in some cases recommended charging—the doctor, his wife, and most of this staff, along with officials in numerous state government agencies, all the way up to the governor” (#798).  Indeed, it became clear that multiple complaints had been filed against the clinic for 30 years and  “the incompetents in Harrisburg, Pennsylvania’s state capital, knew or should have known that, even by their own lax rules, Gosnell should not have been carrying out abortions—but they didn’t care” (#1201).  Their dereliction was facilitated by the 1995 election of a “pro-abortion Catholic Republican,” Tom Ridge, whose policies proved “catastrophic for the many women and the hundreds of live babies who were injured and killed in Gosnell’s clinic” (#1473).  To one Philadelphia-area reporter, Ridge was “‘Gosnell’s chief enabler’” (#1480).  

Given the laws in Pennsylvania, to be charged with murder it was necessary to prove that some of the babies had been born alive and subsequently killed by Gosnell.  The case before the grand also had to be made before an openly pro-abortion judge who “was keen to draw attention away from the abortion establishment closing ranks, protecting one of their own and protecting abortion, regardless of the harm done on the way” (#1685).  Even the “partial-birth” procedure, whereby the baby’s head remained in the birth canal while the torso and legs were outside the mother, could not be labeled “murder” since the law allowed it.  Gosnell’s staff, however, testified to seeing many babies born alive and then killed (snipping their necks with scissors) by the doctor.  Importantly, for the trial, they had also taken some pictures of the slain babies that would provide vital evidence for the prosecution.   Ultimately he would be “charged with seven counts of first-degree murder and two counts of infanticide, and conspiracy to commit murder.  But from the evidence, it’s fair to assume that he murdered hundreds—perhaps thousands—over the course of his career” (#2684).  

Refusing a plea deal that would have led to his incarceration but spared his wife, Gosnell stood trial confident that he would be found innocent of all charges.  “His desire to appear as the smartest guy in the room overpowered all reason and good sense” (#2952), and he even fantasized serving as his own defense attorney!   His attorney, considered by many the best in Philadelphia, portrayed him “as a hardworking, selfless man—a pillar of the community with a virtually unblemished record who ran afoul of an overzealous prosecutor” (#3308).  Given a jury cleansed of pro-life persons and a pro-abortion judge who was a drinking companion with the defense attorney, Gosnell thought he could escape punishment by appealing to the pro-abortion ethos prevalent in progressive circles.  Nevertheless, as the evidence was presented and the expert testimony given, showing graphically what takes place in “late term” abortions as well as the killing of born-alive infants, the jury concluded Gosnell was in fact guilty as charged and he would be sentenced to life in prison.    

What the jury saw, however, went largely unreported by the nation’s media.  “If it hadn’t been for a committed group of bloggers, new media journalists, pro-life activists, and Twitter users, the Kermit Gosnell trial very likely would not have made national news” (#3983).  If a journalist mentioned the case it was usually to stress how virtually all other abortions were different from those performed in the Philadelphia clinic.  But then Kirstin Powers wrote a piece for USA Today, harshly condemning the press for neglecting the trial.  She’d found that none of the major TV networks mentioned the case during its first three months.  Nor did President Obama, who had “worked against the Born-alive Infants Protection Act” while he was in the Illinois Senate, make any comments or face any questions dealing with his position on Gosnell.  Only Fox News covered “the story from the beginning of the trial” (#4189).   The book’s reception further illustrates the media’s pro-abortion bias.  Within days of its release, it sold out on Amazon and Barnes and Noble, outselling all but three non-fiction titles.  But the New York Times refused to put it on its best seller list, and no mainstream media reviewed it.  

Ann McElhinney and Phelim McAleer have written a fully-documented, compelling treatise.  They obviously read everything relevant to the case, sat day-after-day witnessing the trial procedures, and later interviewed Gosnell in prison.  Though the surfeit of details—minutely describing the dead babies found in the clinic, investigating the police and prosecutors responsible for bringing the case to trial—may put off readers wanting a short synopsis, Gosnell:  The Untold Story of America’s Most Prolific Serial Killer merits the attention of everyone committed to the sacred “right to life” guaranteed by the Constitution as well as proclaimed in the Scriptures.  

* * * * * * * * * * * * * * * * * * * * * * *  

In the name of Nature, human nature is being denied and degraded in many venues.  Under the guidance of secular humanism, anti-human forces have been unleashed and radical “trans-human” proposals entertained.  As an astute Mortimer Adler long ago predicted (in The Difference in Man and the Difference It Makes), once a clear distinction between human beings and the rest of creation is drawn, no reason remains for granting man any special standing (i.e. “human exceptionalism”).  The Great Chain of Being has dissolved, leaving nothing but randomly scattered and essentially equal beings.  For several decades Wesley J. Smith has researched and written about precisely this development, and in The War on Humans (Seattle:  Discovery Institute Press, c. 2014), he challenges some of the growing anti-human (misanthropic) currents in contemporary culture—most notably within an environmentalism “that is becoming increasingly nihilistic, anti-modern, and anti-human” (#107).   This is clear when one confronts the philosophical aspects of the Deep Ecology Movement which serves for many as a “neo-Earth religion” that considers  human beings as no more than technologically sophisticated, consumerist parasites destroying Mother Earth.  Consequently, reducing the human population and giving other species unrestricted opportunities to thrive and multiply becomes the goal.   

This “green misanthropy” denies any moral difference between flora and fauna and human beings, whose numbers need reducing in order to enable other species to flourish.  To Paul Watson, head of the Sea Shepherd Conservation Society, humans are the “AIDS of the Earth.”  Only radical surgery, reducing man’s presence and activity on earth, can save the planet.  Similarly, Eric R. Pianka, a biology professor at the University of Texas who was named the Distinguished Texas Scientist of 2006, suggested it would be good if an ebola pandemic would kill 90% of the human population.  To Pianka:  “Humans are no better than bacteria, in fact, we are just like them when it comes to using up resources. . .  We are Homo the sap, not sapiens (stupid, not smart)’” (#318).  Needless to say, such activists enthusiastically promote abortion, euthanasia, eugenics and genetic engineering.  

Fortuitously, today’s green misanthropists have found in the hysteria regarding global warming (or climate change) a useful tool with which to promote their agenda.  Smith claims no expertise in dealing with climate change claims, but he does clearly discern the anti-human tone permeating the discussion.  He also notes that atmospheric carbon levels have steadily increased while there’s been no significant increase in world temperatures detected in 20 years.  Fearsome predictions abound—as in former Vice President Al Gore’s feverish warnings— but minor changes have actually occurred.  Polar bears still flourish, ice still forms in the arctic, snow still falls, crops still grow, hurricanes and earthquakes continue as usual—life on earth continues much as before.  

Yet numbers of school children fear they will not live into adulthood!  A U.S. senator introduced legislation to punish anyone daring to question the reality of climate change!    NASA’s James Hansen urged “the jailing of oil executives for committing ‘crimes against nature’ for being global warming ‘deniers’” (#811).  An editor at the Los Angeles Times says the paper will no longer print letters to the editor that doubt global warming!  Something has happened.  Vast numbers of folks have succumbed to green propaganda.  “Illustrating just how wacky global warming Malthusianism can become, the Mother Nature Network published an article lauding Genghis Khan—the killer of millions of people—for wonderfully cooling the planet during his years of conquest” (#690).    The author claimed that the Mongol invasions eliminated enough people (ca. 40 million) to keep 700 tons of carbon from fouling the atmosphere!  So the planet cooled for a century and Khan’s genocide should be praised!  To Smith:  “Only when the new Earth religion reigns can a vicious barbarian like Khan be canonized a saint” (#697).  

In 1974, Christopher Stone, a University of Southern California law professor published an article, “Should Trees Have Standing?—Toward Legal Rights for Natural Objects,” arguing that trees, as well as humans, should enjoy legal standing.  Subsequently, courts have increasingly granted environmentalists’ claims that legislation (most notably the Endangered Species Act) should be broadly construed so as to guarantee the preservation of all sorts of creatures and environments.  So now we face a new Earth religion that insists all of Nature has inalienable rights, including the right to exist—i.e. to be respected, to procreate, to have access to water.  Laws in nations such as Ecuador, Switzerland (with its “plant dignity” agenda), and New Zealand (declaring the Whanganui River to be  a person), now protect such rights. “In the 1970s,” Smith says, summarizing his presentation, “the values of Deep Ecology were anathema to most.  Ten years ago, granting ‘rights’ to nature would have been laughed off a a pipe dream.”  Yet, as we have witnessed in the rapid acceptance such innovations as same-sex marriage, “in contemporary society very radical ideas often gain quick acceptance by a ruling elite growing ever more antithetical to human exceptionalism” (#1305).  “The triumph of anti-humanism within environmental advocacy threatens a green theocratic tyranny.  Like eugenics, the misanthropic agendas discussed in this book are all profoundly Utopian endeavors, meaning that the perceived all-important ends will come eventually to justify coercive means.   Indeed, the convergence of human loathing, concentrated Malthusianism, and renewed advocacy for radical wealth redistribution—all of which are now respected views within the environmental movement, and each of which is dangerous in its own right—threatens calamity.

“Don’t say you weren’t warned” (#1380).  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * 

In A Rat Is a Pig Is a Dog Is a Boy:  the Human Cost of the Animal Rights Movement (New York:  Encounter Books, c. 2010), Wesley J. Smith carefully distinguishes between “animal welfare” (treating animals rightly) and “animal rights” (treating animals as man’s equal) and seeks to show the dangers posed by the latter.  He writes to alert readers to the dangers posed by animal rights’ radicals who plant bombs, destroy property, burn buildings, condemn all medical research involving animals, urge rigorous forms of vegetarianism, and even justify murder.  He also writes to inform us of the philosophical implications evident in statements such as Richard Dawkins’ declaration that we are not only like apes—“we are apes,” though differently evolved in minor ways!  

Foundational to the animal rights movement is Peter Singer’s 1975 book, Animal Liberation, wherein the first chapter is titled “All Animals Are Equal.”  Fifteen years later he could boast to having launched “a worldwide movement” that would continue to shape  men’s minds, extending the rights of “personhood” to whales and dolphins, dogs and cats, cattle and sheep.  He’d launched a movement!  Millions now embrace his assumptions and promote his objectives—“rescuing” various animals, halting animal research, throwing paint on fur coats, etc.  “Meanwhile, tens of millions of human beings would be stripped of legal personhood, including newborn human infants, people with advanced Alzheimer’s disease, or other severe cognitive disabilities—since Singer claims they are not self-conscious or rational—along with animals that do not exhibit sufficient cognitive capacity to earn the highest value, such as fish and birds.”  Working out the implications of his position, Singer concluded:  “‘Since neither a newborn infant nor a fish is a person the wrongness of killing such beings is not as great as the wrongness of killing a person’” (p. 28).   Contending newborn infants are not yet persons, he notoriously justifies infanticide until the baby attains “personhood” as he defines it.  

Though Singer speaks more plainly, many other distinguished academics and activists share his opposition to “speciesism,” the notion that humans are intrinsically superior to other forms of creation.  Thus they suggest that “Animals Are People Too,” positing “a moral equality between humans and animals,” making it “immoral for humans to make any instrumental use of animals” (p. 35).  All creatures capable of feeling pain are declared full-fledged members of the moral community.  Thus the People for the Ethical Treatment of Animals (PETA) once orchestrated a campaign “called ‘Holocaust on your  Plate,’ which compared eating meat to the genocide perpetrated by the Nazis against Jews” (p. 36).  

Ever alert to the opportunity of pushing their agenda through the judicial system, animal-rights activists work relentlessly to establish the “personhood” of animals in the courts.  Steven M. Wise, a law professor who heads the Center for the Expansion of Fundamental Rights, contends “that all animals capable of exercising what he calls ‘practical autonomy’ are entitled to ‘personhood and basic liberty rights,’ based on mere ‘consciousness’ and ‘sentience’” (p. 62).  Cass Sunstein, one of the regulations “czars” appointed by President Obama, thinks animals should be granted legal standing, and Harvard Law School’s Professor Lawrence Tribe (one of Obama’s instructors) “has spoken in support of enabling animals to bring lawsuits” (p. 67).  To this point, the main success enjoyed in the courts has been in cases restricting or halting medical research using monkeys or on behalf of “endangered species” such as the spotted owl in the Pacific Northwest.  But there is a powerful movement pushing our legal system to grant full equality to all creatures, great and small.

In addition to the courts, animal-rights advocates are working to proselytize children, primarily through the public schools.  Given their childish affection for bunnies and puppies, children easily respond to emotional appeals on behalf of mistreated animals.  PETA comics portray hunters and fishermen as evil people in publications such as “Your Daddy KILLS Animals!”  Young readers are then warned:  “‘Until your daddy learns that it’s not ‘fun’ to kill, keep your doggies and kitties away from him.  He’s so hooked on killing defenseless animals that they could be next!’” (p. 104).  PETA provides teachers with free curriculum materials and guest speakers espousing vegetarianism as well as condemning all forms of animal mistreatment.  High school students are promised legal assistance should they refuse to dissect frogs or dead animals in biology classes.  One organization, Farm Sanctuary, provides schools with materials promoting the “rescue” of animals imprisoned in “factory farms.” 

Smith makes his case by citing an impressive number of sources and presenting arresting illustrations, alerting us to the problems posed by the animal-rights movement.  He also rightly emphasizes “the importance of being human,” rightly caring for animals without elevating them to a sacred status. 

290 Some Polish Perspectives–Leyutko & Kolakowski

 Though Poland as a nation has frequently suffered occupation and exploitation, Polish artists (Chopin) and thinkers (Pope John Paul II) have blessed the world with their works.   Ryszard Legutko’s recent The Demon in Democracy:  Totalitarian Temptations in Free Societies (New York:  Encounter Books, c. 2016) adds another name to the list of writers dealing with the nature of the modern world.  When communists controlled his country, Legato edited an underground philosophy journal espousing the principles of Solidarity—the movement that liberated the nation 30 years ago.  “Solidarity,” he says, stood up in defense of human dignity (in its original and not the corrupted sense), access to culture, respect for the truth in science and for nobility in art, and a proper role given to Christian heritage and Christian religion.  It seemed that suddenly those great ideas at the root of Western civilization—which this civilization had slowly begun to forget—were again brought to life and ignited as a fire in the minds of the members of a trade union” (#819).  Sadly, following the collapse of the Iron Curtain these values quickly evaporated within the  liberal-democracy that replaced communism in Poland and now dominates in much of Europe.  

As a practicing philosopher Legutko has, for many years, pondered developments within this “liberal democracy” and has concluded it contains some of the same flaws that made communism so pernicious.   Leftism, even under a “democratic” banner, is still collectivist and authoritarian.  He makes clear that the “liberal-democracy” he critiques is not the classic system espoused by Thomas Jefferson or Winston Churchill but the modern system evident in both the social democracies supporting the European Parliament and America’s Democrat Party.  After watching with amazement how easily former communists became champions of liberal-democracy, Legutko argues they both “proved to be all-unifying entities compelling their followers how to think, what to do, how to evaluate events, what to dream, and what language to use.  They both had their orthodoxies and their models of an ideal citizen” (#152).  The European Union (EU) increasingly dictates to rather than represents the people of the continent’s nations.  “Even a preliminary contact with the EU institutions allows one to feel a stifling atmosphere typical of a political monopoly, to see the destruction of language turning into a new form of Newspeak, to observe the creation of a surreality, mostly ideological, that obfuscates the real world, to witness an uncompromising hostility against all dissidents, and to perceive many other things only too familiar to anyone who remembers the world governed by the Communist Party” (#174).  

What the two systems share, most deeply, is a commitment to change the world through technology—to “modernize” everything, to bring into being both a new human being and a perfect world.   The past provides neither things worth preserving nor guidance for the future inasmuch as it followed superstitious, medieval, old-fashioned notions.  Neither cultural traditions nor churches nor traditional families nor written constitutions matter, for what’s imperative is the construction of a totally new, modern world.  Rather than accepting the givenness of things as created, both communists and liberal-democrats endeavor to transform them; rather than dealing with reality they propose to construct it.  “In both systems a cult of technology translates itself into acceptance of social engineering as a proper approach to reforming society,  changing human behavior, and solving existing social problems” #233).   “In one system [the U.S.S.R.] this meant reversing the current of Siberia’s rivers, in the other [the U.S.], a formation of alternative family models; invariably, however, it was the constant improvement of nature, which turns out to be barely a substrate to be molded into a desired form” (#240).  

Legutko devotes a chapter to the shared communistic and liberal-democratic perspective on history—what astute thinkers such as C.S. Lewis condemned as “historicism.”  This is the notion derived from Hegel that there is a predetermined, irresistible evolutionary force shaping human events.  To swim with its progressive current is embrace and champion all things modern.  To be on the “right side of history” is to be altogether wise and righteous.  To oppose, to react against this course of events demonstrates stupidity and misanthropy.  For communists, forcefully establishing an egalitarian socialism is the goal; for liberal-democrats, the same end must be attained through peaceful, electoral means.  Both deeply believe they are the change-agents entrusted with perfecting both human nature and the world in general.  “A comparison between the liberal-democratic concept of history and that of communism shows a commonality of argument as well was of images of the historical process” (#412), generally drawn from Marxist sources:  1)  the triumphant march of freedom, vanquishing tyrannies of various sorts (monarchies; churches); 2)  the liberation of various victim (class; race; gender) groups; and, 3) the ultimate, thoroughly scientific enlightenment of homo sapiens.  

To make his case persuasive, Legutko suggests we imagine the differences between an old man and a youngster.  By virtue of his experience, the old man fears change, knowing it often stems from immaturity and ignorance.  The old man knows much about what has happened, including the tragedies and misfortunes resulting from well-intended, imprudent decisions.  But the youngster thinks he and his companions rightly envision a better world and need only to act quickly to achieve it.  “The old man is balanced in his reactions and assessments, looking for the appropriate courses of action in the world which, according to him, was founded on human error, ignorance, poor recognition of reality, and premature ventures; the youngster has an excitable nature, moving from desperation to euphoria, eagerly identifying numerous enemies whose destruction he volubly advocates, and equally happy to engage in collaborative activities with others because—he believes—the world is full of rational people.  The old man says that, given the weaknesses of the human race, institutions and communities (families, schools, churches) should be protected because over the centuries they have proven themselves to be tools to tame human’s evil inclinations; the young man will argue that such institutions and communities need to be radically exposed to light, aired out, and transformed  because they are fossils of past injustices.  The old man is a loner who believes that only such an attitude as his can protect the integrity of the mind; the youngster eagerly joins the herd, enjoying the uproar, mobilization, and direct action” (#536).  

Obviously the modern mind is that of a youngster, full of technical information and lofty aspirations, optimistically envisioning “the promise of a great transformation” that has enraptured so many intellectuals since the Renaissance.  Such intellectuals envision themselves as leaders on the “cutting edge of history,”  and they endlessly engage in “a favorite occupation of the youngster:  to criticize what is in the name of what will be, but what a large part of humanity, less perceptive and less intelligent than himself, fails to see” (#565).   A century ago, the U.S.S.R. served as a lodestar for “youngsters” such as John Reed, who sought therein the realization of their dreams.   More recently, the “youngsters” took to the barricades in Paris in 1968 or marched in America’s streets in support of Ho Chi Min.  The ‘60s revolutionaries chanted “a medley of anarchist slogans, a Marxist rhetoric class struggle and the overthrowing of capitalism, and a liberal language of rights, emancipation, and discrimination.  Capitalism and the state were the main targets, but universities, schools, family, law, and social mores were attacked with equal vehemence” (#1580).  One need only study carefully the rhetoric and policies of Bernie Sanders and Elizabeth Warren to note the empowerment of those ‘60s revolutionaries.  

Indicating one adolescent aspect of the modern mind is the importance of entertainment—a point persuasively made three decades ago by Neil Postman in Amusing Ourselves to Death.  In earlier, more religious times, entertainment was understood to be a non-consequential activity designed to provide a brief break from the serious work assigned us.  But, Legutko, says:  “In today’s world entertainment is not just a pastime or a style, but a substance that permeates everything:  schools and universities, upbringing of children, intellectual life, art, morality and religion” (#753).  Modern entertainment resembles the divertissement so acutely diagnosed by Pascal at the beginning of the modern era:  it’s an activity “that separates us from the seriousness of existence and fills this existence with false content” (#753).  We don’t escape reality for a few hours—we immerse ourselves in an imaginary world.  “By escaping the questions of ultimate meaning of our own lives, or of human life in general, our minds slowly get used to that fictitious reality, which we take for the real one, and are lured by its attractions” (#760).  

Rivaling historicism in its importance for both communists and liberal-democrats is utopianism, generally flying the multicolored flag of social justice.  “Utopia is thus not a political fantasy but a bold project bolder than others because it aims at a solution to all the basic problems of collective life that humanity has faced since it began to organize itself politically.  Utopia is—I beg the reader’s pardon for such a vile-sounding phrase—the final solution” (#931).  Beginning in the Renaissance, various utopians proposed political solutions to man’s ancient ills and aspirations, insisting “man can achieve greatness and be equal to God, because he has unlimited creative potential” (#931).  The republic envisioned by America’s Founders was not utopian, but the egalitarian liberal-democracy promoted by 20th century progressives—from Richard Ely and Woodrow Wilson to John Rawls and Barach Obama—certainly is.    

Counterintuitively, the “classical liberalism” that began with Adam Smith and Thomas Jefferson celebrating individualism slowly became “a doctrine in which the primary agents were no longer individuals, but groups and the institutions of the democratic state.  Instead of individuals striving for the enrichment of social capital with new ideas and aspirations, there emerged people voicing demands called rights and acting within the scope of organized groups.”  Special interest groups working within a relentlessly-expanding state, orchestrated legislative enactments and judicial decisions, “demanding legal acceptance of their position and acquired privileges.  In the final outcome the state in liberal democracy ceased to be an institution pursuing the common good, but became a hostage of groups that treated it solely as an instrument of change securing their interests” (#1205).  Ironically, today’s liberals (most notably homosexuals and feminists) are hardly liberal, in as much as they strive to regulate virtually every aspect of life, including “language, gestures, and thoughts” (#1284).  They’re just Leftists intent on imposing their agenda.  

The political system shaped by both communists’ and liberal-democrats’ historicist-utopianism becomes all-intrusive, ever intent on removing all vestiges of property or class distinctions.  Leftist ideologies of the ’60s now dominate the liberal-democratic academic and media complex.  And the Christian churches, sidelined by pernicious church-state separation decrees, have largely accommodated themselves to the deeply anti-Christian ways of modernity.  Consequently, many churches have tailored their teachings to fit “the requirements of the liberal-democratic state and, consequently, to revise their doctrines substantially, sometimes beyond recognition” (#2885).  Having successfully marched through our cultural institutions, triumphant liberals have “managed to silence and marginalize nearly all alternatives and all nonliteral view of political order” (#1536).  

Reading Legutgo’s provocative and deeply-informative analysis of these realms both clarifies and challenges our understanding of the our world.   I share my good friend John Wright’s strong endorsement of this work.  It is, as John O’Sullivan says in his Introduction, a “culturally rich, philosophically sophisticated, and brilliantly argued book” that deserves our attention if we’re concerned about our civilization. 

* * * * * * * * * * * * * * * * * * * * * *

Fortunately for the general reader, first-rate philosophers often write accessible essays, addressing both current issues and perennial truths.  Thus Leszek Kolakowski, a Polish thinker rightly renowned for his magisterial, three-volume Main Currents of Marxism, published a score of short essays in Modernity on Endless Trial (Chicago:  The University of Chicago Press, c. 1990) that offer serious readers valuable insights into some main intellectual currents of the 20th century.  Whenever an erstwhile Marxist casts a favorable glance as Christianity it makes sense for believers to consider his reasons.   

One set of essays focus “On Modernity, Barbarity, and Intellectuals.”  Strangely enough, a corps of intellectuals has orchestrated the barbarism that has emerged during the last three centuries—an  era labeled “modernity.”  Since Kolakowski cannot see how “postmodern” differs from “modern,” he discerns the loss of religion (and loss of taboos) as the primary current in modern (and postmodern) times, leading to “the sad spectacle of a godless world.  It appears as if we suddenly woke up to perceive things which the humble, and not necessarily highly educated, priests have been seeing—and warning us about—for three centuries and which they have repeatedly denounced in their Sunday Sermons.  They kept telling their flocks that a world that has forgotten God has forgotten the very distinction between good  and evil and has made human life meaningless, sunk into nihilism” (pp. 7-8).  A series of influential, secularizing skeptics prepared the way for the destructiveness of “Nietzsche’s noisy philosophical hammer” crafted to re-order the world (p. 8).  The “intellectuals” responsible for this process were not the scholars—scientists or historians—who “attempt to remain true to the material found or discovered” (p. 36) apart from themselves.  A barbarizing “intellectual” is someone who wishes not “simply to transmit truth, but to create it.  He is not a guardian of the word, but a word manufacturer” (p. 36).  Invariably, such intellectuals are seductive, spinning wondrous tales of utopian vistas.  

To Nihilists such as Nietzsche, truth is illusory.  Consequently, various cultures’ “truths” are equally “true” even if they are obviously contradictory!  Such cultural relativism—declaring all cultures are equal, praising the Aztecs as well as the Benedictines—easily embraces an admiration for various forms of what was once judged barbarism.  The sophisticated, scholarly “tolerance” so mandatory in elite universities and journals ends by granting “to others their right to be barbarians” (p. 22).  What we are witnessing is the Enlightenment devouring itself!  In Kolakowski’s judgment:  “In its final form the Enlightenment turns against itself:  humanism becomes a moral nihilism, doubt leads to epistemological nihilism, and the affirmation of the person undergoes a metamorphosis that transforms it into a totalitarian idea.  The removal of the barriers erected by Christianity to protect itself against the Enlightenment, which was the fruit of its own development, brought the collapse of the barriers that protected the Enlightenment against its own degeneration, either into a deification of man and nature or into despair” (p. 30).  

Another set of essays deal with “the Dilemmas of the Christian Legacy,” for modernity’s secularizing process has significantly, if indirectly, shaped much of the Christian world “through a universalization of the sacred,” sanctifying worldly developments as “crystallizations of divine energy.” (p. 68).  The “Christianity” rooted in process theology—as propounded by Teilhard de Chardin for example— envisions universal salvation and unending evolutionary progress.  “In the hope of saving itself, it seems to be assuming the colors of its environment, but the result is that it loses its identity, which depends on just that distinction between the sacred and the profane, and on the conflict that can and often must exist between them” (p. 69).  Kolakowski detects and dislikes what he finds in these circles—“the love of the amorphous, the desire for homogeneity, the illusion that there are no limits to the perfectibility of which human society is capable, immanentist eschatologies, and the instrumental attitude toward life” (p. 69).  Losing their sense of the sacred, this-worldly philosophies and religions fail to provide any basis for culture.  Indeed:  “With the disappearance of the sacred, which imposed limits to the perfection that could be attained by the profane, arises one of the most dangerous illusions of our civilization—the illusion that there are no limits to the changes that human life can undergo, that society is ‘in principle’ an endlessly flexible thing and that to deny this flexibility and this perfectibility is to deny man’s total autonomy and thus to deny man himself” (p. 72).  A rejection of the sacred invites the denial of sin and evil.  

Though not overtly Christian, Kolakowski himself rejected the atheistic Marxism of his early years and found Christianity the best hope for the world and become a cheerleader for, if not a devotee of the Faith.  “There are reasons why we need Christianity,” he argues, “but not just any kind of Christianity.  We do not need a Christianity that makes political revolution, that rushes to cooperate with so-called sexual liberation, that approves our concupiscence or praises our violence.  There are enough forces in the world to do all these things without the aid of Christianity.  We need a Christianity that will help us move beyond the immediate pressures of life, that gives us insight into the basic limits of the human condition and the capacity to accept them, a Christianity that teaches us the simple truth that there is not only a tomorrow but a day after tomorrow a well, and that the difference between success and failure is rarely distinguishable” (p. 85).   

Given his critique of modernity, Kolakowski has little patience with the modernist (or liberal) Christianity that focuses on “social justice,” peace, and ephemeral earthly progress—the this-worldly political agenda so routinely proclaimed in some quarters.  “Christianity is about moral evil, malum culpae, and moral evil inheres only in individuals, because only the individual is responsible” (p. 93).  To even speak of “a ‘morally evil’ or ‘morally good’ social system makes no sense in the world of Christian belief” (p. 93).  The vacuous “demythologization” project of modernists such as Rudolph Bultmann elicits Kolakowski’s erudite disdain, for it was merely a fitful gasp of the irrational skepticism launched centuries ago by William of Occam and the nominalists, then subtly advanced by David Hume and the 18th century empiricists.  In truth, “there is no way for Christianity to ‘demythologize’ itself and save anything of its meaning.  It is either-or:  demythologized Christianity is not Christianity” (p. 105).  

Demythologized Christianity contradicts itself.  In this respect it’s simply another utopian political ideology.  Having early advocated the Marxist version of utopia, Kolakowski easily detects the many currents of such blissful imagining—popularly expressed in John Lennon’s popular song “Imagine.”  Consider the fantasies of folks who envision a world wherein fraternity is realized, where equality prevails in every realm.  They “keep promising us that they are going to educate the human race to fraternity, whereupon the unfortunate passions that tear societies asunder—greed, aggressiveness, lust for power—will vanish” (p. 139).  Inevitably they establish dictatorships designed to enforce the mirage of equality.  Allegedly admirable goals—caring for the impoverished and weak—require the abolition of private property and a state controlled economy, the abolition of the free market.  However noble the intentions, “the abolition of the market means a gulag society” (p. 167).  

In the name of compassion, giving preferential treatment to various disadvantaged groups, societies easily “retreat into infantilism” (p. 173).  Citizens become dependent, childlike welfare recipients.  The State assumes more and more responsibility to care for everyone’s needs, and we “expect from the State ever more solutions not only to social questions but also to private problems and difficulties; it increasingly appears to us that if we are not perfectly happy, it is the State’s fault, as though it were the duty of the all-powerful State to make us happy” (p. 173).  The State, of course, cannot possibly do this.  Yet this blatantly utopian longing drove some of the most powerful mass movements of the 20th century, most of them Marxist to some degree.  Marx, of course, didn’t envision the gulags that would result from the implementation of his socialistic ideas!  But Lenin and Trotsky were, in fact, faithful to his precepts, installing a “dictatorship of the proletariat” that could not but violently pursue its agenda.  Reducing ethics to “fables” and doing whatever necessary to advance his cause, Lenin simply implemented his Marxist principles. 

289 Notable Conversions–Andrew Claven & Sally Read

 

When I review books I hope some readers find bits of valuable information and perhaps pick up a copy if it interests them.  But some books I not only read and relish but wish everyone could enjoy the enlightenment and beauty they afford.  Such is Andrew Klavan’s The Great Good Thing:  A Secular Jew Comes to Faith in Christ (Nashville:  Nelson Books, c. 2016), wherein a gifted writer speaks persuasively, reaffirming the perennial allure of the the Incarnate Savior, our Lord Jesus Christ.  Klavan is well-known in the literary world, considered by Stephen King a “most original American novelist of crime and suspense.”  But rather than keeping us in suspense Klavan, in The Great Good Thing, tells us about his conversion, culminating with his Christian baptism at the age of 49.  “No one could have been more surprised than I was,” he says.  “I never thought I was the type.  I had been born and raised a Jew and lived most of my life as an agnostic.  I believed in the fullest freedom of thought into the widest reaches of fact and philosophy  I believed in science and analysis and reasonable explanations.  I had no time for magical thinking of any kind.  I couldn’t bear solemn piety.  I despise even the ordinary varieties of willful blindness to the tragic shambles of life on earth.”  In short, for half-a-century he’d been a hard-boiled realist—“a worlding by nature” (p. xiii).  

Flourishing as a writer, Klavan “was one of the men of the coasts and cities, at home among the snarks and cynics of these postmodern times” (p. xvi).  Yet here he was, confessing “that Jesus Christ was Lord” and accepting “the uniquely salvific truth of his life and preaching, death and resurrection—this it seemed to me even in the moment, was to renounce my natural place in the age, to turn against my upbringing and my kind.  It felt, so help me, as I were flinging myself off the deck of a holiday cruise ship, falling away from its lighted ballrooms and casinos, from the parties and the music and sparkling wine of Fashionable Ideas, to go plunging down and down and did I mention down into a wave-tossed theological solitude” (p. xv).  In a sense it made no sense!  But in a deeper sense, it was a coming together of the central themes of his novels wherein his “heroes were always desperately on the run desperately trying to get at a truth that baffled their assumptions and philosophies” (p. xvi).  They wanted to make sense of the world but couldn’t find the key.  

Slowly, through much reading and writing and personal experience, he discovered  the key—the answer to Pilate’s question, “What is truth?”— could be found only in the message proclaimed by The Gospel According to St. John!  Jesus Is the Truth!  Klavan’s spiritual journey, rather like C.S. Lewis’s, took place over a number of years wherein he moved from agnosticism to belief in God.  He’d begun praying and found his life improved by the discipline.  He’d “become like a character in one of my own stories, desperately trying to unknit the fabric of fact and perception, to separate the warp of psychology from the weft of objective truth, before time ran out” (p. xix).  He fully understood the risks entailed—a successful Jewish writer, safely ensconced in an up-scale Santa Barbara suburb, daring to declare himself a Christian.   What would that mean?  “‘Oh, God,’ I prayed fervently more than once, ‘whatever happens, don’t let me become a Christian novelist!’” (p. xx).  “Would I descend into that smiley-faced religious idiocy that mistakes the good health and prosperity of the moment for the supernatural favor of God?” (p. xx).  And in becoming a Christian he determined not to forsake his Jewish ancestry and culture.  Could it happen?  

Well, it did.  He found Christ—or, in that paradoxical mystery of redemption, Christ found him!  Consequently, he found himself “rejoicing.  I was convinced and fully convinced:  my mind was God’s, my soul was Christ’s, my faith was true.  How had that happened and why?  Given the spiritual distance I’d traveled, given the depths of my doubts, given the darkness of my most uncertain places, and given, most of all, the elation and wonder I felt at the journey’s end, it seems to be a story worth telling” (p. xxv).  

It’s a story worth telling—and for us it’s a story worth reading!  Klavan recounts  his early years in Great Neck, New York, “a wealthy town, a well-tailored suburban refuge from the swarming city,” where he was immersed in an upper-middle-class, secularized Jewish community, the son of a successful New York morning drive radio personality.   But as a child he was inwardly unhappy and spent much time daydreaming, constructing elaborate fantasies featuring himself as the invariably tough-guy hero.  Much of his school-time was devoted to fantasizing rather than studying.  He seemed to be a good student, “but it was all fraud.  I could read well and write well and talk glibly and even figure out math problems in my head.  So I could bluff my way through subjects I knew nothing about, and neither my teachers nor my parents, nor even my friends, were aware that I was hardly doing any schoolwork at all” (p. 28).   In fact he learned nothing—“no historical facts, no mathematical formulas, no passages from the books we were supposed to have read” (p. 28).  

Nor did he learn much about Judaism.  His family’s Jewishness was purely cultural, extending to only a few traditions.  “God was not a living presence in my home,” (p. 45), and his required attendance (“suffocating torture”) at Hebrew school in the local synagogue left no impression on him.  “My father used to say, ‘You can’t flunk out of being Jewish.’  But man, I tried” (p. 48).  Forced to submit to his Bar Mitzvah, he ad-libed his way through it and was startled by the “fortune in gifts” he received.   But inwardly he felt only “rage and shame” at participating in a ritual mouthing words he disbelieved.  He knew he was a hypocrite and hated himself for it.  “With great pomp and sacred ceremony, they had made me declare what I did not believe was true—and then they had paid me for the lie with these trinkets!  I felt that I had sold my soul” (p. 55).  

Though his family’s Judaism hardly affected young Klavan, a brief exposure to a thoroughly Christian family did!  A woman, Mina, who worked for his family and became virtually a family member, gave him “a substantial portion of what mothering I had” (p. 61).  With neither husband nor children of her own, “she just took care of people, that’s all” (p. 62).  Though he didn’t really understand it, Mina “was a true Christian.  Religious, I mean, even devout” (p. 64).  She never mentioned “Jesus to me, but he was alive and real to her” (p. 64).  Allowed to go to her gaily decorated, music-filled house one Christmas, Klavan felt himself in a “wonderland” surrounded by cheerful, caring people.  Many years later, preparing for his baptism, he marveled that “Jesus had first entered my consciousness” at that first Christmas at Mina’s house” (p. 68).  Thenceforth, even in his most agnostic, secular stages, he retained a deep fondness for Christmas, even celebrating the season with a sincerity lacking in some Christian circles!  But:  “It was Christmas we loved, the bright tradition, not Christ, never Christ” (p. 74).  

Nor did he love schooling of any kind!  Early on he’d determined to become a writer, and he thought only personal experience could teach him what he wanted to know.  So he entered a program designed to enable youngsters to finish high school early and launched out on his own, at the age of 17, to enter “the world of Experience behind the walls” (p. 119).  He worked at various jobs, traveled hither and yon, and certainly experienced many things.  In the midst of his wandering, for reasons he cannot recall, he applied for admission to the University of California at Berkeley and was accepted.  But he was late on the scene.  “The radical years were over.  The riots and mayhem I’d been hoping to see had passed like a storm” (p. 123).  Though nominally a student, he mainly drank and slept and tried to teach himself how to write.  

When he did go to class, he “went through all the usual dazzle-dazzle shenanigans:  bluff and fakery.  I read none of the books.  I conned and wrote my way to passing grades” (p. 124).  But for some reason he always bought the books required for the courses and kept them.  Back then, when postmodernism was just beginning its onslaught, some university professors still assigned the “classics” and encouraged students to engage in the “Great Conversation, an interchange carried on across the centuries by the major thinkers and artists of the Western canon.  The idea was that by studying this conversation you could move closer to the Truth and so find a fuller wisdom about reality and what made for the Good life” (p. 134).  So Klavan’s growing library contained the works of the masters and one day, lying listlessly in bed, he picked up a William Faulkner novel.  Suddenly he discovered what literature was all about!  And he began to read, on his own, the classic literary works of Western Civilization.  “Without knowing it, I had joined the Great Conversation” (p. 138).  

Not only did he discover the classics at Berkeley—he found a wife, Ellen, the daughter of the chairman of the Berkeley English department.  Ellen’s parents embraced him, and he managed to graduate from the university as well as marry her and make a lasting marriage of invaluable worth.  Though he’d been “a fool in many ways,” by marrying her “somehow—and not for the last time in my life—I had managed to stumble into the great good thing” (p. 158).  The young couple then moved to New York and managed to survive, working at various jobs while he tried to become the writer of his dreams.  Amidst his manifest lack of success as a writer he began struggling with mental and emotional issues that led him to enter a five-year stint of psycho-analysis, wherein a gifted therapist greatly helped him to get mentally healthy.  In the midst of his depression (both mental and financial), however, he decided to write a suspense novel—using a pseudonym, since such was not the genre of “serious” writers.  He and his brother quickly wrote—and sold—the book, which then won the Edgar Award for best paperback mystery.  Better yet, they also got a movie deal.  He not only made money but discovered “that telling such stories was my gift” (p. 170).  In time he would become a highly-acclaimed and financially successful suspense novelist.  

Among the many events that opened his mind to God was the birth of his first child.  “Sex, birth, marriage, these bodies, this life, they were all just representations of the power that had created them, the power now surging through my wife in this flood of matter the power that had made us one:  the power of love.  Love, I saw now, was an exterior spiritual force that swept through our bodies in the symbolic forms of eros, then bound us materially, skin and bone, in the symbolic moment of birth.”  Watching the baby emerge from the womb, Klavan experienced a truly mystical moment.  “I became not one flesh with my wife but one being beyond flesh with the love I felt for her.  My spirit washed into that love and became part of it, a splash in a rushing river.  In that river of love, I went raging down the plane of Ellen’s body until the love I was and the love that carried me melded with the love I felt for the new baby we had made together and I became part of the love as well,” and he saw he “was about to flow out into the infinite.  I saw that, beyond the painted scenery of mere existence, it was all love, love unbounded, mushrooming, vast, alive, and everlasting.  The love I felt, the love I was, was about to cascade into the very origin of itself, the origin of our three lives and of all creation” (p. 191).  

In time he realized:  “You cannot know the truth about the world until you know God loves you, because that is the truth about the world” (p. 236).  Tasting the reality of love, he sought Love!  He began slipping into churches and even attending services—and then met an engaging Episcopal rector.  He appreciated the music as well as the messages and began to “realize there was a spiritual side to life, a side I had been neglecting in my postmodern mind-set” (p. 195).  Intuitively, he knew morality itself requires a transcendent foundation.  “An ultimate Moral Good cannot just be an idea.  It must be, in effect, a personality with consciousness and free will” (p. 205).  “In the chain of reasoning that took me finally to Christ, accepting this one axiom—that some actions are morally better than others—is the only truly nonlogical leap of faith I ever made.  Hardly a leap really.  Barely even a step, I know it’s so.  And those who declare they do not are, like Hamlet, only pretending” (p. 206).  

Coming to faith in Christ proved momentous:  “My personality was so transformed I hardly recognized myself” (p. 211).  Filled with joy, Klavern flourished as a writer and father.   His written works reveal his mind’s journey, refuting the postmodernism firmly entranced in the nation’s intelligentsia and working through the anti-semitism obvious in the Western Christian Culture he’d come to love.   He saw the truth revealed in the words of one of his own characters in True Crime:  “‘You want to believe in God,’ the pastor says, ‘you’re gonna have to believe in a God of the sad world’” (p. 225).  Sin has shattered our world, and it’s full of evil—including the Holocaust.  But the Savior has saved us from sin!  “In this new mental freedom, I came to see that the dilemma I had been wrestling with—my love of a culture that had done so much evil and yet produced such lasting beauty—was only my personal portion of the greater human paradox.  We are never free of the things that happen.  Even evil weaves itself into the fabric of history, never to be undone.  Yet at the same time—at the very same time—each of us gets a new soul with which to start the world again.”  Jesus “offered a spiritual path out of the history created by Original Sin and into the newborn self remade in his image.  It is the impossible solution to the impossible problem of evil.  All reason says it can’t be so.  But it’s the truth that sets us free” (p. 229).  

This book is so good that (as is evident in the long quotations) it’s tempting just to duplicate the entire text!  So let me just share Eric Metaxis’ encomium:  “Andrew Klavan’s superb new book deserves to become a classic of its kind.  Klavan’s immense talents as a writer are on full view in what must certainly rank as his most important book to date.  Tole lege [take up and read—the words Augustine heard prior to his conversion].”  

* * * * * * * * * * * * * * * * * * * * * * * * * * * * 

Inasmuch as all heaven rejoices when a single sinner repents, all converts are equally treasured.  But inasmuch as every person is unique, each conversion story adds depth and texture to the ways of God with man. Thus one of the most recent conversion accounts, Night’s Bright Darkness:  A Modern Conversion Story (San Francisco:  Ignatius Press, c. 2016), by Sally Read, a contemporary English poet, merits our attention.  

Reared in a militantly atheist home—her father a vociferous Marxist journalist—Sally Read represents much about today’s secularism.  “At ten I could tell you that religion was the opiate of the masses; it was dinned into me never to kneel before anyone or anything.  My father taught me that Christians, in particular, were tambourine-bashing intellectual weaklings.  As a young woman I could quote Christopher Hitchens and enough of the Bible to scoff at.  My father would happily scoff with me” (#83).  There is neither God nor soul.  Matter alone exists, she thought.  Yet as she began working as a nurse in a psychiatric nurse she met patients whose sufferings and dyings gave her pause.  And amidst the intemperate drinking and casual sex that punctuated her work-week routine, she occasionally felt strangely drawn to old churches in the neighborhood.  

Then her father’s death at the age of fifty-six distressed her.  “I felt as if a god had died.  The creator of my world and my protector had gone” (#231).  Feeling abandoned and inwardly empty, she felt as if she were in hell and wondered what, indeed, life is all about.  She lost weight and hair.  She “even considered, in a desperate and vague way, invoking God.”  Perhaps some kind of faith would make life “liveable.  But it seemed entirely unfeasible to believe in any God; I thought I could never lower myself to that degree of self-delusion” (#246).  Looking back, she now considers that desolate phase of her life a blessing, for God was mysteriously working therein to bring her to Himself.  “His absence was so painfully loud it seems, now, to prove his existence,” for He “reaches us wherever we are, even if we are so far from knowing him that we mistake him completely.  His infinity always contains our finitude” (#253).  

In the six years following her father’s death, Sally Read became a published poet, married an Italian and gave birth to a baby girl, Florenzia.  She had to “battle” for both a wedding and a child in a world which welcomed neither, but in her heart she just knew such things are right.  Then ultimate, metaphysical questions began to haunt her, and while pregnant she wondered, “What was it I was creating?” (#313).  Having successfully published two volumes of poetry, she envisioned writing a more journalistic work designed to help women nourish their emotional and reproductive health.  Personally, she “had suffered debilitating physical and psychological effects while taking oral contraceptives” and wanted to explore that issue as well as “abortion and its effects on women” (#364).  So she interviewed various women while researching the book and in Rome encountered some devout American Catholics whose husbands were studying at pontifical universities.  Though they lived by standards utterly unlike hers, she was strangely warmed by their zest for life and constant awareness of God in their daily activities.  

Consequently, she began visiting churches and got acquainted with a godly guide, Father Gregory, a Ukrainian studying for the priesthood, whose gentle counsel and literary references help nudge her to belief in God.  He encouraged her to pray, though she had no idea how.  But she picked up T.S. Eliot’s Four Quartets and sensed almost immediately an inner peace, an acceptance of a Reality around her that unleashed a torrent of tears.  Telling Father Gregory of that moment, he sent her a poem by St. John of the Cross which further inspired her.  Entering a nearby church she felt:  “The strange calm that had come upon me that night the week before had settled into a new longing to know what to do.”  Looking up, she saw an icon of Christ’s face in a window and said:  “‘If you’re there, you have to help me’” (#650).  And He did!  “I felt almost physically lifted up.  My eyespots crying instantly, my face relaxed.  It was like being in the grip of panicked amnesia, when suddenly someone familiar walked into the room and gave myself back to me—a self restored to me more fully than before.  It was a presence entirely fixed me as I was on it, and it both descended toward me and pulled me up.  I knew it was him.  This was the hinge of my life; this compassion and love and humility so great it buckled me as it came to meet me.  Later, I would read in Simone Weil’s writings what seemed a very similar experience—how, as she prayerfully recited George Herbert’s poem ‘Love,’ ‘Christ himself came down and took possession of me’” (#658).  

Thenceforth she freely prayed, reciting the Our Father, knowing she was safe in His arms and feeling “as if the Birth, the Crucifixion, the Resurrection were plunged into my being in one gorgeous blow—this is how it is to all of a sudden know the meaning of reality:  the heart kick-started to sense its intrinsic architecture of logic, love, and reason” (#658).  She intimately sensed the Presence of the Living Lord.  “There was a feeling of being known in every cell.  My aloneness was taken away from me; and though it has often since returned, I know that loneliness is the illusion and Christ beside me the reality.  This was my earliest prayer:  being attuned to Christ’s presence, which by grace I perceived in those early days as strongly was my daughter’s breathing or the sound of the blackbird singing at night in the garden.  Prayer became essential,” and she sensed “being touched—if so pale a word can describe the sensation of being broken and healed—touched that he had come to me when I had rejected him and spoken against him and published lies about him in my books” (#680). 

At that point, though Catholics had certainly guided her, she had no interest in Catholicism, with all its dogmas and rules.  But she began reading the Gospels and found the Jesus revealed therein quite unlike the “gentle Jesus, meek and mild” proclaimed in liberal churches.  She came to see that the Truth was something given to us, not something we fashion, something best established in the Church of the Apostles.  And the Truth she encountered led her, step by step into the Catholic Church.  Night’s Bright Darkness reveals a poet determined to discern and beautifully describe Reality entering into its fullness.  

# # #