328 Post-WWII America

 Few of us, having lived through the last half of the 20th century, would discount the massive cultural changes that have transpired during our lifetimes.  But understanding these phenomena, digging into the real causes of the transformation, proves rather daunting.  Given the nature of historiography, no one has the capacity to fully describe, much less to fully understand the past.  Every thoughtful historical monograph, as Alfred North Whitehead said, in his Adventures in Ideas, is “a sort of searchlight elucidating some of the facts, and retreating the remainder into an omitted background.”  A highly-readable, descriptive narrative of important developments during one decade (from the mid‘60s to the mid-‘70s) is provided by Amity Shlaes in Great Society:   A New History (New York:  Harper, Kindle ed., c. 2019).  The “great society” was a phrase appropriated by Lyndon Baines Johnson to represent his aspirations as president, and it became one of the most ambitious social engineering endeavors in American history.  

Shlaes begins with a telling vignette of Michael Harrington, the author of The Other America:  Poverty in the United States—a 1962 treatise widely discussed in the final year of the John F. Kennedy administration.  Semi-humorously, Martin Luther King quipped:  “You know, we didn’t know we were poor until we read your book’” (p. 73).  Harrington was a self-identified socialist who had been briefly involved in the formation of Students for a Democratic Society.  When Lyndon B. Johnson became president in 1963 he and many in his administration (most especially Sarge Shriver, JFK’s brother-in-law and LBJ’s poverty czar) were quite taken by Harrington’s ideas.  Given an office in the White House, Harrington noted:  “‘the abolition of poverty would require a basic change in how resources are allocated.’”  Shriver mentioned this to LBJ, an aspiring Franklin D. Roosevelt, who “told him that if serious economic redistribution was necessary to realize the long-delayed completion of the New Deal, then redistribution might be worth it” (p. 3).  

Whether or not LBJ’s endeavors would bring about the “great society”—great because it is good—Amity Shlaes seeks to show.  So she begins with JFK’s “New Frontier,” brought into being by the election of 1960.  The nation was then prospering, amply illustrating The Affluent Society described by Harvard economist John Kenneth Gailbraith.  Businesses such as GE and GM were fiscally sound and most working men made good money.  The president himself was notably pro-business, sending “his progressive advisor” Galbraith off to India as an ambassador rather than embracing his socialist ideals.  But he also made clear overtures to labor unions, issuing an executive order enabling federal employees to unionize.  However, when he gave a speech indicating his admiration for Britain’s National Health Service the stock market plunged and he quickly retreated into the security of the status quo.  JFK was no FDR, seeking to engineer societal change.  There were, to be sure, pockets of poverty, but by-and-large the Ozzie and Harriet world of the ‘50s gave much impetus to considerable optimism for the coming years.  

Cynically discounting such optimism, however, a group of students met in 1962 near Port Huron, Michigan, in a camp developed by Walter Reuther, the president of the United Auto Workers and named “Four Freedoms”—the items listed by FDR in his last inaugural address.  Styling themselves the “New Left” and led by the likes of Tom Hayden, they felt “it was like God was sending us a message.”  Many of the youngsters imagined they were attending something of a “participatory democracy,” but in fact their input was unimportant, for the real message had been carefully crafted months before by Hayden, Harrington, and operatives funded by the UAW.  Harrington and Hayden were “Catholic activists” and were also “drinking buddies” (p. 77).  One of Reuther’s union officials considered the students were “our kind of youngsters,” and his brother Victor provided ample funding for the group’s endeavors by helping distribute the “Port Huron Statement,” substituting the word “statement” was for “manifesto” in order to distance it from the Communist Manifesto!  Much of the “Statement” had been earlier incubated in Reuther’s UAW “propaganda mills” which constantly decried income inequality and the fact that “‘the wealthiest 1 percent of Americans own more than 80 percent of all personal shares of stock’” (p. 78).  Indeed, Walter Reuther was determined to distribute the wealth by nudging the nation toward a “social democracy.”  And for that he needed “an American president to lead his redistribution revolution” (p. 62). 

FDR, of course, had earlier moved “the country toward socialism while sustaining democracy” (p. 63).   So Walter Reuther needed another FDR.  But he knew JFK’s New Frontier would not update the New Deal.   When John Kennedy was killed, however, Lyndon Baines Johnson proved more amenable to the Reuther agenda.  Indeed, one of the first persons LBJ called was Walter Reuther.  “‘I’ll need your friendship more than I ever did in my life,’ Johnson said.  Reuther promised ‘every possible help I can offer’” (p. 87).  Within a few months Johnson declared “unconditional war on poverty” in his State of the Union address, and a new national tilt toward “social democracy” was underway.  This was evident in a 1964 speech at the University of Michigan, wherein LBJ set forth “a vision as fantastic as the vision of Port Huron, as transformative as that of Reuther” (p. 97).  Poverty must end, civil rights must be insured, and a “Great Society” must be brought into being. 

Thenceforth came a cascade of legislation and federal programs, launched without concern for financial accountability, justified simply as what “ought” to be done by compassionate Americans! The list is almost interminable—medicare; medicaid; civil rights injunctions; minimum wage edicts.  LBJ was on a roll and his triumph in the election of 1964 apparently illustrated the people’s support for his programs;  the “Great Society” was an effective expansion of the New Deal.  But implementing the agenda proved far more difficult than passing legislation!  Take, for example, a rather simple prescription, the minimum wage.  Designed to reduce unemployment, it in fact increased it!  “Black and white youth unemployment had run about the same until the middle of the 1950s, 8 to 11 percent.  But when Congress raised the federal minimum wage by a third in 1956, unemployment rose far higher among black teenagers than among whites, to 25 percent” (p. 183).  The War on Poverty flooded communities with money that counterproductively encouraged irresponsibility, enabling men avoid work.  When you could get $200 a month from welfare, why work hard to earn the same amount!     

Equally vain were the Great Society’s housing programs.  Facing depressed sections in the nation’s great cities, progressives pressed for federally-funded housing projects.  After all, Walter Reuther had declared:  “The choice before the people of every major urban center is simple and clear.  It is build or burn.”  Government housing for the needy had long been a progressive ideal, and their projects revealed an architectural aesthetic.  Consider what was erected in Washington, D.C. to house the newly-created Department of Housing and Urban Development.  It was was, architecturally, a monument to “Brutalism,” a movement celebrating massive, concrete, featureless, geometric structures.  But to most Americans it signified a “brutalist” bureaucratic obsession.  No matter what experts said, “brutalist” had to mean what it sounded and looked like, possessing brute power” (p. 230).  To deal effectively with city slums, old neighborhoods were razed and replaced with soaring, sterile concrete structures—“projects” designed improve living conditions for the impoverished.  Yet with a rapidity impossible to imagine these “projects” in St. Louis, Chicago, and elsewhere became cages of squalor and crime.  They would be, in a rather short time, simply demolished. 

But unlike the brutalist housing projects, Great Society programs persisted.  President Nixon tinkered a bit with some of them but dared not seek to reverse them.  Indeed, he pursued policies, such as wage and price controls in 1971, that were flagrantly socialistic!  Ronald Reagan, both as Governor of California and President of the United States, spoke frequently and passionately against some of them, but Democrats successfully obstructed most all of his proposals.  Half-a-century later, Shlaes says, with trillions of dollars expended, one can only look back at the Great Society and lament its many failures.

* * * * * * * * * * * * * * * * * * * * *

In The Age of Entitlement:  America Since the Sixties (New York:. Simon & Schuster, c. 2020; Kindle Edition), Christopher Caldwell provides a helpful lens with which to understand current developments in America.  He begins by noting how deeply the ‘60s shaped subsequent decades.  Indeed:  “For two generations, ‘the sixties’ has given order to every aspect of the national life of the United States—its partisan politics, its public etiquette, its official morality.  This is a book about the crises out of which the 1960s order arose, the means by which it was maintained, and the contradictions at its heart that, by the time of the presidential election of 2016, had led a working majority of Americans to view it not as a gift but as an oppression” (p. 3).  This was because many of the “reforms” pushed through in that decade “came with costs that proved staggeringly high—in money, freedom, rights, and social stability” (p. 6).

Caldwell’s disillusionment provides a stark contrast to the ‘60s utopian optimism.  Following the traumatic assassination of John F. Kennedy, the welfare state rapidly expanded—Medicare, Medicaid, Civil Rights and Voting Rights acts—and was expected to fulfill the aspirations of the “best and the brightest” who engineered it.  Most importantly, Caldwell argues:  “Civil rights ideology, especially when it hardened into a body of legislation, became, most unexpectedly, the model for an entire new system of constantly churning political reform” (p. 5).  Here the law of unexpected consequences held true, for the “changes of the 1960s, with civil rights at their core, were not just a major new element in the Constitution.  They were a rival constitution, with which the original one was frequently incompatible,” and we are in the midst of a titanic struggle which will determine which will prevail:  “the de jure constitution of 1788, with all the traditional forms of jurisprudential legitimacy and centuries of American culture behind it; or the de facto constitution of 1964, which lacks this traditional kind of legitimacy but commands the near-unanimous endorsement of judicial elites and civic educators and the passionate allegiance of those who received it as a liberation.  The increasing necessity that citizens choose between these two orders, and the poisonous conflict into which it ultimately drove the country, is what this book describes” (p. 6).

In particular, the march toward desegregation, launched by the Supreme Court’s Brown vs. Topeka Board of Education ruling in 1954, inevitably eroded the constitutionally guaranteed freedom of association.  Equality, rather than freedom, became imperative!  Inevitably, the “sanctity of private property” was softened whenever racial discrimination called for correction.  Though some legislators, debating the civil rights laws, feared unanticipated consequences (e.g. mandated school busing, lowering school admission standards, hiring quotas, etc.), they were dismissed as devotees of an antiquated social system.  Nevertheless, many of their fears materialized, and lawmakers “who opposed the legislation proved wiser about its consequences than those who sponsored it” (p. 22).  Rather quickly civil rights leaders and federal bureaucrats moved from eliminating segregation to calling for widespread social and economic changes.  Then, only two months after LBJ signed the Voting Rights Act, the Watts neighborhood in Los Angeles exploded in a deadly race riot, revealing that more than “civil rights” was at stake.  

In fact, more than delivering justice to the black population was envisioned by the progressives now governing the nation.  “Not just excluded and exploited Southern blacks but all aggrieved minorities now sought to press their claims under this new model of progressive governance.  The civil rights model of executive orders, litigation, and court-ordered redress eventually became the basis for resolving every question pitting a newly emergent idea of fairness against old traditions:  the persistence of different roles for men and women, the moral standing of homosexuality, the welcome that is due to immigrants, the consideration befitting wheelchair-bound people.  Civil rights gradually turned into a license for government to do what the Constitution would not previously have permitted. It moved beyond the context of Jim Crow laws almost immediately, winning what its apostles saw as liberation after liberation” (p. 34).  So “women’s liberation” hitched its wagon to the civil rights movement, demanding “equality” for the sexes.  Consequently, while in 1960 married and unmarried women shared similar attitudes regarding most everything today they differ in most all things!  Feminists vigorously promoted contraception, abortion, and full equality in the marketplace.  But they also unleashed “irresistible demands for further sexual freedoms.  Just as Americans were getting comfortable with the things feminism had meant to Betty Friedan and her followers (liberation from household drudgery and loneliness, a fair shake in the workplace, equal dignity elsewhere), feminism began showing signs of what it would blossom into half a century later (gender studies, queer theory, a questioning of all rules about sex)” (p. 56).  Such “freedoms” deeply changed the culture.  

Another culture-changer was the war in Vietnam, beginning with “an act of presidential deceit,” the Tonkin resolution.  But within four years the war had proved so unpopular that everyone running for president in 1968 promised to extricate the country from what seemed to be a quagmire.  Militarily the war might have been won, but politically it was lost—particularly among the younger elites.  Thus a Harvard anti-war student said:  “On the one hand we were angry about the war, about racism, about the countless vicious acts we saw around us.  But on the other hand, we viewed America as one great wasteland, a big, monstrous, mechanized, air-conditioned desert, a place without roots or feeling.  We saw the main problem, really, as:  THE PEOPLE—the ways they thought and acted towards each other.  We imagined a great American desert, populated by millions of similar, crass, beer-drinking grains of sand, living in a waste of identical suburban no-places. What did this imagined ‘great pig-sty of TV watchers’ correspond to in real life?  As ‘middle-class’ students we learned that this was the working class—the ‘racist, insensitive people.’  Things already going on at the time of the Vietnam War inclined privileged people to look on ‘average’ Americans as the country’s problem” (p. 78).  

The counterculture evident in this student’s lament asserted itself and would spread its tentacles throughout every crack in America.  An alienated elite would ultimately dominate virtually all important institutions (schools, media, churches) and demand societal transformation funded by the taxpayer.  Endless funding of proliferating anti-poverty, anti-racist, anti-sexist bureaucracies continued, and not even Ronald Reagan could arrest it.  “Having promised for years that he would undo affirmative action ‘with the stroke of a pen,’ lop the payments that LBJ’s Great Society lavished on ‘welfare queens,’ and abolish Jimmy Carter’s Department of Education, he discovered, once he became president, that to do any of those things would have struck at the very foundations of desegregation. So he didn’t” (p. 110).  Reagan tacitly complied with the “second constitution created by the civil rights movement which led, by the end of the century, to increasingly strident racial politics.  

This was manifestly evident in the metastasizing power of  “affirmative action” and “political correctness”—important planks in the nation’s new constitution, largely shaped by judicial decrees.  It is now clear that by passing the 1964 civil rights laws Americans “had inadvertently voted themselves a second constitution without explicitly repealing the one they had” (p. 172).  In fact:  “Affirmative action was deduced judicially from the curtailments on freedom of association that the Civil Rights Act itself had put in place.  Political correctness rested on a right to collective dignity extended by sympathetic judges who saw that, without such a right, forcing the races together would more likely occasion humiliation than emancipation.  As long as Americans were frightened of speaking against civil rights legislation or, later, of being assailed as racists, sexists, homophobes, or xenophobes, their political representatives could resist nothing that presented itself in the name of ‘civil rights.’ This meant that conflict, when it eventually came, would be constitutional conflict, with all the gravity that the adjective ‘constitutional’ implies” (p. 172).

One of the ultimately disastrous consequences of this shift surfaced in 1992 when President George H.W. Bush, following race riots in Los Angeles, signed a Housing and Community Development Act.  “It inaugurated the process we have seen at many junctures in this book:  the sudden irruption of civil rights law and diversity promotion into an area from which it had been mostly absent, in this case mortgage finance” (p. 178).  This act opened the gates to “the financial crisis that, in the following century, would nearly destroy the world economy under the presidency of Bush’s even more reckless son” (p. 179).  Sandwiched between the two presidents Bush, Bill Clinton manipulated the mortgage finance system, denouncing “the dearth of private housing credit in poor, black, urban neighborhoods” fomented by racist white bankers, and demanding low mortgage rates for blacks buying homes.  In Caldwell’s judgment:  “Sometime between the passage of Lyndon Johnson’s civil rights laws and the long Bush-Clinton march through the country’s financial institutions, the victims’ perspective had won. Now any inequality was an injustice, and one did not need a clear account of what had caused it to demand redress from the system” (p. 180).

Another realm dramatically unexpectedly changed was the institution of marriage.  Other than a few gay activists, no one imagined it possible that same-sex marriage would ever be legally imposed on the nation by a Supreme Court mandate (Obergefell v. Hodges) in 1916.  But homosexuals adroitly fused their “liberation” agenda with the “radical feminist cause of delegitimizing” traditional, heterosexual marriage “and the traditional idea of masculinity” it implied (p. 216).  Gay activists wanted “not just tolerance but a conferral of dignity.  . . . .  Civil rights was always this way:  dignity was an integral and non-negotiable part of what was demanded, and a government interested in civil rights must secure it, no matter what the cost in rights to those who would deny it” (p. 217).  “As Rosa Luxemburg had written of the Russian Revolution, ‘The real dialectic of revolution stands the parliamentary cliché on its head:  The road leads not through majorities to revolutionary tactics, but through revolutionary tactics to majorities’” (p. 225).

Justice Antonin Scalia saw this clearly, dissenting from Obergefell, declaring it to be undemocratic.  “‘A system of government that makes the People subordinate to a committee of nine unelected lawyers,’ Scalia wrote, ‘does not deserve to be called a democracy.’  He called the decision an upper-class ‘putsch,’ noting that every single member of the Supreme Court had gone to either Harvard Law School or Yale Law School, and concluded:  ‘The strikingly unrepresentative character of the body voting on today’s social upheaval would be irrelevant if they were functioning as judges, answering the legal question whether the American people had ever ratified a constitutional provision that was understood to proscribe the traditional definition of marriage.  But of course the Justices in today’s majority are not voting on that basis; they say they are not’” (p. 229).  Writing for the majority, Justice Kennedy had “explicitly repudiated certain conceptions of democracy that had until recently been sacrosanct.  ‘It is of no moment whether advocates of same-sex marriage now enjoy or lack momentum in the democratic process,’ he wrote.  Unless someone was expecting the Court to apologize for Brown v. Board of Education, this thwarting of majority rule in the name of civil rights was what the Supreme Court was for” (p. 229).  Kennedy, of course, was enforcing the “second constitution”—the living constitution of Al Gore, not the original constitution of Antonin Scalia.  

With amazing rapidity the practical ramifications of Obergefell became evident.  Bakers were brought to trial for refusing to bake cakes for gay weddings.  Transgender students insisted they should use restrooms of their choice or compete as athletes in accord with their self-definition.  “A terrible irony of civil rights, obvious from the very outset but never, ever spoken of, was making itself manifest . . . .  The civil rights approach to politics meant using lawsuits, shaming, and street power to overrule democratic politics.  It encouraged—no, it required—groups of similarly situated people to organize against the wider society to defend their interests.  Now it became clear that the members of any group that felt itself despised and degraded could defend its interests this way” (p. 232).  

327 Re-Writing American History

When we talk about “culture wars” we rarely think about historians as armed and significant partisans!  But they frequently are!  This is patently evident in the “1619 Project” recently launched by the New York Times, which promises to “reframe American history” and examine this nation’s history through the singular prism of slavery.  This, says Princeton historian Allen Guelzo, “is not history; it is conspiracy theory.  The 1619 Project is not history; it is ignorance.”  The Times editorial staff is, however, replicating the scenario portrayed in George Orwell’s 1984, wherein history was routinely rewritten and “every record has been destroyed or falsified, every book has been rewritten, every picture has been repainted, every statue and street and building has been renamed, every date has been altered.”  It’s bad enough that Americans know little history—three fourths of the people in one study could not name the three branches of our constitutional system and one-fourth couldn’t name even one!  But, even worse, they’re being deliberately misinformed by schools and books and media committed to fundamentally transforming the nation by destroying its memory.    

For nearly 20 years I routinely taught survey courses in American History.  Then the university changed its core requirements, excluding these courses, so I rarely did so thereafter and rather lost track of texts being used in them.  But I did, now and then, hear of a textbook widely used in many high schools and universities—Howard Zinn’s A People’s History of the United States:  1492 to the Present (New York:  HarperCollins, c. 1980).  I secured and read a copy, dismissing it as an egregious propaganda polemic disguised as history.  On the very first page I noted Zinn’s dishonest deletions from a Christopher Columbus quotation.  Equally evident was his undisguised Marxism.  Turning to his chapter on Indian Removal—a topic I know quite well—I was astounded by his repeated errors—placing the Chickasaws in North Carolina, the Creeks in Mississippi, calling eastern Oklahoma an “arid land, land too barren for white settlers,” and labeling Sequoyah a Cherokee chief!  My negative appraisal was widely shared by many distinguished historians, some of whom wrote scathing reviews of the book, denouncing it for its biased polemics, selective quotations, and pervasive misleading assertions.  He was denounced for selective quotations, factual errors, and overt bias.  Even a Marxist-oriented historian, Eugene Genovese, found it so flawed he refused to review it!  Another noted historian, Arthur Schlesinger, called Zinn “a polemicist, not a historian.”  Then Harvard’s Oscar Handlin reviewed the book in the American Scholar and called it a “fairy tale” with “biased selections” that “falsify events.”  He said the book “conveniently omits whatever does not fit its overriding thesis.”  

Nevertheless, ignoring the warnings of such eminent historians, many high school (especially) and college teachers have used the book as a basic text, and its theses rather quickly entered the minds of radicalized youngsters (some of whom—e.g. Bernie Sanders and  Alexandria Ocasio-Cortes—now serve in Congress).   Rather revealingly, in 2006 Zinn praised the Vermont senator for giving us an “accurate picture” of the problems this nation faces, primarily the gap separating the rich and poor.  So it’s good to have a thorough analysis of  A People’s History of the United States—Mary Graber’s Debunking Howard Zinn: Exposing the Fake History That Turned a Generation against America (Washington:  Regency History, c. 2019; Kindle Edition).  She writes because she believes Zinn has deeply damaged this nation by “convincing a generation of Americans that the nation Abraham Lincoln rightly called ‘the last best hope of Earth’ is essentially a racist criminal enterprise built on murdering Indians, exploiting slaves, and oppressing the working man.  It obviously needs to be replaced by something better.  And of course, Zinn has the answer:  a classless, egalitarian society.  Yes, what Zinn is selling is the very same communist utopian fantasy that killed more than a hundred million human beings in the twentieth century” (#81).  

Zinn’s influence in popular culture was evident in a film staring Matt Damon titled “Good Will Hunting.”  In one conversation he says:  “If you want to read a real history book, read Howard Zinn’s A People’s History of the United States.”  Interestingly enough, while in elementary school Matt Damon was a Zinn’s neighbor.  And Damon was reared by a single mother, an education professor deeply committed to “social justice.”  As a ten-year-old Damon “took the family copy of the newly published People’s History to school and read from it to his class for Columbus Day” (p. 122).  Damon’s endorsement has been duplicated by luminaries such as Alice Walker, Marian Wright Edelman, Mumia Abu-Jamal, Bill Moyers, and Jane Fonda; by mainstream newspapers (the New York Times, the Boston Globe, the Nation, and the Washington Post); by TV outlets (The Daily Show, NPR); and even by the American Historical Association and the Organization of American Historians!  The book’s widely used a text in many schools, and if students don’t read Zinn’s book they frequently him quoted in other materials written for their age group.  Even the prestigious College Board, designing questions for the Advanced Placement U.S. History exam, now “promotes Zinn’s version of history by including his books in AP teacher-training seminars” (#140). 

One of the few public officials daring to oppose Zinn’s version is former Indiana Governor Mitch Daniels, now president of Purdue University, who “questioned the use of Howard Zinn’s book to teach children in Indiana public schools.”   The governor proposed denying credit for any course using Zinn’s text and wondered “‘how do we get rid of it before any more young people are force-fed a totally false version of our history?’  He called A People’s History ‘a truly execrable, anti-factual piece of disinformation that misstates American history on every page.’”  In response:  “Ninety outraged Purdue professors signed onto an open letter” defending Zinn and claiming to use his text in their syllabi and scholarly writings (# 4800).  Though Daniels was supported by the National Association of Scholars’ President Peter Wood and some prominent journalists, both the American Historical Association and the Organization of American Historians sided with Zinn.  Ultimately, Daniels had to acknowledge he was out-gunned and backed away from his efforts. 

Zinn’s influence is dramatically evident in the nation’s celebration of Columbus Day.  He portrayed the “Admiral of the Ocean Sea” as a brutal, genocidal conquerer, unworthy of the respect he enjoyed for 500 years.  Consequently, the radical, violent group Antifa calls for a “Deface Columbus Day” and street gangs threw red paint on his statues.   “In New York City, the large bronze statue in Columbus Circle at the corner of Central Park has had ‘hate will not be tolerated’ scrawled on the base and Columbus’s hands painted red.  And the transformation in Americans’ attitudes toward the man who discovered America wasn’t limited to a few vandals.  Besides the physical attacks, there were continual demands for the government to take down the statue.  Zinn is the acknowledged inspiration behind the current campaign to abolish Columbus Day and replace it with ‘Indigenous Peoples’ Day.’  High school teachers cite his book in making the case for the renaming to their local communities” (#527).  Some sixty major cities (including Columbus, Ohio) and six states have obsequiously followed the leader and replaced Columbus Day with Indigenous Peoples’ Day.  Graber demonstrates how maliciously and mendaciously Zinn depicts Columbus—especially emphasizing the ellipses in his Columbus quotations, deviously designed to delete sections disproving his assertions.  You need no Ph.D. to know how a person’s position can be utterly misrepresented by simply linking together phrases taken out of context.  Yet this is Zinn’s modus operandi!  There’s also little evidence that Zinn actually read primary, eyewitness sources, including Columbus’s journals or the works of Bartolome de Las Casas (the great defender of the Indians who condemned much the Spanish did in the New World but also said many positive things about Columbus).  Doing some careful sleuthing, Graber contends that Zinn simply lifted his account of Columbus from a book written for high school students by Hans Koning, one of his friends (and a fellow anti-Vietnam War activist).  

Koning was novelist who occasionally worked as a journalist, but he was not a historian.  He was, however, a doctrinaire socialist who had helped (along with Noam Chomsky and Zinn) found the War Resist organization to oppose the America’s presence  in Vietnam.  The book viciously smeared the explorer, and it “is the source for Zinn’s indictment of Columbus, which is the opening gambit of A People’s History.  The first five-and-a-half pages of A People’s History of the United States are little more than slightly altered passages from Columbus:  His Enterprise.  Graber points out the passages in Zinn that duplicate passages from Koning.  “Zinn lifts wholesale from Koning the very same quotations of Columbus.  He also includes an attack on the historian Samuel Eliot Morison, just like Koning—complete with references to the Vietnam War.  That’s a rather odd coincidence, given that both Zinn and Koning were purportedly recounting the fifteenth-century discovery of America” (#604).  Adding to his many sins, Zinn was clearly a plagiarist!

But his take on Columbus bore fruit.  Illuminating this, Graber describes how differently recent presidents have portrayed him.  In his final Columbus Day proclamation, George H.W. Bush praised the “‘one man who dared to defy the pessimists and naysayers of his day [and] made an epic journey that changed the course of history.’”  A year later  Bill Clinton praised not of Columbus but “‘the mutual discovery of Europeans and Native Americans and the transformations, through toil and pain, that gave birth to brave new hopes for a better future.’”  Then Barack Obama, in 2009, lamented the fact that “‘European immigrants joined the ‘thriving indigenous communities who suffered great hardships as a result of the changes to the land they inhabited’ ” (#1548).  Whether or not any of the presidents had read him, Zinn’s views percolated through the schools and popular media to significantly alter presidential pronouncements.  To understand Zinn and his biases, Graber gives us some biographical details, emphasizing his deep immersion in the Communist Party following WWII.  While studying at Columbia University, he taught part-time at several nearby colleges and also “taught a class in Marxism at the Communist Party headquarters in Brooklyn.   That’s according to his FBI file.  Zinn’s Communist activities came to the attention of the FBI beginning in 1948 when an informant reported that Zinn had told him that he was a member of the Communist Party and attended meetings five nights a week” (#1150).  The FBI also noted he worked for the Henry Wallace presidential campaign in 1948 and was “a member of the CPUSA (Communist Party USA) from at least 1949 to mid-1953” (#1157).  Though he subsequently denied being a Party member much evidence indicated he surely was.  

In 1956, still working on his PhD, Zinn landed his first full-time teaching position at Spelman College, a Christian school which served as a “finishing school” for black women in Atlanta.  But rather than work to advance the college’s mission, which was strongly Christian, Zinn determined to change it!  He scoffed at the school’s mandatory chapel requirement, labeling it a “pompous and empty ritual,” and  Marian Wright Edelman, a Spelman student at the time,  remembers her “shock” when Zinn declared he didn’t believe in Jesus Christ.  As the civil rights movement gained momentum he worked to involve his students in it.  Though fellow professors tended to see him as a “rabble rouser,” many of his students found him inspiring and energetically engaged in off-campus protests of various sorts.   Rather than giving tests in his courses he granted credit for off-campus protests, leading to brief jail stints for some of his students.  He easily found impressionable youngsters willing to follow someone who nurtured their adolescent rebelliousness and hostility to administrative authority.  Consequently, in 1963 he was fired.  

Moving north to Boston, he found a teaching position at Boston University (once a paragon of Methodist orthodoxy) and promptly promoted radical anti-war and civil rights protests.  Though he did virtually no seriously scholarly work he proved to be extremely popular with students.  One of his famous Spelman students, Alice Walker, recalled how her peers “swooned” over him, and at BU “his rhetoric inspired tears in draft resisters and in young women reading Black Boy for class.  Zinn’s classes routinely filled up and had students waiting on overflow,” and “one of his students was so inspired that he would go on to commit a portion of the fortune he earned later to establishing the Zinn Education Project” (#1413).  His students took no tests nor wrote research papers, nor did any of them fail.  Instead they were credited for working in community organizations and interviewing members of various oppressed minorities.   Whatever course he taught, he used the lectern as a pulpit to promote his vision of social justice and engage his students to pursue it.  He had little contact (personal or written) with his peers in the academic world, preferring to regale young people in classes, rallies, and teach-ins.  

He especially delighted in denouncing America as a “racist” nation.  This is quite evident whenever he treats American Indians—always portraying them as “noble savages” brutalized by invading Europeans or westward-moving American pioneers.  And, iff possible, the African slaves were even more mistreated—evidence that “‘there is not a country in world history in which racism has been more important, for so long a time, as the United States’” (#2002).  To dramatize his thesis Zinn dealt cavalierly with the facts, misrepresenting “the slaves’ truly horrific suffering for his own purposes.  For example, he claims that ‘perhaps one of every three blacks transported overseas died’” when, if fact, “according to the best quantitative evidence, 12 to 13 percent of slaves died in transit from Africa to the Americas during the history of the Middle Passage.  Sometimes a larger percentage of the slave ship’s crew died on the voyage.  In the Dutch slave trade, one in five crewmen died at sea.  But it suits Zinn’s purpose to exaggerate the true numbers and to ignore the historical context of a time and place when life was more perilous for all” (2018).  He grants the reality of African slavery, but just as he romanticized the “noble savage” Indians he also waxed nostalgic about the “communal,” “gentle” African tribal cultures.  He grants that Africans enslaved Africans in Africa, but he insisted it was “a kinder, gentler kind of slavery”—rather like feudal serfs in Medieval Europe!  It was in the New World, Zinn declares, with its capitalistic structures, that slavery became truly odious!  And the Civil War, he says, was fought “to perpetuate a racist capitalist state,” not to free the slaves!  The “great liberator,” Abraham Lincoln, was basically a “cowardly racist political beholden to powerful money interests,” and little he did merits commendation (#2263).  To Zinn, it was the radical abolitionists such as John Brown, not the statesmen such as Lincoln, who merit praise.    

Amazingly, neither the Yankees in the Civil War nor the GIs in World War II garner Zinn’s endorsement.  “Through a series of four long, leading questions about ‘imperialism, racism, totalitarianism, and militarism,’ Zinn insinuates that the ‘enemy of unspeakable evil,’ ‘Hitler’s Germany,’ was no worse than the United States and her allies.  Imperial Japan, too, was a victim of American aggression” (#2401).  Americans fought not to defeat the Axis powers but to escalate American imperialism,  As with Lincoln, Zinn disparages Franklin D. Roosevelt.  Interning Japanese-Americans during the war is equated with Hitler’s concentration camps.  Indeed, he portrays most all American presidents negatively.   They’re all “irredeemable, greedy, capitalist war-mongers.  Zinn’s project is to destroy the credibility of the American presidency—and of America, itself” (#2513). 

Zinn’s treatment of the Cold War was as misleading and biased as his treatment of other wars.  He consistently defended Soviet policies and disregarded any evidence of front groups or communist infiltration in America.  The House Committee on Un-American Activities was purely paranoid, guilty of “‘interrogating Americans about their Communist connections, holding them in contempt if they refused to answer, distributing millions of pamphlets’ that claimed that Communists could be found  ‘everywhere—in factories, offices, butcher shops, on street corners, in private business’” (#3145).  But in fact, Graber reminds us, “they were there.  And, we would add, in classrooms” (#3149).  We now know, thanks to the Venona Papers, that the Communist Party USA was “a fifth column” seeking to destroy this country and that all-too-many government officials, such as Alger Hiss, were secretly working to advance it.  It’s abundantly clear that the Ethel and Julian Rosenberg were guilty of espionage, eminently deserving their execution.  But Zinn stoutly defends them, insisting they were victims of a frame-up.  

His slant on the Vietnam War further reveals Zinn’s ideology, for he vehemently sided with the communists doing battle with an evil, capitalist, imperialistic America—repeating in print the speeches he made as an anti-war agitator in the midst of the war.  He portrays Ho Chi Minh as a “reformer” and touted his righteous role as the leader of North Vietnam, conveniently ignoring the fact that in “redistributing” the nation’s wealth he killed tens of thousands of landlords and funneled peasants into collective farms.  (It should be noted that Zinn was intimately involved in the leaking of the “Pentagon Papers” in 1971, personally hiding for a time the documents stolen by Daniel Ellsberg.  Doing so seriously harmed America’s war effort, for they persuaded the public {by obscuring the actual progress made between 1968 and 1971} that the war could not be won.  As usual, relying on clever ellipses, Zinn cited sources he misrepresented.  Thus the valuable work Douglas Pike, accusing the Viet Cong of “genocide,” is twisted to suggest they were in fact heroic social reformers.  Pike warned that a Communist victory would doom “‘thousands of Vietnamese, many of them of course my friends, to death, prison, or permanent exile,’” warning that if America ever abandoned the South Vietnamese people she would “betray her own heritage.”  Zinn, however, cited Pike’s “book to justify that betrayal, distorting Pike’s analysis to make it appear to support the opposite case” (#4233).  

In her final chapter, Graber says:  “No assessment of Howard Zinn’s People’s History of the United States would be complete without some consideration of his perverse take on the founding of our nation.”  To him the American Revolution was not really revolutionary!  It failed to achieve what the Bolsheviks did in Russia!  A real people’s revolution would have “smashed the capitalist system and toppled the ‘elite’ to whom he refers” (#4578).  Even worse, as Charles Beard had argued, the Constitutional Convention secured the elite’s control of the country.  Written by the likes of John Adams, Thomas Jefferson and Alexander Hamilton, it subjected the people to the rule of the upper class.  Though Beard’s An Economic Interpretation of the Constitution has been thoroughly discredited by historians interested in actuality rather than ideology, Zinn considers it authoritative.  In constructing his “people’s history,” he works “by lying, distorting and misusing evidence, hijacking other historians’ work, and falsifying the facts, as we have seen again and again.  The problem is not that, as Zinn liked to pretend in his own defense, he wrote a “people’s” history, telling the bottom-up story of neglected and forgotten men and women.  The problem is that he falsified American history to promote Communist revolution” (#4731).

Both Zinn’s  autobiography, You Can’t Be Neutral on a Moving Train, and Original Zinn:  Conversations on History and Politics, co-written with David Barsamian, reveal his approach to writing history.  As a “radical,” he wanted to focus on the poor and oppressed.  So he sought to tell us the untold story, the story of the world’s poor, the world’s workers, the world’s homeless, the world’s oppressed, the people who don’t really qualify as real people in official histories.  In his mind, he shared the “radical vision” of Franklin D. Roosevelt, who in 1944 proposed an Economic Bill of Rights, including the right to a good job, a good income, a “decent” home, easy access to medical care, comfort in old age, and a good education.  So Zinn hungers for a world without national boundaries wherein everyone shares equally in the “riches of the planet” and works only a “few hours a day.”  In other words, Marx’s utopian vision can be secured by socialists such as Zinn, and to promote this agenda he wrote his one-sided “people’s history.” 

# # # 

326 Wonders of Light, Water, Fire

In the beginning—at a primordial stage of the creative process making heaven and earth—“the Spirit of God moved upon the face of the waters.”  Planet Earth is rightly called the “watery planet,” for  nothing is more evident in all of life than the miraculous properties of water.  So, writing a short treatise as part of his “Privileged Species Series,” Australian biologist Michael Denton treats The Wonder of Water:  Water’s Profound Fitness for Life on Earth and Mankind (Seattle:  Discovery Institute, c. 2017; Kindle Edition).  He rightly employs the word wonder to indicate his thesis, for this is much more than a descriptive text.  He is (as Socrates noted about Theaetetus) by nature “a philosopher; for wonder is the feeling of a philosopher, and philosophy begins in wonder.”  Similarly, Plato’s more scientifically-oriented student, Aristotle, said:  “In all things of nature there is something of the marvelous.”

And there are indeed many wonderful aspects to water!  Now and then we’re painfully aware of it.  Especially when desiccated by a drought—or battling flood waters—we know something of its enormous  power.   More positively, we know that the slow erosion of rocks makes soil and moist soil incubates organic life.  Yet there is “another unseen and very different magic” working “inside our bodies.  The same wonder substance that is eroding those rocks and providing life with essential nutrients and minerals is doing something else!”  It’s sustaining our circulatory system.  “With each beat of our heart, water carries to our tissues oxygen and many of those very same nutrients leached from the rocks in the fall.  Water also ferries away the waste products of respiration—carbon dioxide to the lungs, other waste products to the kidneys, and excess heat to the skin, where it is vented from the body.  Our vital dependence on those beautiful tumbling waters for the life-giving minerals she draws from the rocks and our equally vital dependence on the water coursing through our arteries carrying many of those same elements around the body brings us face to face with a revelation as extraordinary as any other in any domain of science.  The one substance, water, is uniquely fit to serve two utterly different vital ends—ends as different as can be conceived: the erosion of rock and the circulation of the blood.  Both are absolutely vital to our existence.  No other substance in nature comes close to having the essential set of properties needed to do these two jobs” (p. 12).

More deeply, the wonders of water indicate that our world—and we ourselves—are no cosmic accidents.  “Through its magic, water sings a universal song of life, and in its special fitness for human physiology it sings a special song of man.  The properties of water show that beings with our biology do indeed occupy a special central place in the order of nature, and that the blueprint for life was present in the properties of matter from the moment of creation” (p. 14).   Uniquely, water exists in three forms—solid, liquid, gas.  Rocks remain rocks and oxygen and helium remain gasses.  But water, uniquely, takes various forms on the planet’s surface, and:  “Of all known substances, only water is fit for the hydrological cycle, the delivery system of water to land-based life” (p. 18).  Going into fascinating detail, Denton shows how this cycle works, saysing:  “There is a beautiful and elegant teleology in all this.  The same process which draws from rocks the minerals and essential elements for life generates at the same time—in the clays and sands and silts that together form soil with organic debris—an ideal water- and mineral-retaining matrix that provides the means by which the mineral-enriched water can be used by plants” (p. 28).  It also lubricates the movement of the tectonic plates, shifting continents and casting aloft mountains. 

“The notion that the tectonic system is the result of design rises unbidden from the evidence.  How could such an elegant system of integrated elements of unique fitness, which has fashioned the world for life over billions of years, and which transcends in its reciprocal self-formative abilities any artifact created to date, have arisen out of blind collisions of atoms?  And how could the manifold fitness of water, which conveys every impression of having been fine-tuned to turn the wheels, be mere happenstance?” (p. 58).  Indeed:   “The design of such systems, in which the parts are reciprocally self-formative, transcends the design of any artifact or machine ever created” (p. 58).  Illustrating this is the temperature regulation provided by earth’s oceans, which serve as a hemostat, regulating temperatures and conserving water, providing us with a mechanism “without any parallel in human engineering” (p. 75).  Water’s thermal properties sustain and regulate the earth’s climate, “transporting and redistributing heat around the globe.  If either of these two properties did not have the values they do, the entire climate machine would grind to a halt, permanent ice might cover the region where New York currently stands, and all tropical regions would be hellishly hot.  So the thermal properties of water help produce the atmospheric currents . . .  that contribute to ocean currents, which also use water’s thermal properties to better redistribute heat” (p. 103).

There are also currents circulating within living organisms.  “Steven Vogel, in The Life of a Leaf, describes the way water manages to get to the top of tall trees as a phenomenon mirabile dictu (“wonderful to relate”)” (p. 107).  Two of water’s unique properties are its high surface tension and tensile strength, which enable it to soar 100 feet or so to the tops of trees.  As water evaporates from the tree through its leaves, suction lifts fresh water from the tree’s roots.  “It is a basic law of hydraulics that pressure in one part of an enclosed hydraulic system is transmitted to all other parts.  As water molecules are lost from the leaves at the top of the tree, others must enter the roots to take their place” (p. 109).  Vogel “waxes lyrical in contemplating the way it’s done:  ‘The pumping system has no moving parts, costs the plant no metabolic energy, moves more water than all the circulatory systems of animals combined, does so against far higher resistance, and depends on a mechanism with no close analogy in human technology” (p. 110). 

After celebrating water’s role in human physiology and cellular life, Denton concludes his treatise by asking:  “Is there a tale like the tale of water?  Can one conceive of a substance as profoundly purposeful, serving such a diversity of vital ends?  Has any substance remotely like water been described even in the most outré annals of science fiction?  Who might have guessed or imagined in even the most unrestrained flight of fancy that in this simple substance, one of the simplest of nature’s creations, composed of only three atoms—two of hydrogen and one of oxygen—and only a ten-millionth of a millimeter across, there would be so much design?  There are more ends served in these three magic atoms than in any other natural form, and far, far more, and far more marvelous, than in any artifact created by or conceived of by man.  No words can express the wonder of such manifold purpose, so many vital ends, compressed in such a tiny piece of matter.  Water is the matrix of the cell, the blood of the Earth, the maker of mountains, the sustainer of life” (pp. 179-180).  Still more:  “In these extraordinary features, water’s design for life is transcendent!  Nothing in the artificial realm of our own limited designs comes close.  Reason recoils at the notion that such designs could be the result of blind, unseeing processes.  There is no domain in which astronomer Fred Hoyle’s celebrated confession is more appropriate:  ‘A common sense interpretation of the facts suggests… that there are no blind forces worth speaking about in nature’” (p. 183).

* * * * * * * * * * * * * * * * * *

In addition to the Spirit moving upon the face of the waters, God said:  “Let there be light, and there was light.”  So began the creative process, fueled from its inception by the wondrous power of light.  In Einstein’s world, the sole constant throughout the universe is the speed of light.  So it is fitting, when Michael Denton crafted another treatise in the “Privileged Species Series” (celebrating the wonders of the world) he would write about Children of Light: The Astonishing Properties of Sunlight that Make Us Possible (Seattle:  Discovery Institute, c. 2018; Kindle Edition).  As with water, sunshine is such a daily reality in our world that we rarely pause to ponder its splendor.  Though at one time or another we probably studied and appreciated the importance of photosynthesis, whereby plants miraculously transform light into biological beings, we’re doubtlessly less aware of the importance of the sun’s invisible electromagnetic radiation that’s needed for an amazing variety of necessary ingredients for earth’s intricate workings.  

  Still more:  earth’s intricacies seem perfectly designed for us, her residents, living in a truly “Goldilocks region” that is just right, indeed a perfectly designed, place for us.  We get just enough illuminating light and just enough heat to make this a truly “privileged planet.”  Thus Denton says:  “In addition to being perfectly fit for photosynthesis and hence for our kind of oxygen-utilizing advanced life, sunlight is also just right for high-acuity vision, which depends on another set of extraordinary coincidences in the characteristics of visual light.  And sunlight is just right not only for any type of high-acuity visual device or eye, but just right in terms of its wavelength for beings of our size and upright android design.  What is so significant about the fitness of the Sun’s light for photosynthesis and for high-acuity vision is that these are elements of natural fitness exclusively for our type of life—for beings possessing the gift of sight, breathing oxygen (aerobic), and inhabiting the terrestrial surface of a planet like the Earth” (#147).

The rightness of light for our world is facilitated by a remarkable blend of atmospheric gasses (oxygen, nitrogen, carbon dioxide, water vapor and ozone), for the “life-giving light of the Sun must penetrate the atmosphere right down to the ground to work its magic, and a proportion of the Sun’s IR radiation (heat radiation) must be absorbed by and held in the atmosphere to warm the Earth above the freezing point of water and animate the atoms of life for chemistry.  Amazingly, the atmosphere obliges us in this critical task.  But as we shall see, its capacity to let through the right light and absorb the right proportion of heat depends on an additional suite of hugely improbable coincidences in the combined absorption characteristics of the atmospheric gases” (#629).  Indeed, Denton says:  “If I can be excused for expressing the coincidence in animist terms, it is as if the atmosphere were intelligently colluding with the Sun to ensure that only the right light for photochemistry . . . reached the Earth’s surface and that only the ‘right’ proportion of the IR was absorbed to warm the Earth into the ambient temperature range” (#669).  The three most important gasses—CO₂, H₂O, and O₂—effectively “ensure—by their collective absorption properties in the atmosphere—the availability of the vital light energy necessary to drive the reaction to completion.  It is as if these three gases were colluding intelligently together to promote their incorporation into the substance of living matter.  Altogether these coincidences convey an overwhelming impression of design.  The improbability that they are the outcome of the blind concourse of atoms is equivalent to the improbability of drawing the same card twice from a stack of 1025 cards stretching from Earth beyond the Andromeda Galaxy.  How else can we describe these coincidences except as miracles of fortuity?” (#903). 

Though we see only a tiny bit of the EM spectrum, we are blessed with visual capacities, and our high-acuity eyes are themselves wondrous to behold!  [Many years ago, while teaching at Point Loma Nazarene University, I attended a lecture by Francis Crick, the co-discoverer of DNA.  He gave a fascinating lecture on the eye, acknowledging that scientists were still quite puzzled by its complexity, though he was confident they would figure it all out.  I was also struck by his dogmatic declamations regarding the origin of the universe while admitting his mystification at the eye!]  As did Crick, Denton describes the eye and explains its functions.  But unlike Crick he acknowledges its improbability of emerging through purely materialistic developments.  That we have eyes to see—and that there is light illuminating our world for us to behold—is little short of miraculous, for it requires “the same tiny magic band that has just the right energy levels for photochemistry and detection by bio-matter.”  The probability that it all “just happened” means we have “had to select the same playing card from the stack that stretches that inconceivable distance beyond our nearest neighboring galaxy” (#1555).  Denton notes that most scientists, like Crick, reject his position, for the current naturalistic Zeitgeist hardly allows for any anthropocentric interpretations of our place in the cosmos.  But:  “No matter how unfashionable the notion may be in many intellectual circles, the evidence is unequivocal:  Ours is a cosmos in which the laws of nature appear to be specially fine-tuned for our type of life—for advanced, carbon-based ‘light eaters’ who possess the technologically enabling miracle of sight!  I do admit that the claim—that our existence depends on a profound fitness in nature for our specific form of being—is among the most outrageously ambitious claims in the history of thought.  Could the cosmic dance have really been arranged primarily for beings like us?” (#1745).  * * * * * * * * * * * * * * * * * * * *

Before publishing his works on water and light, Micheal Denton wrote the first of his Privileged Species books, titling it Fire-Maker: How Humans Were Designed to Harness Fire and Transform Our Planet (Seattle:  Discovery Institute Press, c. 2016; Kindle Eduction).  “Of all the discoveries made in the course of mankind’s long march to civilization,” he says, “there was one primal discovery that made the realization of all this possible.  It’s a discovery we use every day and take completely for granted. But this discovery changed everything.  Humankind discovered how to make and tame fire.  Darwin rightly saw it as ‘Probably the greatest [discovery], excepting language, ever made by man’” (p. 10). 

Building fires for heat and cooking were important for early man, and simple camp fires sufficed.  But in order to smelt metals—copper, iron, etc.—something that burned hotter was needed.  And it was discovered:  charcoal!  By burning charcoal in vented kilns you can smelt copper to make bronze and iron (thus the “Bronze Age” and “Iron Age” we study in history).  “Given the range of temperatures in the cosmos and the fantastic diversity of the properties of matter, it beggars belief that the smelting temperatures of metal ores are in reach of the temperatures that can be generated in wood or charcoal fires—a coincidence upon which the whole subsequent development of technology depended” (p. 16).  The elements of earth are just right for us—and we’re rightly designed to use them!  Only humans—conscious, rational, creative creatures with dexterous hands—“could ever have exploited the wonderful fitness of nature for fire and for metallurgy” (p. 17).  We alone are “capable of maintaining and controlling fire, of building kilns, of mining for ores, of felling trees and manufacturing charcoal, and so on” (p. 47). 

Fortunately for us, planet earth is wondrously endowed with fire-friendly ingredients!  It’s just the right size with just the right mass possessing just the right gravity “to retain permanently the heavier gaseous elements such as nitrogen, oxygen, and carbon dioxide, but weak enough to permit the initial loss of the lighter volatile elements such as hydrogen and helium.  Only on planets of similar mass and size to the Earth’s could there exist an atmosphere containing sufficient quantities of oxygen to sustain fire” (p. 27).  Ours for sure is a Goldilocks planet!  And it contains precious ores!  As Alfred Russel Wallace said:  “‘The seven ancient metals are gold, silver, copper, iron, tin, lead, and mercury.  All of these are widely distributed in the rocks.  They are most of them found occasionally in a pure state, and are also obtained from their ores without much difficulty, which has led to their being utilised from very early times… Each of the seven metals (and a few others now in common use) has very special qualities which renders it useful for certain purposes, and these have so entered into our daily life that it is difficult to conceive how we should do without them’” (p. 32).  To smelt these metals required the right fuel—and we have it:  wood.  In various forms—firewood, charcoal, coal, oil—wood has fueled civilization. 

In conclusion:  “Overall, the evidence suggests that the cosmos is uniquely fit for beings of our biology to thrive on a planet like the Earth and to master fire and develop complex advanced technologies.  Surely there could not be an equivalent ensemble of fitness in nature for some other type of life.  Lawrence Henderson made the same point in his classic Fitness of the Environment when he argued that the sorts of ensembles of fitness which make carbon-based life possible are so absurdly improbable that they are almost certainly unique, without any analogue in any other area of chemistry or physics” (p. 66).  “Whatever the ultimate causation may eventually prove to be, as it stands, the evidence of fitness is at least consistent with the notion that the fine-tuning for life as it exists on Earth is the result of design” (p. 67).  “Although the current Zeitgeist would have us believe that humanity is little more than a cosmic accident, one of a million different possible outcomes that happened to arrive and survive on an unexceptional planet, the evidence examined in this short book suggests otherwise—that whatever the causation of the fine tuning, we are no accident of deep time and chance.  On the contrary, as Freeman Dyson famously proclaimed, from the moment of creation ‘the universe in some sense must have known that we were coming’” (p. 69).

* * * * * * * * * * * * * * * * * * * * * *

In 1985 Micheal Denton published Evolution:  A Theory in Crisis, seriously questioning the Darwinian theory of evolution through natural selection.  Reading that work helped embolden the late Philip Johnson to write The Case Against Darwin and help launch what is frequently called the “Intelligent Design” movement.  Though Johnson and others were theists, Denton (an Australian academic) writes purely from a biologist’s standpoint, leaving theological issues in others’ hands.  Just recently he revised and upgraded his position in Evolution: Still a Theory in Crisis (Seattle:  Discovery Institute Press, c. 2016; Kindle Edition.)   “My major goal in this new book,” he says, “is to review the challenge to Darwinian orthodoxy and the support for typology provided by the novelty and extraordinary invariance of the homologs [i.e. primal patterns]” (p. 13).  There are, he argues, perduring “types” of biological structures embedded in and best explaining the natural world.  

Taking this position leads him to align himself with leading 19th century biologists, most especially Richard Owen, who believed in quite specific “laws of biological form” which set limits to the development of species, providing “a few basic designs or Types, just as the laws of chemical form or crystal form limit chemicals and crystals to finite sets of lawful forms.  This view implies that many of life’s basic forms arise in the same way as do other natural forms—ultimately from the self-organization of matter—and are genuine universals.  Structuralism—at least in the form it took in the nineteenth century, and in the version I am defending here—implies that the basic Types of life, and indeed the whole evolutionary progression of life on earth, are built into nature.  Thus, life is no artifact of ‘time and chance,’ as it came to be seen after Darwin, but a predictable and necessary part of the cosmic whole” (p. 15).

Scientists such as Owen were structuralists, while Darwin was a functionalist.  “Where functionalism suggests that function is prior and determines structure, structuralism suggests that structure is prior and constrains function” (p. 19).  When, in 1985, Denton advanced an essentially structuralist view he was very much alone is his advocacy.  But he now says times have changed!  The evidence for a structuralist approach has mounted to almost to a cascade supporting “Owen’s distinction between homolog [the melody] and adaptive mask [tuning the piano]):  ‘We think of natural selection as tuning the piano, not as composing the melodies.  That’s our story, and we think it’s the story that modern biology tells when it’s properly construed’” (p.  29).  Denton believes:  “along with Owen and many other nineteenth-century biologists, that life is an integral and lawful part of nature and that the basic forms of life are in some sense built into nature.  I see this notion massively reinforced by the evidence of twentieth-century cosmology that the laws of nature are uniquely fine-tuned for life.  Inevitably, therefore, this book is a defense of the typological world-view similar to that subscribed to by many nineteenth-century biologists:  that the taxa-defining homologs represent a special set of natural forms which constitute the immutable building blocks of the biological world.  If the Types (or, more specifically, the homologs which define them) are indeed natural forms, their origin can never be explained by cumulative selection.  Thus, Darwinism is bound to fail as a comprehensive explanation of life” (p. 29).

# # # 

325 Impeachment and the Plot to Remove a President

As the Trump impeachment process gained steam during the past year I perused  two scholarly works devoted to explaining precisely what the Constitution provides should Congress should decide to impeach and then remove a President from office.   Years ago I’d read and reviewed Raul Berger’s Government by Judiciary: The Transformation of the Fourteenth Amendment and commended the depth and perspicuity of its analysis.  Knowing he’d written Impeachment:  The Constitutional Problems (Cambridge:  Harvard University Press, c. 1973), I acquired a copy and found it to be, in the words of Arthur M. Schlesinger Jr., “an admirable and powerful work  . . . reliable and illuminating.”  After devoting many pages to English legal history, demonstrating why the British Parliament had developed the process of impeachment, Berger (a Harvard Law School professor) showed how America’s Founders incorporated the process into the Constitution.  Committed to an originalist position, Berger sought to follow the Founders’ injunction and primarily understand the Constitution by considering the explanations of those who actually wrote it.  He especially sought to show what “treason, bribery, and other high crimes and misdemeanors” meant in 1787.  These terms did not necessarily mean criminal behavior, for as Justice Joseph Story (an associate of Chief Justice John Marshall) said, three decades later, impeachment is:  “‘a proceeding purely of a political nature.  It is not so much designed to punish an offender as to secure the state against gross official misdemeanors.  It touches neither his person nor his property, but simply divests him of his political capacity’” (p. 84).  However, the Founders were determined to preserve the balance of powers they deemed essential for the republic and were especially concerned that the legislative branch might become dictatorial.   “Nothing is clearer than the intention of the Founders to repudiate and reject ‘legislative omnipotence’” (p. 273).  

Though impeachment is surely a political recourse, the Founders said it should be rarely used and then only for demonstrably egregious offenses.  Thus, though the House of Representatives could draft articles of impeachment, it was left to the Senate to decide whether or not to remove a President.  As Alexander Hamilton explained, this was ‘because what other body would be likely . . . to preserve, unwed and uninfluenced, the necessary impartiality between an individual accused, and the representatives of the people, his accuser.’  The senate was made judge, not in order to lessen the guarantees, but to insure that the accused would not be crushed by the  oppressive weight of the House of Representatives.  The President, no less than the lowliest citizen, is entitled to the protection of due process, and the essence of due process is fair play’” (p. 277).  In accord with the English tradition, rooted in the Magna Carta, due process and fair play were due everyone.

When Berger wrote his book, only once in American history had a president been impeached—Andrew Johnson, who had defied a law he considered unconstitutional.  Devoting considerable attention to that event, Berger declared the effort by Radical Republicans to remove a President with whom they disagreed overtly spurious.  It serves as a historical reminder of “a gross abuse of the impeachment process, an attempt to punish the President for differing with and obstructing the policy of Congress.  . . . . It undermined the separation of powers and constituted a long stride toward the very ‘legislative tyranny’ feared and fenced in by the Founders” (p. 308-309).  

* * * * * * * * * * * * * * * * * * * * * * *

Cass Sunstein is a Harvard Law School professor who recently published Impeachment:  A Citizen’s Guide  (New York:  Penguin Publishing Group, c. 2019, Kindle Edition).  Though issued amidst the clamor for President Trump’s impeachment, the book is “designed to answer more enduring questions, including:  Why does the U.S. Constitution include an impeachment mechanism?  What’s a ‘high crime or misdemeanor’?  How does impeachment work?  Is impeachment a question of law or politics?” (p. xiv).   As a young man Sunstein had studied the impeachment endeavors targeting President Richard Nixon.  Two  decades later, he was brought to Washington as a distinguished professor to testify before Congress during the Bill Clinton hearings, trying to explain the “high crimes and misdemeanors” phrase in the Constitution.  He also worked within the Clinton White House, preparing to defend the president when he was brought to trial before the Senate.  Sunstein now seeks to take an impartial stance, intent on celebrating “the majesty, and the mystery, of impeachment under the U.S. Constitution” (p. 15).

Rightly interpreting the Constitution, of course, means different things to different folks!  Justices such as the late Thurgood Marshall looked at it as a “living document” endlessly malleable in the hands of judges who install such things as abortion and same-sex marriage as constitutional rights.  Others, such as Antonin Scalia, took an “originalist” view and tried to come to conclusions based upon what the Founders intended and take history as our surest guide.  On most issues, Sunstein supports the “living document” approach, but when it comes to impeachment he actually thinks an “originalist” stance is best.  In part this is because only two American presidents had been impeached and there are few judicial precedents to give the on-going tradition needed for a “living document” approach.  Still more:  the two actual impeachments (presidents Andrew Johnson and Bill Clinton) provide little guidance, for they both “were unconstitutional, even farcical—case studies in what the United States should avoid” (p. 86).  

Sunstein concludes his treatise noting that Trump adversaries were calling for his impeachment as soon as he was elected.  They were determined to do so and went looking for a reasons, ranging from tweets to insulting athletes to denying climate change.  Whatever!  “They did so not because they could point to impeachable offenses, but because they disliked him and they strongly opposed his policies.”  Though Sunstein differs with the President on many issues, he finds him guilty of no impeachable offenses.  So:  “One of the original motivations for this book—not the driving force, but still—was to counteract what seemed to me to be reckless and irresponsible arguments for the impeachment of President Trump before his presidency even got started.  My much larger goals were to correct some recurring misunderstandings of the impeachment clause, which have played a significant role in debates over impeachment at least since the 1990s” (p. 177).  

And my guess is he would find the latest House of Representatives impeachment action sadly misguided—an offense to the majesty of the Constitution he reveres.

* * * * * * * * * * * * * * * * * * * * * * * * * * *

During the past two years I’ve read a number of books devoted to the Democrats’ efforts to impeach and remove President Trump from office.  These include:  Compromised:  How Money and Politics Drive FBI Corruption, by Seamus Bruner; The Russia Hoax:  The Illicit Scheme to Clear Hillary Clinton and Frame Donald Trump, and its sequel, Witch Hunt:  The Story of the Greatest Mass Delusion in American History, by Gregg Jarrett; The United States of Trump:  How the President Really Sees America, by Bill O’Reiley; Resistance (At All Costs):  How Trump Haters are Breaking America, by Kimberly Strassell; Ball of Collusion:  The Plot to Rig an Election and Destroy a President, by Andrew McCarthy; Unmasked:  Big Media’s War Against Trump, by Brent Bozell; Unfreedom of the Press, by Mark Levin; The Red Thread:  A Search for Ideological Drivers Inside the Anti-Trump Conspiracy, by Diana West; and Spygate:  The Attempted Sabotage of Donald Trump, by Dan Bongino.  The authors include lawyers, journalists, and former federal prosecutors.  They all tell essentially the same story, presenting evidence recently and almost totally confirmed in detail by the DOJ’s Inspector General’s report and by the FISA’s rebuke of the FBI’s duplicitous behavior.  They critique the effort to remove President Trump from office, though they do so with considerable variety.  Some authors—Andrew McCarthy, Gregg Barrett, and Mark Levin—provide meticulous documentation; others, such Kimberly Strassell are more journalistic in their approach.  Some are Trump supporters; others mainly find the efforts to destroy him unfair and reprehensible.  All together, however, they reveal an alarming event:  a malicious effort to nullify a presidential election.  

Rather than try to sum up these various treatises, I’ll examine in some detail one of the most recent and readable books:  Lee Smith’s The Plot Against the President:  The True Story of How Congressman Devin Nunes Uncovered the Biggest Political Scandal in U.S. History (New York:  Center Street, c. 2019).  Nunes, as chairman of the House Intelligence Committee, knows as much as any American could regarding the machinations of the intelligence community, for he was charged with their oversight.  Smith, an experienced journalist who has worked for The Village Voice and The Weekly Standard,  seeks “to present the known, as well as previously unreported, details in the anti-Trump operation.  The basic outline of the story, however, is shockingly simple.  Hillary Clinton’s campaign used political operatives and dirty cops to frame her opponent.  When she lost, Obama officials employed the resources of the federal government to try to topple President Trump.”  This endeavor was widely supported by the media, who “weren’t simply partisan or lazy or complicit” but were “an integral component” of the endeavor.  “All in all, it is a tragic story about criminality, corruption, and a conspiracy of lies at the highest levels of important US institutions that were designed to keep the public safe, such as the FBI, and free, such as the press.  But there is another story running parallel to that account, and that is a story about a small handful of Americans, public servants, who stood up, assumed responsibility, and did the right thing at a crucial time” (pp. 13-14). 

The handful of heroes who did the right thing were Congressman Devin Nunes and the investigative staff he assembled which “uncovered the biggest political scandal in American history” (p. 14).  (Contributing significantly to their work was Iowa’s Senator Chuck Grassley, Chairman of the Senate Judiciary Committee).  Leading the investigation was Cash Patal, who had worked as a terrorism prosecutor at the Department of Justice (DOJ).  He’d been outraged by FBI Director James Comey’s exoneration of Hillary Clinton.  “‘He hijacked the Clinton investigation,’ he says.  ‘That was not his call to make.  You don’t go on TV and say, “I, the FBI director, am deciding what is a prosecutor’s decision.”   And by the way, all my colleagues in the national security division, all truly apolitical, every one of us would have taken the Clinton case to a grand jury’” (p. 179).  Patal had also became acquainted with some of the folks he would later investigate—Glenn Simpson, a journalist who had founded Fusion GPS, a firm known for doing “opposition research” (working for the 2012 Obama campaign looking for dirt on Mit Romney) that was hired by the Clinton campaign to find damaging information on Trump;  Christopher Steele, a former British intelligence officer with some past Russian connections who often worked for the FBI; Bruce Ohr, a senior Justice Department official, and his wife Nellie, who did research for Simpson.  These folks fed information to the FBI team (including deputy director Andrew McCabe, deputy assistant director for counterintelligence, agent Peter Strzok, and Lisa Page, McCabe’s special counsel) which orchestrated the “Crossfire Hurricane” investigation of the Trump campaign.  

Early in 2016 they took aim at influential Trump advisors, including retired Lieutenant General Michael Flynn.  He had gained renown for his significant work in military intelligence and was appointed by President Obama to head the Defense Intelligence Agency (DIA) in 2012.  As a devoted reformer and outspoken critic, however, he quickly alienated establishment bureaucrats and offended Obama by challenging both the president’s refusal to release the documents captured in the Osama bin Laden raid and the Iran nuclear deal.  Thus Flynn’s work for the White House ended within two years and he launched a consulting firm.  He then became involved in the 2016 presidential campaign, advising Republican hopefuls such as Ben Carson, Carly Fiorena, and Donald Trump, because he “was willing to talk to anyone if it would help keep Hillary Clinton out of the White House” (p. 20).  

Two days after the election, President Obama spent significant time defaming Flynn while talking with president-elect Trump.  “Obama had allies throughout the intelligence community, hundreds of them.  And they had their own reasons to go after Flynn.  ‘Flynn was talking about remaking the NSC staff and getting rid of the Obama holdovers to put Trump’s people in there,’ says Nunes. ‘He was going to cut the NSC staff down to a third of its size under Obama.’  Even more significantly, Flynn was going to address the problems with the intelligence community as a whole.  ‘He wanted to remake the entire IC,’ says Nunes.  He had Trump’s ear.  They were going to drain the Swamp” (p. 136).  Flynn’s adversaries then launched a vicious and dishonest campaign to discredit and defame him, “erasing facts” and doctoring photographs in order to suggest the Trump team was aligned with the Kremlin.  He would be the first of Trump’s appointees to be fired—within a few weeks of the inauguration.  Trump thought this would end the Russia controversy, but:  “‘He was getting bad advice from some of his advisers,’” says Nunes.  ‘He didn’t understand that after they got Flynn, they’d have momentum.  After Flynn went down, they believed they could get the president, too’” (p. 147).  

Central to the plot against the president was what came to be known as the “Steele Dossier.”  In April 2016, Hillary Clinton and the DNC “hired Fusion GPS to build a Trump-Russia echo chamber.  Fusion GPS garnered more than $1 million to compile information about Trump’s ties to Russia and distribute it to the press.  By the end of the spring, every major US media organization was involved in pushing the big story about the Republican candidate:  Trump and his associates were tied to Russian and other former Soviet Bloc business interests.  Fusion GPS was the Clinton campaign’s shadow war room—and subsequently became its dirty tricks operations center” (p. 42).  Christopher Steele was recruited (and paid $168,000) to gather information on Trump’s Russian ties and proceeded to pen a number of unsubstantiated allegations and rumors.  “What had started as an opposition research project that had turned up little of substance had transformed into a smear campaign” (p. 74).  Thus the “Steele Dossier” became the main basis for the FBI’s appeal to the FISA court for permission to surveil suspected members of the Trump team.  “But the dossier is not an ‘intelligence’ product.  It’s a fiction, a literary forgery, populated with real characters, but who did not do or say the things attributed to them.  And the dossier’s authors are not intelligence officers but journalists and academics accustomed to running smear campaigns and dirty tricks operations and lying” (p. 286).

The dossier mentioned Carter Page, a volunteer consultant on the periphery of the Trump campaign staff.  A graduate of the Naval Academy with considerable knowledge of and contacts in Russia, Page had frequently worked with the CIA, providing information gained on some of his trips abroad.  Though the Crossfire Hurricane group claimed he was a foreign agent, he’d actually helped the FBI locate Russian agents working in New York.  (We now know the FBI actually altered a CIA document indicating Page had worked with the agency to say he had not worked for it!)  The evidence cited by the FBI was little more than the Steele Dossier, as well as newspaper articles which were based upon Fusion GPS claims.  Page had earlier left the Trump campaign, but the FBI wanted to uncover his past emails and find incriminating evidence.  The claimed Page was paid involved in a deal which involved hundreds of millions of dollars.  But he obviously didn’t have that kind of money and it could have easily been disproved had the FBI wanted to do so.  “It was clear the story in the dossier was nonsense” (p. 183).  What  Crossfire Hurricane actually wanted was permission to surveil Page in order to get at Trump and, as Nunes says, “‘They were sure they were going to find something, the golden ticket’” (p. 98).  But they found nothing!  And they virtually destroyed an honorable man.  

All of this took place before the 2016 presidential election, when Crossfire Hurricane assumed Hillary Clinton would be elected and their activities safely ignored.  But when Trump was elected “the operation designed to undermine his campaign transformed.  It became an instrument to bring down the commander in chief.  The coup started almost immediately after the polls closed” (p. 103).  On December 6, 2016, President Obama directed CIA director John Brennan to “review of all intelligence relating to Russia and the 2016 elections” (p. 106).   Brennan almost immediately reported that the Russians had helped Trump win the election—and then leaked that information to a variety of friendly media outlets.  Congressman Nunez immediately saw what was transpiring—an effort to destroy Trump.  “‘I couldn’t have dreamed they’d be that dirty,’ says Nunes. ‘As soon as we saw they’d abused the FISA process, we opened up the investigation right away because the FISA issues bled into other matters, like how they started the whole investigation.  It was all a setup.’  It was then he realized he’d come across the biggest political scandal in US history.  ‘They used the intelligence services and surveillance programs against American citizens,’ he says.  ‘They spied on a presidential campaign and put it under a counterintelligence investigation so they could close it off and no one else would see what they were doing.  They leaked classified intelligence again and again to prosecute a campaign against a sitting president.  Ninety percent of the press was with them, and the attorney general was out of the picture.’  That meant it was up to Nunes and his team to expose the hoax, get out the truth, and uphold the rule of law” (p. 172).

They found that when President Trump fired Director Comey, the acting director, Andrew McCabe, urged deputy attorney-general Rod Rosenstein to name a special counsel.   Comey then leaked a story to a friend which was printed in the New York Times claiming Trump had urged him to “drop the Flynn investigation.”  That story, said Rosenstein, justified the appointment of a special counsel—former FBI director Robert Mueller III.  He assembled a dozen of anti-Trump prosecutors who did everything possible in the next two years to show how Russia supported Trump and enabled him to win the 2016 election.   “‘It was a team of dirty cops,’ says Nunes.  ‘Andrew Weissmann was the worst. He already had a history as a hard-core anti-Trump partisan.’” (p. 205).  For these upper-echelon bureaucrats, “Trump wasn’t their president.  And the America that had elected him was beneath contempt” (p. 199).  

The Crossfire Hurricane group imagined themselves to be replicating the work of the heroic FBI “deep throat” agent who helped spark the resignation of Nixon.  “‘They all wanted to become the next Deep Throat,’” says Nunes.  And they benefited from “elite teams” of journalists ensconced in the Washington Post and New York Times whose articles largely shaped the media frenzy calling for Trump’s removal from office.  The two papers’ staffs were awarded a joint Pulitzer Prize for “‘deeply sourced, relentlessly reported coverage in the public interest that dramatically furthered the nation’s understanding of Russian interference in the 2016 presidential election and its connections to the Trump campaign, the President-elect’s transition team and his eventual administration’” (p. 274).  These bureaucrats and journalists envisioned a paper coup d’etat and knew, as Edward Luttwak says in Coup d’État: A Practical Handbook, that the media are essential:  “‘Control over the mass media emanating from the political center will still be our most important weapon in establishing our authority after the coup’” (p. 193).  “‘The anti-Trump operation,’ says Luttwak today, ‘was a very American coup, with TV denunciations by seemingly authoritative figures as a key instrument.’  The plot against Trump was a bureaucratic insurgency waged almost entirely through the printed word.  It was the ‘Paper Coup’” (p. 193).

But after two years of investigating every lead, the Mueller Commission failed to establish any “Russian collusion” with the Trump campaign.  There was a fully fraudulent endeavor to remove a president, and it failed.  So, almost immediately the Democrats found another cause celebre—Trump’s phone call to UkraineThus the beat of the impeachment drums goes on! 

324 Untethered Minds

As a  prototypical, optimistic “progressive,” believing the world was getting better and better, and after devoting his life to celebrating biological and societal evolution, H.G Wells in 1945 wrote a final, deeply pessimistic book, entitled A Mind at the End of Its Tether, sorrowing that everything seemed to be flying apart and nothing made sense.  A few years earlier the great Irish poet, E.B. Yeats had written an equally doleful poem, “The Second Coming,” lamenting the shape of things to come:  

 Turning and turning in the widening gyre
The falcon cannot hear the falconer;
Things fall apart; the center cannot hold;
Mere anarchy is loose upon the world;
The blood-limned hoard is loosed, and everywhere
The ceremony of innocence is drowned;
The best lack all conviction, while the worst
Are full of passionate intensity. 

Such passionate intensity is routinely visible in today’s campus protests, fueled by angry, profanity-spouting youngsters determined to prohibit controversial speakers from speaking.  Sure enough:  “Mere anarchy is loose upon the world”!  Their adolescent incoherence is thoughtfully analyzed by Mary Eberstadt in Primal Screams:  How the Sexual Revolution Created Identity Politics (West Conchohocken, PA:  Templeton Press, c. 2019).  Screaming youngsters, she insists, are indirectly asking a deeply personal and unanswered if perennial question:  Who Am I?  In the past, living in families and surrounded by stable communities, answering that question was relatively simple.  (For example, I could say I am my father’s son, reared on the high plains, and immersed in the life of a local Church of the Nazarene.)  Today, however, increasing numbers of folks cannot really find roots in such communities and turn to various groups wherein they seek to anchor their identities.  When this turn takes on political dimensions, they embrace “the desires and agendas” of aggrieved factions, providing a base whereby “human beings outside those chosen factions are treated ore and more not as fellow citizens, but as enemies to be eliminated by shame, intimidation, and, where possible, legal punishment” (p. 7). 

Allen Bloom had earlier discerned this development in his widely-discussed The Closing of the American Mind, wherein he described students as reared in accord with Rousseau’s prescriptions in Emile, “in the absence of any organic relation between husbands and wives and parents and children.”  Consequently, Bloom said:  “That is it.  Everyone has ‘his own little separate system.’  The aptest description I can find for the state of student’s souls is the psychology of separateness.”  Bloom blamed divorce as the primary reason for such separateness.  Now, thirty years later we must factor in the astonishing increase of out-of-wedlock births, all together resulting in what Eberstadt calls “The Great Scattering.”  Fractured families frequently mean not only missing fathers but fewer (if any) siblings and cousins and grandparents who are part of one’s life.  Still more (a point made in Eberstadt’s How the West Really Lost God):  youngsters without stable families have difficulty believing in and worshipping God.  Without family or faith to tether them to abiding realities, growing numbers of people seek to find their identity in self-selected groups.

Thus we witness the emergence of “identity politics.”  To answer the question “Who Am I” when traditional ways have collapsed, millions of moderns have relapsed into “one of the most revealing features of identity:  its infantilized expression and vernacular” (p. 64).  To speak personally, I have been utterly perplexed while witnessing utterly irrational behavior on university campuses as well as committee meetings in Congress!  Allegedly educated persons are, in fact, screaming rather than speaking coherently.  There are now “safe spaces” as well as “tiny ersatz treehouse stuffed with candy, coloring books, and Care Bears” on the campuses of the nation’s most prestigious universities (p. 66).  Apparently taking their clues from university professors, thousands of alienated youngsters take solace in identity groups, including feminism, androgyny, the #MeToo movement, etc., etc.

“‘Destroying the family life of highly social, intelligent animals leads inevitably to misery among individual survivors and pathological misbehavior among the group.’ J. M. Coetzee, recipient of the Nobel Prize for Literature has explained.  He was speaking of elephants, of course” (p. 103).  But it’s also true of humans.  Consequently:  “Identity politics is not so much politics as a primal scream.  It’s the result of the Great Scattering—our species’ unprecedented collective retreat from our very selves.”  Indeed:  “Anyone who has ever heard a coyote in the desert, separated at night from its pack, knows the sound.  The otherwise unexplained hysteria of today’s identity politics is nothing more, or less, that just that:  the collective human howl of our time, sent up by inescapably communal creatures trying desperately to identify their own” (p. 109).

* * * * * * * * * * * * * * *

An important aspect of our current culture is diagnosed by Douglas Murray in The Madness of Crowds:  Race, Gender, Identity (London:  Bloomsbury Publishing. Kindle Edition, c. 2019), explaining that “we have been living through a period of more than a quarter of a century in which all our grand narratives have collapsed” (#36).   Consequently, “We are going through a great crowd derangement.  In public and in private, both online and off, people are behaving in ways that are increasingly irrational, feverish, herd-like and simply unpleasant” (#31).  As Yeats lamented, “Things fall apart, the center cannot hold; / Mere anarchy is loosed upon the world.”  For many centuries the West was nourished by some “grand narratives,” including the heritage of the classical world of Greece and Rome as well as the religious traditions of Judaism and Christianity.  For half-a-century now that story has been shunted aside in favor of a “new religion” best evident in various versions of “‘social justice’, ‘identity group politics’ and ‘intersectionalism’” (#60).  Consequently:  “identity politics” provides “the place where social justice finds its caucuses.  It atomizes society into different interest groups according to sex (or gender), race, sexual preference and more” (#66).  The identity groups Murray describes are gays, women, non-white races, and transsexuals.  Much of the book is devoted to detailing illustrations of these four groups, and anyone wanting a very up-to-date journalistic accounting of what’s taking places throughout our world can glean ample information from perusing its pages. 

But the real worth of The Madness of Crowds is the philosophical analysis Murray provides.  All these identity groups share common intellectual roots are manifestly Marxist (updated by fashionable, academic postmodernists such as Foucault and Gramsci) and feel they are engaged in a great class struggle.  There are the haves and the have-nots, but today’s exploiters are not so much capitalists as patriarchs.  So:  “At the top of the hierarchy are people who are white, male and heterosexual.  They do not need to be rich, but matters are made worse if they are.  Beneath these tyrannical male overlords are all the minorities: most noticeably the gays, anyone who isn’t white, people who are women and also people who are trans.  These individuals are kept down, oppressed, sidelined and otherwise made insignificant by the white, patriarchal, heterosexual, ‘cis’ system.  Just as Marxism was meant to free the labourer and share the wealth around, so in this new version of an old claim, the power of the patriarchal white males must be taken away and shared around more fairly with the relevant minority groups” #975).   Thus when we hear about “toxic masculinity” or “white privilege” or “rape culture” we need to remember such slogans are all lethal weapons in our  cultural war. 

Many of these phrases are manifestly nothing more than phrases or slogans which frequently contradict each other.  But “Marxists have always rushed towards contradiction. The Hegelian dialectic only advances by means of contradiction and therefore all the complexities – one might say absurdities – met along the way are welcomed and almost embraced as though they were helpful, rather than troubling, to the cause” (p. 1099).  To those of us perplexed by declarations of men claiming they are women—as irrational as any statement could possibly be—the cultural Marxists simply dismiss us a “logocentric” (e.e. thinking logically).  This, as Stephen Pinker (a Harvard psychologist) “wrote in 2002, ‘Many writers are so desperate to discredit any suggestion of an innate human constitution that they have thrown logic and civility out the window . . . The analysis of ideas is commonly replaced by political smears and personal attacks . . . The denial of human nature has spread beyond the academy and has led to a disconnect between intellectual life and common sense.’  Of course it had.  . . . . The purpose had instead become the creation, nurture and propagandization of a particular, and peculiar, brand of politics. The purpose was not academia, but activism” (#1119).

This essentially Marxist narrative has been recently amplified, Murray argues, by the social media.  In literally the blink of the eye the world has been transformed by “a communications revolution so huge that it may yet make the invention of the printing press look like a footnote in history” (#2030).  Thoughtful books and wisely-edited newspapers have smaller audiences today, for “twitter” and “facebook” postings have superseded them.  “It is there that assumptions are embedded.  It is there that attempts to weigh up facts can be repackaged as moral transgressions or even acts of violence” and enables anyone “to address everything, including every grievance.  And it does so while encouraging people to focus almost limitlessly upon themselves – something which users of social media do not always need to be encouraged to do” (#2056). 

And not only can one say anything about anything—everything he he has ever sent into cyberspace is forever there.  Using words or espousing positions which were once quite acceptable may be used to assail folks.  Something one may have tweeted a decade ago as an adolescent can be uncovered and weaponized to destroy him through excoriation and “public shaming.”  Social media “appears able to cause catastrophes but not to heal them, to wound but not to remedy” (#3280).   Especially absent is any possibility of forgiveness.  We face “the question that the internet age has still not begun to contend with:  how, if ever, is our age able to forgive?  Since everybody errs in the course of their life there must be – in any healthy person or society – some capacity to be forgiven.  Part of forgiveness is the ability to forget.  And yet the internet will never forget” (#3320).  Even the words of one’s father may be resurrected to punish a person, as the British race car driver Conor Daly found out when he lost a sponsorship when it was discovered that 10 years before he was born his father gave a radio interview and used a racially inappropriate word.

To cope with the madness of crowds, more common sense reasoning,  such as Murray provides, must be recovered in all segments of our society.

* * * * * * * * * * * * * * *

A significant and largely unintended consequence of the sexual revolution is elucidated in Warren Farrell and John Gray in The Boy Crisis:  Why Our Boys Are Struggling and What We Can Do About It (Dallas, TX:  BenBella Books, Inc., c. 2018; Kindle Edition.)  Older men, such as myself, grew up in a time when “masculinity came with a built-in sense of purpose of being the provider-protector (e.g., warrior; sole breadwinner)” (p. 10).  As boys we wanted to grow up and assume the responsibilities of mature adults.  We had good reason to be.  But young men today frequently fail to find it.

To prove there is in fact a crisis Farrell considers boys’ mental, physical and economic health as well as their educational success.   Men kill other men and themselves far more frequently than do their female counterparts.  Indeed, the data are depressing!  Though “only 6 percent of the overall population, black males make up 43 percent of murder victims.   More black boys between ten and twenty are killed by homicide than by the next nine leading causes of death combined” (p. 16).  As many white men have killed themselves as have died of AIDS.  As soon as they enter puberty, boys turn suicidal:  “between ten and fourteen, boys commit suicide at almost twice the rate of girls.  Between fifteen and nineteen, boys commit suicide at four times the rate of girls; and between twenty and twenty-four, the rate of male suicide is between five and six times that of females” (p. 16).  Indeed, “the male-female suicide gap in the United States has tripled since the Great Depression” (p. 273).  “Women cry, men die!”  Men also go to jail in alarming numbers and “93 percent are male and are disproportionately young” (p. 18).

Though women were in the distant past called the “weaker” sex, that is certainly not true if one considers longevity as a marker of physical well-being, for men and boys are twice as likely to die as their female counterparts of the same age, making for “a greater life-expectancy gap than at any time since World War II” (p. 20).  Indeed:  “Being male is now the single largest demographic factor for early death,” says Randolph Nesse, Director of the Center for Evolution and Medicine at Arizona State University (p. 20).  Young men are alarmingly overweight and unfit.  As another indicator of physical well-being, an alarming decline of sperm count has been tracked by researchers.  “Boys today have sperm counts less than half of what their grandfathers had at the same age” (p. 20). Economically, the picture is equally drear, especially for men who don’t go to college.  “Over the last forty years, the median annual earnings of a boy with just a high school diploma dropped 26 percent.”  He is 20 percent more likely to be unemployed for significant times.  And if a young man lives “in an urban area, he’ll likely live in one of the 147 US cities in which young women under thirty haven’t just caught up to their male peers, but now outearn them (by an average of 8 percent)” (p. 26).  If he had a university degree things would be different, of course, but many men are failing to pursue higher education. 

“Worldwide, reading and writing skills are the two biggest predictors of success. These are also the two areas in which boys fall the most behind girls.  In the United States, by eighth grade, 41 percent of girls are at least ‘proficient’ in writing, while only 20 percent of boys are.  Many boys used to ‘turn around’ in about their junior or senior year of high school.  Anticipating the need to become sole breadwinner, and therefore gain familial pride, peer respect, and female love, they got their act together.  The expectation of becoming sole breadwinner became his purpose.  No longer.  In one generation, young men have gone from 61 percent of college degree recipients to a projected 39 percent; young women, from 39 percent to a projected 61 percent” (p. 28).  And these well-educated young women almost always refuse to consider lesser educated men as potential husbands!

Digging more deeply into the boy crisis, Farrell identifies a lack of purpose as one of its primary reasons.  “The Japanese call it ikigai, or ‘a reason for being.’  Japanese men with ikigai are less likely to die of heart disease.  And both sexes with ikigai live longer.   Whether we call it ikigai or sense of purpose, when we pursue what we believe gives life meaning, it gives us life.  Historically, a boy’s journey to prove himself is what gave him that sense of purpose” (p. 46).  To protect and provide for his wife and family have, throughout human history, given men ikigai.  But today, in Japan as well as much of the modern world, boys struggle to find it.  Much of this results from men being less and less needed to provide food and shelter for their families.  They also have far fewer heroes to emulate.  “What is a hero?  The word hero derives from the root ser, from which we also get the word “servant” (think “public servant”), as well as slave, and protector.  In Japan and China, the word samurai also derives from the word for servant, saburai.  Billions of boys throughout history have embraced the opportunity to serve and to protect in the hope of being labeled a hero or samurai.  Though the fiercer the enemy, the greater their chance of death, boys were willing to exchange their lives for the label.  They were, in a sense, slaves to the potential honor they might receive if they served and protected their families, villages, or countries” (p. 62).  Occasionally our youngsters see such heroes in action.  Consider the first responders on 9/11—99% were males!  In fact 76% of the firefighters in the country are volunteers, virtually 100% men!  So there are heroes in our midst, but too often young boys are fed anti-hero messages in feminist-run schools and popular culture. 

Thus parents need to strategically prepare their sons for adulthood, and that requires preparing them for employment in our digital age.  If they do well in school, opportunities abound for them if they persevere and find a well-paying slot in the economy.  If they’re not academically-inclined, it’s important to help them train for well-paying blue collar jobs—welders, plumbers, etc.  Participating in athletics is often crucial in helping boys become men.  Farrell provides lots of practical tips for parents (and grandparents) wanting to help their boys mature.  Above all, in a culture celebrating instant gratification and victimization:  “The discipline of postponing gratification is the single most important discipline your son needs” (p. 98).   But practical advice may mean little unless we face “the most important single crisis in developed countries:  dad-deprived children, and especially dad-deprived boys” (p. 102).  Boys reared without an attentive father are inevitably harmed.  If their dads dies, boys do OK, for they have memories of good men.  But when they lose their dads through divorce or never even know them because they were born out-of-wedlock, their stories frequently end poorly.  For those concerned, Farrell provides an appendix listing “some seventy ways that children benefit from significant father involvement—or put another way, seventy-plus ways in which dad-deprived children are more likely to suffer” (p. 117).  They are more likely to fail in school, to join gangs, to go to prison, to lapse into various addictions, to fail in marriage.  To cite only one painful fact:  “Prisons are the United States’ men’s centers (93 percent male).  A staggering 85 percent of youths in prison grew up in a fatherless home.  More precisely, prisons are centers for dad-deprived males—boys who never became men” (p. 120).  In short:  boys without dads do poorly!

But Farrell does more than alert us to problems.  He sets forth quite detailed ways in which dads can help rear healthy boys.  Simply being present in a boy’s life is hugely significant.  Merely interacting with a father boosts a boy’s IQ, strengthens his ability to trust others, reduces aggressive behavior, and enables him to rightly develop.  Stepfathers, unfortunately, have less (if any) positive influence.  Nor do same-sex parents!  Only biological fathers can do the crucial role of fathering.  Added to being present, good dads should preside over routine family dinners—a remarkably important ritual for children.  They can also enforce behavioral boundaries, whereas moms often set but fail to enforce them.  “One boy half-joked, ‘My mom warns and warns; it’s like she ‘cries wolf.’  My dad gives us one warning, and then he becomes the wolf” (p. 136).  Still another illustration:  women are more likely than men to give underage teenagers alcohol, admitting “that their desire to please trumped what they knew was right” (p. 140).  Dads normally roughhouse with and tease their kids—teaching them important lessons never derived from a woman.  They can lead them on wilderness excursions, camping trips, adventures of various sorts demonstrably valuable for youngsters.  They challenge their kids to accomplish things (whether in sports or school) and allow them to deal with defeats.

Farrell devotes many pages to the problem of divorce—and to ways to cope with it.  He also suggests legal changes to better enable men to be better fathers.  But the main message of The Boy Crisis is just that:  it’s a crisis and it’s devastating our culture.  Though wildly overstating the case, Jed Diamond claims:  “The Boy Crisis is the most important book of the 21st century.  Farrell and Gray are absolutely brilliant,” showing “why our sons are failing.”  Indeed:  ‘If you care about the very survival of humankind, you must read this book.”

323 “Social Justice” Casualties

In Why Meadow Died: The People and Policies That Created The Parkland Shooter and Endanger America’s Students (Post Hill Press, c. 2019, Kindle Edition), Andrew Pollack maintains that his daughter, Meadow, died not because of guns or NRA deviousness but because permissive school district policies enabled the killer (Nikolas Cruz) to escape proper treatment and unleash his fury on the students of Marjory Stoneman Douglas High School, located in Parkland, Florida.  According to Meadow’s brother, Hunter:  “If one single adult in the Broward County school district had made one responsible decision about the Parkland shooter, then my sister would still be alive.  But every bad decision they made makes total sense once you understand the district’s politically correct policies, which started here in Broward and have spread to thousands of schools across America” (#111).

This is not, of course, to diminish the responsibility of Nikolas Cruz!  He was, as is detailed in three lengthy chapters, a troubled young man.  Indeed:  “There was something profoundly dark and disturbed at the core of Nikolas Cruz’s soul.  Even his mother, Lynda, described her son as ‘evil’” (#1816).  His kindergarten teachers worried about his aggressiveness and fantasies.   He was known to enjoy torturing animals as well as threatening other students, and wherever he went he misbehaved and “wrecked havoc.”  In and out of special schools designed to help disturbed youngsters, he was frequently identified as a threat to both himself and others.  His teachers and counselors feared him.  “They knew about his obsession with guns and dreams about killing people.  They were so frightened that they took the extremely rare step of contacting his private psychiatrist. Yet not only did they return him to a traditional high school at an unprecedented speed, they also enrolled him in JROTC, a course in which he would learn to shoot using an air gun that resembled an AR-15” (#2187).  While the killings were taking place many staff and students suspected Cruz was the killer.  Sheriff’s officers had over the years responded to calls at Cruz’s home a total of 45 times.  But nothing was done to deal effectively with him.  They were all committed to following “the philosophy of the Broward school district, as expressed by Superintendent Runcie: ‘We are not going to continue to arrest our kids’ and give them a criminal record” (#2554).  

Following the Parkland shooting, many Americans demanded action, and politicians quickly began posturing, promising, and endlessly pontificating.  Responding to the outrage President Trump set up a “listening session” and invited Parkland parents, including Andy Pollack, to attend.  He spoke briefly and urged practical steps be taken to prevent further tragedies.   Subsequently the president talked with him and his son, discussing how to make the nation’s schools safer.  Returning to Florida, Pollack determined to memorialize his daughter by establishing a playground in her memory (Princess Meadow’s Playground) and establishing a nonprofit, Americans for Children’s Lives and School Safety (CLASS).  Yet his efforts were barely noticed amidst the massive national publicity generated by a group of Parkland students who organized a “March For Our Lives” to singularly focus on gun control. 

But Pollack knew guns were not the real problem.  So he began an intensive investigation, determined to understand why his daughter had died and he concluded the main culprit was a pernicious political correctness that pervaded Broward County bureaucracies:  school district officials, mental health providers, and law enforcement officers all failed.  “The only man who could have stopped him, School Resource Officer Scot Peterson, refused to enter the building and actively prevented other officers from entering” (#839).  He drew his gun—and stood still, safely hiding for 50 minutes!  “Ever since Columbine, police have been trained to immediately confront a school shooter” but Peterson stayed safe!  Five other deputies arrived, donned bulletproof vests, and listened (from safe distances, hiding behind cars or trees) to the gunfire killing kids.   Eleven long minutes passed before some of the deputies dared enter the building, long after the shooter had fled the scene.  Two courageous teachers died trying to protect the students, but law enforcement officers lacked their resolve. 

Especially culpable, in Pollack’s view, was the school superintendent, Robert Runcie, who meticulously followed federal guidelines issued by Barak Obama’s Secretary of Education, Arnie Duncan.  He hewed carefully to the agenda promoted by “social justice activist groups” which insisted schools serve “as laboratories for social justice engineering and force politically correct policies into our schools based on the assumption that teachers are too prejudiced to be trusted do the right things.  One policy is known as ‘discipline reform’ or ‘restorative justice.’  Activists and bureaucrats worried that minority students were being disciplined at higher rates than white students, and rather than recognize that misbehavior might reflect bigger problems and inequities outside of school, they blamed teachers for the disparity.  They essentially accused teachers of racism and sought to prevent teachers from enforcing consequences for bad behavior.  They thought that if students didn’t get disciplined at school, if instead teachers did ‘healing circles’ with them or something, then students wouldn’t get in trouble in the real world.  Superintendents then started pressuring principals to lower the number of suspensions, expulsions, and school-based arrests.  All that actually happened was that everyone looked the other way or swept disturbing behavior under the rug, making our schools more dangerous” (#227). 

To personalize his presentation Pollack portrays a number of folks intimately involved in the event.  There’s a math teacher, Kimberly Krawczyk, who was almost killed and became quickly disillusioned with the school district’s cover-up endeavors.  And there’s an immigrant father, Royer Borges, who “moved his family from Venezuela to America in 2014 to keep them safe” (#756).  His son was shot and seriously injured, so he hired an attorney to represent him.  Doing research, the attorney found an essay that linked the shooting with Parkland’s progressive educational policies, especially the district’s PROMISE program, which had been heavily funded by wealthy leftists such as Goerge Soros.  PROMISE was proposed and implemented to help “the victims of institutional racism” by refusing to arrest and punish public school students.  When Royer Borges learned about PROMISE’s permissive prescriptions, he “was furious.  He couldn’t believe that public officials had decided that the law shouldn’t apply in schools.  And he couldn’t believe that no one was going after Broward’s leaders for rolling the dice with children’s lives. It made no sense to Royer why instead of going after these local officials, everyone was marching on Washington, D.C. for gun control.  Venezuela had total gun control.  That’s how the government and the colectivos were able to terrorize the citizens” (#831).

Adding scholarly heft to the book is its co-author, Max Eden, a senior fellow at the Manhattan Institute for Policy Research, who had long researched and written about education and its needed reforms, concluding that a “‘social justice industrial complex’ had taken hold of American education” (#1010).  Using money from Obama’s 2009 stimulus bill, Arne Duncan had used money “to incentivize (some might say bribe) states to follow DCPS’s policy lead on test-based teacher evaluations and the new (and much-hated) Common Core academic standards.”   Educrats from across the country, attending ‘“woke’ conferences and training programs, . . . learned that the fastest path to career advancement is to fake statistical progress for minority students while passionately decrying privilege and institutional racism” (#1015).  Florida’s “Broward County was the standard-bearer for the new approach to school discipline: an aggressive push for leniency on the grounds that racially biased teachers were unfairly punishing minority students” (#1017).  Eden was also deeply distressed by the conduct of the Broward County Sheriff’s Department, which was determined to end “the school to mass murder pipeline” pattern evident throughout the region.  To do so the sheriff joined the school district and its PROMISE program by refusing to arrest adolescents.  Despite multiple calls to Cruz’s house and repeated warnings regarding his conduct, he was given a “free pass” that ultimately enabled him to launch his killing spree.

An unexpected hero in Why Meadow Died is a 19 year-old home-schooler named Kenny Preston, who proved to be the most tenacious and perceptive “journalist” writing about the shootings.  When he recognized two of the students killed by the shooter Preston began studying the incident and was early appalled by the reactions of Broward County authorities.  He spotted Superintendent Runcie’s instant concern to deflect attention from himself rather than mourn the victims’ deaths and was distressed by Sheriff Israel’s calloused response to questions.   “That’s when something inside of Kenny flipped.  The bodies of children who had been murdered under Runcie’s leadership were still lying on the schoolhouse floor directly behind him, and he had already started politicking” (#1315).  So Kenny Preston dug into the documents he could access on-line and interviewed a number of persons, including “Robert Martinez, a recently retired school resource officer, who told him, ‘We all knew some sort of tragedy like this was going to happen in Broward.  You can’t just stop arresting kids without expecting something like this.  As officers, our hands were tied.’  More alarming still, Martinez told Kenny that district officials had explicitly told school resource officers not to arrest students for felonies, in addition to the official PROMISE misdemeanors” (#1419).  Kenny’s on-line articles proved more perceptive than the mainstream media, which could do little more than repeat anti-NRA bromides.  In fairness, some of the local Florida papers did more honest work, but the story detailing Why Meadow Died remained largely for her father to tell! 

Concluding that the Broward County school board needed to change, Andy Pollack and a group of activists motivated by the Parkland shootings decided to challenge its entrenched power structures.  So they  ran candidates who mounted a vigorous campaign.  But all was naught!  Brossard County reelected the seasoned politicians aligned with Robert Runcie, and little was done to address the real problems in the district.  Though he was non-political (never even voting) before the shootings, Pollack finally realized:  “This happened in a Democrat county with a Democrat sheriff, a Democrat superintendent, and a Democrat school board, implementing Democrat ideas on criminal justice, Democrat ideas on special education, and Democrat ideas on school discipline.  And after Democrat voters gave all these Democrats a resounding vote of confidence in the school board election, the Democrat teachers union president, Anna Fusco, wrote in a Facebook group about our campaign for accountability:  ‘Now you can all shut up!’  Meanwhile, at the national level, Democrat organizers swooped in and weaponized my daughter’s murder for their Democrat agenda and to fund-raise to elect more Democrats” (#6220).

* * * * * * * * * * * * * * * * * * * * * * * * * * *

In Stand Down: How Social Justice Warriors Are Sabotaging America’s Military (Washington:  Regnery Gateway Editions, Kindle Edition, c. 2019), James Hasson explains:  “The Army that I entered as a second lieutenant during President Obama’s initial years in office was nothing like the Army I left [as a captain] in late 2015” (#7) because of an “eight-year social engineering campaign against our armed forces” (#15) waged by “hard-left ideologues” such as Ray Mabus, Brad Carson, Deborah Lee James and Eric Fanning, who occupied “some of the most influential national security positions” (#23).  They were all committed to radical feminist and LGBT ideologies and implemented “gender equality” programs, following President Obama’s orders.  He had famously promised to fundamentally transform the country, and Hasson believes he certainly did so in the one realm “over which he would exercise nearly complete control,” the military.   Illustrating such changes, a 2012 article in Stars and Stripes described how, in one Washington state post:  “The Army is ordering its hardened combat veterans to wear fake breasts and empathy bellies so they can better understand how pregnant soldiers feel during physical training” (#2182).   Then, in “2015, Army ROTC cadets at multiple universities participated in ‘Walk a Mile in Her Shoes’ events on campuses.  The events—‘designed to raise awareness about sexual violence against women’—had male Army cadets replace their combat boots with bright red high-heeled shoes” (#2183).  So Stand Down “is the story of what will be President Obama’s enduring legacy:  the sacrifice of the combat readiness of our armed forces to the golden calves of identity politics and progressive ideology” (#94) shaped and driven by homosexual and radical feminist activists. 

Such golden calves were installed in the nation’s military academies, which have substantially changed during the past 25 years as increased numbers of civilian professors have been hired.  Indeed, Hassan “interviewed academy graduates of all ranks who raised serious concerns about the cultural changes imposed upon the academies from above” (#490), all of whom were alarmed by the incursions of political correctness in these schools.  Symptomatic of the problem is a letter written by Robert Heffington, a retired Army lieutenant colonel who had taught at West Point.  He said:  “‘I firmly believe West Point is a national treasure and that it can and should remain a vitally important source of well trained, highly educated Army officers and civilian leaders.  However, during my time on the West Point faculty . . . I personally witnessed a series of fundamental changes at West Point that have eroded it to the point where I question whether the institution should even remain open.”  He charged that “standards at West Point are nonexistent” and lamented “the academy’s failure to enforce the honor code and its lax enforcement of conduct and disciplinary standards.”  Changes in West Point’s curriculum particularly distressed Huffington:  “‘The plebe American History course has been revamped to focus solely on race and on the narrative that America is founded solely on a history of racial oppression.  Cadets derisively call it the ‘I Hate America Course.’  Simultaneously, the plebe International History course now focuses on gender to the exclusion of many other important themes.  On the other hand, an entire semester of military history was recently deleted from the curriculum . . . at West Point!” (#502).

Turning to the other academies Hassan finds equally disturbing phenomena, even extending to concerns for “microaggressions”!  Training warriors by worrying about microaggressions seems at best counterproductive, but one finds “safe space” placards adorning office doors of both military and civilian professors at the United States Naval Academy.   “If the signs were stripped of identifying features, you would be hard pressed to distinguish them from those marking the offices of Yale gender studies professors” (#645). There’s even a “Safe Spaces Faculty Rep” entrusted with making sure no midshipman might be offended by offensive words.  At the Air Force Academy, a visiting psychology professor taught a course on “Interdisciplinary Perspectives on Men and Masculinity.”  The professor styles himself as a feminist and once wrote an article saying, “I challenge you to tell me one way in which the sexes are opposite” (#670).  The academy also deleted the phrase “so help me God” from the oath of enlistment in its cadet handbook as well as the cadet Honor Oath.

Illustrating the harm political correctness has done the military is the “real” story of females graduating from the Army’s Ranger School—considered by many “the hardest combat course on the planet.”   “For the sixty-two days of the course, candidates train for up to twenty hours a day and subsist on little more than a thousand daily calories” (#1300).  Only a few wanna-be male Rangers actually make it.  But in 2016 the Army celebrated two women for completing the course.  Then a journalist, Susan Katz Keating, decided to investigate the story and found the women were granted special exemptions and treatment, getting special “individualized training” and granted additional time “in the ‘pre-Ranger’ screening course despite failing critical tests.  And the instructors felt intense pressure to make sure the women passed, pressure that led to sharp departures from normal Ranger School standards” (#1213).  For example, whereas men were given only 48 hours to recover from stage one of the training before moving on, the women were given two to three months “to regain lost sleep, allow taxed muscles to recuperate, and otherwise recover physically” (#1355).   On-site instructors (speaking anonymously for fear of retaliation) universally commended the women’s efforts but lamented “how systematic political pressure forced changes to the legendary Ranger course, damaging its integrity, just as political pressure forced detrimental changes at every level of the military during the eight years of the Obama administration” (#1454).

In 2013 the Obama administration determined to allow women to serve in ground combat units.  Asked to study the issue, the various services prepared reports.  The Marines devised a meticulous study designed to record “injury rates, the speed at which the companies evacuated causalities on the ground, marksmanship scores, and dozens of other measurements.”  They thought, if all-male units proved superior, the Administration would preserve their traditions.  A combat veteran of Afghanistan, former Marine Captain Jude Eden, “summarized what they found:  ‘[A]ll-male units outperformed coed units in 69 percent of the 134 combat tasks. . . . If the figure had been even a mere five percent difference it would have been ample reason to maintain women’s exemption, since five percent is easily and frequently the difference between life and death in offensive ground combat.  But in fact the figure was 69 percent!” (#1615).  

University of Pittsburg researchers “conducted a comparative analysis of all-male and mixed-sex infantry units’ performance in critical battle drills and corroborated the findings of other teams.  In a thorough analysis of the injury reports from each of the training exercises, the researchers also discovered that the injury rate for female Marines during weight-carrying exercises was more than twice that of their male counterparts” (#1675).  The issue was never whether or not women could fight and die but whether they  could “walk up to fifteen miles a day, carry eighty pounds of equipment, and often sleep and tend to bodily functions in austere environments with little to no privacy?”  (#1933).  In fact, virtually none can!  But the Obama administration cared little for facts.  Senior military officers soon learned “that the administration had no interest in military readiness or lethality.  Instead, it was waging an ideologically driven campaign with an end goal of creating an equal number of male and female generals and the first female chair of the joint chiefs” (#1988). 

All available evidence merely confirms common sense:  “there are real and substantial physiological differences between men and women” (#1629).  But who cares!  The Administration, imposing its agenda upon the Marine Corps, was determined to “crack the glass ceiling” by placing women in combat units, opening for them important opportunities for promotion.  Secretary of the Navy Ray Mabus was especially determined to sexually integrate combat units because he was pursuing an  “ideologically driven quest for a ‘genderless’ Navy and Marine Corps.  He directed senior naval commanders to “ensure [that job titles] are gender-integrated . . . removing ‘man’ from their titles.  Traditional naval and Marine Corps job titles such as ‘yeoman’ and ‘rifleman’—titles that date to the founding of our republic—apparently needed to be changed to reflect a ‘gender-integrated’ force” (#1875).  Though the job titles were not actually changed, Mabus did manage to redesign uniforms to better fit women, and his broader agenda was enacted, fully in accord with radical feminist dogma.

Concluding his case, Hasson insists the United States still has the finest military in the world, but its strength is eroding.   Political correctness is “hurting our ability to retain talented officers and enlisted troops who entered the military for all the right reasons but find they spend their days acting as bureaucrats” implementing societal change” (#2727).  To rectify the problem the author thinks we must immediately reverse many of the Obama policies.  Doing so would enable us to follow the prescription of Revolutionary War hero “Light-Horse Harry” Lee, who “said he could not ‘withhold my denunciation of the wickedness and folly’ of a government that sent its soldiers ‘to the field uninformed and untaught.’  Such a government, he believed, was ‘the murderer of its citizens.’” (#2816).

322 The Myth of the Dying Church

Several weeks ago I began the Sunday school class I teach during the summer by referencing a recent article in First Things entitled “Belief Limbo,” by Ronald Dworkin, lamenting the growing number of folks in America “who are unsure, uninterested, undecided, or just too busy for religion, and who live in ‘belief limbo.’”  Since his concerns regarding the decline of religion had been widely diffused throughout by the religious media, I took his pessimism seriously, and we discussed how churches might better evangelize the nation.   Literally a few days later I read an article referencing a recent book that refutes many of these notions—Glen T. Stanton’s The Myth of the Dying Church:  How Christianity is Actually Thriving in America and the World (New York:  Worthy Publications, c. 2019)—so I acquired and rapidly read it.  Stanton is the director of Global Family Formation Studies at Focus on the Family, where he has worked since 1993, and he reminds us that Theodore Beza, John Calvin’s successor in Geneva, said:  “[ L] et it be your pleasure to remember that the Church is an anvil which has worn out many a hammer.” 

Stanton begins by acknowledging the influence of various “Chicken Littles” who have persuaded the public that Christianity is declining.  A headline in the Washington Post asserted:  “Christianity Faces Sharp Decline as Americans Are Becoming Even Less Affiliated with Religion.”  Similarly,  Newsmax declared:  “Christianity Declines Sharply in US, Agnostics Growing:  Pew.”   An article posted on BeliefNet lamented:  “Declining Christianity:  The Exodus of the Young and the Rise of Atheism.”  National Public Radio, in a celebratory note, said:  “Christians in U.S. on Decline as Number of ‘Nones’ Grows, Survey Finds.”  And that depository of all things properly liberal, the New York Times intoned:  “Big Drop in Share of Americans Calling Themselves Christian.”  And as if the secular doomsayers were not enough, trusted Christian sources often affirm the litany of woe.  One leading Christian author declared:  “Young people are leaving the church in droves,” reflected in “‘staggering numbers’” of those who say they no longer believe.”  An advertisement in a Christian magazine said:  “This generation of teens is the largest in history— and current trends show that only 4 percent will be evangelical believers by the time they become adults.  Compare this with 34 percent of adults today who are evangelicals.  We are on the verge of a catastrophe.”  Then a parachurch organization declared:   “Up to 90 percent or more of Christian kids will leave the church by the time they reach adulthood” and a youth ministry publication warned: “86% of evangelical youth drop out of church after graduation, never to return.”  Unfortunately, many of the folks circulating bad news are in organizations selling books or programs designed to address the problem!  Apologetics is an important discipline, but practitioners of the discipline frequently overstate the threats the church faces order to elicit support.

Given such a plethora of pessimism, many of us may have rather despaired at the prospects for the church!   But Stanton urges us to reconsider:  “I have good news for you:  IT’S SIMPLY NOT TRUE!” (p. xx).  There’s certainly little good news for mainline Protestant churches, for they have sustained significant losses.  “Pew’s America’s Changing Landscape states that between 2007 and 2014, mainline Protestant churches declined by 5 million adult members; taking into account margin of error, that number could be as high as 7.3 million lost members.  Regardless, the loss is massive.  But here is the part you didn’t hear.  Churches in Pew’s ‘evangelical’ category continued to grow in absolute numbers by about 2 million between 2007 and 2014” (p. 26).  Stanton finds this good news because he’s dived into serious scholarly literature—in-depth analyses by renowned professors, scholarly articles in trustworthy journals, and data-packed studies “from leading mainstream organizations that track church growth and decline numbers” (p. 12). 

The percentage of Protestants and Catholics who say their faith “is very important” to them has “increased two percentage points since 2007” (p. 37); they pray daily, join small groups for Bible study and fully believe it is God’s inspired Word.  Stressing the vitality of the faith in today’s America, Greg Smith, for example, “has long worked as the associate director of research for the Pew Research Center, one of the most trusted and respected institutions on this topic.  In an interview with Christianity Today a few years ago, Smith was asked by Dr. Ed Stetzer of Wheaton College if evangelicalism was dying.  He said simply, ‘Absolutely not,’ and went on to explain, ‘There’s nothing in these data to suggest that Christianity is dying.  That Evangelicalism is dying.  That Catholicism is dying.  That is not the case whatsoever’” (p. 13)  In fact, Evangelicalism is, “if anything, growing.”  It’s growth is substantiated by a Indiana/Harvard study that finds it increased from 18 percent of the population in 1972 to 28 percent in 2016.  Much of this growth in taking place in nondenominational, independent churches, many of them of the “megachurch” variety.

Turning to the oft-cited “nones” Stanton says it’s just a new name for an irreligious or nominally Christian group which has always been part of American culture.  “Let me put it directly,” he says:  “The rise in these much talked about and fretted-over nones are not people leaving their faith or the church.  They are not a new kind of unbeliever.  They are not actually a new group at all.  These are folks who are simply being more honest and accurate in their description of where they have always been in terms of their belief and practice.  This is who the nones are.  Their rise is not because of some great secularizing upheaval in American’s faith beliefs and practices.  They are simply reporting their actual faith practices in more candid ways, largely due to new ways in which polling questions have been asked in the last ten years or so.”  Wheaton’s Ed Stetzer, “has given one of the best clarifying explanations of this phenomenon that I’ve seen.  In USA Today, he wrote that ‘Christianity isn’t collapsing, it’s being clarified’” (pp. 53-54). 

Still more:  rather than multitudes of young people rejecting their parents’ faith “nearly 90 percent of kids coming from homes where they were taught a serious faith retain that faith into adulthood” (p. 56).   That collegians may for a time turn irreligious is an old, old story, for young people often demonstrate their independence by rejecting the faith of their fathers.  But, Rodney Stark says:  “‘That [young adults] haven’t defected from the church is obvious from the fact that a bit later in life, when they have married, especially after children arrive, they become more regular attenders. This happens every generation’” (p. 99).  Still more, Ed Steltzer says the University of Chicago’s universally respected General Social Survey (GSS) reveals that:  ‘If you look at young [evangelical] adults, eighteen to twenty-nine years old, we are at the highest reported levels since 1972 of regular church attendance among this group.  That’s a pretty big deal’” (p. 100).  And the reason these young people adhere to the faith is equally big:  parents!  Authentically devout parents enable their children to become devout adults. 

Professor Christian Smith, one of the nation’s finest sociologists, has for years overseen the National Study of Youth and Religion (NSYR), and he asserts that “‘parents are huge— absolutely huge— nearly a necessary condition’ for a child to adopt a living and lasting faith.  He concludes, ‘Without question, the most important pastor a child will ever have in their life is a parent’” (p. 113).  In fact, “fully 85 percent of teens raised by parents who took their faith very seriously, and lived in a home with consistent faith practices, became young adults who not only had a serious faith, but had the highest levels of religious belief and practice among their peers!” (p. 114).  The data show that effective parents:  1) “take their faith very seriously and live it out in meaningful ways;” 2) establish warm relationships with their kids; 3) encourage them to pray regularly and do so themselves;  4) engage them in and exemplify Bible reading; 5) routinely attend church and take part in its various ministries; 6) celebrate “miracles in their own lives and the lives of others;” 7) encourage children to deal honestly with their doubts and difficulties; 8) stand alongside them when teachers or classmates ridicule or persecute them for their faith; 9) enlist “satellite adults” to model and help them in living out the “family’s faith and convictions” (pp. 133-134).

Looking beyond the United States, the state of Christianity around the world is even more encouraging, particularly in the “Global South”—Latin America; Africa; and Asia.  “In terms of sheer numbers, Christianity is flowering around the world and doing so soundly, even dominantly” and probably will do so throughout this century.  . . . .  Specifically, the coming two decades will see the world’s population of Christians grow from today’s 2 billion to a remarkable 3 billion adherents, making Christianity the world’s largest faith for at least the next eighty years” (p. 74).  In stark contrast to Europe and the mainline churches in America, churches in the Global South are almost universally “strongly conservative in their theology, ecclesiology, and sexual teachings” (p. 80). 

And, most importantly, what’s evident in this world-wide church growth is the power and presence of the Holy Spirit, for He has empowered “Christ’s church across time and throughout the nations.  He is unstoppable, unquenchable, and inherently life-giving.  He is not nodding off, sickly, or on vacation.  The work of His heart and very character will not be thwarted.  He is God.  To believe the church is dying is to deny these truths and judge God either confused or a liar” (p. 191).  As was evident at Pentecost, “God’s Word will not return void.  What the Bible says of the church on its first day will also be true of these churches today:  ‘And the Lord added to their number day by day those who were being saved’ (Acts 2: 47).  Church, be of good cheer.  God is true.  Aslan is on the move.  Chicken Little is mistaken.  God’s future is bright.  It cannot be otherwise” (p. 193).

* * * * * * * * * * * * * * * * * * * * * * * * * *

For many years Rodney Stark, a professor at Baylor University,  has been trying to correct some pernicious errors regarding Church history.  In Bearing False Witness:  Debunking Centuries of Anti-Catholic History (West Conshohocken, PA : Templeton Press, Kindle Edition, c. 2016), he sought to rectify the record—not to defend the Catholic Church (since he is a Protestant) but to defend history.  To do this he first addresses various “distinguished bigots” (such as Edward Gibbon) posing as scholars who have maliciously slandered Catholics.  “It all began with the European wars stemming from the Reformation that pitted Protestants versus Catholics and took millions of lives, during which Spain emerged as the major Catholic power.  In response, Britain and Holland fostered intense propaganda campaigns that depicted the Spanish as bloodthirsty and fanatical barbarians.  The distinguished medieval historian Jeffrey Burton Russell explained, ‘Innumerable books and pamphlets poured from northern presses accusing the Spanish Empire of inhuman depravity and horrible atrocities…. Spain was cast as a place of darkness, ignorance, and evil.  Informed modern scholars not only reject this malicious image, they even have given it a name: the ‘Black Legend.’ Nevertheless, this impression of Spain and of Spanish Catholics remains very much alive in our culture—mere mention of the “Spanish Inquisition” evokes disgust and outrage” (#68). 

Inasmuch as much of the “Black Legend” is patently untrue, so other allegations regarding the ignorance and crimes of Roman Catholics need to be disproved.  This includes rightly portraying the Spanish Inquisition, long a whipping boy for cynical critics.  For years Stark had believed the Inquisition illustrated the depravity of the Catholic Church, so “when I first encountered the claim that not only did the Spanish Inquisition spill very little blood but that it mainly was a major force in support of moderation and justice, I dismissed it as another exercise in outlandish, attention-seeking revisionism.  Upon further investigation, I was stunned to discover that in fact, among other things, it was the Inquisition that prevented the murderous witchcraft craze, which flourished in most of Europe during the sixteenth and seventeenth centuries, from spreading to Spain and Italy. Instead of burning witches, the inquisitors sent a few people to be hanged because they had burned witches” (#128).

Without question the “Spanish Inquisition” is routinely included in anti-Catholic polemics, generally written by zealous Protestants or cynical secularists.  Best-selling books by historians such as Will Durant, easily fueled prejudices by declaring that “‘we must rank the Inquisition … as among the darkest blots on the record of mankind, revealing a ferocity unknown in any beast’” (p. 110).  Shocking stories about Torquemada’s brutality, estimates of victims killed ranging from hundreds of thousands to millions (including 300,000 burned at the stake), contributed much to the “Black Legend” so beloved by many.  However, Stark says:  “The standard account of the Spanish Inquisition is mostly a pack of lies, invented and spread by English and Dutch propagandists in the sixteenth century during their wars with Spain and repeated ever after by the malicious or misled historians eager to sustain “‘an image of Spain as a nation of fanatical bigots’” (p. 111).  Contemporary scholars, scouring Spanish archives, have actually read the ”records made of each of the 44,674 cases heard by these two Inquisitions between 1540 and 1700” as well as diaries and letters written in those years.  During the first 50 years, perhaps 1500 people may have been executed, though the records are sparse.  But during “the fully recorded period, of the 44,674 cases, only 826 people were executed, which amounts to 1.8 percent of those brought to trial.  All told, then, during the entire period 1480 through 1700, only about ten deaths per year were meted out by the Inquisition all across Spain, a small fraction of the many thousands of Lutherans, Lollards, and Catholics (in addition to two of his wives) that Henry VIII is credited with having boiled, burned, beheaded, or hanged” (p. 114).

Dealing with the “Sins of Anti-Semitism,” Stark provides an important historical context, showing how Jews have frequently suffered in various historical epochs.  Long before Christianity flourished there were influential Romans, such as Cicero, Seneca, and Tacitus who manifested Anti-Semitism.  In fact:  “The Jews were expelled from Rome in 139 BCE by an edict that charged them with attempting ‘to introduce their own rites’ to the Romans and thereby ‘to infect Roman morals’” (p. 4).  Then, in 70 A.D., the Romans brutally suppressed a Jewish rebellion, destroyed the Temple, and inaugurated a massive Jewish diaspora throughout the Empire.  As the Early Church developed, Jews often played a major role in denouncing and persecuting it.  Thus we find, in both the NT and subsequent Christian writings, many anti-Jewish statements.  But as Christianity triumphed there was relatively little persecution of Jews.  Throughout the Early Middle Ages they enjoyed considerable toleration within Christian communities, but things changed rather dramatically in the 11th century when the Islamic threat precipitated attacks on Jews.

Unfortunately, in the 11th century many Christians became almost morbidly concerned with heresies of various sorts and Jews often suffered alongside them.  “Unlike Christian heretics such as the Cathars, Waldensians, Fraticelli, and similar groups,” however, “the Jews were the only sizeable, openly nonconformist religious group that survived in Europe until the Lutherans did so by force of arms” (p. 19).  Indeed, “no pope in the Middle Ages ever undertook a campaign to convert the Jews,” and the distinguished historian Steven T. Katz, “wrote:  ‘Though Christendom possessed the power, over the course of nearly fifteen hundred years, to destroy that segment of the Jewish people it dominated, it chose not to do so … because the physical extirpation of Jewry was never, at any time, the official policy of any church or of any Christian state” (p. 19).

One of the widespread myths was popularized by Edward Gibbon when he declared Christianity prevailed in the Roman Empire because emperors and prelates ruthlessly imposed the Faith by persecuting pagans.   Consequently, as Peter Brown said:  “‘From Gibbon and Burckhardt to the present day, it has been assumed that the end of paganism was inevitable, once confronted by the resolute intolerance of Christianity; that the interventions of the Christian emperors in its suppression were decisive.’  But it isn’t true. As Peter Brown continued, large, active pagan communities “continued to enjoy, for many generations, [a] relatively peaceable … existence.” All that really happened is that they “slipped out of history’” (p. 46).  Solid historical work now shows pagans peacefully coexisted with Christians following Constantine’s Edict of Toleration.  Indeed we read, in the Code of Justinian:  “‘We especially command those persons who are truly Christians, or who are said to be so, that they should not abuse the authority of religion and dare to lay violent hands on Jews and pagans, who are living quietly and attempting nothing disorderly or contrary to law’” (p. 47).  As one of the finest contemporary historians,  Ramsey MacMullen, emeritus professor of history at Yale University and cited by the American Historical as “the greatest historian of the Roman Empire alive today” put it: “‘The triumph of the church was not one of obliteration but of widening embrace and assimilation’” (p. 61).

Intermingled with misinformation regarding the Inquisition are charges of multitudes of witches being burned.  Feminist “historians” have been particularly aggressive in making such accusations, part and parcel of their assault on evil patriarchs!  “Perhaps no historical statistics have been so outrageously inflated as the numbers executed as witches during the craze that took place in Europe from about 1450 to 1700.  It is sometimes alleged that some nine million witches were burned, often at the hands of Catholic Inquisitors.  But it’s all “vicious nonsense,” for solid scholarship now shows that perhaps 60,000 witches were actually executed, and these were in Protestant rather than Catholic countries.  Indeed, Henry C. Lea (no friend of Catholics) “agreed that witch-hunting was ‘rendered comparatively harmless’ in Spain and that this ‘was due to the wisdom and firmness of the Inquisition’” (p. 116). 

As is evident in the New York Times’ recent determination to date America’s founding in 1619, when slaves first landed in Virginia, slavery provides formidable fodder with which to attack one’s cultural foes.  So too, various historians have asserted that the Catholic Church legitimated and supported slavery.  But in fact slavery had slowly disappeared in the Early Middle Ages as Christianity extended its influence.  Furthermore, the Church’s greatest theologian, Thomas Aquinas, said slavery is a sin, and his position “has guided papal policy ever since” (p. 162).   Thus Pope Paul III declared that American Indians “and all other peoples—even though they be outside the faith—… should not be deprived of their liberty or their other possessions … and are not to be reduced to slavery, and that whatever happens to the contrary is to be considered null and void’” (p. 164).  Unfortunately, other popes occasionally departed from this policy, and it had little influence in Spanish and Portuguese colonies, where monarchs defied the popes and slavery flourished for centuries.  “The problem wasn’t that the Church failed to condemn slavery; it was that few heard it and most did not listen” (p. 165).  Stark carefully examines the French Code Noir and Spain’s Código Negro Español, showing how historians have selectively quoted the documents to disparage the Catholic Church, when in fact the codes set forth much more humane practices than could be found in Protestant colonies.  In America, these two codes helped shape slave-treatment in Louisiana when France (and briefly Spain) controlled the colony, so in 1830 “a far higher percentage of blacks in Louisiana were free (13.2 percent) than in any other slave state” (p. 171).  In fact, in “New Orleans, 41.7 of the blacks were free in 1830,” whereas in Charleston, South Carolina only 6.4 percent were free” (p. 172). 

Stark’s approach to various “myths” stands forth in his chapter titles, setting forth the errors he endeavors to expose:  1. Sins of Antisemitism  2. The Suppressed Gospels  3. Persecuting the Tolerant Pagans  4. Imposing the Dark Ages  5. Crusading for Land, Loot, and Converts  6. Monsters of the Inquisition  7. Scientific Heresies  8. Blessed Be Slavery 9. Holy Authoritarianism  10. Protestant Modernity. 

What Stark helps us do is read history more carefully, remaining especially vigilant whenever historians deal with Catholicism—or Christianity for that matter.  Just as Jesus warned, our enemies will “utter all kinds of evil against you falsely” (Mt 5:11).  That such is done is eminently evident in history books!

321 Readable Histories

Beginning with the “father of history,” Herodotus, most historians crafted interesting stories designed to appeal to the reading public.  Then, in the 19th century, German historians determined to make their craft more scientific, more fact-focused, and wrote increasingly for others in the profession who were compiling (in accord with positivistic scientists) a tapestry of information.  Diligently souring archives and seeking “objectivity” was certainly admirable and useful, but what was too often lost was the literary skill needed to interest general readers.  Fortunately, there are still many histories written (often by journalists as well as scholars with literary skills) that deserve being considered as “readable” histories.  One was given to me by a good friend, Dr. Dean Nelson, who as a journalism professor appreciates effective writing and is a friend of one if its authors (Lynn Vincent).  In Indianapolis:  The True Story of the Worst Sea Disaster in U.S. Naval History and the Fifty-Year fight to exonerate an Innocent Man (New York:  Simon & Schuster, c. 2018), Vincent and Sara Vladic tell the story of a flagship of the World War II Pacific fleet, basing their presentation in extensive interviews as well as library research.  When she was sunk, in literally the final days of the war, it “was the greatest sea disaster in the history of the American Navy” (p. 2).

Rather than write a strictly chronological account, the authors weave together technical details regarding the USS Indianapolis, personal anecdotes regarding her officers and crew, explanations regarding naval strategy and policy, insights from Japanese sources, and political perspectives regarding America’s efforts in WWII.   They thus provide highly detailed, accurate information in an engrossing manner.  For example, they describe the 610’ craft herself—construction materials, physical appearance, guns, 250 ton turrets, bulkheads, armor, etc.  They also provide pictures.  Christened in 1932, “Indy was grand but svelte.  Franklin Delano Roosevelt made her his ship of state and invited world leaders and royalty to dance under the stars on her polished teak decks” (p. 1).  One of the Navy’s 18 “Treaty Cruisers,” the Indianapolis was built to meet “treaty displacement limitations that produced thinly armored vessels shipbuilders referred to as ‘tin clads.’”  The men who manned them, however, “often fell in love with their speed and grace,” and one of the Indy’s sailors, 19 year-old Seaman Second Class L.C. Cox, simply “stood and gawked” when he saw the ship.  “She was colossal.  Sleek.  Magnificent.  He could hardly wait to get aboard” (p. 27). 

Inasmuch as the authors can make the details of a warship interesting, it’s inevitable they’ll be even more winsome when describing the men who manned her.  They provide vignettes of admirals and captains, cooks and gunnery sergeants, Japanese submariners and naval officers.  They not only portray the men but tell about their girlfriends and wives, their local backgrounds and personal proclivities.  A central figure in the story, Captain Charles McVay III, was the son of a Navy officer who’d fought in the Spanish American War and WWI, wherein he commanded two battleships.  Unconcerned with his son’s self-esteem, he routinely unleashed “a steady stream of sharp-tongued verdicts on the younger McVay’s Navy performance and demanded that he cover himself with glory befitting an admiral’s son” (p. 28).  The younger man, like has father, graduated from the Navy Academy, was commissioned in 1920, and rapidly rose through the ranks.  During WWII, he received a Silver Star for gallantry for his action in the Solomon Islands.  He was sensitive to the needs of his men and worked hard to maintain morale.  

During its activities in the Pacific Theater of WWII, the Indianapolis took part in many battles, including the Battle of Okinawa, where the Japanese fought tenaciously and sent hundreds of suicide-bombers to attack the American fleet.  The conflict was costly:  “36 ships sunk, 368 damaged, 763 aircraft lost, more than 12,000 soldiers, sailors, and Marines dead, drowned. or missing” (p. 72).  One of the ships struck was the Indianapolis, leaving her with “damage too serious for repairs at sea” (p. 35).  So she limped back to San Francisco to undergo repairs, and her captain, Charles McVay, assumed she’d see no more combat since the war seemed to be quickly approaching its end.  But as soon as the repairs were done McVay was called to a highly-secret meeting and ordered to take an important shipment back to Okinawa.

Though Japanese forces were retreating, it had become evident to Admiral Nimitz and President Truman that any invasion of the home islands might incur ghastly casualties.  Serious discussions in the very highest sectors ensued, and it was debated whether or not to use the recently-tested atomic bomb to force Japan to surrender.  Using an airplane to transport the bomb across the Pacific was considered imprudent, so it was decided to use Indy for the task.  “The contents of the shipment were not to be revealed to anyone aboard Indianapolis, even McVay” (p. 68).  On July 16, 1945, her voyage began.  The cruiser was built for speed, and McVay ordered her to move fast, since he’d “been told that ever day we take off the trip is a day off the war” (p. 95).  Ten days later Indy anchored at Tinian Island and unloaded her secret cargo.  It would then be placed in the hold of the Enola Gay, which would on August 5 make history by dropping an atomic bomb on Hiroshima. 

Having completed her mission, the Indianapolis sailed from Okinawa to Guam, preparing to sail on to Leyte, in the Philippines.  No naval officers thought this journey would be particularly risky since it would be on the periphery of the combat zone.  As Commodore “Jimmy” Carter said, “‘The Japs are on their last legs and there’s nothing to worry about” (p. 116).  Some intelligence reports indicated a handful of enemy submarines were prowling about in the area, but they were thought to be some distance from the Indy’s projected route.  The ship sailed on July 28 and planned to arrive in Leyte on July 31.  Following established procedures, the ship zigzagged during daylight but resumed base course when it became completely dark.  Then, just before midnight, July 30, a Japanese submarine commanded by Mochutsura Hashimoto launched six torpedoes toward the Indianapolis.  Built for speed, Indy had rather thin armor plates and some authorities had speculated that even one torpedo could sink her.  The Japanese torpedo “carried a huge explosive payload designed to mortally wound battleships and cruisers” (p. 151).   Two of them struck the Indianapolis, and she rapidly began sinking, disappearing 12 minutes. 

Facing the inevitable, Captain McVay ordered the men to abandon ship.  Many had died in the explosions, of course, but some 800 managed to escape and survive the initial disaster.  Yet their trials had only begun!  Clinging to life rafts and debris, they assumed rescue forces (planes and ships) would soon arrive to save them.  Indeed one of the great questions the authors raise is this:  “How was it possible that no one had known Indianapolis was missing?” (p. 262).  But day after day the survivors scoured the horizon and saw no one coming to their assistance.  The merciless sun seared their bodies and there was little food or water to sustain them.  Then came the sharks!  Since the authors interviewed many of those who survived, their description of these days renders the men’s suffering palpable.  Many of them had been injured by the torpedo blasts and quickly expired.  Each day hundreds of them disappeared.  The men prayed fervently.  Many of them behaved courageously.  And, of course, some behaved abominably.  After three days in the water they were spotted by an airplane and help began to arrive, with Captain McVay and the last survivors being rescued on August 3.  They were “ emaciated and shark-bitten.  Some had lost as much as forty pounds.  Their skin looked like burned bacon and was pocked with oozing sores.  Many were delirious” (p. 273).  In all, of the 1200 crewmen manning Indy, only 311 survived (a handful only for a brief time).  Sadly enough, hundreds more would have survived if rescue efforts had come quickly. 

Almost as soon as the tragedy transpired, it was necessary to blame someone!  Trying to escape personal accountability, high-ranking Navy officers (preeminently Fleet Admiral King), tried to blame Captain McVay and ordered him brought to trial for the disaster!  Amazingly, he was the only captain of the hundreds of Amerrican ships sunk in the war to be brought to trial.  But he was, at the end of 1945, court-martialed and his naval career effectively ended—finishing his career stuck in an insignificant posting in New Orleans.  At the time—and increasingly as the years passed—many observers thought McVay was punished to protect some of his superiors who were actually responsible.  Thus the authors devote a significant section of the book to describing “the Fifty-Year fight to exonerate an Innocent Man,” for “to a man, the survivors believed McVay was innocent” (p. 292).  Meticulous researchers and courageous witnesses were able, in time, to provide irrefutable evidence that McVay had been railroaded and was in fact innocent of the charges leveled against him.  But it would not be until George W. Bush was President that McVay was finally exonerated, so in the end his reputation (if not his career) was redeemed. 

* * * * * * * * * * * * * * * * * * * * *

Now and then I read a book I wish I could have written.  This is particularly true of John Sedgwick’s Blood Moon: An American Epic of War and Splendor in the Cherokee Nation (New York:  Simon & Schuster, c. 2018; Kindle Edition), since it deals with the same material I detailed in my 1976 PhD dissertation, entitled Brother Brother Slew:  Factionalism in the Cherokee Nation, 1835-1865.  (In fact, Sedgwick references my dissertation several times in his footnotes, demonstrating how exhaustively he researched the subject!).  The factionalism I discussed is portrayed by Sedgwick as a conflict between two men who personified this conflict:  “The Ridge—short for He Who Walks on Mountaintops—was a big, imposing, copper-skinned Cherokee, a fearsome warrior turned plantation owner, whose voice quieted any room, and whose physique awed anyone who crossed his path.  Smaller, almost twenty years younger, [John] Ross was descended from Scottish traders and looked like one:  a pale, unimposing half-pint who wore eastern clothes, from laced shoes to a top hat.  If The Ridge radiated the power of a Cherokee who could drop a buck at a hundred paces, Ross could have strolled into an Edinburgh dinner party without receiving undue attention.  Tellingly, The Ridge spoke almost no English, and Ross almost no Cherokee” (p. 3).  Ross and Ridge were one-time friends and allies who fell apart under the pressures for removal applied by President Andrew Jackson in the 1830s.  During those years there erupted a “blood feud” which morphed “from personal vendetta to clan war to a civil war that swept through the entire Cherokee Nation before it got caught up in the even greater cataclysm of the American War Between the States” (p. 4).

Sedgwick devotes the first section of his book (“Paradise Lost”) to the history of the Cherokees during from 1770-1814.   (Invoking the word “paradise” to describe that world reveals the author’s rather romantic approach to the natives, inasmuch as my reading of the primary sources certainly unveils a great deal of violence, blood feuds, revenge killings, superstition, insecurity, factionalism, etc. that made the Cherokees something less than Edenic peoples!  Anyone wanting a more realistic depiction of Native Americans would do well to read the  multi-volume Jesuit Relations, or Kevin Seiper’s Conquistador Voices, or Bernard Bailyn’s The Barbarous Years).  During these years the Cherokees watched their territory shrink as a result of conflicts with the first English and then the Americans.  Military conflicts and treaties and trade brought the Indians and Anglo-Americans together in disparate ways, leading to the emergence of a unique Cherokee embrace of the white man’s “civilization.” 

Embodying this transition was The Ridge, formerly known as a great warrior and hunter, who declared:  “‘The hunting is almost done & we must now live by farming, raising corn & cotton & horses & hogs & sheep.  We see that those Cherokees who do this live well” (p. 69).  He and his wife discarded “the habits of their race” and took up “Christian employments.”  Intrinsically industrious, The Ridge soon built a fine house and oversaw a thriving plantation.  He supported the political organization of the tribe, beginning in 1808 with a tribal council passing its first “law.”  He sent his children to a nearby Moravian school, insisting they become literate and ready to prosper in the emerging Cherokee Nation.  Initially uninterested in the missionaries’ Gospel message, he was in time drawn to “the fall and salvation of man” story they shared.  When the great Tecumseh (a Shawnee from Indiana) came visiting in 1812, he implored the Cherokees to join in his conspiracy, averring “that the Great Spirit was furious to see the Cherokee with the whites’ gristmills, cotton clothes, liquor, featherbeds, and house cats” (p. 93).  But the Ridge and most Cherokees declined to join him. 

In fact, when the War of 1812 broke out and Andrew Jackson launched an expedition to punish the “Red Stick” Creeks in 1813, The Ridge and many Cherokees joined him, enthusiastically killing and scalping their ancient foes at the Battle of Horseshoe Bend.  For his warrior-skills and courage, The Ridge was made a major—thenceforth to be called, much to his pleasure, “Major Ridge.”  In a critical phase of the battle, he killed six knife-wielding Red Stick warriors, and Jackson’s ultimate success in the battle at New Orleans was facilitated by his Cherokee allies.  Sadly enough, while the Cherokees were helping Jackson, the Tennessee militia had charged through their lands “like an avenging army, stealing horses, slaughtering hogs and cattle, destroying corncribs, tearing down fences, seizing private stores of corn, maple sugar, and clothing and what few possessions the Cherokee could call their own” (p. 115).  Helping Jackson would not, in the long run, help them!  In fact, when they asked him for compensation for these “spoliations,” he

confiscated some of the Cherokee lands!  Like Major Ridge’s son, John, began to realize, “Old Hickory” was also a “snake in the grass.”

To establish themselves in the face of the advancing frontier—some 14,000 Indians confronting hundreds of thousands of Anglo-Americans (mainly Georgians) forging westward—the  Cherokees rapidly formed a government following the pattern of the U.S. Constitution, the first to be crafted by any Indian tribe.  They launched a national newspaper, The Cherokee Phoenix, thanks to the phenomenal work of Sequoyah, an illiterate genius who single-handedly designed a Cherokee syllabary that enabled adult Cherokees to become literate in a few days.  In fact, “it was so easy to learn that schools didn’t bother to teach it, since children could pick it up on their own.  Remember the eighty-six symbols, sound them out, and you had it.  More than a Gutenberg, Sequoyah was a Leonardo, an inventor who created not just an invention, but modernity.  It is hard to find in all of recorded history as dramatic a transformation of a people in such a brief period of time.  It unleashed an outpouring of notes, letters, essays, records, reports, newspapers, Bible translations, books” (p. 146).  Using the weapons of the press and petitions and lawsuits and delegations to Washington, the Cherokees (led by Principle Chief John Ross) endeavored to deal with the white man on his own terms, vowing:  “Not one foot of land in cession.”  And he was fully supported by Major Ridge, elected as “first counselor to the principal chief, a post that made him, after Ross, the second most powerful man in the nation” (p. 170).

But they faced an implacable foe in Andrew Jackson, who was elected President in 1828.  One of his first concerns, following his inauguration, was passing and implementing the Indian Removal Act.  All Indians east of the Mississippi were to be driven from their homes and resettled in the West.  In resistance, the Cherokees won important legal victories (most notably in the Supreme Court in Worcester vs. Georgia) and found much support throughout the United States, especially in religious sectors.  Jackson, however, cared little for courts or public opinion.  “Incredible as it seemed to Ridge and Boudinot, Jackson had indeed decided that this epic Supreme Court ruling was merely John Marshall’s opinion, nothing more. ‘John Marshall has made his decision,’ Jackson was said to have declared, and rather idly.  ‘Let him enforce it’” (p. 193).  Seeing the writing on the wall, some Cherokees decided it would be wiser to remove to the West (where there was already a settlement of Western Cherokees) on their own rather than wait for the U.S. to force them.  Thus Major Ridge and his extended family, supported by a largely mixed-blood faction, formed what would be known as the Treaty Party—Cherokees willing to negotiate the best removal treaty possible.  They signed a treaty offered them by Jackson’s envoy (Rev. John F. Schermerhorn, a former missionary), taking $4.5 million for their eastern lands and getting lands in Indian Territory.   Though John Ross and the national assembly staunchly rejected the spurious “treaty,” President Jackson claimed it was legitimate and submitted it to the U.S. Senate for approval.  There Jackson “prevailed by just one vote beyond the two-thirds needed.  So, on May 23, 1836, the New Echota Treaty became the law of the United States—and this was one law that Andrew Jackson had every intention of enforcing” (p. 248). 

The Treaty Party (numbering about 1000, including many slaves), led by Major Ridge, his son John, Elias Boudinot (the editor of The Cherokee Phoenix) and his brother Stand Watie, removed on their own.  In what is today northeastern Oklahoma, they settled amongst the “Old Settlers”—Cherokees (including Sequoah) who had on their own migrated west in the previous two decades.   “‘It is superior to any country I ever saw in the U.S.,’ John Ridge declared after he’d had a chance to ride about the territory.  ‘In a few years it will be the garden spot of the United States’” (p. 269).  But when the U.S. Army rounded up and drove west the bulk of the tribe, they were understandably bitter and disillusioned, blaming both the United States and the Treaty Party.  “Of the 15,000 Cherokee who undertook the journey that became universally known as the Trail of Tears, roughly 2,000 died, and countless more simply disappeared en route.  Two thousand more died after they arrived from disease, starvation, and the misery that comes with such suffering” (p. 188).

Their misery prompted thoughts of revenge, and the tribe’s “blood law” justified them.  “No one was to sell Cherokee land without official permission.  No one.  John Ridge had written this law himself, at his father’s instigation.  The fury had been smoldering for some time, possibly from the moment the ink was dry on the page just after Christmas 1835, now almost three and a half long years before.  Did they not know that the land was not theirs to sell?” (p. 297).  Thus a well-orchestrated plot targeted the leaders of the Treaty Party and assassins killed Major Ridge, John Ridge and Elias Boudinot.  The nation divided into a bitter factional strife, punctuated by clandestine killings and political turbulence, sometimes rendered dormant by peaceful interludes supported by all sides.  “With each side cloaked in righteousness, the killing went on and on.  Murder became so common, said one Cherokee, that it was like hearing ‘of the death of a common dog.’  From the end of 1845 to the end of 1846 [for example], thirty-four killings were recorded, nearly all of them political” (p. 338).

Nevertheless, when the American Civil War broke out, the old Treaty Party folks supported the Confederacy and followed Stand Watie, who was commissioned a colonel, leading a corps known as the Cherokee Mounted Rifles.  Ultimately Watie became a Brigadier General in the Confederate Army (the highest rank attained by any Indian during the Civil War), fighting a series of battles in Arkansas and Indian Territory.  “Watie, fighting to the end, was the last Indian commander” to lay down arms “just north of the Texas border.  It was there that Watie officially surrendered, the very last Confederate officer to give up the fight” (p. 392).  The Ross Party, led by John Ross, joined the Union as soon as possible, and many of his followers took up arms and battled for the North in the Cherokee Nation.  “In the Cherokee Nation, the ravages of the American Civil War had been compounded by the internal equivalent. Six thousand Cherokee, a quarter of the population, had died in the battles that occurred in every corner of the nation, or from the terrible starvation and rampant disease that followed them.  It turned 7,000 more out of their homes to roam the landscape in search of sustenance and shelter.  It widowed a third of all Cherokee wives, orphaned a quarter of the children, killed or scattered 300,000 head of cattle, and drove virtually everyone to depend on the federal government dispensing scant aid from the major forts, chiefly Fort Gibson” (p. 394). 

In short:  a Blood Moon shown on the tribe for 30 years wherein “brother brother slew.”  

320 New Conversations on Faith and Science

Two months ago I attended a conference hosted by Faith Bible Church, a large church in The Woodlands, Texas, titled “Reasons 2019:  New Conversations on Faith and Science.  Four speakers were featured, so I read books by each of them before joining others to hear their presentations.  The first presenter was Michael J. Behe, a professor of biochemistry at Leheigh University and one of the leading thinkers advocating the superiority of “Intelligent Design” over “Natural Selection” as the key to understanding the living world.  Nearly 30 years ago Behe published Darwin’s Black Box:  The Biochemical Challenge to Evolution, arguing “that life was designed by an intelligent agent” fully evident in his own study of biochemistry, revealing the intricacies of molecular life, the actual basis of life on planet earth.

A decade later he developed his position in The Edge of Evolution, wherein he noted that current orthodoxy in the scientific community defends a Darwinism composed of “random mutation, natural selection, and common descent.”  Of the three, random mutation is most crucial for understanding the emergence of novel life forms, but “except at life’s periphery, the evidence for a pivotal role for random mutations is terrible.”  In fact, we need the kind of precise, empirical data evident in engineering and anatomy.  For this we must plumb the mysterious realms of tiny molecules, proteins, and DNA.  To do so Behe focused on malaria—“the single best test case of Darwin’s theory.”  Because of its widespread devastation, malaria has been carefully studied for a century, and we can see, in 100 years of malaria parasites’ development, what has taken 100 million years in other species.  Amazingly, “the number of malarial parasites produced in a single year is likely a hundred times greater than the number of all the mammals that have ever lived on earth in the past two hundred million years.”  And though mutations have occurred, rendering us less susceptible to the disease, only minor molecular changes distinguish the parasites.  The Darwinian theory simply cannot explain one of the best-documented stories in biology.

Behe has recently published Darwin Devolves: The New Science about DNA that Challenges Evolution (New York: HarperOne, Kindle Edition, c. 2019).  He begins by reflecting on the philosophical questions he began asking as a boy—where did we come from? why are we here?  And in a simple but deeply profound way there are only two possible world-views addressing such questions, for “the enigma of where nature came from goes back as far as there are written historical records and, with a few lulls, has continued strongly up to the present.”  And despite many variations, “all particular positions on the topic can be considered to be elaborations on either of just two general mutually exclusive views:  (1) contemporary nature, including people, is an accident; and (2) contemporary nature, especially people, is largely intended—the product of a preexisting reasoning mind” (p. 1).  Though the two positions were debated in the Greco-Roman world, the “epitome of science” in antiquity “was arguably the work of the second-century Roman physician Galen, who had a very definite point of view on the origin of nature.  In his book On the Usefulness of the Parts of the Body, . . . Galen concluded that the human body is the result of a “‘supremely intelligent and powerful divine Craftsman,’ that is, the result of intelligent design” (p. 3).  That ancient insight, Behe holds, has been confirmed by the latest scientific understandings of DNA, and he has written this book “to give readers the scientific and other information needed to confidently conclude for themselves that life was purposely designed” (p. 20).

He begins illustrating his case by discussing polar bears, who have evolved, Darwinists claim,  from black bears and are uniquely adapted to the stark polar landscape.  “Yet,” he says, “a pivotal question has lingered over the past century and a half: How exactly did that happen?” (p. 16).  Just recently new research techniques have revealed the polar bear’s genetic heritage, and the “results have turned the idea of evolution topsy-turvy” (p. 16).  Scrupulous studies have shown that the genetic mutations differentiating polar bears from nearby relatives were “likely to be damaging—that is, likely to degrade or destroy the function of the protein that the gene codes for” (p. 17).  “It seems, then, that the magnificent Ursus maritimus has adjusted to its harsh environment mainly by degrading genes that its ancestors already possessed.  Despite its impressive abilities, rather than evolving it has adapted predominantly by devolving.  What that portends for our conception of evolution is the principal topic of this book” (p. 17).

Only recently have scientists been able to study life on a molecular level, where, “it turns out that, as with the polar bear, Darwinian evolution proceeds mainly by damaging or breaking genes, which, counterintuitively, sometimes helps survival.  In other words, the mechanism is powerfully devolutionary.  It promotes the rapid loss of genetic information.  Laboratory experiments, field research, and theoretical studies all forcefully indicate that, as a result, random mutation and natural selection make evolution self-limiting.  That is, the very same factors that promote diversity at the simplest levels of biology actively prevent it at more complex ones.  Darwin’s mechanism works chiefly by squandering genetic information for short-term gain” (p. 38).  Rather than developing new, more vibrant life-forms, natural selection degrades those that already exist.

To Behe, this points out a fatal flaw in the Darwinian dogma.  In fact, Darwin never showed how “purposeful systems could be built by natural selection acting on random variation.  Rather, he just proposed that they might.  His theory had yet to be tested at the profound depths of life.  In fact, no one then even realized life had such depths” (p. 155).  But now we know something about such depths!  And the more we ponder the mysterious inner workings of molecular life the more we’re prompted to discern a Mind at work informing it.

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

Following Behe’s evening presentation, the next morning’s session featured a Brazilian biochemist, Marcos Eberlin, who just published Foresight: How the Chemistry of Life Reveals Planning and Purpose (Seattle: Discovery Institute, c. 2019).  Eberlin is an internationally-acclaimed scientist, and his treatise has garnered “endorsements” from an impressive variety of world-class scientists.  Thus Sir John B. Gurdon, who won the Nobel Prize in Physiology or Medicine in 2012, recommends Foresight to anyone “interested in the chemistry of life.  The author is well established in the field of chemistry and presents the current interest in biology in the context of chemistry.  I am happy to recommend the work.”  One of his Brazilian colleagues, Rodinei Augusti, says the “book demonstrates that the currently available scientific knowledge increasingly points to the existence of a supreme being who carefully planned the universe and life.  This breakthrough will revolutionize science in the years to come.”

Listening to Eberlin speak, I was both pleased and amazed at his enthusiasm for his work.  As he described some of the intricate, complex processes evident in the natural world, he had an almost childlike joy in showing just how wonderful it all is.  On a cosmic level, it’s amazing that earth, a tiny planet amidst two trillion galaxies, each containing some 100 billion stars, is perfectly placed to nourish life.  Our sun is perfectly sized and exudes just the right amount of energy for earthlings.  Our atmosphere perfectly protects us from harmful radiation, allowing just the right amount of sunlight to reach the earth and promote life. The earth’s magnetic shield perfectly protects us from solar winds.  The moon stabilizes earth rotation, promoting the yearly seasons so needed for life to flourish.  Water itself is a most amazing substance, containing some 74 unique properties.  As Eberlin touched on these topics he obviously rejoiced to be alive and well—and able to understand a bit—on this wonderful planet. 

Much of Eberlin’s speech emphasized how much scientists have learned in the past decade!  Current technology enables them to probe both the vastness of the universe and the intricacies of the cell in novel and illuminating ways.  In Foresight, he begins by saying, “as plainly as I can:  This rush of discovery seems to point beyond any purely blind evolutionary process to the workings of an attribute unique to minds—foresight” (#118).  Such is evident, for example, in cell membranes, which must both protect it from external threats, allow nutrients to enter, and expel waste.  “Selective channels through these early cell membranes had to be in place right from the start.  Cells today come with just such doorways, specialized protein channels used in transporting many key biomolecules and ions.  How was this selective transport of both neutral molecules and charged ions engineered?  Evolutionary theory appeals to a gradual, step-by-step process of small mutations sifted by natural selection, what is colloquially referred to as survival of the fittest.  But a gradual step-by-step evolutionary process over many generations seems to have no chance of building such wonders, since there apparently can’t be many generations of a cell, or even one generation, until these channels are up and running.  No channels, no cellular life.  So then, the key question is: How could the first cells acquire proper membranes and co-evolve the protein channels needed to overcome the permeability problem?” (#144).

Were you to try and hire the best engineers in the world to make such a membrane, they “might either laugh in your face or run screaming into the night.  The requisite technology is far beyond our most advanced human know-how.  And remember, getting two or three things about this membrane job right—or even 99% of the job—wouldn’t be enough.  It is all or death!  A vulnerable cell waiting for improvements from the gradual Darwinian process would promptly be attacked by a myriad of enemies and die, never to reproduce, giving evolution no time at all to finish the job down the road” (#164).  Membranes merely protect the cell, of course, and when you study the inner working of the cell itself you behold wonders within wonders.  Scientists receive Nobel Prizes for describing tiny bits of cellular life, but it’s obvious to Eberlin that “if Nobel-caliber intelligence was required to figure out how this existing engineering marvel works, what was required to invent it in the first place?” (#272).  More Nobel Prizes were recently given scientists who discovered how cells repair damaged DNA by making tiny nanomachines.  Their incredible “research and engineering sophistication thoroughly deserved” world acclaim.  But, asks Eberlin:  “Are we then to believe that the marvels of engineering that these brilliant scientists discovered were themselves produced by a mindless process?  If discovering the function of these engineering marvels took genius, how much more genius would be needed to create them?” (#836).

Foresight contains Eberlin’s explorations through a multitude of revealing details—“the code of life,” “bacteria, bugs and carnivorous plants,” birds, “the human form,” etc.  He clearly understands and writes clearly, helping readers share his awe at the wonders of the world we live in, and it is clear to him that it’s all here because of foresight, which by nature requires intelligence.  “The need to anticipate—to look into the future, predict potentially fatal problems with the plan, and solve them ahead of time—is observable all around us.  It is clear from the many examples in this book that life is full of solutions whose need had to be predicted to avoid various dead-ends.  Put another way, many biological functions and systems required planning to work.  These features speak strongly against modern evolutionary theory in all its forms, which remains wedded to blind processes” (#2066).

So Eberlin concludes his book thusly:  “Nobel laureate J. J. Thomson—one of the giants of early modern physics, the discoverer of the electron, and the father of mass spectrometry, my field of expertise, beautifully conveyed this optimistic, open-ended view of science.  I can think of no better words for concluding a book about a world filled with evidence of foresight, words as true today as when Thomson penned them in the early twentieth century:  ‘The sum of knowledge is at present, at any rate, a diverging, not a converging, series.  As we conquer peak after peak we see in front of us regions full of interest and beauty, but we do not see our goal, we do not see the horizon; in the distance tower still higher peaks, which will yield to those who ascend them still wider prospects, and deepen the feeling, the truth of which is emphasized by every advance in science, that “Great are the Works of the Lord”’”( #2149).

* * * * * * * * * * * * * * * * * * * * * * * * * * * * * * *

The third speaker at the “Reasons 2019” conference was Melissa Cain Travis, a young professor of apologetics at Houston Baptist University.  She has recently published Science and the Mind of the Maker: What the Conversation Between Faith and Science Reveals about God (Eugene Oregon: Harvest House Publishers, c. 2018, Kindle Edition).  Citing Oxford philosopher Richard Swinburne, she says: “‘I do not deny that science explains, but I postulate God to explain why science explains. The very success of science in showing us how deeply orderly the natural world is provides strong grounds for believing that there is an even deeper cause of that order’” (#102).  She thus argues, developing what she calls a “Maker Thesis,” that “Christian theism uniquely provides a well-rounded account of both the findings and the existence of the natural sciences.  I will argue that not only do scientific discoveries have positive implications for the existence of a Mind behind the universe, they strongly suggest that this Mind intended for human beings to take up the noble project of rational inquiry into the mysteries of nature.  In other words, Christian theism, unlike atheism, offers a sufficient explanation of the observable features of the natural world as well as mankind’s impressive scientific achievements” (#107).

Travis thus begins by critiquing the philosophical naturalism and scientism so evident in many modern scientists’ worldview, citing as evidence evolutionary biologist Richard Lewontin, who dogmatically declared:  “‘It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counterintuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door’” (#222).  Lewontin nicely illustrates what G.K. Chesterton noted: “I never said a word against eminent men of science.  What I complain of is a vague popular philosophy which supposes itself to be scientific when it is really nothing but a sort of new religion” (#132).

Countering this new religion, Travis directs us back to an ancient insight famously crafted by Virgil in The Aeneid:  “The moon’s bright globe, the sun and stars are nurtured / By a spirit in them.  Mind infuses each part / And animates the universe’s whole mass.”  And Virgil was just poetically phrasing Cicero’s philosophical position, set forth in The Nature of the Gods:  “What can be so obvious and clear, as we gaze up at the sky and observe the heavenly bodies, as that there is some divine power of surpassing intelligence by which they are ordered?”  Centuries later St Augustine would blend these Roman thinkers’ views into Christian theology, saying creation is “a great book” we should read carefully.  Indeed: “‘Look carefully at it top and bottom, observe it, read it.  God did not write in letters with ink but he placed what is created itself in front of you to recognize him in; he set before your eyes all these things he has made.  Why look for a louder voice?  Heaven and earth cries out to you:  God made me’” (#510).

Augustine’s philosophical position, Travis thinks, is forever true and can be sustained amidst all the details of modern science.  Thus she conducts a knowledgeable tour of both historical and contemporary astronomy, physics, chemistry, biology, etc.  It’s important to note that “Copernicus, Kepler, Galileo, Newton, and Boyle were key players in the scientific revolution, and all five of them saw the attributes of the cosmos as indicators of a wise Creator in whose image we are made” (#1210).  Their position, however, was significantly undermined by Charles Darwin’s evolutionary naturalism, effectively removing divine design from the universe.  But a growing number of contemporary scientists—such as Michael Behe and Marcos Eberlin—have effectively set forth evidence favoring Intelligent Design.

Thus there took place a “Revival of the God Hypothesis” in 20th century physics and cosmology.  For example, Max Planck (sometimes called the “father of quantum physics”), “was particularly fascinated by the congruence between the mathematical, law-governed structure of the material world and human rationality; he saw this correspondence as indicative of a designing Mind” (#2256).  Travis finds mathematicians particularly interesting, for they often conclude that inasmuch as the cosmos follows mathematical laws it makes sense to posit a Divine Mind responsible for it all.  To her: “The Maker Thesis has no difficulty explaining the objectivity of mathematical truth, how beautifully mathematics applies to physical reality, and mankind’s corresponding intellectual capacities.  If the cosmos is the creation of a rational Mind in whose image we are made, a Maker who desires our awareness of him, this deep interconnection makes perfect sense.  As Oxford mathematician John Lennox has said, it is ‘not surprising when the mathematical theories spun by human minds created in the image of God’s Mind find ready application in a universe whose architect was that same creative Mind’” (#2816).

* * * * * * * * * * * * * * * * * * * * * * * * * * * *

The fourth and final speaker at the “Reasons 2019” conference was a historian, Michael Newton Keas, who provided a brief survey of his treatise, Unbelievable: 7 Myths About the History and Future of Science and Religion (Wilmington, Delaware: Intercollegiate Studies Institute, c. 2019, Kindle Edition).  As serious scholars know all-too-well, many “historical truths” are anything but true!  In part that’s because the past is a “story” made up of multiple stories.  And stories can be either true or imaginary!  Thus when scientists tell their stories, detailing their endeavors to understand the cosmos, their renditions “sometimes have implications for belief or disbelief in God or a spiritual heaven.  Too often, however, these stories are false.  They are nothing but myths.  And yet many leading scientists and science writers offer these stories as unassailable truth.   The myths make their way into science textbooks—which is a useful measure of a myth’s influence, as we will see in this book” (p. 2).  The myths further shape minds through science fiction, including Carl Sagan’s Cosmos TV series, and films such as the wildly popular Star Trek.  In fact: “The executive producer of Cosmos 2014 says that he has spent most of his professional life creating myths for the greater truth of atheism” and celebrated “his part in creating “atheistic mythology” in more than 150 episodes of Star Trek: Next Generation” (p. 153)

The myths Keas endeavors to deconstruct involve baseless stories about such things as the “Dark Ages” and pre-modern thinkers’ failures to understand the immensity of the universe and the global shape of planet earth.  Following Carl Sagan’s example, Neil deGrasse Tyson routinely invokes the “Dark Ages” to mock Christians, asserting they believed in the “flat earth” for five centuries.  Doing so, “Tyson, probably the world’s most influential public voice for science, is spreading misinformation about medieval views of the earth’s shape” (p. 43).  Tyson et al. tell demonstrably untrue but emotionally-evocative stories about persecuted scientists such as Bruno, Galileo, and Copernicus.  Keys describes how these misleading stories made their way into school textbooks and thence into the public consciousness.  He is particularly persuasive when pointing out how textbooks—so often taken as the final word on whatever they cover—serve as propaganda devices for the regnant worldview.  Doing so he provides needed clarity, refuting many of the allegedly “scientific” certainties pervasive in our culture.

319 Suspect “Science” – the Low-Fat Diet

Fifty years ago I regularly read Organic Gardening and Prevention Magazine—back-to-the earth publications urging readers to embrace a simpler, more natural approach to life.   The articles contained much dietary advice, some of which I embraced and still follow.  I also heeded the advice set forth in Runners’ World, loading up on carbohydrates as the best fuel for active athletes.  Then when the leading “experts” in nutrition began promoting a “low fat” diet and my primary care physician urged me to embrace it, I more-or-less ate in accord with its dictates.  Now and then I heard of folks embracing a “low carbohydrate” rather than “low fat” regimen (as in the Atkins diet), but I assumed they were food faddists or kooks of some sort.   After all, the US Department of Agriculture had issued its “food pyramid” that supposedly summed up the nutritional experts’ evidence—the last word on it all.  Eating lots of carbs, following a near-vegetarian diet and exercising, I believed, was the sure way to good health.  

But recently my curiosity was piqued when my step-son and his family embraced the “Keto” diet.  Then I met a man who’d lost nearly 30 pounds following a “low carb” diet who he suggested I read Nina Teicholz’s The Big Fat Surprise: Why Butter, Meat and Cheese Belong in a Healthy Diet (New York:  Simon & Schuster, Kindle Edition, c. 2014).   I did so and found it both provocative and persuasive.  It’s a very personal book, since Teicholz had for many years devoutly followed the low-fat prescriptions promoted almost everywhere.  But then she moved to New York and found work writing restaurant reviews—and getting free meals!  “Suddenly,” she says, “I was eating gigantic meals with foods that I would have never before allowed to pass my lips:  pâté, beef of every cut prepared in every imaginable way, cream sauces, cream soups, foie gras—all the foods I had avoided my entire life.  Eating these rich, earthy dishes was a revelation. They were complex and remarkably satisfying.  I ate with abandon.  And yet, bizarrely, I found myself losing weight” (p. 2).  That led her to seriously question what she’d been told, and in time she came to believe “that all our dietary recommendations about fat—the ingredient about which our health authorities have obsessed most during the past sixty years—appeared to be not just slightly offtrack but completely wrong.  . . . .  Finding out the truth became, for me, an all-consuming, nine-year obsession.  I read thousands of scientific papers, attended conferences, learned the intricacies of nutrition science, and interviewed pretty much every single living nutrition expert in the United States” (p. 2).

She discovered that a small coterie of nutritionists, trying to explain skyrocketing numbers of heart attacks, had “hypothesized that dietary fat, especially of the saturated kind (due to its effect on cholesterol), was to blame.  This hypothesis became accepted as truth before it was properly tested” (p. 3).  Critics of the hypothesis—and there were many—found themselves ostracized and effectively silenced, “cut off from grants, unable to rise in their professional societies, without invitations to serve on expert panels, and at a loss to find scientific journals that would publish their papers” (p. 4).  The dissenters, however, have been proven right!  For what Teicholz discovered “was not only that it was a mistake to restrict fat but also that our fear of the saturated fats in animal foods—butter, eggs, and meat—has never been based in solid science.  A bias against these foods developed early on and became entrenched, but the evidence mustered in its support never amounted to a convincing case and has since crumbled away.  This book lays out the scientific case for why our bodies are healthiest on a diet with ample amounts of fat and why this regime necessarily includes meat, eggs, butter, and other animal foods high in saturated fat” (p. 7).

Teicholz begins her presentation with some telling illustrations, including the story of Vilhjalmur Stefansson, who lived for several years with the Inuit natives in the Canadian Arctic a century ago.  They ate virtually nothing but fat meat and enjoyed good health.  Subsequently he and another man volunteered to duplicate the Inuit diet and thrived for a year (under medical supervision) eating nothing but meat.  Then comes evidence from Africa, where the Masai and other tribal peoples eat little except meat and dairy products without experiencing significant heart disease.  And in India, “Sir Robert McCarrison, the British government’s director of nutrition research in the Indian Medical Service and perhaps the most influential nutritionist of the first half of the twentieth century, wrote that he was ‘deeply impressed by the health and vigour of certain races there.  The Sikhs and the Hunzas,’ notably, suffered from ‘none of the major diseases of Western nations such as cancer, peptic ulcer, appendicitis, and dental decay.’  These Indians in the north were generally long-lived and had ‘good physique[s],’ and their vibrant health stood ‘in marked contrast’ to the high morbidity of other groups in the southern part of India who ate mainly white rice with minimal dairy or meat” (p. 14).

Given such evidence, one wonders why meat and dairy products suddenly became the great culprits to be avoided!  In large part the notion “that saturated fat causes heart disease was developed in the early 1950s by Ancel Benjamin Keys, a biologist and pathologist at the University of Minnesota” (p. 19).  He and his colleagues zeroed in on cholesterol as the primary culprit causing heart disease.  Though it “is a vital component of every cell membrane, controlling what goes in and out of the cell,” it also helped form the “atherosclerotic plaques” which clogged “the arteries until it cuts off blood flow, [and it] was thought at the time to be the central cause of a heart attack” (p. 21).   Keys himself ran tests that involved giving volunteers massive amounts of cholesterol-rich foods without affecting the cholesterol levels in their blood.   “He found that ‘tremendous’ dosages of cholesterol added to the daily diet—up to 3,000 milligrams per day (a single large egg has just under 200 mg)—had only a “trivial” effect” (p. 23).  Disregarding his own research, Keys simply, instinctively knew better—eating fat must make you fat; cutting calories would cut down weight.  “‘No other variable in the mode of life besides the fat calories in the diet is known which shows anything like such a consistent relationship to the mortality rate from coronary or degenerative heart disease,’” he declared in 1954.  “If people just stopped eating eggs, dairy products, meats, and all visible fats, he argued, heart disease would ‘become very rare’” (p. 32). 

Soon thereafter, in 1955, President Dwight Eisenhower had the first of several several heart attacks and his personal doctor, a Harvard Medical School professor, Paul Dudley White strongly endorsed Keys’ position.  Speaking to the nation from Ike’s bedside, White explained why heart attacks occurred and urged everyone to stop smoking and eat less saturated fat and cholesterol-laden foods.  Writing a front-page New York Times article regarding the President’s health, White cited Ancel Keys’ “brilliant” work and urged the nation to follow his advice.  In fact, the President had no family history of heart disease and had quit smoking a decade earlier.  He exercised, had normal blood pressure, and his total cholesterol (167) was considered normal.  After his heart attack, however, Ike became “obsessed with his blood-cholesterol levels and religiously avoided foods with saturated fat; he switched to a polyunsaturated margarine, which came on the market in 1958, and ate melba toast for breakfast”(p. 33).  He rarely ate meat or eggs or cheese, but by the end of his presidency his cholesterol registered 259—just days after Ancel Keys appeared on Time magazine’s cover, urging everyone to embrace the heart-healthy diet Ike had been so diligently following.  In  Ike’s case, sadly enough, the more he dieted the more cholesterol flooded his system!  He died of heart disease in 1969.

To prove his diet-heart hypothesis, Ancel Keys orchestrated a “Seven Countries” study that seemed to do so.  Yet though frequently cited as evidence, his study at best established “an association between a diet low in animal fats and minimal rates of heart disease; it could say nothing about whether that diet caused people to be spared the disease” (p. 42).  Carefully examined, Keys’ study was full of flaws—nearly fraudulent in some aspects.  But though considerable evidence existed to suggest he had little demonstrable (i.e. clinical) proof, he managed to enlist the American Heart Association in his cause and persuaded the National Institutes of Health to subsidize his research.  Time magazine celebrated him as “Mr. Cholesterol” and he enjoyed virtually unanimous media support, urging folks to eat less meat, drink less meat, and eschew fats of all sorts. 

Columnists such as New York Times health writer Jane Brody relentlessly promoted Keys’ diet-heart hypothesis, and everyone of consequence agreed!  Brody urged everyone to follow a low-fat diet and in 1990 published a seven-hundred-page manifesto:  The Good Food Book: Living the High-Carbohydrate Way.  The message was crystal clear:  dietary fat elevated blood cholesterol which “would eventually harden arteries and lead to a heart attack.  The logic was so simple as to seem self-evident.  Yet even as the low-fat, prudent diet has spread far and wide, the evidence could not keep up, and never has.  It turns out that every step in this chain of events has failed to be substantiated:  saturated fat has not been shown to cause the most damaging kind of cholesterol to go up;  total cholesterol has not been demonstrated to lead to an increased risk of heart attacks for the great majority of people, and even the narrowing of the arteries has not been shown to predict a heart attack” (p. 53).

In fact, while Keys was promoting his hypothesis a multitude of careful, clinical studies—“some of the biggest and most ambitious trials of diet and disease ever undertaken in the history of nutrition science” disputed it (p. 57).  Triglycerides, not cholesterol, looked like a more probable culprit.  Total cholesterol apparently has little significance, for HDL-cholesterol (the “good” kind) contributes to good health whereas LDL-cholesterol (the “bad” kind) proves deleterious.  Consuming vegetable oils, not animal fats, appeared closely linked to the increased incidence of heart disease.  And carbohydrates, not fat, seemed to actually cause obesity.  One of the most celebrated studies—the Framingham Study—early seemed to substantiate Keys’ position, but in 1992, a study leader admitted:  “‘the more saturated fat one ate . . . the lower the person’s serum cholesterol . . . and [they] weighed the least’” (p. 67).  More alarmingly, many studies revealed  “an extremely uncomfortable fact for the promoters of the diet-heart hypothesis:  people who eat less fat, particularly less saturated fat, appear not to extend their lives by doing so. Even though their cholesterol inevitably goes down, their risk of death does not” (p. 74).  “Another study in Israel followed ten thousand male civil service and government employees for five years and found no correlation between heart attacks and anything they ate.  (The best way to avoid a heart attack, according to the study, was to worship God, since the more men identified themselves as being religious, the lower was their risk of having a heart attack” (p. 98).

Yet such dissenting studies failed to register with the American public.  In large part this was because the federal government threw its massive weight into promoting the low-fat diet.  In 1977 Senator George McGovern issued a committee report—“Dietary Goals”—which declared that Americans’ diet was harming their health.  Eating too much meat and eggs and dairy products was responsible for “heart disease, cancer, diabetes and obesity,” whereas eating grains, fruit, and vegetables, would improve the nation’s health.  Though the Dietary Goals came out of a typically brief Senate hearing—not a demonstrative scientific study—it had enormous impact.  “We cannot afford to await the ultimate proof before correcting trends we believe to be detrimental,” said the senators.  “So it was that Dietary Goals . . . without any formal review, became arguably the most influential document in the history of diet and disease.  Following publication of Dietary Goals by the highest elective body in the land, an entire government and then a nation swiveled into gear behind its dietary advice” (p. 120).  Thereafter the Dietary Guidelines for Americans was published, including the USDA food pyramid which was widely endorsed as a guide to good health.  “Here, then, was the new reality:  a political decision had yielded a new scientific truth” (p. 125).   As of 2010 the USDA was still promoting a plant-based diet—grains, vegetables, fruits and nuts.

Yet the USDA had no good evidence for its edict!  In fact, “the largest and longest trial of the low-fat diet ever undertaken” (the Women’s Health Initiative) demonstrably failed.  “A review in 2008 of all studies of the low-fat diet by the United Nation’s Food and Agriculture Organization concluded that there is ‘no probable or convincing evidence’ that a high level of fat in the diet causes heart disease or cancer.  And in 2013 in Sweden, an expert health advisory group, after spending two years reviewing 16,000 studies, concluded that a diet low in fat was an ineffective strategy for tackling either obesity or diabetes. Therefore, the inescapable conclusion from numerous trials on this diet, altogether costing more than a billion dollars, can only be that this regime, which became our national diet before being properly tested, has almost certainly been a terrible mistake for American public health” (p. 173).  Unfortunately:  “Despite the original good intentions behind getting rid of saturated fats, and the subsequent good intentions behind getting rid of trans fats, it seems that the reality, in terms of our health, has been that we’ve been repeatedly jumping from the frying pan into the fire. The solution may be to return to stable, solid animal fats, like lard and butter, which don’t contain any mystery isomers or clog up cell membranes, as trans fats do, and don’t oxidize, as do liquid oils.  Saturated fats, which also raise HDL-cholesterol, start to look like a rather good alternative from this perspective” (p. 285).

That “good alternative,” Teicholz believes, is conveniently set forth in the Atkins Diet.  Robert Atkins was a cardiologist who helped tens of thousands of patients lose weight.  “Based on his experience treating patients, Atkins believed that meat, eggs, cream, and cheese . . . were the healthiest of foods.  His signature diet plan was more or less the USDA pyramid turned on its head, high in fat and low in carbohydrates.  Atkins believed that this diet would not only help people to lose weight but also fight heart disease, diabetes, and possibly other chronic diseases as well” (p. 287).  As an active physician, however, he had no “scientific studies” to bolster his claims.  He urged academic “experts” to look at his files, but none was interested.  Though many considered Atkins a faddish innovator, his dietary prescriptions actually had a long, impressive history, beginning with William Banting’s 1863 pamphlet, Letter on Corpulence, Addressed to the Public, which sold thousands of copies around the world and enabled him personally to shed unwanted pounds.  Then:  “In the United States, Sir William Osler, a worldwide medical authority in the late nineteenth century and one of the founders of Johns Hopkins Hospital, promoted a variation of the diet in his seminal 1892 medical textbook.  And a London physician, Nathaniel Yorke-Davis, used a version of the low-carbohydrate diet to treat the obese President William Taft from 1905 on, helping him lose 70 pounds” (p. 293).

Scores of other researchers reached the same conclusion, for during the first half of the 20th century it was discovered how insulin profoundly affects body-weight.  “The body secretes insulin whenever carbohydrates are eaten.  If carbs are eaten only occasionally, the body has time to recover between the surges of insulin.  The fat cells have time to release their stored fat, and the muscles can burn the fat as fuel.  If carbohydrates are eaten throughout the day, however, in meals, snacks, and beverages, then insulin stays elevated in the bloodstream, and the fat remains in a state of constant lockdown.  Fat accumulates to excess; it is stored, not burned.” However, “the absence of carbohydrates would allow fat to flow out of the fat tissue, no longer held hostage there by the circulating insulin, and this fat could then be used as energy.  A person would lose weight, theoretically, not because they necessarily ate less but because the absence of insulin was allowing the fat cells to release the fat and the muscle cells to burn it” (p. 296).  A small group of scholarly researchers have been compiling compelling evidence regarding the advantages of a low-carb, high-fat diet, though they have as yet failed to dislodge the dominant “consensus” regarding healthy diets. 

Yet to Teicholz:  “The sum of the evidence against saturated fat over the past half-century amounts to this:  the early trials condemning saturated fat were unsound;  the epidemiological data showed no negative association; saturated fat’s effect on LDL-cholesterol (when properly measured in subfractions) is neutral;  and a significant body of clinical trials over the past decade has demonstrated the absence of any negative effect of saturated fat on heart disease, obesity, or diabetes.  In other words, every plank in the case against saturated fat has, upon rigorous examination, crumbled away.  It seems now that what sustains it is not so much science as generations of bias and habit—although, as the latest 2013 AHA-ACC guidelines show, bias and habit present powerful, if not impenetrable, barriers to change” (p. 326).

So the low-fat mantra is dutifully repeated in most sectors, and Americans have obediently reduced their consumption of red meat, eggs, and butter.  “Americans continue to avoid all fats:  the market for ‘fat replacers,’ the foodlike substances substituting for fats in processed foods, was, in 2012, still growing at nearly 6 percent per year, with the most common fat replacers being carbohydrate-based” (p. 330).  Yet what they’ve believed lacks credibility.  Angel Keys declared, in 1952, that heart disease would “become very rare” if folks followed his low-fat diet.  In fact, while following his prescription “Americans have experienced skyrocketing epidemics of obesity and diabetes, and the CDC estimates that 75 million Americans now have metabolic syndrome, a disorder of fat metabolism that, if anything, is ameliorated by eating more saturated fat to raise HDL-cholesterol.  And although deaths from heart disease have gone down since the 1960s, no doubt due to improved medical treatment, it’s not clear that the actual occurrence of heart disease has declined much during that time” (p. 327).

Teicholz concludes her presentation with these sobering words:  “If, in recommending that Americans avoid meat, cheese, milk, cream, butter, eggs, and the rest, it turns out that nutrition experts made a mistake, it will have been a monumental one.  Measured just by death and disease, and not including the millions of lives derailed by excess weight and obesity, it’s very possible that the course of nutrition advice over the past sixty years has taken an unparalleled toll on human history.  It now appears that since 1961, the entire American population has, indeed, been subjected to a mass experiment, and the results have clearly been a failure.  Every reliable indicator of good health is worsened by a low-fat diet.  Whereas diets high in fat have been shown, again and again, in a large body of clinical trials, to lead to improved measures for heart disease, blood pressure, and diabetes, and are better for weight loss.  Moreover, it’s clear that the original case against saturated fats was based on faulty evidence and has, over the last decade, fallen apart.  Despite more than two billion dollars in public money spent trying to prove that lowering saturated fat will prevent heart attacks, the diet-heart hypothesis has not held up. In the end, what we believe to be true—our conventional wisdom—is really nothing more than sixty years of misconceived nutrition research” (p. 330).

* * * * * * * * * * * * * * * * * * * * * * * * * *

Much that Nina Teicholz says in The Big Fat Surprise was earlier set forth, in much more detail and scholarly erudition, by Gary Taubes in Good Calories, Bad Calories (New York:  Knopf Doubleday Publishing Group, Kindle Edition, c. 2007).  “The reason for this book is straightforward,” he says:  “despite the depth and certainty of our faith that saturated fat is the nutritional bane of our lives and that obesity is caused by overeating and sedentary behavior, there has always been copious evidence to suggest that those assumptions are incorrect, and that evidence is continuing to mount. ‘There is always an easy solution to every human problem,’ H. L. Mencken once said—‘neat, plausible, and wrong’” (#216).  That easy solution—the low-fat diet—was promoted by Universities and federal bureaucracies, but doing so has not particularly affected death rates or overall health because total cholesterol—the big bogeyman in dietary circles—has little to do with heart disease!   But Ancel Keys had insisted, based on statistical data, the contrary.  And he won the day, making low-fat diet virtually mandatory for folks desiring to live well.  Yet he may well have been wrong!  And we’ve all paid the price for his error!