Sunday, September 8, 2024

Incivility 2024

 

Presidential Politics, Two Months Out

            My last blog post, dated July 4, noted President Biden’s very poor performance in a late June debate with Donald Trump.  I speculated that Biden might repair his position in the race if this or that or something else happened.  None of those possibilities materialized.  As everyone knows, something else happened: Biden dropped out and endorsed Kamala Harris.  Mrs. Harris quickly took control of the Democratic National Convention.  Many polls show the race to be breathtakingly close.

            Again: not in Oregon or most states.  Oregon is safely blue; its electoral votes will go to Mrs. Harris.  In Oregon, I can safely vote for a protest candidate or write in a name (a way of saying I don’t like either major party candidate) without worrying my vote might affect the outcome.  At least four of our six Representatives in Congress will be Democrats.  A large majority of seats in the House—at least 250 and perhaps as many as 350—are “safe” seats for one party or the other.  Many Representatives face greater electoral challenges in their party primaries than in the general election.  In primaries, when it’s an intra-party fight, candidates tend to kowtow to extremists, the highly motivated single-issue voters on the edges of their party.  In the general election campaign, the candidate then moderates her views to appeal to voters in the middle.  But in a “safe” seat, the candidate needn’t moderate much.

            That paragraph wandered away from presidential to congressional politics.  Perhaps this little essay isn’t really about the presidential race, but about a larger topic.

            Politics, as I wrote in The Virtue of Civility in the Practice of Politics, is the art and science of making group decisions.  Electoral politics, and government in general, makes certain kinds of decisions for large groups of people.  Family politics, church politics, office politics, union politics—there are many other groups that must make decisions, and in each case those decisions can be made well or poorly.  Most of us can think of poorly made decisions in businesses, families, or schools.  The goal of politics at every level is better decisions; at least that’s what I wrote in 2002, and I still believe it.

            Superficially, people often agree that politics ought to aim at better decisions, but they treat their political opponents as mere obstacles.  We often act as if the goal of politics is winning, perhaps believing that we already know which policy ought to be adopted, which direction we should go.  I still believe that the goal of politics is not winning, but better decisions.

            Too often, candidates describe their opponents as thoroughly wrong, as selfish, as stupid, or as willfully evil.  Sadly, this kind of campaigning may “work,” in the sense that it leads to electoral victory.  But what happens when the slanderer takes office and turns to the actual work of government?  If we have made the political opponent into an enemy, how do we cooperate across party lines?

            Mr. Trump has not pledged to accept the outcome of the election if he loses.  Instead, he repeatedly warns that Democrats will promote voter fraud, and the election will not be fair.  The only possible “fair” outcome, in his mind, results in his victory.  If Mrs. Harris wins, we will almost certainly see multiple legal challenges to results in several states.  I worry that we will also see violence and threats of violence from Mr. Trump’s most extreme supporters.  But my main worry is the underlying polarization and incivility, not Mr. Trump himself.  Age will soon remove Mr. Trump from the political scene, but “trumpy” campaigning may continue.

           

Thursday, July 4, 2024

After the Biden/Trump Debate

 

Presidential Politics, Four Months Out

 

            A week ago, June 27, Joe Biden and Donald Trump engaged in their first debate of the 2024 campaign.  President Biden performed poorly, strongly reinforcing the widespread belief that he is no longer mentally sharp enough to serve effectively as president.  If he were to win reelection and serve four years, Biden would be 86 before the end of his presidency.  Post-debate polls say that large majorities (more that 60%) of those who watched the debate do not think Biden is up to the job.  Few debate observers, perhaps less than 25%, believe Biden would be an effective president four years from now.

            Eleven months ago, I summarized my reflections on the race thusly: (1) There is a significant chance (35-40%) that Trump will be elected in 2024.  And: (2) There is a greater chance (60-65%) that Biden will be elected in 2024.  Obviously, such prognostications are subject to change, as events—events of many kinds, from Supreme Court decisions to foreign wars—change the political landscape.  My estimate of Trump-Biden 2024 has changed greatly.

            Before the debate, the polling consensus pointed to an extremely close election, with Trump leading Biden in most of the “swing” states by tiny margins.  It seems that Trump’s chances of winning the election had risen from the 35-40% range to a 50-53% range.  Why?  Events: the Hamas/Israel war (which pulled some progressives away from Biden’s strong and traditional Democratic support for Israel), lingering inflation worries, and continued anxiety about undocumented immigrants.  (Significantly, Trump pushed congressional Republicans to scuttle a bi-partisan reform of immigration law.  Trump would rather use “the border crisis” as a campaign bludgeon than allow Biden to sign a reform law.)

            So: even before the debate, the tide was moving in Trump’s direction.

Post-debate polls give Trump a bigger margin in the swing states, somewhere between 3% and 6%.  If the election were held today, Biden might win the popular vote, but it would be extremely close.  Republicans have a built-in advantage when it comes to the Electoral College; if the election were held today, Trump’s odds of winning are probably around 75%.  Remember, the U.S. is deeply polarized.  At least 40 states are safely “red” or “blue.”  Apart from a genuine landslide, the outcome of the election will rest on a handful of swing states, such as Pennsylvania, Georgia, and Nevada.

Democratic party leaders know all this.  In the week since the debate, three Democratic congressmen have called for Biden to step aside from the campaign.  The New York Times and many other publications have editorialized the same.  There is a storm of controversy going on among Democrat donors, organizers, and politicians.  The decision, of course, can only be made by President Biden.

Theoretically, Biden could repair the damage of his debate performance.  He could meet repeatedly with the press in unscripted settings.  If, in such “live” settings, he showed himself able to think incisively and speak clearly—not just once or twice, but a dozen times before the election—he could reverse the popular picture of a “well-meaning man with a poor memory” (the words of the special prosecutor who chose not to indict Biden for his handling of government secret documents).  I say “theoretically” because I do not believe Biden could do it.  He is not “senile,” as right-wing pundits are saying, but he is not as mentally able as he was 25 years ago.  The stumbling, rambling Biden of the debate is just as much the real Biden as the man we often see reading from a teleprompter.

If Biden did withdraw, we would be in historically uncharted waters.  It would be difficult, though not impossible, for the Democrats to nominate and unite behind some other candidate.  But the newcomer would enter the race with many handicaps.  With or without Biden, the Democratic candidate will be an underdog for the fall.

Trump did not win the debate; Biden lost it.  The practical result is the same in either case.  Trump’s chances of winning the election are high. and the Democrats can’t do anything about it.  Trump, of course, could throw away his advantage.  For now, though, it seems the election is his to lose.

What should we expect from Trump’s return to the White House?  That’s the question Americans should be considering.

Wednesday, May 1, 2024

The Political Opponent is a Resource

 

Philosophical Bits #4:

Epistemology for Social Creatures

 

            Aristotle famously observed that “man is by nature a social animal.”  Human beings need each other to live; we certainly need each other to live well.  Aristotle said the solitary person who somehow believes he doesn’t need other people is more like a beast (below us) or a god (far above us).  Our social nature has implications for almost every aspect of human life, from economics to psychology and religion.

            Nowhere is this more obvious than in science.  Despite the stereotypical image of the lone mad scientist uncovering some heretofore unknown natural law—an image that shows up in literature since Frankenstein—modern science is most definitely a cooperative social endeavor.  Not only do scientists work in teams (often very large and well-funded teams), they publish their findings in scientific journals open to anyone.  Everyone’s work is checked and confirmed (or disconfirmed) by others.  The current generation of researchers build on the theories and discoveries of past scientists, so the social nature of science extends over time, not just geography.

            What is true of science is true of epistemology in general.  The “knowledge business,” in almost all its forms, is a cooperative social activity.  A philosopher may think alone, and he may write an essay in solitude (as I am doing now).  But then he exposes his work to others’ criticisms and corrections.  Both humility and realism teach us to take the criticisms of others seriously.  Most of the time, we learn together.

            A skeptic might object: We don’t learn, because “learning” implies objective truth.  In philosophy there are always arguments for and against, so we should never say we “know.”  But extreme skepticism refutes itself, in that it says we know that we cannot know.  It’s better to take skepticism as a warning.  Yes, the pursuit of knowledge is a slow business, filled with almost as many missteps as progress.  But we do learn.

            Now, someone might agree that natural science has progressed in the last six hundred years, greatly increasing our knowledge of biology, chemistry, astronomy, physics, ecology, and so on.  But what about the so-called “soft” sciences?  Have we gained knowledge in economics, sociology, and psychology?  What about the arts and philosophy?  Do we understand beauty or truth or justice any better than our distant ancestors?  It seems that the further we get from the hard sciences, the less confident we are that we are making progress.

            Let’s face it, the critic may say: Philosophical “fads” come and go.  Marxism was an all-encompassing philosophical theory with drastic implications for economics, politics, and morality—but true believers are few.  Positivism dominated philosophy for thirty or forty years, but now it’s dead.  Deconstructionism was all the rage in the 90s, but its influence wanes.  So, maybe there is no progress in philosophy.  Maybe we just recycle old ideas in new guises. 

            Maybe.  But if not, we need each other in moral philosophy as much as we do in natural science.

            I’m a moral realist.  That is, I believe there is such a thing as moral truth, which means a person can sincerely believe in some moral proposition and be wrong.  Examples.  Within the last 100 years, well-educated persons in this country have pressed for eugenics laws, believing that modern science (evolutionary theory) supported the conclusion that some people are mentally and/or physically defective.  Others advocated for a socialist revolution, believing that only radical redistribution of ownership of the means of production could lead to economic justice.  Others defended racial segregation in law and custom, believing in the inherent superiority of white people over black people.

            These were all sincere beliefs held by educated people in this country in the recent past.  As a moral realist, I think they were wrong, which means there must be some other view which is right or at least closer to the truth.  Moral truth, if we can get it, is a very great good.  We should pursue it.

            (Notice the self-referential folding of that last paragraph.  The proposition, “We should pursue moral truth,” is itself a moral truth.)

            If philosophical fads come and go, and if well-educated people can sincerely believe moral falsehoods, our pursuit of moral truth will be fraught.  We almost certainly will make mistakes.  But since we are social beings, we have a great resource: other people, especially those who disagree with us.

            Scientists publish their findings, which opens the door to criticism and/or confirmation.  The search for knowledge in moral philosophy should be equally open to criticism.  We need each other.  Those who disagree with us on moral questions, including political questions, are precisely the people we need to hear.  As my friend Ron Mock likes to say, the political opponent is a resource for better decisions.

           

 

Monday, April 1, 2024

The Greatness of God

 

Philosophical Bits #3:

Lots of Stars

 

            Astronomy has changed our view of the universe greatly in the last hundred years.  In the early 20th century, Henrietta Leavitt discovered the period/luminosity ratio of Cepheid stars, which gave astronomers a “standard candle” by which to measure interstellar distances.  Edwin Hubble, working with the newest and best telescope in the world at Mount Wilson in southern California, used Leavitt’s discovery to prove that many so-called “nebulae” were in fact galaxies outside our Milky Way.  Only in the 1920s was the debate over galaxies (“island universes”) finally settled.  It’s easy to forget that the best astronomers of the 19th century did not know what our school children are taught as a fundamental fact.

            Hubble also discovered that the red shift—a feature of light from distant galaxies—increases the further a galaxy is from earth.  In the 1930s, astronomers deduced that the universe is expanding, though Hubble himself resisted this conclusion.  Interestingly, the first astronomer to propose the expanding universe theory was Georges Lamaitre, a Roman Catholic priest.  Most cosmologists opposed the so-called “big bang” theory because they preferred to believe in an eternal physical universe.  (Even the label, “big bang,” came from Fred Hoyle, who promoted a steady state theory of the universe; in a 1949 BBC interview, Hoyle derided “this big bang idea.”)  Again, big bang cosmology is something we take for granted now, and we forget that the expanding universe theory did not win the day until the 1940s and 1950s. 

My high school science textbook, circa 1970, taught us that the universe has been expanding since the initial singularity and offered as an explanation the “oscillating universe” model, in which the universe expands and collapses over and over.  But then Roger Penrose and Stephen Hawking convinced cosmological theorists that the math didn’t work.  The universe, scientists now believe, is a one-time mega-event that started almost 14 billion years ago and will end many billions of years from now in eventual heat death.  The time frames and distances involved in astrophysics are so big as to beggar imagination.  Still, for many scientists, the idea that the universe had a beginning and will have an end is troubling; it sounds too much like theology.

Current astrophysical theory says there are several hundred billion galaxies in the observable universe, and each galaxy has about 100 million stars.  For comparison’s sake, consider that the earth has more than eight billion people in 2024.  Estimates of the total number of people in all generations on earth range as high as 120 billion.  If we resurrected every human being who has ever lived and assigned galaxies to them as property, each person would “own” five to ten galaxies, with 100 million stars in each.  The stars in the observable universe outnumber all the human beings who ever lived by a factor of 500 million to one.  Or more.

For some people, Big Bang cosmology may sound uncomfortably like theology, and it reinforces aspects of theology that may have been neglected.  Medieval theologians taught that the God of the Bible—a conception of God shared by Judaism, Christianity, and Islam—created the world, including everything that exists (other than God), out of nothing.  God is an infinite being, always existing.  (Christians add that God exists as three persons, eternally loving one another.  Jews and Muslims reject that idea, insisting on the singleness of God.)  However great the universe may be, medieval theologians said that God is infinitely greater.

Since they are human beings, believers are tempted to neglect or forget the greatness of God.  Quite naturally, we focus on the concerns close to us: our physical bodies, our plans and activities, our families and friends, our clans and countries.  We pray about such things, asking God to grant peace and prosperity, health and wholeness.  Focusing on things close to us, we neglect the greatness of God.  But when Jesus taught his disciples to pray, he told them to say, “May your name be honored.”  Our prayers should include awareness of God’s greatness.

It helps to meditate on the stars, on the sheer magnitude of the universe.  What kind of God is this, who knows the stars by name (according to Isaiah)?  Our whole world, teeming with eight billion souls and full of innumerable creatures, is just one tiny planet at the edge of a middle-sized galaxy, one of billions of galaxies.  How “big” must God be to see it all?  And yet: how intimate God must be, to pay attention, as Jesus said, to every bird of the air?  How could God attend to billions upon billions of planets and stars, and also pay attention to all creatures, great and small?  God hears all the prayers of people on Earth, and if there are spiritual beings on any of the billions of other planets, God attends to their prayers too.  What kind of mind is this, who lovingly attends to billions of souls?

This is the God to whom we pray, the God of all the stars.  This is the great God, the God who knows and sees and loves.

 

 

 

 

Saturday, March 2, 2024

We live this way

Philosophical Bits #2:

Skepticism

 

            But how do I know “this is real”?  Epistemology tries to answer the question: How do I know?  Related questions: What is knowledge?  What distinguishes knowledge from mere belief, even true belief?

            Everyone agrees we would rather have true beliefs than false beliefs.  But a true belief is not the same thing as knowledge.  In the Meno, Socrates points out that some people may adopt a belief, a belief which turns out to be true, without good evidence or by bad reasoning.  They might believe something merely because their hated enemy doesn’t believe it or because no one has proved it false.  They might fall into a belief by accident.  Mere true belief is not enough, says Socrates.  We know when the truth of our belief is tied to the belief by the “bonds” of good reasons.

            This traditional answer, that knowledge is justified true belief, received scrutiny in the 20th century.  In 1963 Edmund Gettier published a short article, “Is Justified True Belief Knowledge?” in which he gives examples of persons who believe a proposition for good reasons (so the belief is justified), and the proposition is in fact true, but do not seem to be cases of knowledge.  Other philosophers quickly created more examples.  Here’s one: While driving in the Willamette Valley, Debbie sees what looks to be a sheep and forms the belief, “There is a sheep in that field.”  In fact, the animal she sees is a dog wearing a sheep outfit made by a middle-school student as part of a prank.  However, unknown to Debbie, there is another animal, a real sheep, standing in a portion of the field but hidden from her sight by a large sign.  Debbie’s belief is true (there really is a sheep in the field), and her belief is justified (she sees an animal that looks like a sheep), but it seems that her justified belief is true only by accident.  It seems her justified true belief isn’t knowledge.

            Philosophers have tried to repair their understanding of knowledge in a variety of ways.  I won’t discuss the details.  In general, the answer to Gettier problems is that justification must hook up to the true belief in the right way.  But the devil is in the details.  What is the right way?  Disagreement persists on that score.

            Skeptics watch from the epistemological sidelines, as it were.  The players in the epistemology game strive to define knowledge, with the goal of better guiding our pursuit of knowledge.  The game is pointless, say the skeptics.  Knowledge is a chimera.  Epistemology might produce some guidance about the way we adopt or reject beliefs, but we should abandon the hope of attaining knowledge.  Regarding any particular belief, the skeptics say, we must face the truth: we might be wrong.

            What about science?  Four centuries of modern science have dramatically changed our beliefs.  Technology, based on scientific discoveries, has dramatically changed our practical world in thousands of ways: electricity, telephones, antibiotics, plastics, radio, Internet, blood transfusions, cars, telescopes, airplanes, microscopes, genetic tests, photographs, videos, recorded music, man-made fibers, and so on.  Surely science gives us knowledge, and the knowledge given by science has made technology possible.

            The skeptics say no.  Science gives us new beliefs.  Technology based on those beliefs has changed our practices.  But are we guaranteed those beliefs are true?  Do we have knowledge?  No.  We might be wrong.

            20th century skeptics were certainly aware of modern science.  The scientific method combines empirical observations with theories.  We propose the theories to make sense of the observations, and we use further observations to test the theories.  Therefore, some philosophers said (these philosophers identified themselves as “positivists”), empirical observations must lie at the heart of science.  Scientific and philosophical theories are meaningless, the positivists said, unless they can be verified empirically.  A.J. Ayer’s 1936 book, Language, Truth, and Logic is an enthusiastic presentation of positivism by one of its exponents.  According to Ayer’s “verification principle,” all meaningful statements are either tautologous or verifiable (at least in principle).

            Now, the verification principle is self-referentially incoherent.  It is itself neither tautologous nor verifiable.  In the second half of the 20th century, philosophers rejected it.  Along the way, though, positivism taught an important lesson.

            Positivism inspired a conceptual search for pure empiricism, empirical observations untainted by theory.  What would an “observation statement” look like?  Obviously, if you use a scientific instrument like a telescope, all your observations must be qualified; you didn’t simply look at the stars, you looked at them with this instrument, with a specific description, and that description implicitly drags in a host of assumptions about light, mirrors, the construction of the telescope, and many other things.  And in everyday scientific practice, that’s fine.  But if scientific theories are to be tested by observations, at bottom we need some observations that are theory-free.  What would “pure” observation be like?

            For a decade or two, in the middle of the century, positivist philosophers of science tried to conceptualize theory-free observations.  By the time positivism collapsed (because of its self-referential incoherence), philosophers of science had adopted a truism: there are no theory-free observations.  All our empirical observations are loaded with assumptions.

            Does this mean we don’t have knowledge?  Are the skeptics right?

            For example, can we know that everyday empirical experiences yield truth?  Is the grass really green?  (Fido doesn’t see color.  Is color real?)  Is there a tree over there?  Should I modestly say only that it seems to me that there is a tree over there?  Some skeptics would say I know only that it looks like a tree to me; the world outside my mind may be different than what I perceive.

Can we know that other people have minds?  I might know that I have a mind, by direct experience.  But can I know that other people have minds, that they are not cleverly designed robots?  The skeptics would say we can’t know these things.  After all, maybe the world was created five minutes ago by an evil demon who made the world just to deceive me.  For all I know, the whole world is just a figment of my imagination, and everything other than my mind is nothing more than an item in my mental universe.  Against the skeptics, G.E. Moore published a paper in 1939, “Proof of an External World.”  In a public reading of the paper, Moore held up his hand.  “I know this is a hand,” he said.  “And here is another.”  Since Moore and his audience both know these two things, there must be a world external to their minds.

            It is said, perhaps apocryphally, that Ludwig Wittgenstein told Moore this paper was the best thing Moore had written.  Wittgenstein certainly wrestled with the problem.  A collection of his notes, written in the last months of his life and called On Certainty, begins: “If you really do know ‘Here is one hand,’ we’ll grant you all the rest.”

            On Certainty is not a polished book, but a collection of notes, published after Wittgenstein’s death in 1951 by his literary executors.  Philosophers have struggled with it ever since.  My own notes on On Certainty are almost as long as On Certainty.

            Wittgenstein thought something had gone wrong on both sides.  The skeptics want to say we don’t know everyday facts that we observe or remember.  Moore wanted to insist that he did know he had two hands, and his audience knew this as well.  Wittgenstein thought both sides had lost touch with the “language game” of knowing.

            The meaning of language is in its use, Wittgenstein wrote in Philosophical Investigations.  How do ordinary people talk about knowledge when they are not wrestling with skepticism?  When would we, in real life, say, “This is a hand”?  The language game of knowing includes things like doubting and being sure.

            Example.  Bob asks, “Did you go to the club Thursday?”  Sally replies, “Sure.  I always go.”  Bob: “But they moved the meeting last week.”  Sally: “Oh! That’s right.  I forgot.  I went to the meeting Wednesday.  But I also went to the club Thursday.  I know I did because I had to return a book.”

            This is an unremarkable use of “know,” completely at home in ordinary language.  Both the skeptic and the commonsense realist (Moore) are tempted to take knowing away from such examples.  I think Wittgenstein praised Moore’s essay because “Proof of an External World” shows how the skeptics had gone too far.  But he worried that Moore’s refutation—“I know this is a hand”—was also language gone on holiday.

            Perhaps the best answer to skepticism is that we live this way.  We live in a world where water boils at 100 degrees Celsius and freezes at 0 (an example in Wittgenstein’s notebook).  We are confident that the world of my morning walks is real, but the world of my novel is not.  We cannot “refute” the extreme skeptic, but we live this way.  And the skeptic is one of us.

           

           

Sunday, February 4, 2024

This is real

 

Philosophical Bits 1:

Metaphysics

 

            This is real.  I am not merely an actor in a story.  That is, I am not a character in a fiction.  My life may have narrative form, but it is a true story, not made up. 

            I am walking outdoors in Oregon in winter.  The world I see around me—a sidewalk, houses, trees, lampposts, streetlights, cars and trucks, white clouds and dark gray clouds, the broken shafts of morning sunlight—all this is real.

            In contrast there is fiction.  I wrote a sci-fi novel, Castles.  It’s a long thing, to be published in three volumes, with scores of characters, and most of the action takes place on an imagined planet on the other side of the galaxy.  However entertaining or instructive the story, the events in it didn’t really happen; it isn’t true.  (I leave aside for the moment the idea of symbolic truth.)  The people and places in my story aren’t real.     

Consider Alice’s experiences in Through the Looking Glass.  Tweedledum and Tweedledee tell her she is only a character in the Red King’s dream. “If that there king would wake up, poof!  You’d be gone.”  Alice, of course, rebels against the idea that she isn’t real, and at the end of the story she wakes up.  The Red Queen, whom Alice was angrily shaking, turned out to be a cat.  But readers of the story are provoked to wonder: Alice really is a figure in a fairy tale, despite her insistence that she is real; how do we know we aren’t also fictional characters?  Lewis Carroll, the author, was a mathematician.  It is no surprise that the layers-inside-layers of the world inside the looking glass provoke us to think.

So how do I know “this,” the world of my morning walk, is real?  Do I slap my foot (or both feet or my hand) against the sidewalk?  What would that prove?  Can I “shake” something or someone (like Alice shaking the Red Queen) and wake myself from a dream?  I’ve had lots of dreams, and I have experienced waking from dreams.  I am extremely confident that I am not dreaming.  It seems I can tell the difference between dream and reality.  Can I?  The Matrix—another storyenvisions a dystopian future in which most people are systematically deceived about reality.  I am confident that I do not live in a supercomputer-generated illusion.  Is my confidence misplaced?

            In an Intro to Philosophy course, I introduce students to philosophical jargon.  Every discipline has its own jargon, I reassure the students.  Philosophy’s words are no more intimidating than the technical terms of other fields.  Just think of the strange words you learn in biochemistry or neuropsychology.  You’ll get used to it.  You’ll even begin to use these words yourself. 

Here are two words to begin: metaphysics and epistemology.  In metaphysics we ask: What is real?  In epistemology: How do we know?

Some philosophy students quickly decide, when they learn there are competing theories in both metaphysics and epistemology, that philosophy is entirely a matter of opinion.  Nothing is true or false, right or wrong.  We don’t know what is real or how to know.  I must assure them there is a difference between good philosophy and bad philosophy—and they will begin to recognize the difference when I mark their papers.

Some philosophers, sceptics, would say I should not assert the reality of the world of my walk, but their assertions—made with great confidence at various times in the 20th century—have been rightly undermined.

 

I do not live in the made-up worlds of Through the Looking Glass, The Matrix, or Castles.  I live in the real world, and I am grateful for my existence.

           

Tuesday, January 2, 2024

Bible Reading in 2024

 

Heroes of Old

 

            For many years—since the early 1990s, at least—I have maintained a regular program of Bible reading.  In odd numbered years, I read some portion of the New Testament, one or two chapters per day (Luke-Acts in 2023); as a result, I read the selected book(s) twelve times in the year.  In even numbered years, I read the whole Bible, four or five chapters a day, starting with Genesis right through to Revelation.  Thus, I’ve read through the Bible at least fifteen times.

            I was taught as a child, and I believe as an adult, that God speaks through scripture.  For thirty years I taught full-time at George Fox University, an orthodox/evangelical Christian university that is committed to the authority of the Bible.  I read the Bible, I study the Bible, and I seek to be changed by the Bible.  To speak more precisely, I hope to be changed not just by this collection of ancient books we call the Bible but by the God revealed in the Bible.

            And now it is 2024, time to begin with Genesis again.  It’s January 2, so I read Genesis chapters 6-10.  The text says in verse 2: “. . . the sons of God saw that the daughters of men were beautiful, and they married any of them they chose.”  Verse 4: “The Nephilim were on the earth in those days—and also afterward—when the sons of God went to the daughters of men and had children by them.  They were the heroes of old, men of renown.”

            Such a text creates a problem for Bible readers.  How should we understand it? 

            At George Fox, I have said many times to students, we are committed to the authority of scripture.  That is, we believe what the Bible teaches.  But what the Bible teaches is not necessarily what the Bible says.  It’s relatively easy to quote some passage of the Bible that is false, if we take the words away from their context.  For instance, Psalm 93:1 reads, “The world is firmly established; it cannot be moved.” (The great reformer, Martin Luther, thought this verse proved that the theory that the earth orbited the sun had to be wrong.)  Careful Bible readers know that every passage must be understood in context and according to genre.  Few of us are tempted to take the psalmist’s poetry as an astronomy lesson.  After all, the earlier lines of the same verse say, “The LORD reigns, he is robed in majesty; the LORD is robed in majesty and is armed with strength.”  The psalm teaches about God’s omnipotence and eternal authority, but to do so it says something false about the motion of the earth.

            Back to Genesis 6, the “sons of God” and the “heroes of old.”  I think it is a mistake to read this literally.  The genre here is not poetry, but myth. 

            (The very word, “myth,” creates doubt and consternation for many people.  In academia, a myth is any meta-story that organizes the thinking of a significant group of people.  Thus, we speak of the “myth of inevitable secularization,” “the myth of American exceptionalism,” “the myth of dialectical materialism,” and so on, including myths of various pagan gods.  “Myth” often implies falsity, but it always means that it is a meta-story that shapes worldviews.)

            Ancient Israelites were familiar with stories of the great men of the distant past.  Egyptians, Babylonians, Sumerians, Greeks—all the powerful cultures of the ancient world had myths of the men of old.  In the deep past, these stories said, men were bigger (some were giants), they lived longer (thousands of years, in Babylonian myths), and they struck bargains with the gods.  Some men and women were offspring of the gods.  These are the greats of pre-history, the “heroes of old.”

            How should an Israelite think about such stories?  What does our myth say?  The text nods briefly to such stories, as if acknowledging them.  But the acknowledgment is vague.  Who are these “sons of God” who father children by the “daughters of men”?  The text doesn’t say, and there is no good answer.  If the “sons of God” are lesser gods, this text contradicts the emphatic Old Testament teaching that there is only one God.  If the “sons of God” are angels, this text contradicts the teaching of Jesus that the angels don’t marry—and it would make this text singular among all Bible references to angels.  If the “sons of God” are ordinary men, the myth is deflated; it doesn’t explain the “heroes of old” at all.

            The key to this myth, I think, is in the verse I haven’t quoted, in between verse 2 and verse 4.  “Then the LORD said, ‘My Spirit will not contend with man forever, for he is corrupt; his days will be a hundred and twenty years.’”

            The God of Israel is in control.  The ancient Israelite could nod to his neighbor: “Yes, you have stories of great men of the past, the demi-gods and heroes.  But my God—the only real God—put a limit to all that.  Find me someone who lives more than 120 years.  We are mortal beings, corrupted beings, who live before the face of God.”

            It seems to me that our text treats the myths of the nations as an opportunity to assert the power and divine authority of Yahweh.  Speculation about the “heroes of old” can produce lots of fun; the Percy Jackson stories are bringing pagan legends to a whole new generation.  But the Bible speaks to people in this world, a world of sin and death, a world that needs redemption/salvation.  Redemption came, not through the heroes of old, but through the Word made flesh.