Few people know this, but our age is an amazing time for people who love philosophy.
When I was in college 30 years ago, philosophy was strictly an academic exercise and there were few resources available for people, like me, who view philosophy more as a way of life or avocation than as a job.
Today, however, all that has changed.
There are three or four excellent “magazines” about philosophy – such as Philosophy Now and The Philospher’s Magazine – that are filled with funny, off-beat, irreverent articles about philosophical topics. A number of top-rate publishing houses, mostly in the UK, such as Routledge and Blackwell Publishing, produce books aimed at a general philosophical readership.
There are philosophy radio programs such as Philosophy Talk… coffee houses… salons… adult education classes… and literally hundreds of websites for the interested reader. There are even philosophy comic books, such as LogiComix about the life of British logician Bertrand Russell. It’s simply amazing. It’s a golden age of philosophy, I think.
The irony, however, is that there is still no solid consensus on what, precisely, philosophy actually is. In its historical and etymological sense, philosophy is literally “love (philia) of wisdom (Sophia),” and that is always how I have looked upon it. Philosophy, for me, is the attempt to reflect upon experience in order to understand more about life and how we are to live. My aims, like those of Socrates, are primarily practical: I want to understand the world and myself to live better.
Today, there are three, perhaps four major “schools” or approaches to philosophy, each with their own journals, intellectual heroes and methodologies. It is one of the scandals of contemporary philosophy that these schools are somewhat incommensurable, meaning they are so different in their approaches and ideals they are almost incapable of speaking to one another. It’s as though organic chemistry and 17th century French literature are forced to share the same offices and pretend they are the same discipline (I exaggerate but you get the point).
The first approach may be called, for lack of a better word, Traditional Philosophy: this is the approach now largely taught only in Catholic universities. It is primarily historical in orientation, a “history of philosophy” style in which students study the thought of, say, the ancient Greeks, and Descartes, the British empiricists, Kant, Hegel and so on. There is very little attempt to think through how the thought of these philosophical greats can be reconciled. The idea appears to be that by working through all of these great thinkers, eventually the student will come to his or her own philosophical conclusions — although there is really no fixed “method” or approach given for doing so. I always think of this as the University of Chicago or Great Books approach. A variation of this approach is Catholic philosophy, including various schools of Thomism (such as the Transcendental Thomism of Merechal, Karl Rahner and, my guru, Bernard J.F. Lonergan)
The second major approach to philosophy today is what is known as Continental Philosophy. This is the philosophy that is most commonly taught in Europe and, again, in some Catholic universities in the U.S. In practice, it means primarily the philosophical systems of phenomenology, existentialism, so-called “critical theory” and their postmodern descendants. When I was in college, this is what I studied (in addition to traditional philosophy). We read the classic texts of phenomenology as well as such trendy philosophers as Jean-Paul Sartre, Merleau-Ponty, Heidegger, Karl Jaspers, Max Scheler, Edith Stein and others. Today, those names have largely been replaced by those of postmodern French thinkers such as Michel Foucault, Jacques Derrida, Jean Baudrillard, Jean-François Lyotard. While classical Husserlian phenomenology does attempt to “solve” major philosophical problems and actually be a descriptive science, in practice students of Continental Philosophy, like their Traditional Philosophy counterparts, spend much of their time studying the works of individual thinkers and writing papers on aspects of their thought. (There is a greater interest in Continental Philosophy in social and political questions, however.)
The third and allegedly dominant approach to philosophy today is Analytic Philosophy. This is the philosophy most commonly taught in the UK and in major U.S. universities. Built upon the infrastructure of British empiricists such as David Hume, Analytic Philosophy appeared in the early 20th century through the work of such thinkers as Bertrand Russell, Gottlob Frege, G.E. Moore and Ludwig Wittgenstein. When I was in college, I found Analytic Philosophy to be mostly unintelligible gibberish. The emphasis on symbolic logic and the solving of trivial intellectual “puzzles” was, to me, an absurd waste of time.
In the past few years, however, I’ve been reading more about Analytic Philosophy and I am now much more impressed. Analytic Philosophy has matured over the past few decades and is now more of a philosophical “style” than it is a collection of doctrines. The style is more like that of my hero, Bernard J.F. Lonergan, in that Analytic Philosophy is much more interested in actually solving philosophical problems than it is in clarifying the thought of past philosophers. Thus, Analytic Philosophy is characterized by a thematic, rather than a “history of philosophy,” approach. It uses or creates a specialized technical vocabulary to elucidate the various “options” available in any given philosophical issue… marshals the evidence in favor or against those options… and then attempts to actually “settle” the issue. It’s actually quite refreshing.
The only problem with Analytic Philosophy from the perspective of a traditional philosopher or “lover of wisdom” is that it’s still focused primarily on trivial problems or mere puzzles (perhaps because those are the easiest ones to “solve”). Academic analytic philosophy is often little more than “chloroform in print,” boring to the point of dispatching its readers into a catatonic stupor. The cure for this tedium has been, over the past several years, the appearance of those popular philosophy journals and publishing houses I mentioned earlier. Precisely because they are aiming at a wider audience, the popular philosophy authors have to turn their attention to the Big Issues that interest real people – and thus are forced by the market to abandon the tedium beloved by academics and use their philosophical skills to address topics people actually care about. An example of how wonderful this can be is a book I am reading right now, Michael Sandel’s magisterial Justice: What’s the Right Thing to Do? It’s clear, concise, lays open the various options available on contentious issues, concerns serious subjects (what is justice?) and doesn’t resort to pretentious displays of symbolic logic to make its points.
These days, I mostly read good Catholic philosophy (such as can be found in the American Catholic Philosophical Quarterly or Method: A Journal of Lonergan Studies) and “popular” analytic books such as Justice or those produced by Routledge. I still can’t read academic analytic philosophy journals. I tried subscribing to Faith and Philosophy, the (mostly analytic) journal of the Society of Christian Philosophers, but found it deadly dull and exhibiting the worst aspects of analytic pretentiousness. Here’s a sample, taken from John Turri’s essay, “Practical and Epistemic Justification in Alston’s Perceiving God” (July 2008, p. 290):
“Alston’s thesis is that putative perceptions of God often justify beliefs about God. A subject S has a putative perception of God when S has an experience e in which it seems to S that God appears to S as P. If, based on e, S forms the “M-belief” that God is P, then S has a justified belief that God is P. An M-belief is a belief that God is P, which is based on a putative perception of God. (I will often substitute ‘q’ for the proposition that God is P.)
I dunno. My reaction to writing like that is the same as George Will’s: Just because life is absurd that doesn’t mean philosophy should be as well.
I don’t mean to pick on John Turri, whom I am sure is a great guy and a lot smarter than I am. But this sort of stuff is meant solely for professional philosophers in universities… and is largely what turns people off to philosophy as an academic discipline. If Socrates had spoken like that, they probably would have forced him to drink hemlock much earlier and philosophy would never have gotten off the ground.
Around 2008, after my book The Politically Incorrect Guide to the Bible came out, I was asked to fly to Ireland to participate in a debate on the existence of God at University College Cork. I had been doing radio interviews for my book and was very comfortable discussing some of the sillier arguments atheists use to attack Christianity or the Bible – for example, that the Bible is full of scientific “errors” and therefore is obviously complete nonsense. Attacks such as these are basic category errors – a comparison of apples and oranges – that are easily refuted.
But despite studying philosophy as an undergraduate, I didn’t really feel qualified to debate the existence of God. Plus, I was super busy with other things and with business projects, about to go on a trip to Rome, and so I politely declined the offer in Ireland.
At the time, Christopher Hitchens, Sam Harris and Richard Dawkins were supposedly going around doing debates, taking on people like Dinesh D’Souza and the Oxford theologian and former scientist Alister McGrath. The impression I got was that Hitchens was simply demolishing the theists with his allegedly rapier-like wit and vast erudition. Also, I have always looked with awe on Oxbridge philosophy – home of such luminaries as Bertrand Russell, Ludwig Wittgenstein, Elisabeth Anscombe and so on – and so I assumed that the UK philosophers would trot out their allegedly superior logical skills, decades of logical analysis, and easily smash the dusty old arguments of theism. (Truth be told, however, Fr. Coppleston more than held his own against Lord Russell in their famous 1948 debate on the BBC.)
It turns out that I was utterly deluded. In recent times, I’ve begun to systematically record and listen to all of the debates on the Existence of God that I can lay my hands on and listen to them at my leisure.
I made a shocking discovery. It turns out that the atheists are really, really good at insults but are actually quite poor debaters.
Thus, since Christians and observant Jews are typically polite, they are usually at an extreme disadvantage when “debating” atheists such as Hitchens and Harris – especially when they discover that the “debate” consists in nothing but a half-hour of put-downs, snide remarks and petty insinuations. The atheists insult Christianity, Judaism and religion generally with a nastiness that is almost breathtaking. They belittle. They demean. They insinuate. But the one thing they don’t do is offer intelligent arguments.
In fact, they don’t actually reason at all.
Reasoning, after all, is a systematic questioning of assumptions… a marshaling of evidence… a critical examination of arguments. It is not, primarily, name-calling. When I first started watching these debates, I couldn’t believe what I was hearing. I just assumed the atheists would put forward logical arguments that the Theists would be hard pressed to answer. What I wasn’t prepared for was that the atheists didn’t really marshal arguments at all: they merely sneered. The New Atheists were plainly accustomed to standing up in front of large groups of college students, making snide put-downs that got a lot of laughs and applause; and they were quite good at demolishing arguments made by young earth Creationists and snake-handling fundamentalists. But when faced with genuine Christian intellectuals – such as the philosopher William Lane Craig – they failed utterly to even engage the principal arguments that were made.
For example, when Craig debated Sam Harris on the topic of moral values – whether you can establish the existence of objective moral values without recourse to God – and Craig offered three extremely precise reasons why Harris failed to prove the existence of objective moral values in his then-latest book, The Moral Landscape. He offered a detailed, step by step critique for why Harris’s argument in his book is, at bottom, logically incoherent.
When it came time for Harris to respond, he didn’t. He didn’t respond to a single one of Craig’s logical arguments. Instead, he simply changed the subject – and fell back on his snide one-liner attacks on the Bible and how stupid Christians are.
I actually felt sorry for Harris because he was so clearly out of his depth. Harris studied philosophy as an undergraduate at Stanford, but his Ph.D. is in the new pseudo-science of “neuro-science,” a new inter-disciplinary degree that brings together neurology, psychology and a little philosophy in order to discuss Big Ideas without the burden of actually having to think clearly.
In contrast, Craig earned two master’s degrees in theology, a Ph.D. in philosophy at the University of Birmingham in the UK, a doctorate in theology under Wolfhart Pannenberg at the University of Munich, and then, after all that, spent six years doing post-graduate research at the Catholic University of Louvain in Belgium. What’s more, Craig is a professional philosophy in the analytic mode – meaning, he breaks down philosophical subjects into the various possible options, uncovers the logical assumptions in each of the possible options, and then demonstrates how the hidden assumptions in philosophical arguments or claims undermine the point being made or, in some instances, provide the necessary and sufficient conditions for the argument to make sense. As an analytic philosopher, Craig is quite comfortable with rigorous logic, sufficient reason, proof, demonstrations and so on – and so, when doing battle on the field of pure reason and logic, he is able to expose the arguments of the New Atheists (such as they are) as little more than empty rhetoric. Here is how a typical debate between a New Atheist and someone like Craig goes:
Craig: But secondly, the problem that’s even worse is the “ought implies can” problem. In the absence of the ability to do otherwise, there is no moral responsibility. In the absence of freedom of the will, we are just puppets or electro-chemical machines. And puppets do not have moral responsibilities. Machines are not moral agents. But on Dr. Harris’s view, there is no freedom of the will, either in a libertarian or a compatibilistic sense, and therefore, there is no moral responsibility. So there isn’t even the possibility of moral duty on his view. So while I can affirm and applaud Dr. Harris’s affirmation of the objectivity of moral values and moral duties, at the end of the day his philosophical worldview just doesn’t ground these entities that we both want to affirm. If God exists, then we clearly have a sound foundation for objective moral values and moral duties. But if God does not exist, that is, if atheism is true, then there is no basis for the affirmation of objective moral values; and there is no ground for objective moral duties because there is no moral lawgiver and there is no freedom of the will. And therefore it seems to me that atheism is simply bereft of the adequate ontological foundations to establish the moral life .
Now, those were fighting words. Craig just demolished most of the argument in Harris’s most recent book in front of a large audience at the University of Notre Dame. You might think that Harris would be called upon to actually defend his position, to offer reasonable counter-arguments to show why Craig’s attacks were unfair or were missing the point.
But he doesn’t!
Much to my astonishment and disappointment, Harris just reverts to what Atheists do best – which is to change the subject and begin name calling!
Harris: Well, that was all very interesting. Ask yourselves, what is wrong with spending eternity in Hell? Well, I, I’m told it’s rather hot there, for one. Dr. Craig is not offering an alternative view of morality. Ok, the whole point of Christianity, or so it is imagined, is to safeguard the eternal well-being of human souls. Now, happily, there’s absolutely no evidence that the Christian Hell exists. I think we should look at the consequences of believing in this framework, this theistic framework, in this world, and what these moral underpinnings actually would be.
Alright, nine million children die every year before they reach the age of five. ok, picture, picture a, a a Asian tsunami of the sort we saw in 2004, that killed a quarter of a million people. One of those, every ten days, killing children only under five. Ok, that’s 20, 24,000 children a day, a thousand an hour, 17 or so a minute. That means before I can get to the end of this sentence, some few children, very likely, will have died in terror and agony. Ok,, think of, think of the parents of these children. Think of the fact that most of these men and women believe in God, and are praying at this moment for their children to be spared. And their prayers will not be answered. Ok, but according to Dr. Craig, this is all part of God’s plan. Any God who would allow children by the millions to suffer and die in this way, and their parents to grieve in this way, either can do nothing to help them, or doesn’t care to. He is therefore either impotent or evil.
This is what passes for reasoned argument among the New Atheists.
Well, I thought, perhaps this is unfair. After all, Dr. Craig is a trained professional philosopher with two doctorates and a lifetime of training as an analytic philosopher. Sam Harris studies “neuro-science.” It’s hardly a fair contest. Dr. Craig is trained in mathematical logic; Sam Harris is trained in school yard insults.
So, it seems fair that we compare apples to apples – in this case, an Atheist philosopher versus a Theistic philosopher.
As a result, I started looking for debates between Dr. Craig and some famous atheist philosophers. Much to my delight, I found some! In 2005, it turns out, Dr. Craig debated the British philosopher A.C. Grayling at the Oxford Union on the topic of, “Belief in God Makes Sense in Light of Tsunamis.”
Perfect. Surely, I thought, an atheist philosopher of Grayling’s stature would mount scary, logically airtight arguments against the existence of God and would demolish Dr. Craig – at least teach him a lesson he wouldn’t quickly forget. I was actually rooting for the atheist side! Dr. Craig reminds me of Thomas Aquinas: His logic is so impeccable you have to attack his premises. He is so relentlessly rationalistic you start to root for the underdog.
But, again, I was quickly disappointed. Dr. Craig made his case in his characteristic analytic style: step by step, premise by premise, pointing out the possible weaknesses in his own argument and helpfully suggesting ways his Atheist opponents could possibly prove him wrong.
He began to remind me of Chess Masters who are so confident of their abilities that they actually point out to you, in advance, why you probably don’t want to make that move… because in ten steps it will result in Checkmate. Alas, Craig’s debate with Grayling was as one-sided as was his debate with Sammy “I’m Just So Darn Smarter Than Everyone Else” Harris. Grayling was reduced to stammering… and fell back, as Atheists almost always do, on insults. Dr. Craig started off by explaining the underlying presuppositions of the classic logical argument against God from the existence of evil:
Craig: Traditionally, atheists have claimed that the co-existence of God and evil is logically impossible. That is to say, there is no possible world in which God and evil both exist. Since we know that evil exists, the argument goes, it follows logically that God does not exist. It is this version of the problem of evil that professor Grayling recently defended in his debate with Keith Ward in The Prospect.
So, according to the logical version of the problem of evil, (the two statements on your hand-out):
“(A) an omnipotent, omnibenevolent God exists”
“(B) evil exists”
…are logically incompatible.
The difficulty for the atheist, however, is that statements (A) and (B) are not, at face value, logically inconsistent. There’s no explicit contradiction between them. If the atheist thinks they are implicitly contradictory then he must be som – uh – assuming some hidden premises that would serve to bring out the contradiction and make it explicit.
But, what are those premises? Well, the atheist seems to be assuming two things:
“(1) If God is omnipotent then he can create any world that he desires”
“(2) If God is omnibenevolent then he prefers a world without evil over a world with evil”
The atheist reasons that: since God is omnipotent he could create a world without evil, and since he is omnibenevolent he would prefer a world without evil, therefore if God exists, evil cannot exist.
Dr. Craig goes on to explain that this version of the problem of evil, based on logical incoherence, has been “seriously undermined” by the incisive critique of the philosopher Alvin Plantinga and has fallen out of disfavor among academic philosophers. He points out that Plantinga has demonstrated that the atheist must show that both of the critical assumptions (1) and (2) are necessarily true in order for the argument to be logically valid. But, Plantinga argues, if it is even possible that human beings have free will then (1) and (2) are not necessarily true.
This is what Analytic Philosophy does best: Break down arguments into their underlying premises… and then demonstrates what must or must not be true in order for an argument to be logically valid.
Okay, I thought, pretty slick. But now the Atheist team is going to bring in one of their Big Guns – an Oxford philosopher, trained in the same logical jujitsu as Dr. Craig. Surely he’s about to meet his match. Then Grayling spoke. If possible, he was even more meandering and non sequitur than Sammy Harris, albeit with slightly better manners.
GRALYING: Um, let me just begin with a remark about the tsunami which, as you know, killed several hundred thousand people – among them small children and elderly people – a great majority of them were not Christians – they were people of other faiths and all faiths – I suppose – and of no faith. So I suppose one would need an assumption to the effect that the deity, if, he/she or it caused it or countenanced it or wasn’t able to stop it, nevertheless it would have – in some sense – to be the same deity for all those people, and if there is a greater good envisaged in the event then it would have to be one that, um, is somehow captured in very different forms in these different faiths. And I leave that point hanging in the air because I think it’s something that we need to bring up a bit later on – remembering that there was a competition between the faiths! After all, a Christian will tell you that that the founder of that religion said “I am the way, the truth and the life, no-one comes to the Father but by me”, which seems rather bad news for very many of the people who were swept away by that grave wave .
Once again, the Atheist declines to actually address the topic at hand and simply and quickly changes the subject – in this case, to the multiplicity of religions on earth.
I can’t tell you how disappointed I was by this whole performance.
That’s because there is a part of me that finds airtight logical arguments inherently unpersuasive. Faith, to me, is bigger than logic, bigger than reason. Proving the existence of God from logical arguments seems to me a lot like proving that I love my wife from logical arguments: the very exercise seems a bit inappropriate or even somewhat demeaning. I can imagine approaching my wife and, instead of giving her roses and a box of chocolates on St. Valentine’s Day, proposing the following argument:
A. All men who give their wives presents love them.
B. I give you presents.
C. Therefore, I love you.
If that was how I proved my love for my wife, offering her airtight logical demonstrations, I don’t think I would have been married for very long.
The same is true of God. Authentic religion of any kind has a mystical component that bypasses logic or, rather, that makes logic almost unnecessary. In a very real sense, we have an experience of the grandeur of God – an experience of what mystics call the Numinous – that is above and beyond the rational arguments of the human mind. These experiences don’t preclude logic; they just make logic irrelevant. My felt sense of the awesomeness and holiness of Being – of the transcendent power that maintains in existence galaxies as well as my own beating heart – makes me want to fall to my knees. To try to conjure up a logical premise from such an experience to use in an argument seems almost as absurd as trying to do the same thing after a date with my wife.
Don’t get me wrong. I am not belittling logic and reason, or even debates on the existence of God. I just see their limitations. They say that at the end of his life, St. Thomas Aquinas, Christendom’s foremost logician, had a mystical experience and, after that point, he refused to write another word. “All of my writings are as straw,” he supposedly said. The same thing was true of Blaise Pascal, the brilliant French mathematician, scientist and mystic. When he was young, he had a mystical experience of some kind that changed his life. In a frenzy, he scrawled out a description of what had happened to him:
GOD of Abraham, GOD of Isaac, GOD of Jacob
not of the philosophers and of the learned.
Certitude. Certitude. Feeling. Joy. Peace.
GOD of Jesus Christ.
My God and your God.
Your GOD will be my God.
Forgetfulness of the world and of everything, except GOD.
He is only found by the ways taught in the Gospel.
Grandeur of the human soul.
Righteous Father, the world has not known you, but I have known you.
Joy, joy, joy, tears of joy.
Pascal sewed this inscription into his coat and wore it every day of his life.
All this explains why I am not particularly threatened by logical arguments against the existence of God… and why I can even root for the Atheist team a little. If I were to debate myself, I would never use mystical experience as an argument for God’s existence because it is non-falsifiable, it is an unfair trump card that avoids logical reasoning. But just as my love for my wife is not the result of a logical demonstration, so, too, my faith in God is not the result of a chain of deduction. Reason can perhaps confirm what we know already by faith, but faith is rarely the result of reason. What’s more, I have this sense that the God of Abraham, Isaac and Jacob is not the Prime Mover of Aristotelian logic… and that to argue for his existence, using the paltry weapons of the human mind, seems almost presumptuous. So, that is why I came to the Atheist debates with a relatively open mind.
Although the existence of God is as self-evident to me as the existence of air, I am perfectly comfortable with the notion that his existence may not be provable logically. The Catholic Church holds as a religious dogma that his existence can be proven, but I was and am willing to entertain the possibility that, with the development of new tools of logical analysis, the traditional Theistic arguments for his existence may be found wanting. For example, I have long been persuaded that the modern argument from design, at least as presented by the Intelligent Design movement, can be persuasively overcome. The concept of “irreducible complexity,” used by Intelligent Design theorists such as Dembski and Behe, has been effectively refuted by scientists and philosophers. As a result, not all Theist arguments hold water… and I came to the New Atheist debates with an open mind concerning which arguments were solid and which could be undermined.
What I was wholly unprepared for, however, was the way in which the Atheist team completely abandoned the effort to present logical arguments at all and simply reverted to name calling. As I said, when faced with worthy opponents, such as Dr. Craig or even Dinesh D’Souza, many of the atheist debaters gave up any effort to mount rational arguments and just started making snide remarks. These remarks sometimes got a laugh – even I chuckled at some of them – but what they didn’t do was make any sort of rational case. It’s gotten so bad that the enfant terrible of the New Atheists, the popular science writer Richard Dawkins, has refused to debate William Lane Craig. In typical New Atheist fashion, he doesn’t offer reasons but only insults: He asked colleagues in the philosophy department at Oxford and “no one” had heard of Dr. Craig. Thus, Dr. Craig is too small of a fish to face such an intellectual giant as himself. Even many atheists are now embarrassed by Dawkins’s refusal to debate Craig.
Fans of Game of Thrones, the blockbuster fantasy novels and now an HBO-TV series, will recognize “Winter is Coming” as the motto of the House of Stark, the noble family who rules over the cold northern regions of the seven kingdoms in the mythical land of Westeros.
For some time now, I’ve been walking around cryptically muttering this phrase to myself as it becomes increasingly possible that Barack Obama and his gang of Big Government apparatchiks will win re-election.
Of course, like most sane people I don’t trust the media’s polls more than I can spit. However, I do trust gamblers… and the Intrade odds in London now have Obama with a 78.8% chance of winning. The bettors believe that it will be another razor close election but that Obama, given all the advantages of incumbency and his willingness to literally buy votes with offers of free telephones and free colonoscopies, will squeak by.
And that means winter is coming…
Obama campaigned on the idea that he wanted to “spread the wealth around,” and that is precisely what he’s been doing: Borrowing trillions of dollars from the working middle class (who will have to pay the money back with future tax increases) and giving it all to key Democrat constituencies, including auto workers, government bureaucrats, teachers unions and, most of all, Wall Street banksters.
Without spending a penny of his own campaign money, therefore, Obama starts out with nearly half of the voting public already in his pocket — millions of state, city and federal government workers who expect to retire at age 50 with six-figure pensions and are outraged if they have to pay so much as a $10 co-pay for a doctor’s visit.
This entrenched government class will vote for any politician that promises to keep writing bigger and bigger checks… no matter how much debt that means saddling our children and their children with over the coming fifty years.
And make no mistake: the debt is mounting fast. In just four years, Barack Obama has borrowed an almost unimaginable $5 trillion from future workers – more than every other U.S. administration in history… combined.
According to the Congressional Budget Office, the current unfunded pension and health liabilities of the U.S. government now exceed $50 trillion… and U.S. debt could grow to nearly twice the size of the U.S. economy by 2037.
Of course, it goes without saying that none of this is an endorsement of Mitt Romney. Romney, like most establishment Republicans, is merely Obama Lite.
No national Democrat or Republican politician will take the steps necessary to avert fiscal collapse – such as drastically reducing government pensions, eliminating wasteful and redundant government programs, closing down half of U.S. military bases overseas, and reforming Medicare. At best, Romney, like Ronald Reagan, will only “slow the growth” of government spending, not make any real cuts.
Like the long winters in Game of Thrones, the coming fiscal winter will be long, cold and, for people who don’t drive around in Hollywood limousines, increasingly desperate.
The U.S. economy will continue to sputter… unemployment could top double-digits… and an increasingly authoritarian government will use its new omnipresent Big Brother surveillance tools to track down “tax cheats” (meaning most of the working population) and squeeze more and more revenue out of fewer and fewer working people.
It will also mean a reenergized Kulturkampf against the Christian churches in general and the Catholic Church in particular: Once reelected, the Obama Administration will do everything in its power to force Catholic hospitals to perform abortions and sterilization procedures… force Catholic charities to accommodate gay marriage and adoption… and force Catholic schools to provide contraception and safe sex education. The emphasis, as always with the current administration, will be on force.
Yet, “Winter is Coming” does not mean the world is coming to an end.
Americans (and others) will adjust to the winter. They will do what citizens living under bullying regimes have always done: They will ignore the government whenever possible… cope with its nutty demands when they must… and live their lives as best they can in a stagnating economy. Rich people, like Obama’s Wall Street and Hollywood cronies, won’t mind the massive tax increases and high unemployment. There will still be plenty of money to be made for the agile and politically well-connected (just ask the founders of Solyndra). But poor people will mind.
However, nothing lasts forever in the world of politics and economics… and every disaster, including every corrupt regime, carries with it the seeds of its own overcoming.
Just as the American people quickly soured on the economic misery, crime and social unrest that Jimmy Carter and the big-spending Democrats brought with them in the 1970s, so, too, another four years of trillion-dollar annual deficits will eventually become so unsupportable that it will collapse from its own weight.
The government will eventually default on its obligations, the credit rating of the U.S. will fall (again), and even big-spending politicians will recognize that things can’t continue as they are. As the old joke has it, socialism collapses when you run out of other people’s money. That’s what the people of Spain and Greece are discovering right now. That is what California is about to discover.
But in the meantime, winter is coming. It’s time to prepare. Dress warm.
NOTE: The author of Game of Thrones, the talented and very rich George RR Martin, is an avowed Democrat partisan who thinks Obama is “the most intelligent president we’ve had since Jimmy Carter.”
In the 1990s, a new generation of young, less ideologically-driven, often female anthropologists and scholars made it their business to investigate prehistoric European religious cultures and, when they did, they made an astonishing discovery: the religion of the Great Goddess was all made up out of whole cloth.
“The evidence is overwhelming that Wicca is a distinctly new religion, a 1950s concoction influenced by such things as Masonic ritual and a late-nineteenth-century fascination with the esoteric and the occult, and that various assumptions informing the Wiccan view of history are deeply flawed,” wrote Charlotte Allen in The Atlantic. “Furthermore, scholars generally agree that there is no indication, either archaeological or in the written record, that any ancient people ever worshipped a single, archetypal goddess…”
In fact, according to Phillip G. Davis, professor of religion at the University of Prince Edward Island and author of The Goddess Unmasked: The Rise of Neopagan Feminist Spirituality, much of what passes for “feminist spirituality” today is largely the creation of just one man (not one woman), an eccentric British bureaucrat named Gerald B. Gardner (1884-1964).
An avid Rosicrucian and occasional nudist, Gardner claimed to have learned the “Old Religion” from an ancient “coven” near his home in Highcliffe, Dorset, but Davis and other scholars who examined his unpublished papers now conclude that no such group ever existed. Rather, Gardner synthesized the Romantic musings of such 19th and early 20th century cranks as Charles G. Leland (author of Aradia, or the Gospel of the Witches), Margaret Murray (author of Witch-Cult in Western Europe), Robert Graves (author of The White Goddess) and the infamous occultist Aleister Crowley.
What’s more, when women scholars began searching the records and relics of ancient societies for the peaceful religion of the “Great Goddess,” they discovered (to their dismay) that ancient pagans were a far cry from the feminist encounter groups they encountered at Harvard Divinity School, lighting incense to “the Divine Sophia.”
“Searching for female images of the Divine, [religious feminists] inevitably turned to ancient pagan goddesses such as Isis of Egypt and Ishtar of Babylonia, and, in the process, adopted the romantic notion that the societies that worshipped them held women, sexuality, and nature in high regard,” writes Judith Antonelli, a religiously observant Jew and feminist author in Boston. “There’s just one problem: The fairy tale isn’t accurate. It whitewashes the male supremacy and militarism of ancient paganism, falsely attributing the origin of these phenomena to ‘the Hebrews.’ In the new goddess myth, Egypt and Babylonia are portrayed as benevolent, peaceful, and matriarchal societies, despite the fact that sexual abuse and exploitation, ritual castration, phallus worship, and even human sacrifice were all integral aspects of their religious traditions. Do women who are enchanted by Isis, for instance, know that worship of her involved the annual drowning of a young virgin girl in the Nile to assure a plentiful harvest? Do devotees of Ishtar realize that many of her priestesses were simply temple slaves who were branded with a star (Ishtar’s symbol) just like the animals that were dedicated to her?”
Actually, it gets worse.
Mainstream anthropologists now concede that there is no historical or archaeological evidence whatsoever that a true matriarchal society – one in which political power lay primarily in the hands of women – ever existed anywhere on earth, even among goddess-worshiping pagans.
There have been matrilineal societies, of course – such as traditional Judaism, ironically enough – in which children are identified primarily from the mother’s line. But even in matrilineal societies such as traditional Judaism, males have dominated.
In fact, it was only after Christianity was introduced in Western Europe – with its egalitarian ethos and pervasive cult of the Blessed Virgin Mary – that women first gained the respect and dignity which, Christian theologians insisted, God ordained from the beginning of creation.
“One of feminism’s irritating reflexes is its fashionable disdain for patriarchal society, to which nothing good is ever attributed,” writes the iconoclastic feminist scholar and lesbian intellectual Camille Paglia in her now-classic text, Sexual Personae. “But it is patriarchal society that has freed me as a woman. It is capitalism that has given me the leisure to sit at this desk writing this book. Let us stop being small minded about men and freely acknowledge the treasures their obsessiveness has poured into culture… we could make an epic catalog of male achievements, from paved roads, indoor plumbing, and washing machines to eyeglasses, antibiotics and disposable diapers… If civilization had been left in female hands, we would still be living in grass huts.”
Sentiments such as these are, of course, heresy among today’s aging gender feminists for whom “patriarchy” is the original, if not the only, sin.
It is certainly debatable whether testosterone-driven male “obsessiveness” was a necessary stage in the development of human civilization, but what is not debatable is that the status of women significantly improved (and almost exclusively in western Europe) as Christianity became the dominant religion of the Roman Empire. The reason for this was the example set by both the Jewish and Christian scriptures.
The Rights of Women in the Ancient World
The truth that many contemporary feminists don’t want to face is that, with a few rare exceptions, women never had anything close to equal rights with men throughout the long history of the ancient world.
The evidence for prehistory is mixed. Certainly, small tribal groups may well have exhibited a more easy-going familiarity between the sexes and consequent quasi-equality. Then again, there is also evidence that some prehistoric societies were, like gorilla societies, brutally patriarchal, dominated by the strongest (alpha) males who simply took what they wanted – food, women, the best cave — with nary a please nor thank you.
Once we arrive in the period when records were kept, however, the status of women was definitely more limited.
In the Homeric epics, there are numerous powerful female figures – the goddesses and characters such as Circe and even the long-suffering, eternally chaste Penelope. But in Greek society itself, women had few if any rights. In the famous democracies of classical Greece, women had no vote. They could not sue in law courts. They couldn’t own property. They were rarely seen in public. As in ancient Japan, the Greeks expected their wives to be seen and not heard… although they did appreciate the company of prostitutes (hetaira) and concubines (pallakai) who appear to have had some influence.
As we saw in another essay, male children were more highly valued throughout the ancient world than female children who were frequently killed through the widespread practice of infanticide – a grim sociological reality that gender feminist, through their grim advocacy of abortion-on-demand, have unintentionally brought back. Strangely enough, ancient Egypt — with its powerful queens — seems to have granted women more equality. Late Egyptian marriage contracts give women more rights.
In Rome, women had more rights than in ancient Greece, including the right to divorce; but very little political power. There were no female Roman senators or emperors. Not until the legalization of Christianity in Byzantium did true empresses appear — Irene (A.D. 752-803), Zoe (978-1050) and, of course, the formidable Theodora (984-1056). (The wife of Augustus, Livia Drusilla, functioned as “co-empress” in a sense and was an adviser to her son, Tiberius, but was not a true emperor.)
Women in the Hebrew Bible
Women were generally treated better in ancient Israel and in early Judaism than in most pagan societies but patriarchy still reigned supreme. There are passages in the Hebrew Bible in which women are portrayed as seductresses (Delilah in Judges 16, Pharaoh’s wife Potiphar in Genesis 39), or as inherently untrustworthy.
Many feminists insist that the story of Adam and Eve places most of the blame on Eve… even though it’s clear from the text, and from their mutual punishment, that the blame lies on both equally. Part of the punishment for disobeying God is that the man will “rule” or “dominate” his wife – although this also makes clear that such male domination was not what God willed in the beginning but is a consequence of sin. Certainly, the Mosaic Law contains many provisions that strike modern people as unjust and discriminatory against women… such as the provision that a bride (but not a groom) discovered not to be a virgin be stoned to death (Deut. 22: 13-21)… the fact that men (but not women) were allowed to have multiple spouses… the rule that property be passed on to male heirs but not to female ones (Numbers 27: 8-11)… the provision that an oath by a man is legally binding but not that of a woman if it is contradicted by her father or husband (Numbers 30)… and so on.
On the other hand, however, there are many passages in the Hebrew Bible that emphasize the fundamental equality of women with men – as well as their ingenuity, compassion and courage.
In the Hebrew Bible, women routinely outsmart the men and take the initiative. You have to be a singularly ideological feminist not to see the humor and pathos in the story of Abraham and his beautiful, powerful, rich and determined wife Sarah. She is a woman of such beauty that, when Abraham and she flee to Egypt, Abraham tells the Egyptians that she is his sister, not his wife, because he fears the Egyptians will simply kill him and take her for themselves – and indeed, Sarah attracts the attention of none other than the Pharaoh himself. Yet Sarah remains childless into her old age – and so she, and not Abraham, proposes that her tired old husband Abraham sleep with the young Egyptian servant-girl Hagar (the first recorded act of surrogate motherhood) so Abraham will have an heir and all their property not be deeded to their bondservant. Inevitably, the nubile Hagar begins to act haughtily towards her old mistress – and Sarah is not a happy woman. “You are responsible for this outrage!” the Bible records Sarah screaming at Abraham. “I myself gave my maid to your embrace; but ever since she became aware of her pregnancy, she has been looking on me with disdain. May the Lord decide between you and me!”
Hardly a docile wife.
Finally, when Abraham informs his wife that God will perform a miracle and that she will conceive a son after menopause, she laughs – a bit at her old husband’s expense. Sarah says, “Now that I am so withered and my husband is so old, am I still to have pleasure (edna)?”
Here’s what sucks about life: You wake up in your crib, confused and more than a little dazed, and then spend the next 20 or 30 years trying to figure out what to do with yourself.
You mostly do what you’re told. You learn how to read, play sports, try to attract members of the opposite sex. In your 20s, you look for some kind of job – and maybe decide to settle down, get married and have kids. But life, as they say, doesn’t exactly come with an operating manual – whatever people may say about the Bible or other holy texts. Even if it did, no one would have time to read it.
We catch what wisdom we can on the fly, from our friends mostly, a little from our parents, more than we care to admit from ideals dreamed up for us on television and in movies.
It isn’t very much to go on.
And then problems start coming, fast and furious. Life is harder, more complex, than TV would have us believe. Kids get sick. Spouses get angry, bored or indifferent. Work is not exactly a dreamland of creativity and fun.
Before you know it, you hit 30 and the bills are getting scary, the problems even harder. You still have dreams but you’re concentrating on survival, keeping your head above water. The big things you want to do with your life will just have to wait for a while. Reality bites.
But then, before you know it, you’re 40! Holy smokes! Someone close to you dies, a parent perhaps. You get sued. You get arrested for drunk driving. Your spouse cheats on you. Your business goes bust. You get seriously hurt in an accident – or develop a life-threatening disease.
It is often then, perhaps for the first time, when you realize you might need help – that you’ve done as best you could but perhaps it’s time to reflect a little, to question the assumptions that have guided your life up until now, to reevaluate where you are and where you are going. For some, this process begins early; for others, later. But eventually we all realize that we need more wisdom than is available to us on HBO… or on our favorite blog.
That’s the purpose of religion – and why parents desperately try to keep their teenage children connected, in whatever way is possible, to a religious community. Religion is nothing more than the depository of humanity’s accumulated life wisdom, won over millennia of trial and error and its haphazard encounters with the Infinite.
Many people, scared off of religion by TV evangelists or other nightmare experiences, look for wisdom elsewhere. Maybe in politics, or pop psychology, or the Law of Attraction. Or they decide, like Descartes, the only wisdom worth having is what they can discover for themselves. Fair enough.
But parents instinctively sense that their children will someday need better advice than what is available on TV talk shows or the Internet – and so drag their reluctant offspring to synagogue, or to church on Sunday, or to Mormon institute classes, or to their Zen sitting group.
They want to at least put them into contact with a bigger community of shared spiritual values, with the great prophets and mystics of their religious tradition, with its saints and even sinners, with the ideals and ideas that shaped their own souls and which can help guide them through the treacherous rapids that are real life.
Children, of course, are bored silly by this. How can the Bible compete with YouTube?
Mark Twain once famously described the Book of Mormon as “chloroform in print,” but, with all deference to Joseph Smith, it’s fair to say that’s an equally accurate description of most holy texts.
I say this as someone who actually loves the Bible and spent a decade of graduate study learning more about it. But it takes time, and more than a little study, to appreciate the wisdom and beauty in the Bible – an odd anthology of ancient Hebrew and Greek writings utterly removed from the reality of 21st century modern industrial society. Expecting a modern teenage boy to be moved by, and gain wisdom from, the Torah or St. Paul’s Letter to the Romans is like expecting him to be similarly inspired by the Analects of Confusius or the U.S. Constitution. It’s asking a lot, usually too much.
Yet parents know they only have so much time to introduce their children to what matters in life, and most of what matters – in terms of marriage and family, birth and death, God and our purpose on earth – is found in the teachings of the world’s great religions. However, because the accumulated life wisdom that is found in religious tradition can really only be appreciated much later in life – when you actually need it — the best most parents can hope for is to introduce their children to the sources of this wisdom, the religious communities in which it is found and passed down, and do their best to give their children warm fuzzy feelings about the community so they will return to it later as adults.
In other words, smart parents recognize that children are often bored out of their heads by Sunday Mass, or Hebrew school, or whatever, but do their best to find activities and communities that have enough fun and sociability in them that their children are not put off forever.
Catholics are pretty bad at this, in my experience. It takes considerable skill and wisdom to trick teenagers into learning about a religious heritage – and most efforts, quite frankly, are pathetic. I learned this first-hand when teaching a Confirmation Class at my local Catholic parish, to fourteen very bored fourteen-year-olds. A teenager’s mind is on sex and maybe sports, not religious doctrine.
Evangelicals seem better, what with Christian rock, Young Life, Campus Crusade for Christ, Bikers for Christ, hip urban Christian magazines. Still, even they have trouble. I don’t know for sure, but Mormons seem able to keep their kids connected. Jews, too. Yet baby boomer ex-hippies struggle to hand on their Zen or Hare Krishna beliefs to their offspring (my first paid magazine article was on the children of the Hare Krishnas)… who rebel and join strange cults, like the Greek Orthodox Church. Just kidding.
My point is that all knowledge is cumulative, a series of insights that are passed on, over the centuries, and which lay the groundwork for further insights. We learn from and build upon the past.
The life wisdom that human beings need, spiritual knowledge, is the same. To ignore the spiritual wisdom of our religious traditions is to perpetually reinvent the wheel each generation, to start over from scratch.
That doesn’t mean we shouldn’t question how the wheel was made in the past, or wonder if we could re-design it, or see if there might be alternatives to the wheel that would work better. That is the nature of human knowledge, to constantly test inherited insights against current problems. But we gotta start somewhere, and that somewhere, for most people, is the collective wisdom found in the world’s great spiritual and religious traditions.
My advice: Go to church… or shul… or your local Scientology seminar… or your mother’s Zen sitting group. Go to whatever spiritual tradition you were born into until you find something better.
There are many different ways of life, of course, and each person has to choose the way that fits his or her personality and intuitions about what life is all about and how to be happy. There is the way of the adventurer. The way of the businessman. The way of the scholar or priest. There is the way of the artist or mystic. There is even Gurdjieff’s Way of the Sly Man, the secret mystic who lives like an ordinary business man. But because I studied Aristotle at a young age, I’ve always been persuaded that, when considering how to “structure” your life, you should consider how best to use whatever God-given talents you’ve been given. In his Nicomachian Ethics (1095a15–22), Aristotle said that happiness (eudaimonia) comes from the full exercise of your powers… from using your gifts… and I’ve always thought that is true. And that is why, for me personally, one of the dominant themes of my life has always been what I call balance – the attempt to arrange your life, in so far as its possible, so that you have some balance in life and are able to use as many abilities as possible. Perhaps this is actually the Way of the Dilettante but I prefer to think of it as the Way of the Renaissance. In other words, I wanted a life in which I could marry, raise a family, think, create, study, make money, travel, play sports, stay in shape, play music, read books, and, in general pursue my interests and passions. I tried to make choices, as life went on in its haphazard way, that created conditions in which this multi-faceted, balanced life would be possible.
For example, I knew early on that I wanted to be my own boss – both because I would probably make more money with my own businesses but also because it meant I would have more free time, the ability to travel, the ability to see my children growing up, and so on. As a result, I never really pursued any sort of corporate job or career. This has its disadvantages, of course. We’ve always had to pay for our own health insurance and medical costs, for example – and to this day marvel when friends complain about their $30 “co-pays” and rising insurance costs. We paid for each of our children’s deliveries, about $5,000 each, almost literally in cash. We’ve also had to create our own Defined Benefit Pension Plan with its myriad federal regulations, its mandatory reporting requirementss, its frequent demands for cash, and what I like to call the “adult supervision” of a professional pension fund administrator – a delightful woman, the “dragon lady,” who is an Orthodox Jew and who does her best to keep us out of trouble with the Feds.
But overall, being self-employed, in my opinion, gives you many more opportunities for the “full exercise of your powers” and be happy than working in a nine-to-five corporate job. I am revisiting all these issues afresh because, as I write these words, my eldest son is plotting his own career trajectory in the corporate world of high finance – and I marvel both at his ambitious determination and at the assumptions that underlying his plotting. It is so utterly alien to my own way of life – trying to fashion a career in a corporate setting – that I am only now appreciating the stubborn but quite deliberate choices that went into our way of life.
Another part of living a balanced life is making money – not a lot of money, perhaps, but enough to provide a safe and comfortable home, in a quiet and secure neighborhood, and so that you can afford such luxuries as sports teams, music and language lessons, health care, good schools and so on. If you want to marry and raise children – which, for most people, is the most realistic path to becoming a decent human being and whatever enlightenment is granted us on this earth — a minimum amount of money is a requirement. The practical upshot of this, for me, was that I didn’t want to choose businesses or jobs that would make me too poor. I’ve never really been all that materialistic (as anyone who sees the old truck I drive or my clothes would confirm) but I do like to travel, buy books, study Aikido and philosophy in my spare time, and provide educational opportunities for my children. This meant that my wife and I had to figure out how to make money – and thus becoming a starving artist wasn’t a choice I was prepared to make. I admire artists for their single-minded dedication to their art… and I actually would encourage anyone with serious talented to pursue art or music as a career choice… but you still have to earn a living, artist or no.
When young people call me up, as some do, and ask me if they should become writers, I always say the same thing: Absolutely! It’s the best way of life in the world! My only caveat is that, to be happy, most people will want to marry and have children, to exercise all of their powers – not just their artistic ones – and that you therefore have to balance your artistic pursuits with the need to make money and provide a comfortable home. You want to enjoy your body and stay in shape. Play tennis or softball. Go to yoga classes. This is self-evident to many people but not to all, especially not to all of my children. When you are young and idealistic, you want to give yourself over to a great artistic passion or project – to spend years working on plays that never get produced, or a great novel, or painting, or a rock band. In your early twenties, that’s what you should do – test out your abilities and explore different ways of making a living. But if you want to have a happy life, you need to know that you have to balance the desire for creative pursuits with the need to make a decent living – not to “sell out” but in order that you can “exercise your full powers,” so you’re able to become a full human being.
Again, I am only thinking about these issues because I have so many children. But I really do believe balance in life is essential, perhaps even a key to happiness – even if you decide that your talents lie in science, or engineering, or medicine. For example, my eldest daughter is thinking about becoming a doctor. My wife likes this idea because her sister is a doctor and she likes the economic security that being a doctor can provide to women, especially in an increasingly competitive global economy. I think that’s great, of course, and will do everything I can to help my daughter through medical school, if she decides to pursue that course. My only caution to her would be to strive for balance – to think about how to balance the demands of a medical career with the needs and expectations of family life, her musical talents, her passion for swimming and athletics. Medicine is a fairly demanding and monomaniacal profession… but I know it’s possible to build a balanced life as a doctor, as my younger brother and my sister-in-law have proven. But it takes effort and deliberate choices.
Anyone who has struggled with the arcane texts of contemporary analytic philosophers will appreciate this delightful cartoon on YouTube.
For a beach philosophizer like myself, it doesn’t get much better than “The Nature of Existence,” the quirky little documentary on the Meaning of Life that is opening this weekend. Filmmaker Roger Nygard wrote down the 85 toughest questions he could think of about the meaning of life — and then set out with a camera crew to ask them of such luminaries as Indian holy man Sri Sri Ravi Shankar (The Art of Living), professional atheist polemicist Richard Dawkins (The God Delusion), 24th generation Chinese Taoist Master Zhang Chengda, Stanford physicist Leonard Susskind (co-discoverer of string theory), wrestler Rob Adonis (founder of Ultimate Christian Wrestling), confrontational evangelist Brother Jed Smock, novelist Orson Scott Card (Ender’s Game), director Irvin Kershner (Star Wars: The Empire Strikes Back), Stonehenge Druids Rollo Maughfling & King Arthur Pendragon and many more. The result is his interesting little film:
It goes without saying that Supreme Court nominee (soon to be justice) Elena Kagan is a charming, intelligent, well-spoken woman who, unlike most politicians, gives every impression of being a genuine “moderate” in her views, someone who understands the complexities involved in great social issues and who is willing to acknowledge that people of good will could disagree with her.
Yet there was one point in the confirmation hearings today that revealed an ideological seed that, I fear, will grow into something quite disturbing during the 30 years or more she will be seated on the bench. And that is her apparent agnosticism towards the existence of unalienable natural or human rights.
Questioned by Sen. Tom Coburn, R-Okla., about whether she believed in “unalienable rights,” such as those referenced in the Declaration of Independence, Kagan replied quite firmly that she did not.
“You should not want me to act in any way on the basis of such a belief” in people’s rights outside the Constitution and laws, Kagan said. “I think you should want me to act on the basis of law.”
This exchange reveals that Kagan is, as many liberals today are, a believer in what is known as legal positivism. Legal positivism was a highly influential theory of jurisprudence throughout the first half of the twentieth century. But the horrors of World War II and Communist and Nazi totalitarianism made many law professors rethink whether it is a good idea to teach the doctrine that what is legal is whatever the State says is legal.
After all, Adolf Hitler was democratically elected by the people of Germany. The summary executions and brutalities of the Communist regimes were “legal” in the sense that the State authorized and approved them.
Much of what has gone wrong in the western law, over the last 150 years — from the approval of slavery in the Dred Scott decision to the legalization of abortion in Roe v. Wade — stems from this fundamental, anti-Christian belief that basic human rights do not really exist, that the State may grant, or take away, whatever rights and “privileges” it deems necessary.
In contrast to the “new” theories of rights advocated by Hobbes, Bentham and others, the classic Judaeo-Christian view (expressed most succinctly in natural law theory) has always been that governmental elites must answer to a higher law than mere human legislation.
When governments repeatedly transgress these fundamental human rights, it is the right of the people, as Jefferson put it in the Declaration of the Independence, “to throw off such government and to provide new guards for their future security.”
Governments that fail to respect the “unalienable rights” endowed by God are tyrannies and, therefore, illegitimate.
Once again, these ideas stem, not from atheistic philosophers, but from Christian theologians reflecting upon the truths found in the Bible.
The notion of a Divine Law above mere human law was expressed clearly by Thomas Aquinas in his Summa Theologica, ratified by John Calvin in his Institutes, and summarized succinctly by Sir William Blackstone in his Commentaries on the Laws of England (1765), one of the chief sources used by Jefferson (and all the Colonists) in crafting the new American government.
According to Blackstone, civil law is given, not to create rights, but to protect already pre-existing natural rights.
The primary object of law, he says, is to maintain and regulate those “absolute rights of individuals … such as would belong to man in a state of nature, and which every man is entitled to enjoy, whether in society or out of society.”
“Those rights then which God and nature have established, and are therefore called natural rights, such as are life and liberty, need not the aid of human laws to be more effectually invested in every man than they are; neither do they receive any additional strength when declared by the municipal laws to be inviolable. On the contrary, no human legislature has power to abridge or destroy them, unless the owner shall himself commit some act that amounts to a forfeiture.”
This great tradition of classical natural right — which extended from the biblical prophets and the teaching of Christ through the medieval scholastics and Protestant divines up to the U.S. Declaration of Independence—was challenged directly by what is sometimes called “political atheism.”
A long string of anti-Christian thinkers—first Machiavelli, then Thomas Hobbes, and finally Jeremy Bentham and John Austin—rejected the notion of human rights as nothing more than, as Bentham put it, “anarchical fallacies.”
The English political philosopher Thomas Hobbes (1588–1679) advocated a strong totalitarian government (the “leviathan”) as the only way to save human beings from themselves. Famously describing human life as “solitary, poor, nasty, brutish, and short,” Hobbes insisted that the reality of human interaction was that of “war of every one against everyone.”
From this, he says, it follows that “nothing can be unjust. The notions of right and wrong, justice and injustice have no place [in the state of nature] (Leviathan, 13.13)
The only hope for a modicum of peace and civilization, Hobbes thought, was for individuals to surrender irrevocably their natural rights to a totalitarian state (such as an absolute monarch). Fear of the “leviathan” would force selfish and violent men to maintain order and limit their crimes. For this reason, Hobbes rejected the Christian notion that individuals could ever disobey immoral laws or criticize the State in any way.
Nevertheless, Hobbes accepted the classic notion, developed in the early Middle Ages, that government derives its powers from the “consent of the governed.” This “social contract” idea did influence the American Founders. But for Hobbes, the “consent,” once made by a majority, is irrevocable and cannot be changed or transferred without the permission of the original sovereign.
It sometimes happens that a sovereign puts to death an innocent man, he says, but, in essence, this is simply the price people must pay for having a government. “Nothing the sovereign representative can do to a subject, on what pretense soever, can properly be called an injustice or injury, because every subject is author of every act the sovereign does,” he says. For this reason, “tyranny” is merely an empty word for a monarchy that someone dislikes … just as “oligarchy” is merely a word used for an aristocracy that someone dislikes.
Needless to say, the American Founders didn’t accept Hobbes’s notion of absolute, unquestioning obedience to the State. Rather, they worked out an entirely new theory of government that combined the best of both the classic Christian theory of natural rights and the notion, drawn from the early Middle Ages, that the legitimacy of a government was the result of the “consent of the governed.” But unlike Hobbes, the Founders believed that, when a government violated a people’s natural rights “endowed by their Creator,” then it was the “duty” of the people to “alter or abolish it” and to “institute new government.”
Not surprisingly, not everyone took to this new theory of government. Totalitarians throughout history, whether monarchists or communists, dislike the idea that the power of government can be constrained in any way.
The English philosopher Jeremy Bentham (1748–1832), who witnessed the American Revolution, rejected entirely the notion of natural rights and the entire natural law tradition. His enemies were both Sir William Blackstone and John Locke. The idea of natural rights, he said, “is simple nonsense: natural and imprescriptible rights, rhetorical nonsense,—nonsense upon stilts.” The founder of the movement called “legal positivism,” which is still influential today, Bentham believed that the only rights that truly exist are rights created by the civil government. If the government hasn’t granted the right, he said, it’s not a right but a wish. Whereas Aquinas and other Christian thinkers insisted that a civil law that violated the law of God is not a true law and can be justly disobeyed, Bentham and legal positivists insist that this is not the case. According to John Austin, another influential founder of legal positivism, morality and law have nothing to do with one another. The validity of a law lies only in that it is proclaimed by a sovereign. The influence of Thomas Hobbes is evident throughout legal positivism. The power of the state is virtually limitless.
This is the disturbing philosophical pedigree, if you will, to Elena Kagan’s off-hand remark that she doesn’t recognize any natural rights “not found in the Constitution or in law.” In other words, if the State doesn’t say it’s wrong… it isn’t wrong. If the State says it’s okay to do something, it’s okay.
After the horrors of the Nazi era, we know that isn’t true. Elena Kagan knows it, too. And that’s why, as nice as she is, what she said is so disturbing to anyone who believes in human rights.
Jacob Weisberg wrote a thoughtful piece on the various factions on the Right. It was published in the dying Newsweek blog:
Here’s my favorite paragraph:
The GOP’s new Western tone harks back to Goldwater’s disastrous but transformational presidential campaign of 1964. Goldwater didn’t care about religion—he was a Jewish Episcopalian who once said that Jerry Falwell deserved a kick in the nuts. He wasn’t focused on racial politics; there weren’t many black people in Arizona then. What mattered to him was limiting government and preserving liberty. To Goldwater, political freedom was inseparable from economic freedom, a view distilled in his most famous phrase: “Extremism in the defense of liberty is no vice.”
It was great analysis until the end… when Weisberg predictably proclaimed that what America needs is “a conservatism that hasn’t been in evidence lately—a version that’s not Western, Southern, or Eastern, but instead tolerant, moderate, and mainstream.”
In other words, a “conservatism” that is anything but conservative… sort of Country Club Republicanism that most Americans don’t want and which consistently loses.
One of my favorite Catholic blogs, Lex Christianorum, had an interesting post on my philosophical guru, Bernard J.F. Lonergan, yesterday. Lonergan is accusing of being… the horror!… a Kantian in disguise. Of course, this comes as no surprise to anyone who knows Lonergan or his work. It’s the standard slam against almost all of the so-called Transcendental Thomists (Marechal, Rahner, Coreth) and is somewhat justified. Nevertheless, I find Lonergan to be the best philosophical synthesis between classical Thomism and the new statistical methods of modern science — far better, in my view, than the work of Jacques Maritain. Here is just one of Lex Christianorum’s digs. The post is discussing the work of a new book, By Nature Equal: The Anatomy of a Western Insight by John E. Coons & Patrick M. Brennan.
Their [Coons' and Brennan's] radical reinterpretation of the tradition is not of their own breeding, but comes from their devotion to the theological method of Bernard Lonergan (1904-1984), whose system must be viewed with suspicion as it led him to dissent the Church’s teaching on artificial contraception as contained in Pope Paul VI’s encyclical Humanae Vitae, implying that somewhere, somehow Lonergan got it badly wrong. Lonergan despaired of any ability of the human mind to comprehend objective being or objective good (as traditionally understood as the correspondence of the mind to the external reality, veritas est adaequatio rei et intellectus). So he escaped like any good Kantian would from the effort. He withdrew, much like a tortoise into its shell, or perhaps better, imploded, sort of like gravity and light into a black hole, into a subjective objectivity, or objective subjectivity, clearly confusing both dimensions of reality. The objective retracts into or collapses into the subjective, and, in all but name, objectivity becomes subjectivity.
This is tremendously unfair to Lonergan, I think. Funny, but unfair. Lonergan’s great insight was that objective knowledge is not, as we imagine, “taking a look” at the “really real,” but is the fruit of the act of judgment. Human knowing is a grasp of the virtually unconditioned, a judgment of fact that something is or is not so.
A week ago, I was scrambling to get my daughters off to a major swim meet at six o’clock in the morning, when I got a strange email: I was being invited to come to the Philosopical Society at University College Cork, Ireland, and participate in a debate about “whether this house would” reject atheism. The debate would be in mid-March, just a few weeks away.
I was flabbergasted! I couldn’t for the life of me figure out why they had invited me, of all people. I have no qualifications as a philosopher (my B.A. in philosophy, as impressive as it is, doesn’t quite cut it). I haven’t written a book about atheism. My only academic work has been in Biblical studies, and even that is pretty popular.
Of course, my wife’s first reaction, upon reading the email, was to say, “Well, plainly they must be desperate…”
But I was tempted. The university was offering to pay all my expenses… plane ticket and beer money… and I’ve always wanted to see both Cork and its rather famous university. What’s more, the “Philosoph,” as the UCC Philosphical Society is known, has had quite a few famous speakers and debaters address it, including most Irish politicians and even the prime minister.
However, in the end, sanity prevailed. The debate was right about the same time that I am taking my whole extended family on a 10-day trip to Rome. If I accepted, I would think about nothing else for three weeks and would get nothing done. It was just too much. I reluctantly declined.
Alas, that hasn’t stopped me from thinking about what I would have said. Why should “this house” reject atheism? Here are a few ideas off the top of my head:
1. That This House Would Reject Atheism because atheism is, by its very nature, irrational, and universities should not encourage more irrationality than they already do. Logically speaking, atheism is what is known as a “universal negation” — such as “there are no gods” or “there are no purple toads” — that is impossible to prove empirically. (That’s one of the few things I remember from the Introduction to Logic class I took in college.) Therefore, precisely because atheism is illogical, an assertion of empirical fact without empirical evidence, it should be rejected as the official doctrine of any university regardless of its religious orientation.
2. That This House Would Reject Atheism because atheism, unlike agnosticism or theism, promotes the closing of the human mind rather than the opening up of it. Atheism asserts dogmatically that the case for God is not “unproven” but CLOSED — Science has “settled it,” rather like climate change — and this closed-mindedness represents the very antithesis of the scientific spirit that universities are supposed to inculcate in the young and foolish.
3. That This House Would Reject Atheism because atheism, in its most popular contemporary forms — as represented by such authors as Richard Dawkins, Sam Harris, Christopher Hitchens and Daniel Dennett — offers primarily insults, not arguments. The so-called “new atheist” writers merely belittle the vast majority of mankind as intellectually dim-witted compared to, well, themselves. “Atheism is nothing more than the noises reasonable people make when in the presence of religious dogma,” said Sam Harris in The End of Faith, which is the closest he actually gets to mounting a philosophical argument of any kind. Phillip Roth once said “Roman Catholicism would insult the intelligence of a gorilla” just as Clarence Darrow, the famous scourge of creationism, noted that he “doesn’t believe in God for the same reason I don’t believe in Mother Goose.” Danniel Dennett famously proposed that atheists call themselves “brights” to contrast themselves with the “dims” who believe in God. Universities should not promote an ideology that merely insults or belittles the vast majority of human beings on the planet.
4. That This House Would Reject Atheism because atheism has, historically and more ominously, been an force for intellectual intolerance and, at times, political oppression. The atheism of the French Revolution led inexorably to the guillotine… just as the atheism of Stalin, Mao and Pol Pot led to concentration camps and genocide. The New Atheists, like Sam Harris and Dawkins, while not as murderous as their intellectual forebears, nevertheless display the same authoritarian streak. “I hope to show that the very ideal of religious tolerance — born of the notion that every human being should be free to believe whatever he wants about God — is one of the principal forces driving us toward the abyss,” Harris writes (The End of Faith, p. 15). Universities should not embrace any ideology that asserts human beings should NOT be free to believe whatever he wishes about God.
5. That This House Would Reject Atheism because atheism’s version of empiricism represent a crabbed, discredited 19th century “scientism” that rejects out of hand virtually every intellectual endeavor not conducted with lab equipment. In philosophical terms, atheism adopts a positivistic epistemology that is intellectually indefensible (if it can’t be quantified or measure with instruments, it doesn’t exist or isn’t worth thinking about). Universities in the 21st century should not revert to philosophical doctrines that the vast majority of actual scientists, philosophers and university professors no longer accept.
6. That House Would Reject Atheism because atheism’s principal argument — that order and complexity in the universe can be better explained as the result of chance than intelligence — merely asserts what it must prove. Anyone who has studied Richard Dawkins’s writings knows this. Dawkins’s primary argument is that the overwhelming fact of order, complexity and design in the universe is merely an ILLUSION that can be explained as the result of random events. However, he does not then explain them. Instead, all Dawkins does is show that the theory of evolution offers a plausible account of how biological change could occur through random forces — and then merely asserts that a similar mechanism must exist for the physical universe (what he terms a “crane”), even though he concedes none is known. It is just as logically plausible that order and complexity in the universe are the result of intelligence as it is that they are the result of a yet-to-be-determined random mechanism. Universities should not be promoting as established fact what is, instead, merely hypotheses.
7. That This House Would Reject atheism because atheism’s secondary argument, that an appeal to an intelligent creator involves circular reasoning, is logically fallacious. This secondary argument is the closest Dawkins comes to an actual philosophical argument (Hitchens and Harris don’t even bother with philosophy: they merely throw insults and glib one-liners). Dawkins claims that, while postulating the existence of an intelligent creator does make at least a little intellectual sense, given the existence of so much interlocking complexity in the universe, nevertheless such a hypothesis is intellectually indefensible because it merely begs the question, “Well, then who created God?” But in fact it does no such thing: If I find a baby on my doorstep, and I assert that SOMEONE must have put him there, the fact that I cannot explain who put that someone in a position to put the baby there does not justify me asserting that NO ONE put the baby there… the baby must have appeared randomly out of nowhere.
8. That This House Would Reject Atheism because atheist, at its very root, involves a fundamental rejection of an empirical, even metaphysical fact that all intelligent human beings (except atheists) accept: that nothing happens without a cause. To accept atheism is to believe that things happen without purpose, certainly, but also without any cause whatsoever: The universe “just is,” as Dawkins puts it. While this is logically possible, it contradicts the overwhelming evidence of our own eyes and even of all scientific inquiry: Virtually everything we know about DOES have a cause. Science was born precisely because the Christian belief in divine “laws” gave early modern scientists the stamina to search for underlying causes they were certain MUST exist. Had western Europe embraced atheism, as the Chinese embraced the mysterious Tao, science as we know it would never have been born. Universities should recognize the role that religious belief has played in the intellectual development of mankind… rather than merely belittling it as atheism does.
I would like to briefly examine the claim, made by advocates of Neo-Darwinism and others, that advances in contemporary systems theory now give a rational explanation for the development of highly complex structures in the universe without recourse to the hypothesis of a Divine Creator.
Further, I will show that such claims, while purporting to be based on the evidence of empirical science, are, as certain postmodern philosophers of science have shown, metaphysical assertions. I will offer a few brief remarks on how advances in the mathematics of complex systems (illustrated by cybernetics and so-called chaos theory) actually can be reconciled with a theory of theistic evolution. Finally, I will discuss how the “critical realist” philosophy of the Canadian Jesuit cognitional theorist and theologian, Bernard J.F. Lonergan, offers a coherent response to the dogmatic scientism of the neo-Darwinists, on the one hand, and the simplistic “pseudo-science, relativism and nihilism” of postmodern philosophy on the other. You do not have to throw out the baby of logical coherence and rationality with the bath water (rightly critiqued by postmodern theorists) of metaphysical naturalism and scientism.
The Blind Watchmaker
Many contemporary Christians, especially those without training in mathematics, the metatheory of logic or the philosophy of science, are under the impression that the teleological argument for the existence of God has been definitively refuted by new developments in cybernetic systems theory, fractal geometry and evolutionary biology. This refutation is symbolized, in popular culture, by the widely influential book, The Blind Watchmaker, written in 1986 by the British zoologist Richard Dawkins. Dawkins purports, and is purported by many others, to have delivered an analytical coup de grâce to the classic “argument from design” as formulated, for example, by the 18th century theologian William Paley. Paley argued that, just as a watch is far too complex and functional to have simply sprung into existence by chance, and so provides indubitable evidence of the existence of an intelligent watchmaker, so, too, the universe’s far greater complexity and functionality are proof of purposeful design by a Divine Watchmaker.
Au contraire, says Dawkins. The complexity and apparent functionality of the universe only give the illusion of design and planning. In reality, the intricate complexity inherent in the universe’s systems is merely the result of blind, unconscious natural forces. “There may be good reasons for belief in God, but the argument from design is not one of them,” he writes.
“Despite all appearances to the contrary, there is no watchmaker in nature beyond the blind forces of physics, albeit deployed in a very special way. Natural selection, the blind, unconscious, automatic process which Darwin discovered, and which we now know is the explanation for the existence and apparently purposeful form of all life, has no purpose in mind. It has no mind and no mind’s eye. It does not plan for the future. It has no vision, no foresight, no sight at all. If it can be said to play the role of watchmaker in nature, it is the blind watchmaker.”
Advanced Systems Theory and Evolution
Dawkins’s assertion, that random mutations alone explain what he calls “cumulative selection” – the gradual evolution of more and more complex biological structures – has seemingly been buttressed in recent years by rapid developments in systems theory, aided, of course, by the analytical tools used in creating new supercomputers . For our purposes, systems theory actually has two relevant components.
(1) Chaos theory, pioneered by such scientists as Edward Lorenz, is the scientific study of simple, nonlinear, dynamic systems that give the appearance of random activity but which are actually the result of simple deterministic forces. A practical example of chaos theory is fractal geometry and the study of snowflakes, which show how simple processes can give rise to apparently random variations of immense complexity.
(2) Cybernetics, developed by the Hungarian mathematician John von Neumann (d. 1957) and further developed by the Nobel Prize-winning chemist Ilya Prigogine at the University of Brussels, is the scientific study of what are called “self-organizing systems.” Self-organizing systems are complex assemblies that generate simple emergent behaviors. Practical applications of self-organizing systems studies can be found in the study of cellular automata (self-reproducing systems), neural networks (artificial learning), genetic algorithms (evolution), artificial life (agent behavior), fractals (mathematical art) and physics (spin glasses).
Interestingly enough, systems theory is not really the stalwart alley that advocates of a blind, random universe believe it to be. And in fact, many Neo-Darwinist theoreticians now recognize this. The inability of Darwinist and Neo-Darwinist theories to convincingly explain the origin of life from non-life is part of the reason why “self-organizing systems” are among the hottest topics in the philosophy of science. Further, analysts who study self-organizing systems often insist that they resist reductionist explanations, indeed that the properties that emerge are not explicable from a purely reductionist viewpoint. This is why systems theory has been so enthusiastically embraced by advocates of process theology, because it provides for both a scientific study of the complex processes of nature and yet does not reject the existence of a Divine Intelligence that set these processes in motion in the first place.
In other words, systems theory, like any branch of science, can be viewed as merely the rigorous, mathematically-based description of actual processes that exist in nature. It describes precisely how these processes work themselves out in practice – simple forces giving rise to seemingly random, complex structures (chaos theory) and complex systems giving rise to simple behaviors (self-organizing systems). Neo-Darwinists want to pretend that these bare empirical descriptions alone constitute a rational explanation for the complexity of the universe, but of course that goes far beyond the scope of systems theory as an empirical, descriptive discipline.
The Philosophical Temptation
That is why, when all is said and done, Dawkins, like many scientists before him, can’t resist abandoning science for philosophy. The crux of Dawkins’ argument in favor of a blind, random universe is not, as he imagines, scientific analysis but a metaphysical assertion.
Dawkins’ rejection of theism is actually the old objection that recourse to an original “first cause” is essentially a circular argument. After hundreds of pages in which he attempts to show how the complex structures of nature are the result of natural selection and random mutation, he must, in the end, resort to a philosophical argument. “To explain the origin of the DNA/protein machine by invoking a supernatural Designer is to explain precisely nothing, for it leaves unexplained the origin of the Designer,” he says. “You have to say something like, ‘God was always there,’ and if you allow yourself that kind of lazy way out, you might as well just say, ‘DNA was always there,’ or ‘Life was always there,’ and be done with it.”
But Dawkins, like many scientists before him, is making a fundamental epistemological error here. The inability to explain one reality (e.g., God) does not, in and of itself, free one from the necessity of explaining other realities. If that were the case, then one should abandon science altogether. Advocates for the argument from design assert that it is illogical, and contrary to all observable phenomena, to assert that something can happen without a cause. That human beings cannot, at this stage, explain what caused God does not logically mean that we can rationally assert that things happen without a cause. If Dawkins can prove that a sophisticated robot factory exists that can produce, blindly, a perfectly made watch – and scientists and engineers can describe in detail the complex processes by which the robot factory produces these watches – that does not answer the obvious question of who or what made the robot factory. It merely begs the original question.
If anything, chaos theory and its related disciplines are only further strengthening this fundamental metaphysical axiom that all things must have a cause, showing how the apparent randomness of certain natural processes are not, in fact, random at all – but only appear to be random. Chaotic systems appear disorderly, perhaps random, but are not. Underneath their random behavior lies an order and a pattern that, with the aid of new supercomputers, can now be for the first time actually tracked mathematically. It was Lorenz’s discovery that, as his famous metaphor put it, the flapping of a butterfly’s wings in Ecuador may affect weather patterns in Alaska. The Alaskan weather patterns may appear random, and without cause, but that is only because of the inability of human minds to know all of the deterministic processes involved.
Advocates of Neo-Darwinism and so-called creation science rarely agree on anything, but they are often united in their contempt for what is called theistic evolution. Dawkins asserts that any attempt to bring God into the scientific picture is “transparently feeble” because “science” can show how organized complexity arises spontaneously. As we have seen, science does no such thing: It merely describes the processes by which complex systems arise, without explaining what set these processes in motion in the first place. Creationists, for their part, object that theistic evolution is, in effect, incoherent, an ungodly pact with the devil in which Christians compromise their fundamental belief in divine providence. Typically, theistic evolution is described as evolution guided by God. But, creationists argue, this is a contradiction in terms: If it is evolution, then it is a theory of change in which natural processes are governed by random chance. If it is theistic, then change occurs through divine guidance.
But this presents a false dichotomy. As some of the early “fundamentalist” theoreticians saw (A.C. Dizon, Louis Meyer, R.A. Torrey), there is nothing inherently anti-theistic in a theory of Creation by which God created the universe using evolutionary processes. Christians have long accepted the notion, in physics and chemistry, that there exist observable, seemingly deterministic laws of nature. What is the essential difference between laws which govern atomic particles and, say, the complex DNA encoding by which a single cell develops into a newborn child?
Moreover, it is not even clear, from a logical standpoint, why a theistic worldview could not accommodate elements of randomness as part of the universe’s physical processes – why, contrary to Einstein’s famous assertion, God could not play dice.
Purpose, design and planning do not, in and of themselves, rule out an element of randomness. Indeed, randomness can be part of a design and purpose. College officials may plan and organize a football game – to be played according to fixed, unvarying rules – and yet require, as part of their plan, that the first kick-off be determined by a random flip of a coin. God, for His part, could conceivably create a universe in which randomness can and does occur – not least in the free choices of spiritual beings not entirely bound by deterministic forces. In other words, even if Quantum Theory (to take one example) is somehow able to prove the existence of irreducibly probabilistic laws – in which random events simply occur apparently without a cause – that could still be seen within the boundaries of natural laws established by a Divine Creator.
This is what the Canadian Jesuit theologian Bernard J.F. Lonergan set out to show in his classic work Insight: A Study of Human Understanding. Lonergan thought through the implications of a shift from a classical to a statistical worldview, from a mechanistic cosmology to one in which universal order is constituted by emergent probability. Lonergan argued that a world process, governed by schemes of recurrence best described by the laws of probability, is still a world of design and purpose. Intelligence can both discern, and, ultimately, create, an underlying purpose in an aggregate of systems – a system of systems – that operate seemingly independently.
Systems theory and chaos theory have, in fact, proven Lonergan’s basic point: Systems are fundamentally “schemes of recurrence” that, while often appearing to be random, and which are best described by statistical probability, nevertheless exhibit patterns of cumulative complexity.
In the end, therefore, we begin where we started. Popularizing scientists such as Dawkins are justly proud of their new analytical tools. As a methodological starting point, science can and should proceed according to naturalistic presuppositions – lest every scientific mystery be explained away as “God does it.” The purpose of science is to describe the mechanisms discoverable in nature, to discern the patterns observable in what appears to be, to unaided human eyes, random or disorganized events. Chaos theory… and Ilya Prigogine’s self-organizing systems… have demonstrated just how unfathomably complex the processes of nature actually are.
But science, by its very nature, must recognize that its descriptive theories do not, ultimately, explain the origin of the universe. They only describe how the universe works, not how it came into existence or for what purpose. It is the task of the philosophy of religion, and systematic theology, to learn from new disciplines such as chaos theory and propose a new rational synthesis that takes into account the discoveries of these new disciplines and integrate them into classical Christian affirmations about creation. It is by no means clear that we live in a random universe, but if we do, Christian theology can show how the Creator can work His purposes through the “schemes of recurrence” of emergent probability just as He could under the old laws of classic Newtonian mechanics.
Relevance for Apologetics
Ultimately, Christian apologetics must face up to the intellectual challenges posed to it by the culture in which it is operating – and that culture, in the West at least, is dominated by increasingly sophisticated computer technologies and disciplines that call into question both the simple-minded determinism of 19th century modernist science and the “head in the sand” anti-science attitudes of postmodern “critics.” Young people, born with Nokia cell phones in their hands, and struggling with the challenges of mastering ever-more-complex technologies, know that postmodern philosophers are not serious when they deny the existence of objective facts.
Just as there are no atheists in fox holes, so, too, they are no sincere postmodern theoreticians in the cancer ward. When the postmodern theologian is sitting on the examination table, and her physician is explaining that she could have (a) a brain tumor requiring immediate surgery to save her life; or (b) a headache, requiring an aspirin, it’s a good bet that this postmodern theologian will NOT explain to the doctor that, in fact, she rejects the “foundationalist” premises of his science “practices,” that reality is really a social construct and that just because a tumor is “true for him,” it doesn’t follow that it is necessarily true for her. Instead, she will probably demand more tests – thus proving to everyone, including her students, that when push comes to shove she very much believes in objective reality over and above what she thinks about it. She even believes in absolute truth – because, if she takes an aspirin rather than undergoing surgery – and makes the WRONG choice – she will probably die. In her case, at least, the truth matters. Her life depends upon it.
In a similar way, a Christian apologetics that does not display at least as much conviction will not persuade anyone. That is why it is important that theologians today meet the challenges posed by contemporary science and not flee from them into a postmodern humanist ghetto. As I have attempted to argue in this paper, such flight is unnecessary. We have the intellectual resources to meet the challenges posed by contemporary systems theory, evolutionary biology and quantum physics. We do not have to accept either a simplistic naturalism, advocated by proponents of neo-modernism, nor a simplistic postmodern relativism and skepticism. While critiquing the excesses of 19th century modernist science, we do not have throw out the baby of truth with the bath water of scientism and naturalism.
I just finished reading Leo Damrosch’s magisterial 2005 biography of Jean-Jacques Rousseau (Jean-Jacques Rousseau: Restless Genius) and I’ve been thinking a lot about how Rousseau’s vision ties in, or doesn’t tie in, with the problems of modern urban society. (Full disclosure: My wife hates Rousseau because he forced his lifelong mistress, Therese Levasseur, to give up their five children to foundling homes and then had the temerity to instruct women on why they should breastfeed their children and raise them according to his precepts.)
Rousseau, born in Switzerland in 1712, was basically a professional vagabond and loafer who ran away from his home in Geneva at the age of 16, was almost entirely self-taught, and who earned his living through menial jobs, copying musical manuscripts and writing books that both titillated and outraged most of Europe. Rousseau’s basic argument is that “civilization,” far from being an engine of progress and advancement, is actually a corrosive, even destructive force.
Rousseau was original in that he went against what everyone believed about social advancement, the value of science and art, technology and so on. Things aren’t getting better and better as the Enlightenment philosophes taught; they actually getting worse and worse. And nothing is getting worse quite like human beings themselves — who, Rousseau taught, are slowly degenerating from centuries of living in cramped, ugly cities, bad nutrition and the demands that social life imposes.
Rousseau was thus the world’s first hippie.
He championed a more “natural” lifestyle free from the artificial constraints and absurd duties that society demands. Much of what the modern world believes about human beings — from the importance of child-centered education to an emphasis on “authenticity” and natural foods — comes from this strange and highly original thinker.
Although denounced by both Protestant and Catholic religious authorities for his departures from Christian orthodoxy, Rousseau remained, to the chagrin of his agnostic friends, an obstinate believer throughout his life; and his vision of an original “wholeness” and perfection in nature is a kind of secular version of the creation story in Genesis.
Rousseau, like Christian theology, believed that mankind was created good… but that, through the actions of men and women, that natural perfection became disfigured. Here is how Rousseau explains it in his strange book on education, Emile:
Everything is good as it leaves the hands of the Author of things; everything degenerates in the hands of man. He forces one soil to nourish the products of another, one tree to bear the fruit of another. He mixes and confuses the climates, the elements, the seasons. He mutilates his dog, his horse, his slave. He turns everything upside down; he disfigures everything; he loves deformity, monsters. He wants nothing as nature made it, not even man; for him, man must be trained like a school horse; man must be fashioned in keeping with his fancy like a tree in his garden.
Powerful stuff! I’ve always thought that our (my!) modern obsession with health can be seen in a Rousseau-like light, as a kind of primal “therapy” to correct the imbalances, weaknesses and deformities that our indolent modern lifestyles have bequeathed to us.
Rousseau was well aware that his “natural man” may never have actually existed… and that in reality primitive life may have been the way Thomas Hobbes described it as being (brutish, nasty and short)… but he imagined what human beings might have been like free from the artificial conveniences of cities and bad food.
He imagined “natural man” as strong, free, healthy, honest and direct. As imagined in his strange romantic novel Julie, Rousseau wanted to help people to get back, in a sense, to Middle Earth, to a time before the furnaces of Mordor destroyed the natural beauty of Man and his environment. Who can’t sympathize, at least a little, with this primeval longing?
The new atheist crusaders (such as Christopher Hitchens, Sam Harris and Richard Dawkins) like to pretend that the concept of universal human rights just popped out of thin air in the 17th and 18th century, the creation of the agnostic and atheist thinkers of the French Enlightenment.
But the truth is precisely the opposite: The recognition of universal human rights is one of the preeminent legacies of the Bible and the two religions, Judaism and Christianity, centered around it.
We forget that the great English political philosopher John Locke – widely credited with working out the first systematic theory of natural (human) rights in modern times – based most of his arguments on Biblical precedents.
In his First Treatise on Civil Government, which is more Biblical exegesis than philosophy, Locke argued that human rights are not privileges dispensed or withdrawn at the discretion of the State. Rather, they are gifts from God which no prince or potentate, no state or sovereign, may take away.
Thomas Jefferson relied primarily upon Locke’s insights, and not those of French Enlightenment thinkers, when penning the Declaration of Independence — which, for the first time, proposed founding a state upon this fundamental, God-given, Biblically-based idea: “We hold these truths to be self-evident: that all men are created equal and are endowed by their Creator with certain unalienable rights…”
There is also some empirical evidence that respect for human rights grew out of the Biblical heritage when comparing the “freedom” rankings produced by the international democracy watchdog organization Freedom House – co-founded in 1941 by Eleanor Roosevelt — with the percentage of the population in each country ranked as Christian by the CIA. (The CIA designation refers more to “nominal” rather than “practicing” Christians but nevertheless is illuminating when it comes to the cultural context that produces civil liberties.)
Each year, Freedom House publishes its annual survey which attempts to measure the degree of democracy and freedom in every nation of the world, producing “scores” that represent the levels of political rights and civil liberties in each state and territory – from 1 (most free) to 7 (least free). Out of 194 countries and territories surveyed for 2006, 73 countries (38 percent) were rated Free, 54 (28 percent) were rated Partly Free, and 67 (34 percent) were rated Not Free. (This is a marked improvement over 1980 when only 23.9% of nations were rated Free… 24.8% were rated Partly Free… and 51.3% were Not Free.)
Among the countries ranked as the most free (1) and with the highest respect for civil liberties (1) are Australia (66% Christian), Austria (78.3%), the United States (79%), Canada (66%), Costa Rica (92%), Belgium (100%), Chile (100%), Denmark (98%), France (90%), Finland (86%), Germany (68%), Great Britain (71.6%), Ireland (93%), Iceland (93%), Norway (90.1%), Portugal (98%), Spain (94%), Switzerland (78.9%), Sweden (87%), Italy (90%) and New Zealand (79.5%).
These are not fixed absolutes, of course. There are exceptions.
Haiti, for example, is listed as 96% Christian by the CIA yet has among the very worst record for human rights and political freedoms. The same is true of Rwanda: Rated 93.6% Christian by the CIA, it scores a 6 out of 7 for political freedom and a 5 for civil liberties. Some Latin American countries, just emerging from years of civil war or military dictatorship, have higher Christian populations but somewhat restricted freedom. For example, El Salvador, which is 83% Roman Catholic, is rated “free” but only scores a 3 for civil liberties. Mexico, which is 95% Christian despite its historically anti-Christian government, is rated 2 for political freedom and civil liberties.
But at the opposite end of the spectrum, those countries with the smallest percentage of Christians are rated overwhelming “not free” by Freedom House and are among those with the worst ratings for civil liberties by far – but again, with a few interesting exceptions. Almost all of the Islamic countries have very small Christian populations and rank near the bottom when it comes to political freedom and civil rights – including Saudi Arabia (0% Christian and no political freedom), Sudan (5% Christian and no political freedom), Libya (3% Christian and no political freedom), Iran (1% Christian and no political freedom), and so on.
Current Communist regimes, such as China (4% Christian), Cambodia (0%), North Korea (0%), Laos (1.5%) and Vietnam (7.2%), also have very low Christian populations and virtually no freedom whatsoever.
Interestingly enough, although some of the former Communist states are still ranked as “not free” or “partly free,” including Russia (only 15% Christian) and Albania (30%); a number of former Communist countries with sizable Christian populations are now ranked near the top in terms of civil liberties and political liberty. Once these countries were freed of Soviet military domination, they quickly adopted laws protecting political liberty and basic human rights. These include Bulgaria (83.8% Christian), which scores in the top rank for political freedom and a 2 for civil liberties; Poland (91.2% Christian), which now scores 1 for both civil liberties and political freedom; Hungary (74% Christian), which now scores 1s as well; and Lithuania (85%), which now scores 1s; Romania (99%), which scores 2s;
There are also some countries that are neither Christian nor communist but which nevertheless score badly in terms of civil rights and political freedom, including Bhutan (0% Christian), rated 6 for civil liberties and 5 for political freedom; Nepal (0.2% Christian), rated 6 for political freedom and 5 for civil liberties; the Maldives (0% Christian), rated 6 for political freedom and 5 for civil liberties; Guinea (8%), rated 6 for political freedom and 5 for civil liberties; and Malaysia (7%), which scores 4s.
Finally, there are a handful of countries with extremely low Christian populations but which nevertheless score high in terms of political freedom and civil liberties. These are Israel (2%), which scores 1 for both political freedom and civil liberties; Japan (0.7%), which also scores 1s; Taiwan (4%), which scores 1s; South Korea (26%), which scores 1s; and India (2.3%), which scores 2s.
Clearly, therefore, a sizable Christian population is not a requirement for civil liberty and political freedom, but you could still make the case that those non-Christian societies that have a solid record on human rights and political liberty benefited from prolonged contact with, and influence by, Christian nations.
Israel is a special case because respect for fundamental human rights and political freedom is a preeminently Jewish cultural legacy, one that is implicit in the Torah and which Israel bequeathed to Christianity. Japan, of course, had its western-style democratic government more or less imposed upon it by U.S. Occupation Forces following its defeat in the Second World War – but what was imposed by force has now taken root and grown into a distinctly Japanese style of liberal democracy. India, which was a colony of Great Britain’s for more than 175 years, and which today still prides itself on its membership in the Commonwealth and its record as preeminent cricket champions, is today a federal republic with a president, prime minister, a bicameral Parliament and a legal system based on English common law. While only 2.3% Christian, India has adopted many of the cultural values of liberal democracy and retains, like other members of the Commonwealth, remarkably strong ties to Britain.
In conclusion, therefore, we can say that the enemies of Christianity, Judaism and the Bible have it exactly backwards: Far from being a threat to liberal democracy and political freedom, the biblical heritage is, in fact, the intellectual matrix out of which both arose.
The values and beliefs that permeate the Bible — the notion that all human beings are equal in the eyes of God and that no king or ruler may claim unquestioned obedience — were the proximate cause for the development of a religious theory of liberty and the recognition of universal human rights. It is certainly not true, as atheist crusaders claim and as the freedom rankings from Freedom House refute, that commitment to Biblical religion results in intolerance and oppression. In fact, with a few exceptions, the countries on earth that practice freedom of religion and social tolerance are those with large Christian or Jewish populations.
So if, as Albert Einstein insisted, Biblical religion was the necessary intellectual precondition for the gradual development of scientific method, how did the myth of the “scientific revolution” come about?
One reason: For the past 400 years, the partisans of irreligion-from the Marquis de Sade to Sam Harris and Richard Dawkins-have deliberately misrepresented the way science actually developed in the West as part of their ideological crusade against Judaism and Christianity.
What’s worse, the partisans of atheism have been intellectually dishonest in the extreme: They have tried to take credit for the development of science when, in fact, they had little if anything to do with it.
Many of the most ideological and dogmatic of atheist crusaders, although continually referring to science, and seeking to use science to justify their own philosophical assumptions and declarations, were not scientists themselves.
In fact, many of the most famous anti-Christian polemicists of the last 200 years-who sought to use science to justify their unbelief-never themselves set foot in a laboratory or conducted a single field observation.
That includes the Marquis de Sade (a writer), Percy Bysshe Shelley (a poet), Friedrich Nietzsche (a philologist by training), Algernon Swinburne (a poet), Bertrand Russell (a philosopher), Karl Marx (a philosopher), Robert Ingersoll (a lecturer), George Bernard Shaw (a playwright), Vladimir Lenin (a communist revolutionary), Joseph Stalin (a communist dictator), H.L. Mencken (a newspaper columnist), Jean-Paul Sartre (a philosopher), Benito Mussolini (a fascist dictator), Luis Buñuel (Spanish filmmaker), Clarence Darrow (a lawyer), Ayn Rand (a novelist), Christopher Hitchens (a journalist), Larry Flynt (a pornographer), George Soros and Warren Buffett (investors), and Penn and Teller (magicians).
In dramatic contrast, most of the true giants of empirical science-the people who founded entire scientific disciplines or who made landmark scientific discoveries-were primarily devout Christians who believed that their scientific studies, far from being in conflict with their religious faith, ultimately was dependent upon it.
In his book, The God Delusion, atheist crusader Richard Dawkins once again tries to reclaim Einstein for atheism, citing quotations at length in which Einstein denied belief in a personal God, but the truth is that Einstein was struggling to enunciate a middle position between atheism and classic theism and couldn’t seem to make up his mind how to describe it. “There is every reason to think that famous Einsteinisms like `God is subtle but he is not malicious’ or `He does not play dice’ or `Did God have a choice in creating the Universe?’ are pantheistic, not deistic, and certainly not theistic,” Dawkins writes. “`God does not play dice’ should be translated as `Randomness does not lie at the heart of all things.’ `Did God have a choice in creating the Universe?’ means `Could the universe have begun in any other way?’ Einstein was using `God’ in a purely metaphorical, poetic sense.”
Perhaps. Yet when Einstein was explicitly asked whether he believed in “Spinoza’s God”-meaning an impersonal Deistic God-this is what he said:
“I can’t answer with a simple yes or no. I’m not an atheist and I don’t think I can call myself a pantheist. We are in the position of a little child entering a huge library filled with books in many different languages. The child knows someone must have written those books. It does not know how. The child dimly suspects a mysterious order in the arrangement of the books but doesn’t know what it is. That, it seems to me, is the attitude of even the most intelligent human being toward God. We see a universe marvelously arranged and obeying certain laws, but only dimly understand these laws. Our limited minds cannot grasp the mysterious force that moves the constellations.”
Not an orthodox Jew, certainly, but hardly a snide atheist ideologue along the lines of Dawkins, Chistopher Hitchens, or Sam Harris, either.
To sum up: We have two rival claims.
On the one hand, we have scientific (let’s be charitable) amateurs-from Nietzsche and Ingersoll to Chrisopher Hitchens and Sam Harris-insisting that science and Biblical religion are fundamentally incompatible.
On the other hand, you have the greatest minds in the history of science, the people who actually made most of the discoveries that created modern science to begin with-folks like Galileo, Sir Isaac Newton, Gregor Mendel, Max Planck, Louis Pasteur, Werner Heisenberg, and even Albert Einstein-who insist that, not only is religion not at odds with science, but Biblical religion is what made science possible in the first place.
Whom should we believe?
Should we believe the attorney Clarence Darrow, who said “I don’t believe in God because I don’t believe in Mother Goose” … or should we believe Albert Einstein who said, “My religion consists of a humble admiration of the illimitable superior spirit who reveals himself in the slight details we are able to perceive with our frail and feeble mind”?
Frankly, in the great debate over religion and science, faithful Christians and Jews stand with the more enlightened half – those who make the actual discoveries in science.