Saturday, March 31, 2018

John Updike on the Resurrection

The American novelist John Updike (1932-2009) was not only a great writer, he was something of a paradox. The recipient of two Pulitzers and many other prestigious awards, he wrote stories that some consider at least mildly pornographic, stories which reflect his own marital infidelities, but he seems nevertheless to have been devoutly Christian.

A poem he wrote in 1960 titled Seven Stanzas at Easter reflects his piety. Updike makes the point that if one is a believer he/she should really believe. No wishy-washy liberal protestantism for him. The resurrection of Christ was either an actual, historical, physical return to life of a man who had been actually, historically, physically dead or else the whole story doesn't really matter at all.

None of this "Jesus' body actually, permanently decomposed, but he rose in the sense that his spirit lived on in the hearts of his followers" nonsense for Updike. Either it happened objectively or Christianity is a fraud.

Seven Stanzas at Easter

Make no mistake: if He rose at all
it was as His body;
if the cells’ dissolution did not reverse, the molecules
reknit, the amino acids rekindle,
the Church will fall.

It was not as the flowers,
each soft Spring recurrent;
it was not as His Spirit in the mouths and fuddled
eyes of the eleven apostles;
it was as His flesh: ours.

The same hinged thumbs and toes,
the same valved heart
that–pierced–died, withered, paused, and then
regathered out of enduring Might
new strength to enclose.

Let us not mock God with metaphor,
analogy, sidestepping transcendence;
making of the event a parable, a sign painted in the
faded credulity of earlier ages:
let us walk through the door.

The stone is rolled back, not papier-mâché,
not a stone in a story,
but the vast rock of materiality that in the slow
grinding of time will eclipse for each of us
the wide light of day.

And if we will have an angel at the tomb,
make it a real angel,
weighty with Max Planck’s quanta, vivid with hair,
opaque in the dawn light, robed in real linen
spun on a definite loom.

Let us not seek to make it less monstrous,
for our own convenience, our own sense of beauty,
lest, awakened in one unthinkable hour, we are
embarrassed by the miracle,
and crushed by remonstrance.
Have a meaningful Resurrection Day.

Friday, March 30, 2018

A Parable for Good Friday

Some time ago I did a post based on a remark made by a woman named Tanya at another blog. I thought that Good Friday might be a good time to run the post again, slightly edited.

Tanya's comment was provoked by an atheist at the other blog who had issued a mild rebuke to his fellow non-believers for their attempts to use the occasion of Christian holidays to deride Christian belief. In so doing, he exemplified the sort of attitude toward those with whom he disagrees that one might wish all people, atheists and Christians alike, would adopt. Unfortunately, Tanya spoiled the mellow, can't-we-all-just-get-along, mood by manifesting a petulant asperity toward, and an unfortunate ignorance of, the traditional Christian understanding of the atonement.

She wrote:
I've lived my life in a more holy way than most Christians I know. If it turns out I'm wrong, and some pissy little whiner god wants to send me away just because I didn't worship him, even though I lived a clean, decent life, he can bite me. I wouldn't want to live in that kind of "heaven" anyway. So sorry.
Tanya evidently thinks that "heaven" is, or should be, all about living a "clean, decent life." Perhaps the following tale will illustrate the shallowness of her misconception:
Once upon a time there was a handsome prince who was deeply in love with a young woman. We'll call her Tanya. The prince wanted Tanya to come and live with him in the wonderful city his father, the king, had built, but Tanya wasn't interested in either the prince or the city. The city was beautiful and wondrous, to be sure, but the inhabitants weren't particularly fun to be around, and she wanted to stay out in the countryside where the wild things grow.

Even though the prince wooed Tanya with every gift he could think of, it was all to no avail. She wasn't smitten at all by the "pissy little whiner" prince. She obeyed the laws of the kingdom and paid her taxes and was convinced that that should be good enough to satisfy the king's demands.

Out beyond the countryside, however, dwelt dreadful, Orc-like creatures who hated the king and wanted nothing more than to be rid of him and his heirs. One day they learned of the prince's love for Tanya and set upon a plan. They snuck into her village, kidnapped Tanya, and sent a note to the king telling him that they would be willing to exchange her for the prince, but if their offer was refused they would kill Tanya.

The king, distraught beyond words, related the horrible news to the prince.

Despite all the rejections the prince had experienced from Tanya, he still loved her deeply, and his heart broke at the thought of her peril. With tears he resolved that he would do the Orcs' bidding. The father wept bitterly because the prince was his only son, but he knew that his love for Tanya would not allow him to let her suffer the torment to which the ugly people would surely subject her. The prince asked only that his father try his best to persuade Tanya to live safely in the beautiful city once she was ransomed.

And so the day came for the exchange, and the prince rode bravely and proudly bestride his mount out of the beautiful city to meet the ugly creatures. As he crossed an expansive meadow toward the camp of his mortal enemy he stopped to make sure they released Tanya. He waited until she was out of the camp, fleeing toward the safety of the king's city, oblivious in her near-panic that it was the prince himself she was running past as she hurried to the safety of the city walls. He could easily turn back now that Tanya was safe, but he had given his word that he would do the exchange, and the ugly people knew he would never go back on his word.

The prince continued stoically and resolutely into their midst, giving himself for Tanya as he had promised. Surrounding him, they pulled him from his steed, stripped him of his princely raiment, and tortured him for three days in the most excruciating manner. Not once did any sound louder than a moan pass his lips. His courage and determination to endure whatever agonies to which he was subjected were strengthened by the assurance that he was doing it for Tanya and that because of his sacrifice she was safe.

Finally, wearying of their sport, they cut off his head and threw his body onto a garbage heap.

Meanwhile, the grief-stricken king, his heart melting like ice within his breast, called Tanya into his court. He told her nothing of what his son had done, his pride in the prince not permitting him to use his son's heroic sacrifice as a bribe. Even so, he pleaded with Tanya, as he had promised the prince he would, to remain with him within the walls of the wondrous and beautiful city where she'd be safe forevermore.

Tanya considered the offer, but decided that she liked life on the outside far too much, even if it was risky, and besides, she really didn't want to be in too close proximity to the prince. "By the way," she wondered to herself, "where is that pissy little whiner son of his anyway?"
Have a meaningful Good Friday. You, too, Tanya.

Thursday, March 29, 2018

Orwell and Huxley

In 1985 Neil Postman wrote a book titled Amusing Ourselves to Death which has become something of a classic of cultural criticism. Its message seems to be just as timely today as it was thirty years ago. Here's the introduction:
We were keeping our eye on 1984. When the year came and the prophecy didn't, thoughtful Americans sang softly in praise of themselves. The roots of liberal democracy had held. Wherever else the terror had happened, we, at least, had not been visited by Orwellian nightmares.

But we had forgotten that alongside Orwell's dark vision, there was another - slightly older, slightly less well known, equally chilling: Aldous Huxley's Brave New World. Contrary to common belief even among the educated, Huxley and Orwell did not prophesy the same thing. Orwell warns that we will be overcome by an externally imposed oppression. But in Huxley's vision, no Big Brother is required to deprive people of their autonomy, maturity and history. As he saw it, people will come to love their oppression, to adore the technologies that undo their capacities to think.

What Orwell feared were those who would ban books. What Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism.

Orwell feared that the truth would be concealed from us. Huxley feared the truth would be drowned in a sea of irrelevance. Orwell feared we would become a captive culture. Huxley feared we would become a trivial culture, preoccupied with some equivalent of the feelies, the orgy porgy, and the centrifugal bumblepuppy.

As Huxley remarked in Brave New World Revisited, the civil libertarians and rationalists who are ever on the alert to oppose tyranny "failed to take into account man's almost infinite appetite for distractions". In 1984, Huxley added, people are controlled by inflicting pain. In Brave New World, they are controlled by inflicting pleasure. In short, Orwell feared that what we hate will ruin us. Huxley feared that what we love will ruin us.

This book is about the possibility that Huxley, not Orwell, was right.
George Orwell (1903-1950)
Aldous Huxley (1894-1963)
Of course it's possible that they were both right, that they were each "seeing" one side of the totalitarian coin.

In an age when we suffer separation anxiety if we're unable to access our devices for more than a few minutes, an age filled with the trivialities of an entertainment culture which distract us from thinking about what really matters in life, an age when only half the population cares enough to vote and only half of voters care enough to educate themselves on who the candidates are and what they'll do if elected, an age when the centuries long Islamic war against the West has been resuscitated while the West deludes itself into thinking that the current crisis is just an aberration, an age when families and faith are alike disintegrating, an age when too many schools don't teach anything worth learning and too many students don't read anything worth reading, an age when our society is increasingly balkanized along racial, ideological, and ethnic lines - in such an age we are more than at any time in our history a society dazed on Huxleyian soma and vulnerable to Orwellian tyranny.

Wednesday, March 28, 2018

On the Possibility of Miracles

The Christian world prepares to celebrate this Sunday what much of the rest of the Western world finds literally incredible, the revivification of a man 2000 years ago who had been dead for several days. Modernity finds such an account simply unbelievable. It would be a miracle if such a thing happened, moderns tell us, and in a scientific age everyone knows that miracles don't happen.

If pressed to explain how, exactly, science has made belief in miracles obsolete and how the modern person knows that miracles don't happen, the skeptic will often fall back on an argument first articulated by the Scottish philosopher David Hume (d.1776). Hume wrote that miracles are a violation of the laws of nature and as a firm and unalterable experience tells us that there has never been a violation of the laws of nature it follows that any report of a miracle is most likely to be false. Thus, since we should always believe what is most probable, and since any natural explanation of an alleged miracle is more probable than that a law of nature was broken, we are never justified in believing that a miracle occurred.

It has often been pointed out that Hume's argument suffers from the circularity of basing the claim that reports of miracles are not reliable upon the belief that there's never been a reliable report of one. However, we can only conclude that there's never been a reliable report of one if we know a priori that all historical reports are false, and we can only know that if we know that miracles are impossible. But we can only know they're impossible if we know that all reports of miracles are unreliable.

But set that dizzying circularity aside. Set aside, too, the fact that one can say that miracles don't happen only if one can say with certainty that there is no God.

Let's look instead at the claim that miracles are prohibitively improbable because they violate the laws of nature.

A law of nature is simply a description of how nature operates whenever we observe it. The laws are often statistical. I.e. if molecules of hot water are added to a pot of molecules of cold water the molecules will tend to eventually distribute themselves evenly throughout the container so that the water achieves a uniform temperature. It would be extraordinarily improbable, though not impossible, nor a violation of any law, for the hot molecules on one occasion to segregate themselves all on one side of the pot.

Similarly, miracles may not violate the natural order at all. Rather they may be highly improbable phenomena that would never be expected to happen in the regular course of events except for the intervention of Divine will. Like the segregation of warm water into hot and cold portions, the reversal of the process of bodily decomposition is astronomically improbable, but it's not impossible, and if it happened it wouldn't be a violation of any law.

The ironic thing about the skeptics' attitude toward the miracle of the resurrection of Christ is that they refuse to admit that there's good evidence for it because a miracle runs counter to their experience and understanding of the world. Yet they have no trouble believing other things that also run counter to their experience.

For example, modern skeptics have no trouble believing that living things arose from non-living chemicals, that the information-rich properties of life emerged by random chaos and chance, or that our extraordinarily improbable, highly-precise universe exists by fortuitous accident. They ground their belief in these things on the supposition that there could be an infinite number of different universes, none of which is observable, and in an infinite number of worlds even extremely improbable events are bound to happen.

Richard Dawkins, for example, rules out miracles because they are highly improbable, and then in the very next breath tells us that the naturalistic origin of life, which is at least as improbable, is almost inevitable, given the vastness of time and space.

Unlimited time and/or the existence of an infinite number of worlds make the improbable inevitable, he and others argue. There's no evidence of other worlds, unfortunately, but part of the faith commitment of the modern skeptic is to hold that these innumerable worlds must exist. The skeptic clings to this conviction because if these things aren't so then life and the universe we inhabit must have a personal, rather than a mechanistic, explanation and that admission would deal a considerable metaphysical shock to his psyche.

Nevertheless, if infinite time and infinite worlds can be invoked to explain life and the cosmos, why can't they also be invoked to explain "miracles" as well? If there are a near-infinite series of universes, a multiverse, as has been proposed in order to avoid the problem of cosmic fine-tuning, then surely in all the zillions of universes of the multiverse landscape there has to be at least one in which a man capable of working miracles is born and himself rises from the dead. We just happen to be in the world in which it happens. Why should the multiverse hypothesis be able to explain the spectacularly improbable fine-tuning of the cosmos and the otherwise impossible origin of life but not a man rising from the dead?

For the person who relies on the multiverse explanation to account for the incomprehensible precision of the cosmic parameters and constants and for the origin of life from mere chemicals, the resurrection of a dead man should present no problem at all. Given enough worlds and enough time it's a cinch to happen.

No one who's willing to believe in a multiverse should be a skeptic about miracles. Indeed, no one who's willing to believe in the multiverse can think that anything at all is improbable. Given the multiverse everything that is not logically impossible must be inevitable.

Of course, the skeptic's real problem is not that a man rose from the dead but rather with the claim that God deliberately raised this particular man from the dead. That's what they find repugnant, but they can't admit that because in order to justify their rejection of the miracle of the Resurrection they'd have to be able to prove that there is no God, or that God's existence is at least improbable, and that sort of proof is beyond anyone's ability to accomplish.

If, though, one is willing to assume the existence of an infinite number of universes in order to explain the properties of our universe, he should have no trouble accepting the existence of a Mind out there that's responsible for raising Jesus from the dead. After all, there's a lot more evidence for the latter than there is for the former.

Tuesday, March 27, 2018

Campus Moral Absolutism

The Atlantic's Jonathan Merritt argues that our culture's fling with moral relativism, a fling that's persisted for at least sixty years, is over. Borrowing from a column by the New York Times' David Brooks, Merritt maintains that, at least among Millenials, we're seeing what may be described as a New Absolutism.

He may be right, but I think the absolutism he sees descending upon us like a smog is not really an absolutism at all, but rather an emotivist power play.

I'll explain shortly, but first some excerpts from Merritt's column:
In The New York Times last week, David Brooks argued that while American college campuses were “awash in moral relativism” as late as the 1980s, a “shame culture” has now taken its place. The subjective morality of yesterday has been replaced by an ethical code that, if violated, results in unmerciful moral crusades on social media.

A culture of shame cannot be a culture of total relativism. One must have some moral criteria [by] which to decide if someone is worth shaming.

“Some sort of moral system is coming into place,” Brooks says. “Some new criteria now exist, which people use to define correct and incorrect action.”

America’s new moral code is much different than it was prior to the cultural revolution of the 1960s and 70s. Instead of being centered on gender roles, family values, respect for institutions and religious piety, it orbits around values like tolerance and inclusion. (This new code has created a paradoxical moment in which all is tolerated except the intolerant and all included except the exclusive.)

Although this new code is moral, it is not always designated as such. As Brooks said, “Talk of good and bad has to defer to talk about respect and recognition.”
To be sure, there is a wave of moral absolutism sweeping academia which sees things like racism, sexism, support for Israel, and reservations about gay marriage and global warming as absolutely wrong, but to simply point this out and then conclude that relativism (actually a more accurate term here would be "moral subjectivism") is dead is to miss the fatal weakness lurking in the moral passion to which Merritt alludes.

That weakness lies in the fact that the moral fervor with which the above positions are often held on campus and in the media today has no ground in any objective moral referent. These moral convictions are based on nothing more substantial than the ardent feelings of those who hold them. As such they may be regarded as absolutes by those convinced of their rightness, but, if so, they are arbitrarily chosen absolutes, which is to say they're not really absolutes at all.

For any moral principle to be absolute it has to be objectively grounded in something which transcends one's own feelings, indeed which transcends humanity altogether. Otherwise, how do we adjudicate between the passionate feelings of one person and the passionate feelings of another? We can't, of course, unless we have some standard to which we can compare those disparate passions.

Lacking such a standard, when we say something is wrong all we're saying is that it offends my personal preferences, to which someone might well ask, "Why should your preferences be the standard of right and wrong for everyone else?"

The new moral absolutism to which Merritt and Brooks advert is not absolutism at all. It's simply a form of narcissistic subjectivism, or egoism, which presumes that anything which transgresses my personal moral preferences is wrong for everyone and anyone to do.

You, let's say, think it's right to help the poor. I, let's say, think we should adopt social Darwinism and let the poor fend for themselves. You insist I'm wrong. I ask why am I wrong? You say because I'm being selfish and greedy. I ask why selfishness and greed are wrong. You say because they hurt people. I ask why it's wrong to hurt people. You reply that I wouldn't want people to hurt me. I respond that that's true but it's not a reason why I should care about hurting others. You give up on me, judging me hopelessly immoral, but what you haven't succeeded in doing is explaining why it's wrong of me to let the poor suffer. You've simply given expression to your own subjective feelings about the matter.

For there to be any objective moral duties there has to be a transcendent moral authority from which (or whom) all moral goodness is derived. Take away that authority, as our secular society seems eager to do, and all we're left with is emotivism - people insisting that their emotional reactions to events are "right" and contrary reactions are "wrong," but lacking any non-arbitrary basis for making such judgments.

This is not to say that if one believes in a transcendent authority, a God, that one will know what's right. Nor is it to say that even if one knows what's right one will do what's right. What it does say is that unless there is a God, or something very much like God, there simply is no objective right or wrong, and certainly no absolute moral duties.

As any meaningful belief in God fades from our social and cultural landscape, belief in good and evil, right and wrong, will also fade until it's as insubstantial as the grin of the Cheshire cat in Alice in Wonderland:

Monday, March 26, 2018

What Are They Afraid of?

Materialists deny that there's anything that cannot be explained in terms of material substance and the laws of physics. The incredibly complex, information-laden biological world, they argue, as well as the astonishingly fine-tuned cosmos which life inhabits are both the unintended product of chance and physics.

Their metaphysics allows for no mind or intelligence behind it all nor, they insist, is there any reason or evidence for supposing that any such non-material, non-natural intelligence exists.

William J. Murray offers a dissenting view in a comment at Uncommon Descent. He poses an interesting thought experiment. Suppose, he says,
we landed on a planet in a different star system, and on an otherwise barren planet we found a massive, self-sustaining and self-replicating metallic machine comprised mainly of alloyed materials found nowhere else on the planet other than as a material manufactured by the machine for it’s own duplication and repair, what would be the materialist’s reaction? Further, what if the machine was run by a library of code and a code-processing system?

Would they accept the machine as evidence of non-human intelligent design? If they found no archaeological evidence on the planet supporting the idea that an intelligent race of beings at any time in history constructed that machine, would they turn to naturalistic explanations? Would they insist that somehow humans had been there before and left the machine without any other trace of their presence?

Or, would they come to the conclusion that an intelligent agency of some sort designed and built the machine, even though they didn’t know what that intelligent agency was?
At the very least anyone in this situation would have good reason for supposing that some non-human intelligence had been at work on that planet. But we ourselves are in an analogous situation on this planet since we are surrounded by much the same sort of machines, albeit biological machines, as Murray describes.

Or consider the fine-tuning of the universe:
[W]hat if we were exploring space in distance areas of the galaxy and came across a habitat floating in space, perfectly balanced to be self-supporting for a rich and diverse spectrum of life. Let’s say this habitat is enclosed by some form of unknown energy with no apparent source. Everything in the habitat is finely tuned for the flourishing and preservation of that life.

Would the materialist conclude that there must be countless other such habitats floating around, produced by some as yet unknown unintelligent process, each tuned differently and most not capable of supporting life? Or, would they conclude that some intelligent agency must have designed and built the habitat for the purpose of sustaining life? Would they insist other humans must have built it? Would they ever even imagine that a non-human intelligence might be responsible?
Of course they would unless they were so obsessively determined to avoid that conclusion, unless they found that conclusion so repugnant that they'd refuse to accept it no matter how much evidence were available to them. But then, of course, they would be behaving irrationally, perhaps even insanely. One might conclude that the materialist's refusal to accept the evidence that Murray describes is motivated by fear of what that evidence points to.

Murray again:
[M]aterialists contort themselves as if they are headed toward some horrible, painful experience simply by admitting this [that the universe appears to be intelligently designed].

They do similar such contortions when confronted by the hard problem of consciousness and the problem of subjective morality, when admitting that objective morality must exist, and admitting that consciousness exists beyond the material, commits them to no [particular] spiritual or religious doctrine whatsoever. It’s just admitting what evidence indicates and what is logically necessary....

Why fight it tooth and nail? Why contort, obfuscate, and run from these things? Why deny the obvious and the logical to the point of saying such foolish things like “consciousness is an illusion” or “morality is subjective”, when they cannot even act or speak as if such things are true?
Good questions. What are they afraid of?

Saturday, March 24, 2018

Still the Center of a Storm

University of Pennsylvania law professor Amy Wax is back in the news. The leader of Black Lives Matter in Philadelphia is calling for her to be fired for remarks she made recently that BLM insists are racist.

BLM leader Asa Khalif told the school they must fire Wax or else face major disruptions on the Philadelphia campus. Wax's offense was to have claimed in a radio interview with Brown University Professor Glenn Loury, that “I don’t think I’ve ever seen a black student graduate in the top quarter of the class and rarely, rarely in the top half. I can think of one or two students who’ve scored in the top half of my required first-year course.”

Penn Law Professor Amy Wax
Ms. Wax has stirred up controversy before. Last summer I wrote about the firestorm she generated when she claimed that “single-parent, antisocial habits, prevalent among some working-class whites; the anti-‘acting white’ rap culture of inner-city blacks” as well as “the anti-assimilation ideas gaining ground among some Hispanic immigrants,” are symptoms of decline in the lower echelons of our socio-economic structure.

In a follow-up interview Wax noted that Anglo-Protestant cultural norms are superior. “I don’t shrink from the word, ‘superior,’” she said, adding that “everyone wants to come to the countries that exemplify” these values and that “everyone wants to go to countries ruled by white Europeans.”

Khalif insists that, “Anyone with the types of beliefs she holds teaching black and brown students is a danger to them and their future. We are unwavering in our one demand that she be fired. Based on her beliefs and the things she has said, she is a threat to black and brown students.”

Unfortunately, Khalif's threat to disrupt the campus is exactly the wrong way such matters should be handled in a civilized society in which liberty is cherished.

In a healthy, intellectually mature environment like a major university is supposed to be, the first question that would be asked is whether what Professor Wax alleges about minority achievement at her law school is in fact true. Is it indeed the case that black and brown students rarely finish in the top 50%? Surely it would be easy enough to find out.

If it is true, then the next step should be to ask why that is so and focus on the causes of black and brown underperformance rather than seek to punish the person who calls attention to it. On the other hand, if it's not true that black and brown students fall disproportionately outside the top tiers of their class then Prof. Wax should be confronted with the actual statistics by university administrators and invited to publicly recant her claim.

If she's honest and is shown her mistake, she will. If she doesn't, and can offer no compelling reasons for refusing, then, and only then, should she be suspected of harboring some sort of animus against minority students.

To refuse this sensible course of action and opt instead for demands that a woman be denied her livelihood, to engage in threats of disruption and perhaps worse, to choose to create an Orwellian climate of fear and forced conformity such as prevails on many campuses today, a climate in which everyone is afraid to say anything, no matter how true or reasonable, that someone else might be offended by, is to revert to the tactics of 20th century fascism and communism.

It is a favored tactic of oppressors who wish to suppress ideas they don't like to keep them from being openly discussed. It's a tactic employed by people who couldn't care less about liberty or truth, who don't even believe there are such things, and who are interested only in promoting whatever is necessary to achieve their own cause.

It's also a tactic employed by those who have little to no confidence that their side of the case can actually stand on its own merits or that the facts of the matter will support them.

Friday, March 23, 2018

Consciousness Deniers

Philosopher Galen Strawson asks what the silliest claim ever made might be and concludes that the answer has to be the claim made by some philosophers that conscious experience is merely an illusion and doesn't "really" exist. In an interesting, albeit rather lengthy, piece at The New York Review of Books he calls this claim "The Denial."

A summary of the argument the contemporary Deniers make against conscious experience looks something like this:
  1. Naturalism entails materialism which entails that all reality is reducible to matter.
  2. Conscious experience cannot be reduced to matter.
  3. Therefore, conscious experience isn't real.
Strawson is a naturalist and a materialist so he agrees with the first premise, but he avers that the second premise is just silly. We know far too little about the brain, he argues, to say that conscious experience can't be reduced to brain matter. His own argument, then, looks like this:
  1. Naturalism is true and it entails materialism (the belief that all reality is reducible to matter).
  2. Conscious experience is real.
  3. Therefore, conscious experience can be reduced to matter.
I completely agree with the second premise of this argument. Pace the Deniers, the premise can only be denied on pain of incoherence. I agree with Strawson that it seems folly to deny it. If it's an illusion that I'm in pain, for instance, then I'm still experiencing the sensation of pain via the illusion. Thus, even if I'm under the spell of an illusion I'm still having a conscious experience.

Nevertheless, the conclusion of this syllogism only follows if we know that the first premise is true. Strawson seems to beg the question by assuming it is, but that's just a metaphysical preference, a presupposition, an act of faith on his part. It could just as easily be false for all we know since he offers no argument for it.

Here's his own summary of his argument:
Naturalism states that everything that concretely exists is entirely natural; nothing supernatural or otherwise non-natural exists. Given that we know that conscious experience exists, we must as naturalists suppose that it’s wholly natural. And given that we’re specifically materialist or physicalist naturalists (as almost all naturalists are), we must take it that conscious experience is wholly material or physical.

And so we should, because it’s beyond reasonable doubt that experience—what W.V. Quine called “experience in all its richness...the heady luxuriance of experience” of color and sound and smell—is wholly a matter of neural goings-on: wholly natural and wholly physical.
Strawson goes on to describe how other naturalist philosophers have come to deny the reality of conscious experience:
But then—in the middle of the twentieth century—something extraordinary happens. Members of a small but influential group of analytic philosophers come to think that true naturalistic materialism rules out realism about consciousness. They duly conclude that consciousness doesn’t exist.

They reach this conclusion in spite of the fact that conscious experience is a wholly natural phenomenon, whose existence is more certain than any other natural phenomenon, and with which we’re directly acquainted, at least in certain fundamental respects.

These philosophers thus endorse the Denial.

The problem is not that they take naturalism to entail materialism—they’re right to do so. The problem is that they endorse the claim that conscious experience can’t possibly be wholly physical. They think they know this, although genuine naturalism doesn’t warrant it in any way.

So they...claim that consciousness doesn’t exist, although many of them conceal this by using the word “consciousness” in a way that omits the central feature of consciousness—the qualia [i.e. our sensations of color, taste, fragrance, sound, pain, etc.]
Strawson and I agree, then, that qualia, and thus conscious experience, are real, but we disagree over his rejection of the claim that conscious experience cannot be completely reduced to material stuff. It seems to me that qualia are fundamentally different from matter, and it's exceptionally difficult to see how the experience of red, for instance, can be reduced to electrochemical phenomena in the brain.

As the late philosopher Jerry Fodor once said:
Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious. So much for the philosophy of consciousness.
When we experience the sensation of color, or sweetness, or pain the immediate cause of that sensation is physical or material, but the sensation itself is not. A miniature scientist walking around inside someone's brain while the host is tasting sugar will only observe electrochemical reactions occuring in the neurons and synapses.

She'd see electrons whizzing about, chemicals interacting, and nerve fibers lighting up, perhaps, but however deeply she probed into the host's brain she wouldn't observe "sweetness" anywhere. Likewise, mutatis mutandis, with every other sensation her host might be experiencing.

So, I'd suggest a reformulation of the first syllogism:
  1. Naturalism entails materialism which entails that all reality is reducible to matter.
  2. Conscious experience probably cannot be reduced to matter.
  3. Therefore, Naturalism is probably false.
This argument rests on the truth of the second premise, of course, a premise which Strawson denies. But he can't, or at least doesn't in his essay, give any reason for his denial of the premise other than his a priori conviction that Naturalism is true.

If, though, it's reasonable to think that the second premise is true, and a lot of philosophers, many metaphysical Naturalists among them, are convinced it is, then it's reasonable to believe the conclusion that Naturalism is probably false is true as well.

Thursday, March 22, 2018

Dems: It'd Be Wrong to Vote for Clinton or Kennedy

This is pretty weird. A headline at HotAir declares that "53% Of Democrats Say It’s Not Acceptable To Vote For A Candidate Who’s Done Immoral Things In His Private Life".

What could these folks have been thinking when they responded to this poll that way? How many people have not done something immoral in their private life? How many people have not harbored unkind thoughts, twisted or flouted the truth, coveted what they had no right to have, yielded to pride or lust or the temptation to eat, drink or smoke to the point that it becomes harmful? Probably no one.

Maybe these respondents, being Democrats, had the sexual transgressions of Donald Trump in mind when they answered the question and were simply opining that it was unacceptable to vote for a man who has had illicit affairs.

Yet how many of them, if they had had the opportunity, would have voted for Bill Clinton or Ted Kennedy in the 90s, or John Kennedy or Lyndon Johnson in the 60s. Probably all of them. And the sexual escapades of Clinton and Kennedy were conducted, some of them, while they were actually in office, and occurred in the White House itself. Not even his most implacable foes on MSNBC or CNN have accused Donald Trump of so tarnishing and degrading the office of the presidency.

So, it's hard to know what to make of the 53% of respondents who said that voting for men like Clinton and the Kennedy brothers is unacceptable when they all, or at least most of them, would have done it themselves.

But that aside, Donald Trump's behavior, at least prior to taking office, has been in some respects, awful, and those who've tried to make excuses for it have disgraced and discredited themselves. Especially is this true, as former G.W.Bush speechwriter Michael Gerson insists in a column at The Atlantic, of conservative Christian leaders.

Not only have they shamed themselves, Gerson argues, they've brought into grave disrepute the whole idea of Christian morality and integrity. They've made themselves look like hypocrites and opportunists, and it's hard to disagree with Gerson about this.

But he goes on to imply that those who voted for Trump are equally as guilty of having sold out their values for a mess of political pottage, and here I think he's mistaken. There's a big difference between excusing or winking at immoral conduct and voting for someone who has engaged in it. The first is unconscionable, but in our imperfect world the second might be the best of bad options.

Given the alternatives with which we were presented in 2016, a vote for Trump was not necessarily inconsistent with one's belief that he was/is a moral reprobate and that much of what he has done in his personal life, particularly in his sexual conduct, is vile. Here's a paraphrase of Gerson's catalogue of some of Mr.Trump's more egregious behaviors:

His past political stances (he once supported the right to partial-birth abortion), his character (he has bragged about sexually assaulting women), and even his language (he introduced the words p***y and s******e into presidential discourse) should disqualify him among conservatives, especially religious conservatives. This is a man who has cruelly publicized his infidelities, made disturbing sexual comments about his elder daughter, boasted about the size of his manhood on the debate stage, and his lawyer reportedly arranged a $130,000 payment to a porn "actress" to dissuade her from disclosing an alleged affair.

In most years such a record would indeed make it impossible for someone who believes our leaders should be virtuous to cast a vote for the man, but 2016 wasn't most years.

Mr. Trump's opponent in the race was Hillary Clinton who is herself at least as deeply compromised as he is. Whatever depravities Donald Trump can with justice be accused of can with equal justice be levied against Ms Clinton.

Did Mr. Trump before entering politics speak disparagingly about women as sex objects? Ms. Clinton actually sought to destroy the credibility of women who publicized her husband's treatment of them as sex objects.

Has Mr. Trump used vulgar language? Ms. Clinton was notorious among her secret service detail for her verbal abuse of them, abuse which was liberally salted with F-bombs.

Has Mr. Trump been dishonest in his public statements? Ms. Clinton has repeatedly lied to the American people about her reckless and illegal handling of classified documents when she was Secretary of State, among other things.

Did Mr. Trump treat his primary opponents abominably? Ms. Clinton, with the complicity of the DNC, rigged the Democratic primaries to ensure that her opponents, Joe Biden and Bernie Sanders, could not win.

Add to this abysmal record of moral failure the corruption of the Clinton Foundation and Global Initiative, her disregard for national security, and her own perhaps dubious sexual history and the argument that voters had a clear choice between a morally superior and a morally inferior candidate in the last election is risible.

The choice in 2016 was either to vote for one of two very flawed candidates or not vote at all. For those who deplored Ms Clinton's politics it was clear that to refrain from participating in the election was to ensure that she would gain the Oval Office and that we'd have another four to eight years of a mushy economy, little job growth, no relief from Obamacare, more progressive ideologues appointed to the judiciary, more onerous government regulations imposed on business and individuals, more erosion of our fundamental religious and constitutional liberties, more scandal in governmental agencies, more disastrous foreign policy blunders, and Bill Clinton back in the White House.

Mr. Trump, on the other hand, despite often acting like an adolescent septuagenarian in the primaries, promised he would do three things: Stop illegal immigration, restore our economy to a sound footing, and appoint judges whose rulings would be based on the law and the Constitution and not on their personal ideological bias. Despite opposition from his opponents he has made progress toward accomplishing all three.

So when writers like Michael Gerson, a Christian conservative, castigate Christian leaders for excusing Mr. Trump's character flaws, I'm right with him, but when he argues that those flaws should have made it impossible for those who profess a Christian value system to support him with their vote, I want to ask Mr. Gerson, what was the alternative?

Wednesday, March 21, 2018

No Womb, No Say

Whenever the topic of abortion or the laws regulating it arises someone can always be counted upon to inform any male dissenters from the contemporary status quo that since they can't get pregnant they have no business advocating legal restrictions on a woman's access to abortion or even speaking out on the issue.

The argument is specious, of course, but that doesn't matter to those who employ it since it packs an emotional wallop which obscures its logical inadequacies.

Even so, it's worth the effort to unwind its shortcomings, and Roland C. Warren does just that in a recent column at the The Federalist.

Warren, who is the president of a large network of pregnancy centers called Care Net, writes that,
I have heard this challenge to men so often that I have coined it the “no womb/no say” perspective. In short, since a man does not have a womb to carry an unborn child, he should have no say in what happens to an unborn child in the womb.
There's an irony in the attempt by pro-choice organizations which otherwise exclude men from having an opinion on the issue to nevertheless recruit men to support abortion rights. There's an even greater irony, as Warren observes, in the fact that the Supreme Court, which in 1973 discovered the right to terminate a developing human being hidden away somewhere in the shadows of the Constitution, was comprised of a group of old, white men.

Anyway, Warren defines the “no womb/no say” principle as the claim that, "Unless one is impacted by an issue or action in the most direct way, one should have no agency in making decisions about that issue or action", and proceeds to demonstrate that the claim is absurd:
Should a woman who is a stay-at-home mom and, therefore, makes no income outside the home, have a say on tax policy? After all, she doesn’t directly pay taxes for an income. Or, should someone who does not own a gun or has never been injured by a gun have a say in what our nation’s gun laws should be? Again, a non-gun owner is not going to be directly impacted if the access to guns is limited.

[C]onsider the Civil War. The South was primarily an agrarian society that, in large measure, was structured and directly dependent on slave labor. Indeed, a key aspect of the South’s “states’ rights” argument was that since the North’s society and economic system would not be as directly impacted by the abolition of slavery, the North should have no say. Indeed, “no slaves/no say” was the South’s proverbial battle cry.

Also consider the issue of voting rights in the United States. From our nation’s founding, voting rights were limited to property owning or tax paying white males, who were about 6 percent of the population. So the notion was “no property/no say.” And even when voting rights were extended to other men, women were excluded. Why? Because the view held by many men was that women were not and should not be as directly involved in the economic and civil aspect of American society as men.

Consequently, these men held a “womb/no say” perspective when it came to voting rights. Well, the Women’s Suffrage movement challenged this perspective, and in 1920, with the passage of the 19th Amendment to the Constitution, women were given the right to vote … by men.
Of course, it could be objected that in each of these examples people are, or were, affected by the policies and that's what entitles them to voice an opinion on them, but surely abortion doesn't just affect the mother and her unborn child (who might well be male, after all). A policy which allows 1.5 million unborn children to be killed every year is surely a policy that affects all of us, if not directly then indirectly.

To argue that because men can't get pregnant that they therefore have no business expressing an opinion on the morality of the policy - unless they hold the "correct" opinion - is just as fatuous as telling someone today that because they're never going to risk being enslaved they therefore have no right to voice an opinion on the morality of slavery.

Tuesday, March 20, 2018

Rolling the Dice

From time to time we've talked about the argument for an intelligent designer of the universe based on cosmic fine-tuning (okay, maybe a little more often than just "from time to time").

Anyway, here's a four minute video by Justin Brierly on the subject that serves as a nice primer for those not wishing to get too bogged down in technical aspects of the argument:
Brierly is the host of the weekly British radio show Unbelievable which is available on podcast. Each week Justin brings together believers and unbelievers to talk about some issue related to matters of religious faith. The discussions are almost always pleasant, informative, and Justin does an excellent job moderating them. They're usually what such conversations should be like, but too often aren't.

If you'd like to sign up for the podcast or browse the archives of past shows which have featured discussions on almost every topic related to religious belief you can go to the Unbelievable website here. For those readers who might prefer a slightly more elaborate explication try this post and the debate it links to.

Monday, March 19, 2018

Some Thoughts on Last Week's Walkout

Last week, thousands of students across the nation, rightly alarmed by the fact that they're sitting ducks on their gun-free campuses for any deranged nihilist who wants to be famous, walked out in protest of gun violence. One certainly sympathizes with their concern, at least the concern of many of them.

I qualified that last sentence because, despite the media's apotheosis of the students and their demonstration, decades of teaching high school students has made me both cautious and curious.

I wonder, for example, how many of the young men who joined those demonstrations went home that afternoon and sat down to play violent video games in which the object is to kill as many virtual enemies as possible.

I wonder, too, how many students who walked out of their classes just wanted to get out of school or to stick their thumb in the eye of school authorities or were otherwise indifferent to the fears that motivated their classmates.

Many schools of course condoned the walkouts, but I wonder how enthusiastically schools and the media would've supported these students had they been demonstrating against the vast number of people slaughtered in our nation's abortion clinics or the hundreds of people killed every year by illegal immigrants.

I wonder what the media would've said about the exploitation of five to twelve year olds by adults who used children as props in these demonstrations had the cause they were protesting been, say, the devastation wrought on their families by liberal policies like easy divorce and other fallout from the sexual revolution.

I wonder, also, how Hollywood celebrities can piously deplore gun violence in his country while making their living performing in movies which glorify precisely that very kind of violence.


I wonder, finally, how many of the politicians and journalists who used this walkout as another lever to weaken the voters' resistance to disarming the citizenry themselves employ bodyguards or carry weapons.

In other words, although I'm sure thousands of those students were sincerely and justly concerned about the terrible carnage that's been visited upon our schools over the last decade or two, I nevertheless wonder how much hypocrisy we were treated to last week by some of their schoolmates and faculty and even more by our media and politicians.

Saturday, March 17, 2018

Why We Celebrate St. Patrick's Day

Millions of Americans, many of them descendents of Irish immigrants, celebrate their Irish heritage by observing St. Patrick's Day today. We're indebted to Thomas Cahill and his best-selling book How The Irish Saved Civilization for explaining to us why Patrick's is a life worth commemorating.

As improbable as his title may sound, Cahill weaves a fascinating and compelling tale of how the Irish in general, and Patrick and his spiritual heirs in particular, served as a tenuous but crucial cultural bridge from the classical world to the medieval age and, by so doing, made Western civilization possible.

Born a Roman citizen in 390 A.D., Patrick had been kidnapped as a boy of sixteen from his home on the coast of Britain and taken by Irish barbarians to Ireland. There he languished in slavery until he was able to escape six years later. Upon his homecoming he became a Christian, studied for the priesthood, and eventually returned to Ireland where he would spend the rest of his life laboring to persuade the Irish to accept the Gospel and to abolish slavery.

Patrick was the first person in history, in fact, to speak out unequivocally against slavery and, according to Cahill, the last person to do so until the 17th century.

Meanwhile, Roman control of Europe had begun to collapse. Rome was sacked by Alaric in 410 A.D. and barbarians were sweeping across the continent, forcing the Romans back to Italy, and plunging Europe into the Dark Ages.

Throughout the continent unwashed, illiterate hordes descended on the once grand Roman cities, looting artifacts and burning books. Learning ground to a halt and the literary heritage of the classical world was burned or moldered into dust. Almost all of it, Cahill claims, would surely have been lost if not for the Irish.

Having been converted to Christianity through the labors of Patrick, the Irish took with gusto to reading, writing and learning. They delighted in letters and bookmaking and painstakingly created indescribably beautiful Biblical manuscripts such as the Book of Kells which is on display today in the library of Trinity College in Dublin. Aware that the great works of the past were disappearing, they applied themselves assiduously to the daunting task of copying all surviving Western literature - everything they could lay their hands on.


For a century after the fall of Rome, Irish monks sequestered themselves in cold, damp, cramped mud or stone huts called scriptoria, so remote and isolated from the world that they were seldom threatened by the marauding pagans. Here these men spent their entire adult lives reproducing the old manuscripts and preserving literacy and learning for the time when people would be once again ready to receive them.


These scribes and their successors served as the conduits through which the Graeco-Roman and Judeo-Christian cultures were transmitted to the benighted tribes of Europe, newly settled amid the rubble and ruin of the civilization they had recently overwhelmed.

Around the late 6th century, three generations after Patrick, Irish missionaries with names like Columcille, Aidan, and Columbanus began to venture out from their monasteries and refuges, clutching their precious books to their hearts, sailing to England and the continent, founding their own monasteries and schools among the barbarians and teaching them how to read, write and make books of their own.

Absent the willingness of these courageous men to endure deprivations and hardships of every kind for the sake of the Gospel and learning, Cahill argues, the world that came after them would have been completely different. It would likely have been a world without books. Europe almost certainly would have been illiterate, and it would probably have been unable to resist the Muslim incursions that arrived a few centuries later.

The Europeans, starved for knowledge, soaked up everything the Irish missionaries could give them. From such seeds as these modern Western civilization germinated. From the Greeks the descendents of the Goths and Vandals learned philosophy, from the Romans they learned about law, from the Bible they learned of the worth of the individual who, created and loved by God, is therefore significant and not merely a brutish aggregation of matter.

From the Bible, too, they learned that the universe was created by a rational Mind and was thus not capricious, random, or chaotic. It would yield its secrets to rational investigation. Out of these assumptions, once their implications were finally and fully developed, grew historically unprecedented views of the value of the individual and the flowering of modern science.

Our cultural heritage is thus, in a very important sense, a legacy from the Irish. A legacy from Patrick. It is worth pondering on this St. Patrick's Day what the world would be like today had it not been for those early Irish scribes and missionaries thirteen centuries ago.

Buiochas le Dia ar son na nGael (Thank God for the Irish), and I hope you have a great St. Patrick's Day.

Friday, March 16, 2018

Stephen Hawking, R.I.P.

Perhaps no contemporary scientist has acquired the cultural cache and fame that Stephen Hawking managed to accrue during his career as a physicist. He even had a movie made about his life during his lifetime.

Hawking died Wednesday at the age of 76 which was itself an amazing achievement since he had suffered ever since the 1960s from ALS, a degenerative nerve disease that usually claims its victims long before they reach their seventies. Hawking, however, was fortunate in the quality of his medical care, the love of the people around him and his own strength of will and humor.

A piece at New Scientist offers a good overview of his life and his contribution to cosmology. It opens with this:
Stephen Hawking, the world-famous theoretical physicist, has died at the age of 76.

Hawking’s children, Lucy, Robert and Tim said in a statement: “We are deeply saddened that our beloved father passed away today.

“He was a great scientist and an extraordinary man whose work and legacy will live on for many years. His courage and persistence with his brilliance and humour inspired people across the world.

“He once said: ‘It would not be much of a universe if it wasn’t home to the people you love.’ We will miss him for ever.”

The most recognisable scientist of our age, Hawking holds an iconic status. His genre-defining book, A Brief History of Time, has sold more than 10 million copies since its publication in 1988, and has been translated into more than 35 languages. He appeared on Star Trek: The Next Generation, The Simpsons and The Big Bang Theory. His early life was the subject of an Oscar-winning performance by Eddie Redmayne in the 2014 film The Theory of Everything.

He was routinely consulted for oracular pronouncements on everything from time travel and alien life to Middle Eastern politics and nefarious robots. He had an endearing sense of humour and a daredevil attitude – relatable human traits that, combined with his seemingly superhuman mind, made Hawking eminently marketable.

But his cultural status – amplified by his disability and the media storm it invoked – often overshadowed his scientific legacy. That’s a shame for the man who discovered what might prove to be the key clue to the "theory of everything", advanced our understanding of space and time, helped shape the course of physics for the last four decades and whose insight continues to drive progress in fundamental physics today.

Hawking’s research career began with disappointment. Arriving at the University of Cambridge in 1962 to begin his PhD, he was told that Fred Hoyle, his chosen supervisor, already had a full complement of students. The most famous British astrophysicist at the time, Hoyle was a magnet for the more ambitious students. Hawking didn’t make the cut. Instead, he was to work with Dennis Sciama, a physicist Hawking knew nothing about. In the same year, Hawking was diagnosed with amyotrophic lateral sclerosis, a degenerative motor neurone disease that quickly robs people of the ability to voluntarily move their muscles. He was told he had two years to live.

Although Hawking’s body may have weakened, his intellect stayed sharp. Two years into his PhD, he was having trouble walking and talking, but it was clear that the disease was progressing more slowly than the doctors had initially feared. Meanwhile, his engagement to Jane Wilde – with whom he later had three children, Robert, Lucy and Tim – renewed his drive to make real progress in physics.
There's much more about Hawking's scientific work at the link. His philosophical ideas seemed sometimes ill-informed, as in his book The Grand Design, but he was clearly a scientific genius and an amazing human being.

Thursday, March 15, 2018

Dialogue "Partners"

I’ve been reflecting lately on why some people, even some friends, are difficult to have a conversation with. There are some people with whom it's easier to dialogue via email than it is face-to-face, and I've been wondering why that is, exactly.

I've arrived at the conclusion that in my experience there are at least six types of conversation "partners" and five of them are hard to engage with in any meaningful way. Readers may be able to come up with more than six types, I don't claim my taxonomy to be exhaustive, but here are the six that I've encountered:

One type consists of those who seem constantly distracted while you're talking to them. In the middle of what you're trying to say they're constantly looking away as if something else is capturing their interest, they're staring out the window or fidgeting with their phone or calling out to acquaintances who happen by. This is a behavior we've all come to expect from unmannered children, but it's disconcerting to have to endure it from an adult. The person is either rude or suffers from ADD.

A second type is the individual who expatiates interminably on whatever the topic of the moment may be, never permitting you an opportunity to insert even the slightest contribution to the matter beyond a grunt of assent now and then. You can scarcely utter a word before your dialogue "partner" seizes hold of the conversation again. A discussion with this person consists of him talking and you listening. You're expected to essentially play the role of audience to his monologue. This conversational type combines rudeness with narcissism.

A third type is the fellow (or lady) who appears to be listening to you but who is in fact mulling over in his mind what he wants to say as soon as you shut up. Like the previous two, this individual doesn't much care about what you think, only about what he thinks.

Then there's the individual who gets angry, aggressive or defensive, as soon as you offer a dissenting opinion to whatever he or she has happened to say or believe. Some people simply cannot brook any disagreement no matter how politely expressed. The previous three types may be tolerable (barely) if taken in infrequent doses, but this type rarely is. This person is just unpleasant to try to talk to, at least whenever the dialogue turns to matters upon which there are divergent points of view.

Another type of interlocutor is the one who dismisses your opinions with a disdainful gesture or joke, or changes the subject, or otherwise treats your words as though none of them are really worth listening to. This maneuver establishes them in their own eyes, perhaps unconsciously, as superior or dominant, somewhat like type two. The individual is not only arrogant but, like all the other preceding types, rude as well.

The last type is the individual who genuinely makes an effort to listen to you, to understand what you're saying and gives you time to develop your thought fully before responding. They're a pleasure to spend time with and one always looks forward to conversing with, and learning from, such people.

I know we all sometimes take on the aspect of each of these types. We dialogue differently with different people and sometimes fall into one or more of these types, even in the same conversation, but wouldn't it be wonderful if we, and everyone else, were, most of the time, more like the last type.

Wednesday, March 14, 2018

The End of Secularism

Some years ago a professor of political science named Hunter Baker came out with a fine book titled The End of Secularism. The problem Baker addressed in his work was the largely successful attempt in the latter part of the 20th century to purge religious sentiments from the public square and to instill in our everyday life an assumption of, or bias toward, secularism.

His main argument is that politics simply cannot be separated from religion, that the secularist alternative is neither neutral nor desirable, and that it will ultimately fail. Secularism should be seen not as the only reasonable occupant of the public square but rather as one competitor among others jockeying to be heard in the marketplace of ideas.

Secularism is not to be confused with the separation of church and state. The latter refers to institutional independence. The former refers to the separation of religion from public life. Separation of church and state is a good thing for everyone involved. The separation of religion from public life is not.

Baker argues persuasively that the courts have erred in seeing the First Amendment as a prohibition of religious expression in taxpayer subsidized spaces. The establishment clause ("Congress shall make no law respecting an establishment of religion...") was not about religious freedom at all. It was about who had jurisdiction in church/state matters.

The Founders were saying that the role of religion would be a matter for the several states to resolve each for themselves, and that the federal government had no business injecting itself into what was a state matter. Each state was to be free to develop its own relationship with religion in whatever way it chose without the federal government telling it what it could or couldn't do.

Of course, that's not how our courts have chosen to interpret the Amendment. They've ruled, in effect, that the First Amendment is a mandate for secularism, which actually privileges one religious view - secularism - above all others.

Baker traces the uneasy history of church state relations from the early Roman church to the present and attributes the rise of secularism in the West to three main 19th century developments: The emergence of German higher criticism, the publication of Darwin's Origin of Species, and the schisms wrought by the Civil War and slavery in both the nation and the church. The clincher, though, was the Scopes Trial in 1926 and its aftermath.

The trial was a humiliation for Christian fundamentalism and launched secularism on a trail of victories for the next sixty years that made it seem invincible.

Today, however, the picture is much different. Since the latter part of the 20th century secularism has come under intense scrutiny by "its own advocates, conservative Christians, other conservative religionists, and postmoderns." The critique of secularism includes the irony that secularists have absorbed as their own the values of our common Christian heritage even as they claim that secular thinking is actually the source of these values.

Baker spends several pages on Stanley Fish's critique of the secularist project and why it is doomed to fail. For example, "Bracketing off religion does not solve the problem of toleration. It just disadvantages one set of orthodoxies from interacting with the many secular orthodoxies roaming free in a liberal society." This is true. It also privileges the secularist orthodoxies by essentially insulating them from criticism by banning the opponents most likely to present the most powerful critiques - religious opponents - from the public square.

We must exclude religious reasons and motivations from our public discourse, the secularist argues, because we need to allow only viewpoints that are accessible to everyone and held by everyone in the public arena. The assumption, however, that secular viewpoints are somehow metaphysically neutral is a fraud. The secularist is no more disinterested than is the religious citizen and for him to claim that he should be allowed to judge what passes for legitimate discourse is like permitting a baseball pitcher the prerogative of calling the balls and strikes.

All public discourse reduces to two fundamental visions of reality. One maintains that the universe is the product of a rational, personal, and good creator and the other holds that everything is a result of chance and impersonal forces. The secularist wants to rule the former out of court and allow only the latter in the public square, but conveniently, the latter view happens to be his own. Postmodern thinkers like Fish have been particularly adept at pointing out the self-serving nature of the attempt to establish a monopoly for one's own view while maintaining the pretense of neutrality.

Baker makes the interesting observation that although secularism serves essentially the same role in the Democratic party that religion serves in the GOP, the media, though eager to report on the influence religion has among Republicans, rarely reports on the influence secularism has among Democrats. One never hears, for instance, how the Democrats have "shored up their base among the unchurched, atheists and agnostics."

We're often reminded that schools must not teach religious values, but secularist values like environmental attitudes and fads, tolerance, opposition to racism, sexism, and homophobia are all deemed perfectly legitimate topics for taxpayer-funded schools. In other words, taking Judeo-Christian religion out of the public square does not leave the square religion-free. Rather, it leaves secularism as the only religion to be allowed a voice in our public deliberations.

There's much more in Baker's relatively short (194 pages) book, and I recommend it to anyone interested in Church/State issues and the role of religion in public life.

Tuesday, March 13, 2018

George Berkeley at 333

Yesterday was the 333rd anniversary of the birth of Irish philosopher George Berkeley (1685). I recently came across a very good essay on Berkeley and his philosophy by Ken Francis at New English Review. Berkeley is famous for his philosophical idealism. He believed that material substance is an illusion created by minds, that all that really exists are ideas in minds.

Francis begins by mentioning the Scottish clergyman Ronald Knox who sought to mock Berkeley's philosophy:

[T]he philosopher George Berkeley did not believe in the existence of the material world .... In fact, he did not believe in the physical existence of the entire universe for that matter (or should it be ‘non-matter’?). Knox’s limerick to Berkeley went something like this:
There was a young man who said God
Must find it exceedingly odd
To think that the tree
Should continue to be
When there’s no one around in the quad.
An anonymous reply to Knox’s limerick added:
Dear Sir, your astonishment’s odd;
I am always about in the quad
And that’s why the tree
Will continue to be
Since observed by yours faithfully, God.
There's much of interest in Francis' article which touches on other philosophers (Plato, Locke, Kant) besides Berkeley. He poses, for instance, this puzzle:
Are the words that you are now reading on this page, including the backdrop to wherever you are reading, part of the conceptualized reality of a Supreme Entity, of which our collective consciousness is a manifestation? Berkeley believes that every-day objects are such a manifestation with multiple visual aspects that can change depending upon the circumstances. This also brings into play the problem of appearance and reality.

Take for example a fingerprint. If asked to describe one, the obvious answer would be that it’s a small black blob, about two-inches in circumference, with whirly lines going through it. The philosopher Bertrand Russell would call this favouritism, as we tend to view objects from an ordinary point of view under usual conditions of light, but the other colours/shapes which appear under other conditions have just as good a right to be considered real.

In other words, if we look a little closer, through a powerful microscope, our conventional idea of what a fingerprint looks like takes on a whole new meaning. For here we see something that resembles a huge mountain range, a kind of dark grey version of the Himalayas.

And if we stand back from the fingerprint, say about 20 feet, it looks like a tiny black spot (without the whirly bits). The same applies to everything else we perceive—from tables and chairs to mountains and oceans. But we describe most everyday objects from the very convenient distance, usually a couple of feet away, of a human perceiver, with its meaning relative to how such a perceiver thinks.
So, is reality observer relative? Does what it's like depend upon how we perceive it? If so, then doesn't our perception actually in some sense establish the reality that something has?

These are fascinating questions. I encourage you to read Francis' entire article at the link, and don't forget to wish the Reverend Berkeley a happy 333rd.

Monday, March 12, 2018

The "Old Enough" Argument

I should say up front that I don't in principle oppose raising the age at which someone can purchase some firearms to 21.

That may surprise, and even miff, some readers, but I'm not sure that a restriction is any more an infringement on our second amendment rights than is prohibiting 14 year-olds from buying certain weapons, or prohibiting anyone at all from buying a fully automatic rifle or a grenade launcher.

Having said that, though, I think there's an amusing irony in the arguments made by some liberals in favor of raising the age at which certain weapons can be purchased.

Before I explain that I should note that Florida has just passed such a restriction, and although many Democrats voted against it, they did so only because the legislation provides for a program to train school faculty to carry a firearm. They generally favored raising the age at which a person can buy a weapon.
Florida Gov. Rick Scott (R.) signed new state gun restrictions into law on Friday, including raising the minimum age to buy a gun to 21 and instituting a three-day waiting period for all firearm purchases.

The new law also banned the sale of bump stocks in Florida and allowed police to ask judges to confiscate weapons from those deemed a threat to themselves or others, as well as granted monies for the training and arming of school personnel, the Miami Herald reports:
Scott signed the bill despite his opposition to creation of a program that allows school personnel to carry concealed weapons on campus.

Family members of all 17 Parkland victims signed a statement supporting passage of the legislation.

The Coach Aaron Feis Guardian Program, named in memory of the assistant football coach at the school who died protecting students from gunfire, will create a $67 million program for county sheriffs to train school personnel to neutralize an active school shooter.
I trust that the county sheriffs in Florida know a bit more about how to train people to neutralize an active shooter than the sheriff of Broward County apparently does, but in any case, there's an irony in hearing liberal Democrats argue that the age at which guns can be purchased should be raised to 21.

Back in the 1970s, when the voting age was lowered from 21 to 18, it was liberal Democrats who led the way, and one of their chief arguments was that if someone was old enough at 18 to go to Vietnam and fight and possibly die then he/she was surely old enough to vote. The argument struck many as silly, but like other silly arguments before and since, it carried the day with our politicians, and the voting age was duly lowered.

Eighteen year olds were henceforth to be considered old enough and mature enough to be entrusted with the great responsibility of choosing our nation's leaders.

Forty some years later, we find liberal Democrats now rejecting almost the same argument they made in the 1970s. Even though 18 year olds may be old enough and mature enough to go off to Iraq and heaven-knows-where-else to fight and possibly die, we now hear liberals say, they're not old enough to buy a weapon here at home.

Yet, if they're old enough to go to war, and thus old enough and mature enough to vote, why are they not also old enough and mature enough to own a weapon?

Maybe someone should ask them.

As I said above, I myself have no objection in principle to raising the age at which certain kinds of weapons can be purchased, but then neither have I ever impressed with the "old enough to fight and die" argument, nor could I ever see the wisdom of lowering the voting age to 18.

Saturday, March 10, 2018

The Art of the Witty Insult

A couple of recent posts touched on the loss of civil discourse in our public square and reminded me of a Viewpoint post from long ago which catalogued a sampling of famous quotes that raise the act of insult to an artform. Some of them are funny and they're all clever. Enjoy:

"He has all the virtues I dislike and none of the vices I admire." -- Winston Churchill

"I have never killed a man, but I have read many obituaries with great pleasure." -- Clarence Darrow

"He has never been known to use a word that might send a reader to the dictionary." -- William Faulkner (about Ernest Hemingway)

"I've had a perfectly wonderful evening. But this wasn't it." -- Groucho Marx

"I didn't attend the funeral, but I sent a nice letter saying I approved of it." -- Mark Twain

"He has no enemies, but is intensely disliked by his friends." -- Oscar Wilde

"I am enclosing two tickets to the first night of my new play; bring a friend, if you have one." -- George Bernard Shaw to Winston Churchill. "Cannot possibly attend first night; will attend second, if there is one." -- Winston Churchill's response

"I feel so miserable without you; it's almost like having you here." -- Stephen Bishop

"He is a self-made man and worships his creator." -- John Bright

"I've just learned about his illness. Let's hope it's nothing trivial." -- Irvin S. Cobb

"He is not only dull himself; he is the cause of dullness in others." -- Samuel Johnson

"He is simply a shiver looking for a spine to run up." -- Paul Keating

"He had delusions of adequacy." -- Walter Kerr

"Why do you sit there looking like an envelope without any address on it?" -- Mark Twain

"His mother should have thrown him away and kept the stork." -- Mae West

"Winston, if you were my husband, I would poison your coffee!" -- Lady Astor to Winston Churchill at a dinner party. "Madam, if I were your husband, I would drink it!" -- Winston Churchill, in response

"Some cause happiness wherever they go; others, whenever they go." -- Oscar Wilde

Friday, March 9, 2018

Alinsky's Rule #12

Students sometimes wonder why and how our politics have gotten so nasty. Actually, this is not a recent development. Many conservative commentators trace the nastiness back to the ugly slanders to which Supreme Court nominees Robert Bork and Clarence Thomas were subjected by Senate Democrats, particularly Ted Kennedy and Joe Biden, in the late 80s and early 90s.

Liberal pundits, on the other hand, point to the impeachment of Bill Clinton by Republicans as another event that created wounds so deep that it may be decades before our public discourse ever recovers.

Whatever the more public catalysts may have been, it seems that a lot of people in politics today have taken rule #12 in Saul Alinsky's Rules for Radicals to heart: "Pick the target. Freeze it, personalize it, and polarize it. Cut off the support network and isolate the target from sympathy. Go after people and not institutions. (This is cruel, but very effective. Direct personalized criticism and ridicule works.)"

It's worth noting that Alinsky was a left-wing radical writing a manual for progressive activists. There's nothing that I'm aware of in all of conservative literature that comes anywhere close to Alinsky's adjuration to dehumanize and degrade one's political opponents. One searches in vain through Edmund Burke, William Buckley, Barry Goldwater, Russell Kirk, or even Rush Limbaugh and Mark Levin for anything that matches in cruelty or odiousness Alinsky's rule #12.

Yet Alinsky is the left's tactical lodestar and guru. Barack Obama incorporated his teaching into his work as a community organizer. Hillary Clinton wrote her Master's thesis on him. Alinsky is must-reading for leftist activists in good standing. Thankfully, not all of them follow him, but evidently enough do that our political discourse has been gravely coarsened by his more devoted disciples.

It's perhaps no coincidence that the Left's attacks over the years not only on Bork and Thomas, but also Ronald Reagan, Ken Starr, George Bush, Dick Cheney, Joe the Plumber, Sarah Palin, Carrie Prejean, the tea partiers and town hall protesters, and, more lately, Donald Trump and anyone who voted for him, were and are so personal, vicious, and vile. Like Alinsky said, smearing and dehumanizing one's opponent works, especially if the media plays along.

Anyone who gets in the way of the progressive agenda can expect to be vilified, insulted, libeled, slandered, and ridiculed. It's the tactic their esteemed mentor urged them to employ and Alinsky's votaries, or at least too many of them, employ it with distressing gusto.

Unfortunately, we're likely to continue seeing this ugliness in our politics until the media and the public demand an end to it. Meanwhile, it's imperative that those on the right who value civil discourse ensure that their side, no matter how much they're angered by the behavior of the left over the last two decades, doesn't fall into the same destructive, degrading rhetorical cesspool and that we dissociate ourselves from those on the right who may already be swimming in it.

It's appropriate to shine a light on this depraved behavior when we encounter it, but let us not succumb to the temptation to respond to it in kind.

Thursday, March 8, 2018

Projection

Annafi Wahed describes herself as a tiny, talkative, Asian young woman who spent four months on Hillary Clinton's 2016 campaign staff. To the consternation of her friends she chose to attend CPAC, an annual gathering of politically active conservatives.

As Andrew Klavan tells it, when she set out for the event her liberal friends expressed actual concern for her safety, as if she were descending into a den of violent ruffians.

What Wahed found instead completely surprised and confounded her friends' stereotypes. She writes about her experience in a column for the Wall Street Journal (subscription required):
Where some saw a circus, I saw a big tent. I spoke with Jennifer C. Williams, chairman of the Trenton, N.J., Republican Committee and a transgender activist. Twenty feet away, I spoke with a religious leader who opposes same-sex marriage.

While a panelist touted capital punishment, several attendees crowded the Conservatives Concerned About the Death Penalty booth. Hours after President Trump recast Oscar Brown Jr.’s song “The Snake” as an ugly anti-immigrant parable, several influential Republicans were asking me, a naturalized citizen, how they can support my startup.

In retrospect, I’m embarrassed at how nervous I was when I arrived. I found myself singing along to “God Bless the USA” with a hilariously rowdy group of college Republicans, having nuanced discussions about gun control and education policy with people from all walks of life, nodding my head in agreement with parts of Ben Shapiro’s speech, and coming away with a greater determination to burst ideological media bubbles.

Among liberals, conservatives have a reputation for being closed-minded, even deplorable. But in the Washington Republicans I encountered at CPAC, I found a group of people who acknowledged their party’s shortcomings, genuinely wondered why I left my corporate job to join Mrs. Clinton’s campaign in 2016, and listened to my arguments before defending their own positions....

As I look back on all the people who greeted me warmly, made sure I didn’t get lost in the crowd, and went out of their way to introduce me to their friends, I can’t help but wonder how a Trump supporter would have fared at a Democratic rally. Would someone wearing a MAGA hat be greeted with smiles or suspicion, be listened to or shouted down?
Judging by the frequent behavior of the Left toward opinions they don't like, it's doubtful those wearing MAGA hats would get a polite hearing. Conservatives, including the MAGA hat crowd, have, over the last two years, endured being shouted at, cursed, reviled, spit upon, beaten and ostracized by those who claim for themselves the mantle of tolerance.

Some on the Left have actually called for the imprisonment and even death of those who disagree with them on climate change and other issues.

Matt Vespa at TownHall.com answers Wahed's question this way:
[L]iberals would shout down Trump supporters and conservatives at their gatherings, things will be thrown at them, and they would be called racists. With the exception of Ms. Wahed, today’s liberals cannot share space, have relationships with, or even be near someone who voted for Donald Trump and the Republican Party. We’re anathema. Period.

Yet, it shows how the ideological roots of the two sides yield entirely different results. The values of the Republican Party are grounded in the rule of law, respect for life, the family, and equalizing opportunities. It’s about hard work and the principles of freedom that are grounded in our founding and the Constitution.

There is nothing that’s race-specific about freedom, love of country, and ensuring everyone has the opportunity to make it in America. Anyone can be a conservative, which probably explains why the Left is so aggressive in trying to paint the movement as racist, and call any person of color who identifies as such a race traitor, confused, or someone acting against their own interest. It’s abject nonsense.
Where Vespa says "Republican" I'd prefer to stay with "conservative", and Ms. Wahed is certainly not the only liberal who would treat her political opponents with civility, but otherwise what he says is a pretty accurate description of how vicious our politics has become.

The friends who warned Ms Wahed about mingling with conservatives were not only stereotyping but also indulging a very human tendency to engage in projection. They knew how their side often treats their political opponents and assume that the other side would surely behave the same way.

It's nice that Ms. Wahed found that the stereotype promoted on the Left isn't the reality.

Wednesday, March 7, 2018

Socrates

Socrates (470–399 BC) was one of the most influential philosophers in all of human history. He himself never wrote anything but his unique mode of discourse, which came to be known as the “Socratic method,” remains as one of the great teaching styles and modes of inquiry still in use today.

Dr. Paul Herrick writes a good overview of Socrates' style as well as the details of his trial and death at Philosophy News. Here are some excerpts from his discussion of the Socratic Method:
At some point around the middle of his life...Socrates became convinced that many people think they know what they are talking about when in reality they do not have a clue. He came to believe that many people, including smug experts, are in the grips of illusion. Their alleged knowledge is a mirage.

Similarly, he also saw that many believe they are doing the morally right thing when they are really only fooling themselves—their actions cannot be rationally justified.

As this realization sank in, Socrates found his life’s purpose: he would help people discover their own ignorance as a first step to attaining more realistic beliefs and values. But how to proceed?

Some people, when convinced that others are deluded, want to grab them by their collars and yell at them. Others try to force people to change their minds. Many people today believe violence is the only solution. None of this was for Socrates. He felt so much respect for each individual—even those in the grips of illusion and moral error—that violence and intimidation were out of the question. His would be a completely different approach: he asked people questions. Not just any questions, though.

He asked questions designed to cause others to look in the mirror and challenge their own assumptions on the basis of rational and realistic standards of evidence. Questions like these: Why do I believe this? What is my evidence? Are my assumptions on this matter really true? Or am I overlooking something? Are my actions morally right? Or am I only rationalizing bad behavior?
This may not seem like such a big deal but it is. Most of us have no desire to question our beliefs about important matters like religion or politics, and when someone does question us our response is often to get defensive and to just shout louder than the other person until the exchange ends in anger. We see a form of this when college students shout down speakers with whom they disagree and refuse to let them speak (for a couple of recent examples see here and here).

Such behavior is not just rude and intellectually immature, it's a signal that the shouters have no good reasons for believing what they do and deep down realize that their beliefs can only prevail if the other side is denied a hearing. The cause of truth is ill-served by such tactics, but then the thugs who engage in this behavior aren't really interested in truth in the first place.

Herrick continues:
Looking in the mirror in a Socratic way can be painful. For reasons perhaps best left to psychologists, it is easy to criticize others but it is hard to question and challenge yourself. There are intellectual hurdles as well. Which standards or criteria should we apply when we test our beliefs and values?

Socrates, by his example, stimulated a great deal of research into this question. Over the years, many criteria have been proposed, tested, and accepted as reliable guides to truth, with truth understood as correspondence with reality.

These standards are collected in one place and studied in the field of philosophy known as “logic”—the study of the principles of correct reasoning. Today we call someone whose thinking is guided by rational, realistic criteria a “critical thinker.” Our current notion of criterial, or critical, thinking grew out of the philosophy of Socrates.

So, moved by the pervasiveness of human ignorance, bias, egocentrism, and the way these shortcomings diminish the human condition, Socrates spent the rest of his life urging people to look in the mirror and examine their assumptions in the light of rational, realistic criteria as the first step to attaining real wisdom. Knowledge of your own ignorance and faults, he now believed, is a prerequisite for moral and intellectual growth.

Just as a builder must clear away brush before building a house, he would say, you must clear away ignorance before building knowledge. As this reality sank in, his conversations in the marketplace shifted from the big questions of cosmology to questions about the human condition and to that which he now believed to be the most important question of all: What is the best way to live, all things considered?

Socrates’s mission—to help others discover their own ignorance as a first step on the path to wisdom--explains why he expected honesty on the part of his interlocutors. If the other person does not answer honestly, he won’t be led to examine his own beliefs and values. And if he does not look in the mirror, he will not advance. For Socrates, honest self-examination was one of life’s most important tasks.
When our most deeply-held beliefs are at risk, when we're confronted by compelling challenges to those beliefs, honesty is often difficult. Not only are our convictions at stake but so is our pride. It's humbling to have to acknowledge that we've been wrong about a belief we've held. We resort to all manner of diversion, obfuscation and fallacy in order to escape the conclusion our interlocutor's argument may be leading us toward. We resist it, we refuse to believe it, regardless of the price we must pay for that refusal in terms of our intellectual integrity.

There's an old ditty that captures the psychology of this well: "A man convinced against his will is of the same opinion still."

Socrates himself encountered this resistance to having one's beliefs challenged and paid with his life for having discredited the certainties of very proud and vain men. You can read about what happened to him in Herrick's column at the link.