Wednesday, September 10, 2025

Maybe the Best Empirical Evidence for Life After Death (Pt. I)

Gary Habermas has been investigating reports of Near Death Experiences (NDEs) since 1972. NDEs are experiences that people claimed to have had while they were flat-lined from some physical trauma and subsequently resuscitated. They were either very near death or had actually died.

Habermas believes the evidence for the survival of persons after death is overwhelming and is the best argument against materialism, the belief that there are no immaterial substances, like minds or souls, in the universe.

He contributed a fascinating chapter on this subject to an anthology of articles on the nature of the mind titled Minding the Brain, which can be found online in three installments.

I'm going to summarize these installments over the next couple of days. In the first installment, Habermas discusses those NDEs for which there is corroborating empirical evidence of the person's claims. He writes,
[C]orroborated data may indicate the presence of human consciousness during times where neither the heart nor brain registered any observable measurements. Further, such experiences apparently took place during NDEs that involved substantiated observations that almost certainly could not have been made from the person’s bodily location, even if they had been fully conscious, healthy, and observing their surroundings at that time.

[Medical research has established that] heart stoppage initiates the immediate and measurable elimination of upper brain activity just seconds later during the persistence of this state. The cessation of lower brain activity occurs just very slightly afterwards.

Therefore, verified NDE data that occur within this time frame are exceptionally crucial in indicating the potential presence of consciousness beyond the quantifiable existence of the central nervous system.
Habermas states that there are well over 150 cases of empirically verifiable NDEs in the medical literature in which the experience occurred precisely within the moments when measurable heart and brain activity were absent.
This is a major argument for continued consciousness during these moments....the sheer unlikelihood of mistakenness or deceit of one sort or another in every one of the dozens of different situations makes appeal to mistake or deceit seem incredible, particularly when the described occurrence happened precisely during those minutes rather than before or after.
He gives a few examples in the chapter in Minding the Brain. Here are a couple:
One case involved a shoe found on a hospital roof. It was reported from Hartford, Connecticut by Kenneth Ring and Madelaine Lawrence. The resuscitated patient claimed to have had an NDE in which she floated above her body and then watched the resuscitation attempt going on beneath her. Then she experienced being “pulled” through several floors of the hospital until she emerged near the building’s roof, where she viewed the Hartford skyline. Looking down, she then observed a red shoe.

When nurse Kathy Milne heard the story, she reported it to a resident physician, who mocked the account as a ridiculous tale. However, in order to ascertain the accuracy of the report, he enlisted a janitor’s assistance, and was led onto the roof, where he found the red shoe!

Kristle Merzlock was a young girl who nearly drowned and was resuscitated by a doctor. Upon regaining consciousness three days later, her intensive care nurses initially heard her recollection of having visited heaven, guided by an angel. Though there was no way to verify the angel, Kristle also testified that, although she was unconscious and hooked up in the hospital, she was “allowed” to observe her parents and siblings some distance away, at home for the evening.

She provided exact details regarding where each person was located in the house, identifying the specific things they were doing, as well as the type of clothes that they were wearing. For instance, she identified that her mother was cooking roast chicken and rice for dinner. All of these particulars were subsequently confirmed very soon afterwards.
He concludes this excerpt with this:
In addition to the above cases, there is a large number of reported and documented distance NDEs said to have occurred in the absence of any measurable heart or brain activity. A number of the cardiac arrest cases include some of the strongest evidential scenarios.

Once again, as in the previous category of corroborated observation inside the room, it is exceedingly unlikely that every last one of another dozen cases exhibiting neither apparent heart nor brain activity can be meaningfully accounted for by data learned through other means, misperception or deception, coincidences, or mistakes, especially when the events described occurred precisely within the time interval of the medical crisis rather than subsequently.
One of the most stunning examples of a verifiable NDE of this latter sort isn't actually mentioned by Habermas in this excerpt but is recounted by neurosurgeon Michael Egnor in his recent book The Immortal Mind. It's the story of a 35 year-old woman named Pam Reynolds, who underwent surgery in 1991 to remove a brain aneurysm. The surgery was performed by the world's leading aneurysm specialist, Dr. Robert Spetzler.

The procedure required that Reynolds' body be chilled to 600F, her heart was stopped, and all the blood was drained from her brain. She had a high frequency noise directed into her ears that prevented her from hearing any ambient conversation and which allowed the doctors to know when her brain activity was completely shut down. She was essentially brain-dead.

Dr. Spetzler then carved a window into her skull to expose her brain. Afterwards, Reynolds described being pulled out of her body and watching the procedure from a vantage point like sitting on the surgeon's shoulder. She was able to accurately recount conversations among staff she heard during the procedure and describe surgical tools that were not unwrapped until after she was unconscious.

There appears to be no good materialist explanation for her experience, an experience that was thoroughly documented by Dr. Spetzler.

More tomorrow.

Tuesday, September 9, 2025

The Overton Window

You may have heard of the "Overton Window" but weren't sure what it was referring to. An article by William F. Marshall at Townhall.com explains where the term comes from and what it means.

Marshall writes,
A term that became popular in the last decade or two is the Overton Window Principle. It was a theory conceived in the 1990s by a brilliant young engineer-turned-lawyer, Joseph Overton, who was a libertarian and executive in a libertarian think tank, the Mackinac Center, in Michigan.

While working in fundraising for the Mackinac Center, Overton developed his Window principle, while explaining to audiences the purpose of think tanks. The essential idea is that there is in every society a “window,” or range, of thoughts and policies that are generally acceptable to the wider population.
Ideas that were formerly considered unthinkable become thinkable and then acceptable by constantly pressing them onto the culture. As the idea gets discussed more it becomes less outlandish, more "reasonable," and the "window" opens wider. We've seen this happen with, among other things, divorce, gay marriage, abortion, and more recently, the cluster of notions attached to transgenderism.

Marshall goes on:
Overton died in 2003 at a tragically young age (43) in an aircraft accident, but had he lived to see our society today, the free marketeer that he was probably wouldn’t appreciate the results of his theorizing. The rise of socialist and communist politicians in America, like Zohran Mamdani, Bernie Sanders, and Alexandria Ocasio-Cortez reveal that the window of acceptable economic policies, particularly for young Americans, is shockingly anti-free market.

But the ideas that were perhaps most outlandish were those involving transgenderism and race- based preferential policies in the form of Diversity, Equity and Inclusion.

In a shockingly brief period of time, the Left managed to convince many, if not most, Americans that we must accept absurdist notions, such as a person’s sex not being defined by the chromosomes which inhabit every cell of their bodies. We were told that a person’s gender was “assigned at birth.” And that the genitalia that tells mom and doctor whether you are a baby boy or baby girl is just an arbitrary construct, and the real factor determining your gender is how you feel.
The window of acceptable ideas surrounding transgenderism has opened very wide indeed. At the turn of the millenium it would've been unthinkable to suggest that men could get pregnant, that men should be allowed to compete in sports against women, and should be permitted in women's restrooms and locker rooms.

The left has been persistent in prying that window open ever wider and one wonders how much wider it can be.

Monday, September 8, 2025

Three Lies About Israel

In a piece at the Wall Street Journal (paywall) Bernard-Henri Lévy discusses three lies about the Israeli war with Hamas that are perpetrated by the Western media and other international organizations. Here's his lede:
Foolish notions about Israel are being repeated morning and night. Relayed by high international authorities. Validated by respectable organizations, among them Action Against Hunger, which I helped found. They are echoed by one of my friends, the Israeli author David Grossman, who 30 years ago helped me launch La Règle du Jeu, a journal that supports every struggle for freedom. Yet they remain foolish notions.
The first example is the lie that Israel is “reoccupying” Gaza. Levy writes (slightly edited):
This isn’t the strategy of the Israel Defense Forces or the position of the government. Whether one likes Mr. Netanyahu or not, the least one can do is listen.

On Fox News, on Aug. 7, he said he wants the IDF to drive out Hamas and take control of Gaza—but “without administering it” and handing it over to a “civil government” as soon as possible. “We do not want to keep Gaza, we do not want to govern Gaza.”

How could he be clearer?

One can be horrified by this war. One can find that it causes, on both sides, far too many victims. One can wonder if everything is being done to bring back the hostages. But one can’t say just anything. One can’t keep repeating, ad nauseam, in a perpetual “J’accuse!” tone that Israel is occupying Gaza.
The second lie is that Israel is using famine as a “weapon of war” in Gaza. Levy says,
Look at the hundreds of trucks that have passed inspection, with Israelis ensuring they are not also carrying weapons. These trucks don’t wait at border crossings. They wait inside Gaza. And then what happens?

At one point, the United Nations agencies charged with aid distribution couldn’t prevent them from being looted by Hamas or gangs linked to it. Then Israel, with the U.S., set up the Gaza Humanitarian Foundation, to which the U.N. agencies replied: No thanks, the starving can wait, we won’t touch that bread, we will not collaborate with those people (i.e. the Israelis).

Then Israel agreed to work with the official agencies if they purged their ranks of Hamas activists. The agencies replied that Gaza was too dangerous.

Then Israel offered to open humanitarian corridors 10 hours a day in areas including Al-Mawasi, Deir al-Balah and Gaza City. The agencies demanded Israeli military convoys, even as they accused the Israeli military of engineering a famine.
The third lie is that Israel is committing “genocide” in Gaza. Here's Levy:
To say “genocide” means a plan—a deliberate, targeted initiative to destroy a people. That isn’t what the Israeli army is doing.

Perhaps it is waging the war badly—though who would do better in an asymmetric conflict when the enemy’s goal isn’t to minimize casualties on its own side but to maximize them, so that every martyr is a trophy and a reason to continue a fight whose aim is not a state for the Palestinians, but no state at all for the Israelis?

A genocidal army doesn’t take two years to win a war in a territory the size of Las Vegas. A genocidal army doesn’t send SMS warnings before firing or facilitate the passage of those trying to escape the strikes. A genocidal army wouldn’t evacuate, every month, hundreds of Palestinian children suffering from rare diseases or cancer, sending them to hospitals in Abu Dhabi as part of a medical airlift set up right after Oct. 7. To speak of genocide in Gaza is an offense to common sense, a maneuver to demonize Israel, and an insult to the victims of genocides past and present.
Even a causal observer can see that the Gazan people are suffering terribly, but the fault for that lies entirely with the party they elected to govern them, Hamas, and that party's military wing.

If Hamas had not invaded Israel and slaughtered over a thousand Israelis this war would never have started. If Hamas would today surrender their arms and hostages this war would immediately cease.

The Israelis do not want this war. They fight because it's a matter of their national survival. They fight because the barbarians are literally at the gates of their homes.

Saturday, September 6, 2025

Our Pagan Future

In an interview at Science and Culture neurosurgeon Michael Egnor discusses how the cumulative evidence for Near Death Experiences has reinforced his belief in the existence of an immaterial, immortal soul.

The interview is interesting and not overlong, but toward the end of the interview Egnor asserts his belief that materialism (the belief that everything that exists is reducible to material stuff) and atheism (the lack of a belief in God) are fading and are giving way to a 21st century paganism.

He states:
That said, I believe that atheism and materialism are on the wane. I believe that New Atheism and materialism were never viable metaphysical or scientific perspectives. They are vacuous nonsense. They were meant to open a door — to discredit the Christian understanding of nature and man — to let in paganism, which is an incredibly malignant force that is on the rise in the West.

Child sacrifice — abortion and the castration and sexual mutilation of mentally ill children, sexual grooming of children in schools, drag queen story hour for preschoolers, as well as “assisted suicide” and euthanasia for despondent people, rampant pornography and sexual perversion, and the ubiquitous moral relativism that plagues our culture … these are hallmarks of paganism.

Paganism, unlike atheism or materialism, is a religious perspective that is appealing to billions of people, and in fact, is in a real sense the natural religion of man. It was paganism, not atheism and materialism, that Christianity vanquished, and it is paganism that is returning.

I am very concerned about paganism, which is clearly on the rise in our culture. I believe that materialism and atheism were never meant to be enduring philosophies — they have never been viable ways of understanding man and nature. They are intellectually and morally vacuous. Materialism and atheism were meant to discredit Christianity and open the door to paganism, which is enduring and poses an enormous threat to countless souls and to Western civilization.
One might ask who, or what, "meant" for Christianity to be discredited and paganism to be revived, but set that question aside. John Daniel Davidson argues that paganism is the endpoint of a loss of Christian consensus in his book Pagan America: The Decline of Christianity and the Dark Age to Come.

Davidson writes:
There is no secular utopia waiting for us in a post-Christian world now coming into being, no future in which we get to retain the advantages and benefits of Christendom without the faith from which they sprang. Western civilization and its accoutrements depend on Christianity.
Many others have made the same point in recent years, Tom Holland in his book Dominion, being one of the most noteworthy since Holland is himself a secular man.

Davidson does not limit the definition of paganism just to worshippers of the gods of pre-Christian societies, but rather defines it as "an entire system of belief, which holds that truth is relative and that we are therefore free to ascribe sacred or divine status to the here and now, to things or activities, even to human beings if they're powerful enough....power alone determines right."

His definition is similar to that of the early church fathers who considered anyone who wasn't a Jew or a Christian to be a pagan.

In any case, I'm not sure that paganism is the only possible consequence of a loss of a vibrant Christianity in the West. It may well be the short-term consequence, but in the long term it seems to me that a spiritually effete West will be overwhelmed by Islam.

Whether that would better than paganism I leave it to the reader to decide, but it seems likely that if Christianity continues to lose purchase with modern Western people one or both of those outcomes will be our fate.

Friday, September 5, 2025

Were Americans Better Off Under Biden or Trump?

An interesting comparison between Donald Trump's first term and Joe Biden's only term reveals that Americans were economically much better off under Trump than under Biden. John Hinderaker at Powerline blog summarizes the study which was done by The Committee to Unleash Prosperity.

Here's what they found:
  • Under President Trump the average family gained ten times more income than under Joe Biden. Every income group did better under Trump than Biden — by a wide margin.
  • Real median household income grew ten times more under Trump ($6,400) than under Biden ($550).
  • Real median household income grew more in Trump’s first four years in office ($6,400) than it did in Obama’s eight years in office and Biden’s one term in office combined ($5,550).
  • Income for lower-income Americans (the income cutoff to be in the bottom 25%) ROSE by $3,959 under Trump and fell under Biden (-$170). The Biden policies made the poor slightly poorer because of the high inflation over that (2020-2024) period.
  • The lowest income category of Americans had the largest gains of income under Trump. The bottom 25% gained 10% in income, the median household gained 8% and the richest 25% gained 7%.
Hinderaker writes that
I testified before the Joint Economic Committee of Congress on the effects of the Tax Cuts and Jobs Act of 2017. The numbers were impossible to argue with, and Democratic members of the committee didn’t try to dispute them. And the Trump administration’s deregulatory efforts also strengthened economic growth and personal incomes.

Liberals purport to be puzzled by the swing of middle and working class voters to the GOP. There is no mystery: millions have learned from their own experience that conservative policies work better, for them, than liberal policies.

Liberal policies are great for thin slices of the elite population and for some welfare recipients and public employees, and bad for pretty much everyone else.
It'll be interesting to see the numbers from Trump's second term.

Thursday, September 4, 2025

The Crucial Importance of Marriage

One of the troubling sociological trends in our society has been the growing gap between the upper and lower classes, and an important fact researchers have discovered about this gap is the shift in attitudes among members of the lower socio-economic class toward marriage.

This is not a new topic here at Viewpoint, but a piece by Glenn Stanton reminds us of the disturbing trend occurring among the low-income segment of our population, a trend that's found, by the way, among all racial groups. Here are some key excerpts:
Just 70 years ago, social mobility and protection from poverty were largely a factor of employment. Those who had full-time work of any kind were seldom poor. Fifty years ago, education marked the gulf separating the haves from the have-nots. For the last 20 years or more, though, marital status has increasingly become the central factor in whether our neighbors and their children rise above, remain, or descend into poverty. The research is astounding.

Charles Murray of the American Enterprise Institute explains in his important book “Coming Apart: The State of White America” that in 1960, the poorly and moderately educated were only 10 percent less likely to be married than the college educated, with both numbers quite high: 84 and 94 respectively. That parity largely held until the late 1970s.

Today, these two groups are separated by a 35 percent margin and the gap continues to expand. All the movement is on one side. Marriage is sinking dramatically among lower- and middle-class Americans, down to a minority of 48 percent today. No indicators hint at any slowing. It’s remained generally constant among the well-to-do. This stark trend line led Murray to lament, “Marriage has become the fault line dividing America’s classes.”

Jonathan Rauch writing in the National Journal, certainly no conservative, notes that “marriage is displacing both income and race as the great class divide of the new century.” Isabel Sawhill, a senior scholar at the center-left Brookings Institute, boldly and correctly proclaimed some years ago that “the proliferation of single-parent households accounts for virtually all of the increase in child poverty since the early 1970s.” Virtually all of the increase!
We spend fortunes on various programs to rescue poor women and children who are drowning in the quicksands of impoverishment, but the remedy for poverty is almost incredibly simple and inexpensive:
Professor Bill Galston, President Clinton’s domestic policy advisor and now a senior fellow at Brookings, explained in the early 1990s that an American need only do three things to avoid living in poverty: graduate from high school, marry before having a child, and have that child after age twenty. Only 8 percent of people who do so, he reported, will be poor, while 79 percent who fail to do all three will.

These disturbing family-path trends are unfortunately true for millennials, as well.
One troubling aspect of this problem is that young people are increasingly failing to follow Galston's prescription:
A recent report on this topic focusing on millennials reports that 97 percent of those who follow the success sequence—earn at least a high-school diploma, work, and marry before having children—will not be poor as they enter their 30s. This is largely true for ethnic minorities and those who grew up in poor families. But sadly, fewer millennials are keeping these things in order, compared to their Boomer and Xer forbearers.
It's astonishing, given the human predilection for personal well-being, and the enormous emphasis placed on well-being in our modern world, that so many people are spurning the most reliable means of achieving it:
A consistent and irrefutable mountain of research has shown, reaching back to the 1970s and beyond, that marriage strongly boosts every important measure of well-being for children, women, and men. Pick any measure you can imagine: overall physical and mental health, income, savings, employment, educational success, general life contentment and happiness, sexual satisfaction, even recovery from serious disease, healthy diet and exercise. Married people rate markedly and consistently better in each of these, and so many more, compared to their single, divorced, and cohabiting peers.

Thus, marriage is an essential active ingredient in improving one’s overall life prospects, regardless of class, race, or educational status.

Only 4 percent of homes with a married mother and father are on food stamps at any given time. But 21 percent of cohabiting and 28 percent of single-mother homes require such public assistance. Likewise, 78 percent of married people own their own home—a central goal in achieving the American Dream—while only 41 percent of cohabiting adults and 44 percent of singles do. Data indicates that marital status boosts home ownership more than home ownership increases marital opportunities.

Even women entering marriage between the conception and birth of their first child, regardless of class, education, and race, benefit from a greater standard of living by the following percentages.
  • 65 percent over a single mother with no other live-in adult
  • 50 percent over a single mother living with a non-romantic adult
  • 20 percent over a single mother living with a man
The economic and other benefits to children of growing up in an intact family are profound, and when there are several generations of children raised in such families those benefits are compounded by having two sets of grandparents who often have the financial wherewithal to provide assistance to their children and grandchildren with mortgages, cars, college tuition, as well as counseling and guidance through the vicissitudes of life.

The benefits of marriage also extend to men who become fathers:
The advantages of growing up in an intact family and being married extend across the population. They apply as much to blacks and Hispanics as they do to whites. For instance, black men enjoy a marriage premium of at least $12,500 in their individual income compared to their single peers. The advantages also apply, for the most part, to men and women who are less educated. For instance, men with a high-school degree or less enjoy a marriage premium of at least $17,000 compared to their single peers.

Marriage generates wealth largely because marriage molds men into producers, providers, and savers. Singleness and cohabiting don’t. Nobel-winning economist George Akerlof, in a prominent lecture more than a decade ago, explained the pro-social and market influence of marriage upon men and fathers: “Married men are more attached to the labor force, they have less substance abuse, they commit less crime, are less likely to become the victims of crime, have better health, and are less accident prone.”

Akerlof explains this is because “men settle down when they get married and if they fail to get married, they fail to settle down.” This is precisely why every insurance company offers lower premiums on health and auto insurance to married men. Settled-down men also work more, earn more, save more, and spend more money on their families than on themselves. They boost the well-being of women and children in every important way.
It's tragic that the belief that the traditional two-parent family is obsolete or unnecessary has gained currency today. From the fact that it's not perfect nor as prominent as it was fifty years ago, however, it doesn't follow that it's either unnecessary or obsolete. It still remains the best soil in which to grow flourishing men, women, and children and a strong, prosperous society.

Wednesday, September 3, 2025

Is Mind the Ultimate Reality?

Why do some physicists think that the material universe is somehow dependent for its existence upon minds? Why do they think that an observation somehow creates a reality which didn't exist prior to the observation?

The following video illustrates a classic experiment that some say proves that materialism, the belief that matter is the fundamental reality, is false. The experiment is compatible with the view that mind is the most fundamental substance and that matter is a product of an observing mind.
One commenter at the Youtube site for this video asserts that all the theories seeking to explain the existence of the universe distill to three possibilities:

1. Either the universe(s) has always existed in one form or another and thus never needed creating because it always existed.

2. Or the universe(s) created itself from nothing where nothing previously existed.

3. Or that a divine entity has always existed and created it through an act of will.

He goes on to say that:
Each of these alternatives is equally outrageous and impossible to believe but one MUST be true. I like to think the first one is true.
I don't agree that these are all equally hard to believe. I think the second is much harder to believe than the other two. Be that as it may, the commenter favors the first as a matter of metaphysical preference which is another way of saying that he doesn't really want a divine creator to exist.

Why he's averse to that alternative, he doesn't say, but I think a lot of people, whether theists or naturalists, share his basic outlook. What they believe about the universe, their fundamental worldview, is not a matter of logic or compelling reasons.

It's more a matter of taste, or subjective preference, or aesthetics, and it's very difficult, especially in this pragmatic, postmodern age, to persuade someone whose belief is based on a matter of personal preference to abandon it for an alternative belief, especially the belief that there really is a Mind that undergirds all of reality.

Tuesday, September 2, 2025

The Eye

For about the last century or so Darwinian naturalists have cited the eye's design as evidence against the existence of an intelligent designer. This is surprising because the eye is an exquisitely engineered organ, but the argument of the Darwinians has been that there are several design flaws in the eye's structure that any competent engineer would have avoided.

One of the alleged flaws is that the rod and cone cells in the retina face backward rather than forward which would seem to minimize the amount of light that reaches them. As such, the eye seems to reflect sub-optimal engineering, and, the argument goes, sub-optimal structures are what we would expect given that naturalistic evolution is a blind, rather haphazard process. They're the very opposite of what we would expect were the structure intelligently constructed by a competent designer.

As the short video below illustrates, however, the backward facing cells are actually an ingenious way to optimize vision and not a defective design at all.

The video also makes short work of the claim that complex eyes evolved over very long periods of evolutionary time by numerous successive short steps. In fact, the very earliest eyes found in the fossil record are just as complex as are the eyes found in organisms today. If eyes did evolve, the process must have been very rapid and thus, it's reasonable to assume, somehow intelligently directed.

Indeed, the only basis there can be for ruling out an intelligent agent guiding the process is an a priori commitment to metaphysical naturalism, but why privilege naturalism in such a way if there's evidence to suggest it may be wrong? Yet people do it all the time as this famous quote from geneticist Richard Lewontin reveals:
Our willingness to accept scientific claims that are against common sense is the key to an understanding of the real struggle between science and the supernatural. We take the side of science in spite of the patent absurdity of some of its constructs, in spite of its failure to fulfill many of its extravagant promises of health and life, in spite of the tolerance of the scientific community for unsubstantiated just-so stories, because we have a prior commitment, a commitment to materialism [i.e. naturalism].

It is not that the methods and institutions of science somehow compel us to accept a material explanation of the phenomenal world, but, on the contrary, that we are forced by our a priori adherence to material causes to create an apparatus of investigation and a set of concepts that produce material explanations, no matter how counter-intuitive, no matter how mystifying to the uninitiated. Moreover, that materialism is absolute, for we cannot allow a Divine Foot in the door.
As Lewontin's declaration of loyalty to naturalism illustrates, it's not science as such that conflicts with the notion of intelligent agency at work in biology. The conflict is between two metaphysical worldviews, naturalism and theism. Lewontin is acknowledging that his choice to embrace naturalism is a subjective philosophical preference, a preference akin to a personal taste and not based on any empirical evidence at all. He embraces naturalism for no reason other than that he has a deep metaphysical, and perhaps psychological, aversion to theism.

Anyway, give the video a look:

Monday, September 1, 2025

Against Raising the Minimum Wage

Note: This post is a rerun of one originally written just before the Covid-19 pandemic devastated the restaurant industry, but it's still relevant today:

On Labor Day perhaps it's appropriate to revisit the debate over raising the minimum wage.

On the surface raising the minimum wage to $15 an hour seems like a simple solution to help unskilled, poorly educated workers struggling with poverty, but, like most simple solutions, raising the minimum wage has unintended consequences that hurt the very people it's supposed to help.

An article by Ellie Bufkin at The Federalist explains how raising the minimum wage has actually harmed many workers, especially in the restaurant industry.

New York state, for example passed a law several years ago requiring that businesses offer mandatory paid family leave and pay every employee at least $15 an hour, almost twice the previous rate. The results were predictable and indeed were predicted by many, but the predictions went unheeded by the liberal New York legislature.

Bufkin uses as an illustration a popular Union Square café called The Coffee Shop which is closing its doors in the wake of the new legislation. The Coffee Shop employs 150 people, pays a high rent and under the Affordable Care Act must provide health insurance.

Now that the owner must pay his employees twice what he had been paying them he can no longer afford to stay in business:

Seattle and San Francisco led New York only slightly in achieving a $15 per hour minimum pay rate, with predictably bad results for those they were intended to help.

As Erielle Davidson discussed in these pages last year, instead of increasing the livelihood of the lowest-paid employees, the rate increase forced many employers to terminate staff to stay afloat because it dramatically spiked the costs of operating a business.

Davidson noted that,
Understaffed businesses face myriad other problems [in addition to] wage mandates. Training hours for unskilled labor must be limited or eliminated, overtime is out of the question, and the number of staff must be kept under 50 to avoid paying the high cost of a group health-care package. The end result is hurting the very people the public is promised these mandates will help.

Of all affected businesses, restaurants are at the greatest risk of losing their ability to operate under the strain of crushing financial demands. They run at the highest day-to-day operational costs of any business, partly because they must employ more people to run efficiently.

In cities like New York, Washington DC, and San Francisco, even a restaurant that has great visibility and lots of traffic cannot keep up with erratic rent increases and minimum wage doubling.

When the minimum wage for tipped workers was much lower, employees sourced most of their income from guest gratuities, so restaurants were able to staff more people and provided ample training to create a highly skilled team. The skills employees gained through training and experience then increased their value to bargain for future, better-paying jobs.

Some businesses will lay off workers, cut back on training, not hire new workers or shut down altogether. A Harvard study found that a $1 increase in the minimum wage leads to approximately a 4 to 10 percent increase in the likelihood of any given restaurant folding.
How does this help anyone other than those who manage to survive the cuts? When these businesses, be they restaurants or whatever, close down it's often in communities which are "underserved" to start with, and the residents of those communities wind up being more underserved than they were before the minimum wage was raised.

Moreover, raising the minimum wage makes jobs heretofore filled by teenagers and people with weak qualifications more attractive to other applicants who are at least somewhat better qualified.

Workers who would've otherwise shunned a lower wage job will be hired at the expense of the poorly educated and unskilled, the very people who most need the job in the first place and who were supposed to be helped by raising the minimum wage.

Despite all this our politicians, at least some of them, still think raising the minimum wage is a social justice imperative, even if it hurts the people it's supposed to help.

Or perhaps the politicians know it's a bad idea, but they see advocating a mandatory increase in wages as a way to bamboozle the masses into thinking the politician deserves their vote.

Saturday, August 30, 2025

The "Rational" Man

Philosopher and novelist Iris Murdoch, in her book The Sovereignty of Good (1970) describes in vivid accents the modern man who prides himself in his rational approach to life unencumbered by the silly superstitions believed in by gullible religious people. The modern rational man, typified in her telling by someone like the 18th century philosophical icon Immanuel Kant, is a man who ...
...confronted even with Christ turns away to consider the judgement of his own conscience and to hear the voice of his own reason . . . . This man is with us still, free, independent, lonely, powerful, rational, responsible, brave, the hero of so many novels and books of moral philosophy. The raison d’être of this attractive but misleading creature is not far to seek . . . .

He is the ideal citizen of the liberal state, a warning held up to tyrants. He has the virtue which the age requires and admires, courage. It is not such a very long step from Kant to Nietzsche, and from Nietzsche to existentialism and the Anglo-Saxon ethical doctrines which in some ways closely resemble it.

In fact Kant’s man had already received a glorious incarnation nearly a century earlier in the work of Milton: his proper name is Lucifer.
Lucifer? Why such a harsh judgment? Perhaps because the modern, "rational" man believes only what science and his senses tell him. The rational man looks at himself and his fellows as little more than flesh and bone machines, animals, whose only real "purpose" is to reproduce, experience pleasure and avoid pain.

He regards morality as an illusion. His reason affords him no basis for caring about the weak or the poor, no basis for human compassion, no particular point to conserving the earth's resources for future generations.

Whereas Kant thought that reason dictated the categorical imperative - i.e. the duty to treat others as ends in themselves and not merely as a means to one's own happiness - the fact is that reason, unfettered from any divine sanction, dictates only that each should look to his or her own interests.

In practice modern man may care about the well-being of others, but he must abandon his fealty to science and reason to do so because these provide no justification for any moral obligations whatsoever.

Indeed, the purely rational man is led by the logic of his naturalism to the conclusion that might makes right. The pursuit of power frequently becomes the driving force of his life. It injects his life with meaning. It leads him to build abattoirs like Auschwitz and Dachau to eliminate the less powerful and less human.

Would Kant have agreed with this bleak assessment? No, but then Kant wasn't quite in tune with the modern, rational man. Kant believed that in order to make sense of our lives as moral agents we have to assume that three things are true: We have to assume that God exists, that we have free will, and that there is life beyond the grave.

Take away any of those three and morality reduces to a matter of one's individual preferences and tastes.

The modern man, of course, rejects all three of those assumptions, and in so doing he rejects the notion of objective moral value or obligation. That's why reason has led men to embrace ideologies that have produced vast tracts of corpses, and that's why, perhaps, Murdoch used the name Lucifer to describe them.

Friday, August 29, 2025

Goodbye to the Left

A lot of the discussion surrounding the question of how the Democrat Party can recover its electoral mojo has focused on the ball and chain that is woke ideology and its attendant cancel culture. It reminds me of a post I wrote about this topic some years ago but which still has resonance today. Here it is:

A generation or two ago the last place people would've thought free speech was imperiled was North American universities, yet today the pressure on students to conform to politically correct speech codes in some schools is enormous.

Consider the story of a young grad student named Lindsay Shepherd who attended Wilfrid Laurier University in Canada:
On November 1, 2017, during a first-year undergraduate class Shepherd was teaching, she showed two clips from a public Canadian television channel. The first featured [University of Toronto professor Jordan Peterson], who has been an outspoken opponent of Canadian laws that mandate the use of transgender pronouns.

A heated discussion among the students followed the videos. Later, a student approached an LGBTQ support group, which then filed a complaint with the university’s Diversity and Equity Office. That office requested a meeting with Shepherd on November 8.

Shepherd secretly recorded the meeting, which turned into an interrogation. During the 40-minute circus, university staff (who acknowledged her “positionality” regarding open inquiry), accused her of having created a “toxic climate for some of the students” by playing the clips and approaching the topic neutrally (emphasis mine).

One professor even compared the pronoun debate to discussing whether a student of color should have rights. He also called Peterson a member of the “alt-right” and compared playing a clip featuring Peterson to “neutrally playing a speech by Hitler or Milo Yiannopoulos.” Peterson’s perspective was also rejected as “not valid,” as, apparently, not all perspectives are up for debate.

Shepherd released the recording to Canadian media. Not long afterward, WLU’s president, Deborah MacLatchy, apologized, as did Nathan Rambukkana, a professor and Shepherd’s academic advisor, who was the main antagonist in the meeting. MacLatchy said the meeting did not “reflect the values and practices to which Laurier aspires.”

Shepherd filed a lawsuit in June 2018 against the university, Rambukkana, and several others, for damages of $3.6 million, claiming “harassment, intentional infliction of nervous shock, negligence, and constructive dismissal.” Peterson also filed a lawsuit against Laurier and several university staff.
It's incredible that adopting a "neutral" standpoint on a controversial issue would get an instructor into trouble with her administration in an institution which is putatively committed to free and open inquiry.

It's also ironic because Shepherd considered herself, until this episode, to be a leftist progressive who supports environmental causes and gay marriage. Since then, however, she has published a video on her website which she titles "Goodbye to the Left" in which she explains why she no longer considers herself a leftist although she still retains the same position on many of the social issues she did previously.

The left, however, is so rife with censorship, victimhood culture, and moral righteousness that she no longer feels a part of it. Her video has received almost a million views. You can read more about Shepherd at the link as well as get links to her video and youtube channel.

Her story is a common one. Liberals, and, of course, conservatives, who value free speech and the free flow of ideas are finding themselves hounded, intimidated, and driven from their jobs and careers by an intolerant, Stalinist left that brooks no challenge to, questioning of, nor deviation from, its woke dogmas.

The left is fond of purging from its midst anyone who sits just the slightest bit to their right, but by eliminating everyone situated to the right of the progressive mainstream they ensure that the mainstream continually moves leftward toward fascism, communism or some other tyrannical totalitarianism.

The left - not just the extremists in Antifa but also those who populate our college and university faculties and administrations as well as many in the upper echelons of the Democratic party - is a very real threat to the freedoms we take for granted as Americans, freedoms that are today under the greatest assault of any time in our nation's history.

Lindsay Shepherd and many others are unfortunately having to discover this the hard way. Google, for instance, the stories of liberals like Mozilla CEO Brendan Eich, University of Pennsylvania law professor Amy Wax, or Evergreen College biology professor Brett Weinstein. Or Google software engineer James Damore to see how liberals who have the temerity to express heterodox opinions are harrassed and even have their careers ruined by the progressive brown shirts.

You might also google literature professor Scott Yenor or astronomer Guillermo Gonzalez to get an idea of what a modern day inquisition looks like.

President Trump's Department of Education is doing what it can to end this harassment and intimidation on campus, but their efforts will be for naught if the nation simply turns around in 2028 and re-elects politicians who are the enablers of this academic tyranny.

Thursday, August 28, 2025

Immortality on a Chip

There's been a lot of talk in the last couple of years about the possibility of gaining immortality by downloading one's consciousness into some information-storing medium like a computer chip which could then be implanted into another body of some sort.

It sounds interesting given the technological advances in computer power that've been made in recent years, but as the following 11 minute video points out the obstacles to downloading the contents of one's brain in such a way that the self remains intact are more than daunting.

The narrator of the video, which was recommended to me by one of my students, seems to have a tongue-in-cheek optimism about the prospects of digitizing the brain. There's no reason to think it can't be done, he seems to imply, but as the video proceeds the viewer realizes that the whole point of the video is to show that, in fact, it could never be done.
One wonders, watching this video, how something as astoundingly complex as a brain could have ever evolved by chance, but set that very important questions aside.

In addition to all the fascinating technical difficulties that preserving one's consciousness involves there's another major problem that the video doesn't address. The video assumes that the brain is all that's involved in human consciousness, but it's by no means clear that that's so.

Many philosophers are coming to the conclusion that, in addition to the brain, human beings also possess a mind that somehow works in tandem with the brain to produce the phenomena of conscious experience. If this is correct then the problems entailed by downloading the data that comprise the physical brain are child's play compared to the difficulties of downloading an immaterial mind.

Maybe the only way to gain immortality is the old-fashioned way, the way that involves the God that your grandparents told you about.

Wednesday, August 27, 2025

Why Give Israel the Benefit of the Doubt But Not Hamas?

A good friend asked me recently why I seem to believe what the Israelis tell us about their war in Gaza and discount what Hamas tells us. It's a fair question. Here are eight reasons why I tend to give the Israelis the benefit of the doubt and distrust Hamas:

1. Hamas has repeatedly lied in the past. For example, it has lied about the use to which it was putting foreign aid. It lied about the damage done to the Al-Ahli hospital, claiming it was targeted by the Israelis when they knew it was bombed by their own errant artillery.

2. The Hadiths encourage Muslims to lie to infidels, especially in time of war. According to the authoritative Arabic text, Al-Taqiyya Fi Al-Islam: “Taqiyya [deception] is of fundamental importance in Islam. Practically every Islamic sect agrees to it and practices it. We can go so far as to say that the practice of taqiyya is mainstream in Islam, and that those few sects not practicing it diverge from the mainstream...Taqiyya is very prevalent in Islamic politics, especially in the modern era.”

Of course, it's possible that the Israelis are lying as well, but we have pretty conclusive proof that Hamas has lied about Israeli massacres. If we're to believe that the Israelis are dissembling, conclusive examples of it need to be offered. Moreover, we need to ask the common sense question, when Israel is accused of deliberately bombing a hospital or gratuitously opening fire on civilians trying to access food aid, what benefit there would be to Israel in doing so. There's considerable benefit to Hamas in trying to persuade the world that Israel has committed war crimes, but no benefit to Israel in actually committing those crimes.

3. Hamas uses its own people, including children, as human shields. Those who have so little regard for the lives of their own people can hardly expect the respect from others upon which trust is based.

4. Hamas, including many Gazan civilians, reveled in the slaughter of Israeli citizens. October 7th was an orgy of barbarism and savagery. People who would perpetrate such horror, and those who approve of it, do not deserve the trust of civilized people.

5. Hamas runs schools in Gaza, called madrasas, which inculcate into their children a hatred of Jews. The pronouncements of a society based on hate are not credible, at least not with me.

6. Repeatedly, journalists whose reporting from Gaza seems to favor Hamas have been found to be in the employ of Hamas.

7. Israel is an open, democratic society with a vibrant free press, a formidable political opposition, and a history of calling out its leaders when they transgress liberal values. Hamas, on the other hand, is a closed society with no free press, no history of permitting dissent, and no history of liberal values.

8. Hamas has sworn to destroy the state of Israel. Crediting anything Hamas says is like crediting anything Vladimir Putin says.

The question of credibility was put to me in the context of claims of famine in Gaza. Whether there's actually famine in Gaza or not, it's clear that any suffering could end today if Hamas would lay down their arms and leave the Gaza Strip. The miserable conditions that prevail in Gaza have come about because Hamas initiated a war with Israel on October 7th, 2023, and the misery persists because Hamas refuses to stop fighting and release their hostages.

Somewhat relatedly, there's more than a hint of a double standard in the world's willingness to condemn Israel for whatever misfortune befalls Gazan civilians.

After all, the world didn't much care when Hafiz al-Assad of Syria killed between 10,000 and 25,000 of his countrymen in the city of Hama in 1982. Nor was there an outcry against the Algerian government in 1991 when it slaughtered some 80,000 Algerians, most of whom were civilians. And where are the impassioned speeches at the U.N. and reports on our nightly news about the murders by Muslims of at least 52,250 Christians in Nigeria since 2009?

Is the salient difference that in each case the murderers were, or are, Muslims, but in the present case Israeli Jews are inflicting the damage?

A final thought: Imagine that the roles were reversed and Hamas had the military might of Israel and Israel was as weak as Hamas. How might we expect Hamas to behave toward starving Israelis? Would the world care? Would Hamas be careful to give warnings about their bombing? Would they go to the trouble of moving large populations around to keep them out of the line of fire? Would they go to the expense of shipping food to the hungry Israelis? Would they care what the rest of the world said? Would the rest of the world say anything?

Actually, their own rhetoric tells us what they'd do - they'd push every last Israeli into the sea. And many in Europe and the U.S., in our universities especially, would cheer.

Tuesday, August 26, 2025

Can We Be Good Without God? (Pt. II)

Yesterday we looked at an argument by biologist Jerry Coyne the gravamen of which was that morality is not contingent upon a transcendent moral authority such as God. I'd like to continue our critique of this argument in today's post.

Coyne writes that,
though both moral and immoral behaviors can be promoted by religions, morality itself — either in individual behavior or social codes — simply cannot come from the will or commands of a God. This has been recognized by philosophers since the time of Plato.

Religious people can appreciate this by considering Plato's question: Do actions become moral simply because they're dictated by God, or are they dictated by God because they are moral? It doesn't take much thought to see that the right answer is the second one.

Why? Because if God commanded us to do something obviously immoral, such as kill our children or steal, it wouldn't automatically become OK.

Of course, you can argue that God would never sanction something like that because he's a completely moral being, but then you're still using some idea of morality that is independent of God. Either way, it's clear that even for the faithful, God cannot be the source of morality but at best a transmitter of some human-generated morality.
Coyne here adverts to the classic Euthyphro dilemma which, contrary to what he thinks, has been discredited by philosophers for centuries (see here, and here). It's unfortunate that Coyne is unaware of this, but it illustrates the hazard of experts in one field speaking dogmatically on matters in other disciplines.

But what he says next merits a more thorough response:
So where does morality come from if not from God? Two places: evolution and secular reasoning. Despite the notion that beasts behave bestially, scientists studying our primate relatives, such as chimpanzees, see evolutionary rudiments of morality: behaviors that look for all the world like altruism, sympathy, moral disapproval, sharing — even notions of fairness.

This is exactly what we'd expect if human morality, like many other behaviors, is built partly on the genes of our ancestors.
Assuming this is correct what makes the behaviors he mentions moral or right? If a chimp acted contrary to these tendencies would we think the chimp immoral? Would we call its actions evil or wicked? Why, then - if we're nothing more than hairless apes - do we call humans evil when they torture people or abuse children? We have an aversion to such things, to be sure, but aversion doesn't make something wicked.
And the conditions under which humans evolved are precisely those that would favor the evolution of moral codes: small social groups of big-brained animals. When individuals in a group can get to know, recognize and remember each other, this gives an advantage to genes that make you behave nicely towards others in the group, reward those who cooperate and punish those who cheat. That's how natural selection can build morality.
In other words we should be nice because we've evolved to be nice. This is fallacious. Philosophers since Hume have recognized that one can't derive an ought from an is. Because we've evolved a certain tendency, if indeed we have, it doesn't follow that we have an obligation to express that tendency.

As mentioned yesterday, we've also evolved the tendency to be selfish and mean and a host of other unsavory behavioral traits. Are these behaviors morally right just because they've evolved? Should we consider cruelty good because it's an evolved behavior?

Coyne concludes with this thought:
Secular reason adds another layer atop these evolved behaviors, helping us extend our moral sentiments far beyond our small group of friends and relatives — even to animals.
This is silly. Secular reason says no such thing. What secular reason dictates is that I should look out for my own interests, I should put myself first, and use others to promote my own well-being and happiness.

That may entail that I give people the impression that I care about them in order to get them to assist me in my own pursuit of happiness, but people who are of no use to me are of no value to me. Thus, it'd be foolish of me to sacrifice my comforts to help some starving child in some other part of the world who will never be of any use to me.

Indeed, why, on Coyne's view, is it wrong to refuse aid to victims of poverty and starvation?

Atheistic philosopher Kai Nielson stresses this very point:
We have not been able to show that reason requires the moral point of view or that all really rational persons unhoodwinked by myth or ideology need not be individual egoists or amoralists. Reason doesn't decide here. The picture I have painted is not a pleasant one. Reflection on it depresses me...pure reason...will not take you to morality.
Secular reason and evolution have no answer to the question why we should help those who are in no position to help us, at least none that doesn't reduce to the claim that helping others just makes us feel good. It's an ugly fact about naturalism that its logic entails such conclusions, and either Coyne knows it's ugly and doesn't want his readers to know it, or he has no idea.

In either case he should stick to biology.

Monday, August 25, 2025

Can We Be Good Without God? (Pt.I)

Jerry A. Coyne is a professor in the Department of Ecology and Evolution at The University of Chicago. He's also a prominent atheist who has written a book titled Faith Vs. Fact in which he tries to explain why theism is false.

A few years ago he wrote a column for USA Today in which he argued that belief in God is not necessary for one to live a moral life. He complains that:
As a biologist, I see belief in God-given morality as American's biggest impediment to accepting the fact of evolution. "Evolution," many argue, "could never have given us feelings of kindness, altruism and morality. For if we were merely evolved beasts, we would act like beasts. Surely our good behavior, and the moral sentiments that promote it, reflect impulses that God instilled in our soul."
Coyne believes that human morality is a consequence of the evolutionary process coupled with human reason. God is unnecessary.

There are at least four things wrong with Coyne's rejection of the belief that God is in some sense necessary for ethics. First, "God-given morality" is not incompatible with evolution. God could be the ground both of moral value and of evolutionary change.

There is a serious incompatibility, however, between "God-given morality" and Coyne's naturalism, i.e. his belief that the natural world is all there is. If naturalism is true then there is no God and thus no "God-given morality."

Second, no one argues that evolution could not, at least in theory, have bestowed upon us the sentiments Coyne lists. The problem is that if evolution is the source of these impulses then it's also the source of avarice, bigotry, cruelty, etc.

If we believe that evolution has produced all human behavioral tendencies, on what basis do we decide that one set of behaviors is good and the other bad? Are we not assuming a "moral dictionary", so to speak - a standard above and beyond nature by which we adjudicate between behaviors to determine which are right and which are wrong?

If so, what is that standard?

Third, if an impersonal, mindless, random process is the ultimate source of these behaviors it can't in any moral sense be wrong to act contrary to them. If moral sentiments are the product of natural selection and chance chemical happenstances in the brain there's no non-arbitrary moral value to anything.

Right and wrong reduce to subjective likes and dislikes, and that leads to moral nihilism.

Finally, in the absence of God in what sense are we accountable for our actions? And if we're not accountable, if there's no reckoning for how we behave, what does it mean to say that a given behavior is "wrong"? If there are no posted speed limits on a highway and no enforcement, what does it mean to say that one is "wrong" to go as fast as one wishes or thinks is prudent?

The most it can mean is that other people won't like it, but why should anyone care whether others approve of how he or she behaves?

We'll look at another aspect of Coyne's argument tomorrow.

Saturday, August 23, 2025

A Late Summer Miracle

Those who spend time outdoors in the late summer are about to witness an annual "miracle."

Every year an estimated 100–200 million monarch butterflies migrate two thousand to three thousand miles between the United States/Canada and Mexico. While there are other populations of monarchs, including in western North America, South America, the Caribbean, and Australia, the population in eastern North America is the best known because of its amazing migration.

Monarch Butterfly
For example, they're the only butterfly species known to make a two-way migration.

They can travel between 50 to 100 miles a day during their 3000 mile journey to Oyamel fir forests in the Mexican mountains nearly two miles above sea level.

They roost in the trees in a dozen or so of these mountain areas from October to March, often returning to the same tree in which their ancestor roosted the previous year.

In late summer in northeastern North America dwindling food supply and shorter days trigger the Monarch's migratory impulse. A generation that has hatched after mid-August begins the trek south for wintering grounds they've never been to before. Most summering Monarchs live for about two to six weeks, but this migrating generation can live up to nine months.

The migrants travel during the day and roost at night, often in the same trees that previous generations used as roost sites during their migration.

During the summer their range covers close to 400,000 square miles, but when they finally arrive in Mexico they squeeze into territories of less than half a square mile.

Monarchs roosting in Mexican Oyamel fir trees
One of the most amazing aspects of this is that these butterflies, with brains the size of a pinhead, can navigate so unerringly across thousands of miles of terrain. Researchers believe that they use a complex system which involves ultraviolet sunlight, a magnetic compass, the position of the sun and an internal clock.

Their internal clock tells them the time of day. In the morning when the sun is rising they navigate to the west of it. At noon they fly toward it and in the afternoon they fly to the east of it. This strategy keeps them flying due south as depicted in these figures:

Another amazing fact is that the generation that made the long trip from the northeast and over-wintered in Mexico is not the generation that returns to the northeast. This generation begins the trip back in the spring but they reproduce and die along the way.

The second generation continues the migration, but they, too, reproduce and die along the way. It's the third generation that makes it back to the summering grounds in the northeast, but they also reproduce and die, so it is their offspring that begin the cycle all over again in August.

There's an interactive feature here that shows the Monarch's pattern of migration. All of this raises questions:

How does each year's crop of butterflies "know" the route to take to get back to the same trees in Mexico that their ancestors left from when they've never done it before?

How do those butterflies born along the return trip "know" to continue the migratory flight and "know" which direction to take?

What is the source of the information needed for these insects to complete this astonishing journey?

And how would all this have come about through a blind, purposeless process like natural selection and genetic mutation?

Comparisons of migratory monarch genomes with the genomes of non-migratory monarchs have revealed that some five hundred and thirty genes are involved in migratory behavior so that means that on the Darwinian hypothesis there must have been a minimum of five hundred thirty genetic mutations in the history of the species, all of which were random but which somehow fortuitously produced the ability to successfully make this astonishing journey.

Moreover, Monarchs are believed to have evolved about two million years ago so the migrating variety must've split off from the ancestral stock sometime thereafter. Thus, at the most, those 535 mutations must've accumulated within the last two million years, a very short time for all that evolution to have taken place - at least it's a very short time if the evolution were unguided by any outside intelligence.

If this all came about naturalistically that would be almost miraculous, which is ironic since naturalism discounts miracles.

It's possible, of course, that this migratory behavior could've evolved by unguided, purposeless processes, in the same sense that it's possible that elephants could've evolved the ability to fly, but it takes a king-sized portion of blind faith to dogmatically insist that it did.

Friday, August 22, 2025

Democrat Difficulties

An article by Scott Pinsker at PJMedia discusses the revelations in the New York Times of the desperate circumstances the Democrats currently find themselves in. The NYT article requires a subscription, but a free subscription may be available.

According to Pinsker's summary, what the Times article reports is that of the 30 states that track voter registration by political party, Democrats lost ground to Republicans in every single one between the 2020 and 2024 elections — and often by a lot.

All told, Democrats lost about 2.1 million registered voters between the 2020 and 2024 elections in the 30 states, along with Washington, D.C., that allow people to register with a political party. (In the remaining 20 states, voters do not register with a political party.) Republicans gained 2.4 million.

The flight away from the Democratic Party is occurring in battleground states, and in both blue and red states.

Moreover, Republicans went from roughly one-third of newly registered voters under 45 to a majority in the last six years. They're also gaining among both men and women, as well as among Hispanics.

Pinsker has a lot more commentary at the link. He closes with a quote from the Times article which cites an election expert who believes the trend is not a temporary fluke:
“I don’t want to say, ‘The death cycle of the Democratic Party,’ but there seems to be no end to this,” said Michael Pruser, who tracks voter registration closely as the director of data science for Decision Desk HQ, an election-analysis site. “There is no silver lining or cavalry coming across the hill. This is month after month, year after year.”

Any hope that the drift away from the Democratic Party would end organically with Mr. Trump’s election has been dashed by the limited data so far in 2025. There are now roughly 160,000 fewer registered Democrats than on Election Day 2024, according to L2’s data, and 200,000 more Republicans.

“It’s going to get worse before it gets better.”
There's nothing permanent in political alignments, but at least in the near term (2026, 2028), things aren't looking good for Democrats.

Thursday, August 21, 2025

On Reading Well

I'd like to share a delightful post sent by a friend. It was written back in 2018 by a man named Bob Trube, and in his post he talks about reading, particularly what he calls "reading well." Trube writes:
Among the resolutions people make each new year is some variant on “read more books.” That’s certainly a goal that I can applaud when the average number of books read by adults is twelve a year (a number skewed by avid readers; most people read about four a year). But I have a hunch that many of these resolutions fare no better than those of losing weight or exercising more, and probably for the same reasons: lack of specific goals that are realistic, forming a habit, social support and a good coach.

I will come back to these but I want to address something I hear less about – reading well.

For a number who read this blog, I don’t have to convince you about the value of reading, and in many cases, you already have good reading habits and exceed that book a month average. And even if you don’t, you probably sense that reading isn’t about numbers of books but part of a well-lived life.

You read not only for amusement or diversion but to better understand your world and how to live one’s life in it. That can be anything from understanding the inner workings of your computer and how to use it better to a work of philosophy or theology or even a great novel that explores fundamental questions of life’s meaning, living virtuously, or the nature of God.
Trube goes on to list four aspects of reading well:
  1. Reading well is an act of attentiveness. We read well when we read without external and internal distractions. A place of quiet and a time when we are not distracted with other concerns helps us “engage the page.” It also helps to turn off the notifications on your phone or tablet, or better yet, put the electronics in another room.

    Read on an e-reader without other apps if you prefer these to physical books.
  2. Visual media often encourages us to passively absorb content. Books of substance require our active engagement–noticing plot, characters, and the use of literary devices like foreshadowing, allusions and more. Non-fiction often involves following an argument, and paying attention to the logic, the evidence, and whether the argument is consistent.

    Reading well can mean jotting notes, asking questions, or even arguing with the author. Above all it means reflecting on what we read, and how the book connects with our lives.
  3. Reading well over time means choosing good books to read. What is “good”? I’m not sure there is one good or simple answer. There are a number of “great books” lists out there and they are worth a look. You might choose one of those to read this year. One test of a book’s worth is whether people are still reading the book and finding value in it long after its author has passed.

    Also, in almost any genre, there are reviews, websites, and online groups. Over time, you will find sources of good recommendations.
  4. Finally, I’d suggest choosing something to read off the beaten path. Reading authors from other cultures, or a genre you don’t usually read can stretch your horizons. This year, I want to work in some poetry and get around to the Langston Hughes and Seamus Heaney that I’ve had laying around.
He closes with a few thoughts "For those who simply want to read more and get into the reading habit." I encourage you to go to the link and check them out.

I sometimes wonder if reading isn't becoming a lost art, like knitting. Our lives are so full of work and other obligations that we don't have much time to read.

Even during what leisure we may have we're constantly plugged in to some device or other that distracts us and makes reading seem boring by comparison. Yet good books are like vitamins and minerals for the mind. They nourish and enrich us in ways that last for a lifetime.

If you're one who would like to read more, but just can't seem to get into it, check out the tips that Trube gives at his blog. They're very good.

Wednesday, August 20, 2025

Does AI Understand?

One major controversy in the philosophy of mind is driven by the claim that computers can think, or will soon be able to. If that claim is true then it makes it a lot easier to assume that the brain is a kind of computer and that what we call mind is simply a word we use to describe the way the brain functions.

Or put another way, mind is to brain what computer software is to the computer's hardware. This view is called "functionalism."

In 1980 philosopher John Searle published an argument that sought to show that functionalism is wrong and that there's more to our cognitive experience than simple computation. His argument came to be known as the Chinese Room argument and neuroscientist Michael Egnor has a helpful discussion of it at Evolution News and Views. Egnor describes the argument as follows:
Imagine that you are an English speaker and you do not speak Chinese. You've moved to China and you get a job working in a booth in a public square. The purpose of the booth is to provide answers to questions that Chinese-speaking people write on small pieces of paper and pass into the booth through a slot. The answer is written on a small piece of paper and passed back to the Chinese person through a separate slot.

Inside the booth with you is a very large book. The book contains every question that can be asked and the corresponding answer -- all written only in Chinese. You understand no Chinese. You understand nothing written in the book. When the question is passed through the slot you match the Chinese characters in the question to the identical question in the book and you write the Chinese symbols corresponding to the answer and pass the answer back through the answer slot.

The Chinese person asking the question gets an answer that he understands in Chinese. You understand neither the question nor the answer because you do not understand Chinese.

Searle argues that you are carrying out a computation. The booth is analogous to a computer, you are analogous to a CPU, and the information written in Chinese is analogous to the algorithm. The question and the answer written on the paper are the input and the output to and from the computer.
In other words, the computer, like the person in the booth, has no understanding of what it's doing. As Egnor says: "Thought is about understanding the process, not merely about mechanically carrying out the matching of an input to an output according to an algorithm."

Searle's argument denies that computers "think." They simply follow an algorithm. Since humans do think, however, and do understand, either our brains are not computers or functionalism is not true.

Searle points out that the computation performed by the booth and its occupant does not involve any understanding of the questions and answers provided. His point is that computation is an algorithmic process that does not entail or require understanding, but since we do understand when we perform a computation, human cognition is something qualitatively different from mere computation.

This leads to the question of how a material chunk of meat, the brain, can generate something as mysterious as understanding. If all the material that makes up a brain were placed in a laboratory flask would the flask understand? Would it be conscious?

That human beings are capable of such marvels is evidence that there's more to our cognitive abilities than our material brain. Perhaps that something more is an immaterial mind or soul that's cognitively integrated with the material brain and which the brain cannot function without.

The dominant view throughout the 20th century and into the 21st is that everything reduces to matter and energy. This view is called philosophical materialism and it denies that there are immaterial substances such as souls associated with persons. Materialism, however, is a view in the throes of a terminal illness. It's dying. A preponderance of evidence is amassed against it, but it's nevertheless still breathing.

Its proponents won't pull the plug, though, because once they grant that immaterial souls or minds exist it becomes a lot harder to resist the conclusion that God exists, and that's a conclusion that terrifies most materialists.

Tuesday, August 19, 2025

Three Options on Creation

The book A Fortunate Universe: Life in a Finely-Tuned Cosmos by cosmologists Luke Barnes and Geraint Lewis discusses the incredibly precise fine-tuning of the forces, parameters and constants that comprise the structure of the universe.

Here's a video trailer that introduces the theme of their book:
The trailer suggests that there are four possible explanations for this incomprehensible level of precision, but for reasons I'll explain in a moment, there really are only three.

The first is that something about the universe makes it a logical necessity that the values cosmologists find are in fact the only possible values a universe could have. There is no reason, however, to think this is the case. There's nothing about the universe, as far as we know, that makes it impossible for gravity or the strong nuclear force, to take just two examples, to have slightly different strengths.

The second explanation is that even though it's astronomically improbable that any universe would be so fine-tuned that living things could exist in it, if there are other universes, all with different parameters, universes so abundant that their number approaches infinity, then one like ours is almost bound to exist. This option goes by the name of the multiverse hypothesis.

The difficulty with this idea is that there's no good reason to believe other universes actually do exist, and even if they do why should we assume that they're not all replicas of each other. Even if they're all different whatever is producing them must itself be fine-tuned in order to manufacture universes, so all the multiverse hypothesis does is push the problem back a step or two.

The third explanation is that our universe is the product of a very intelligent agent, a mathematical genius, which exists somehow beyond the bounds of our cosmos.

There are actually two varieties of the third option. One is to say that the designer of the universe is a denizen of another universe in which technology has advanced to the point that it allows inhabitants of that world to design simulations of other universes.

The trailer treats this as a fourth option but since it posits a designer who resides in some other universe it's actually a combination of the second and third options and suffers some of the same difficulties as the multiverse hypothesis. It also assumes that computer technology could ever simulate not only an entire cosmos but also human consciousness, which is certainly problematic.

The other version of the third explanation is to assume that the designer of our universe is not some highly accomplished computer nerd in another universe but rather that it is a transcendent, non-contingent being of unimaginable power and intellectual brilliance who is the ultimate cause of all contingent entities, both universes and their inhabitants.

Which of these options is most attractive will vary from person to person, but philosophical arguments won't settle the issue for most people. Human beings tend to believe what they most fervently want to be true, and what they most want to be true is often whatever makes the fewest demands upon their autonomy and their lifestyle.