Tuesday, March 31, 2020

How Scientism Refutes Itself

Philosopher J.P.Moreland has written a fine book titled Scientism and Secularism (2018) from which some of what follows has been borrowed. Scientism is not to be confused with science, but scientists, particularly naturalist or materialist scientists, are often proponents of scientism. Scientism is actually a philosophical view which holds, paradoxically, that science is the only reliable means of apprehending truth.

The late cosmologist Stephen Hawking famously declared in a book he co-authored with Leonard Mlodinow titled The Grand Design (2010) that "philosophy is dead" and that all the answers to life's important questions, at least those that can be known, are to be answered henceforth by science.

Hawking is here giving expression to his scientism, the view that all the important questions can either be answered by science or not answered at all, and that the methodologies of science are the only valid path to truth and knowledge. All other ways of knowing must give way to the supreme authority of science, especially the natural, or "hard" sciences like physics and chemistry.

Scientism is a common view, but not only does it have some serious liabilities, the notion that science supersedes philosophy is surely false.

There are at least three things wrong with scientism:
  1. It's self-refuting.
  2. It's false that science is the only sure way of knowing truth.
  3. It's false that philosophy is dead. If it were then science would be impossible.
Scientism is self-refuting because the claim that only what is testable by the methods of science can be trusted to be true is itself not a scientific claim but a philosophical assertion. It's not a claim that can be tested through the methods of science. Thus, the foundational claim of scientism itself must be false.

Nor can science be the only way of knowing since there are many other things we can know with at least the same level of certainty as we know any of the deliverances of science.

For example, which do you know with stronger certitude, that atoms are the basic building blocks of matter or that torturing children for fun is evil? The latter is not a scientific claim at all, it's a metaphysical claim, yet most of us are far more sure of its truth than we are of the truth of the claim about atoms.

There are other examples of things we know that do not lend themselves at all to scientific demonstration. For example, I can know: that I took a walk on my last birthday, that I hold certain beliefs about science and philosophy, that I have an itch in my foot, that sunsets are beautiful, that justice is good; and I can know the basic laws of math and logic, e.g. I know that 2 + 2 = 4, and I know that if a proposition (P) entails another proposition (Q) then if P is true so must Q be true.

Not only do we all know such things, we know them with far more certainty than we know the truth of the claims of scientists about, say, global warming, atomic theory or Darwinian evolution.

Moreover, science depends for its very existence upon a series of assumptions, none of which are themselves scientific. All of them are philosophical, so if philosophy is dead where does that leave science?

Here are some examples: The law of cause and effect (Every effect has a cause), the law of sufficient reason (everything that exists has a sufficient explanation for its existence), the principle of uniformity (The laws that prevail in our neighborhood of the universe prevail everywhere in the universe), the belief that explanations which exhibit elegance and simplicity are superior to those which don't, the belief that the world is objectively real and intelligible, the belief that our senses are reliable and that our reason is trustworthy. All of these are philosophical assumptions that cannot be demonstrated scientifically to be true.

Scientism is a bid by some materialists to assert epistemological hegemony over our intellectual lives and especially over the disciplines of philosophy and theology. However, just as similar attempts in the 20th century, such as positivism and verificationism, fell victim to self-referential incoherence, so, too, does scientism.

The claim that science is uniquely authoritative and that we should all recognize and bow to its supremacy is quite simply false.

Monday, March 30, 2020

True Heroes

One silver lining to the Covid-19 pandemic is that everyday we're seeing real heroes, not the spurious Hollywood heroes on our movie screens, but men and women risking, and sometimes losing, their lives to save the lives of others. They are genuinely wonderful people.

What motivates them to do it? An individual's motivation arises from many sources, of course, but what are the deep-rooted tacit assumptions that impel a man or a woman to do what medical professionals are doing today as they seek to mitigate the misery of so many sufferers around the world?

I can't prove it, but I believe that the impetus that drives these men and women all across the globe can be traced back to a passage in the Gospel of Matthew, chapter 25, verses 35-40 where Jesus instructs His followers as to how He wants them to treat the sick, the weak and the poor:
[For] I was hungry, and you gave me something to eat; I was thirsty, and you gave Me drink; I was a stranger and you invited Me in; naked, and you clothed Me; I was sick, and you visited Me; I was in prison, and you came to Me. Then the righteous will answer Him, saying, 'Lord, when did we see you hungry and feed You, or thirsty and give You drink? And when did we see You a stranger and invite you in, or naked, and clothe You? And when did we see You sick, or in prison, and come to You?' And the King [Jesus] will answer and say to them, 'Truly, I say to you, to the extent that you did it to ... even the least of these, you did it to Me.'
This was a revolutionary way to see one's responsibility to other people in antiquity. Other than among some Jews, there really was nothing like it in any ancient society. The idea that anyone had a moral obligation to the “least of these” was a uniquely Christian idea rooted in the teaching of Jesus Christ.

The desire to meet this obligation toward "the least of these" provided the motivation for hospitals, orphanages, almshouses, child labor laws, state welfare, charitable organizations, human rights, etc., all of which eventually emerged first in the Christian West.

In the Graeco-Roman world, by contrast, the poor and weak were often treated with contempt. It was a brutal existence in which human life was very cheap, and the life of the weak and defenseless cheaper still.

In a world which has no concept of the Judeo-Christian God it makes sense to see life this way. This is why nations which officially adopt atheism - communist and fascist - are often murderous and cruel. In the absence of God men are just ciphers, animals to be herded, manipulated and exploited in whatever way benefits those who have power over them.

The 19th century atheist Friedrich Nietzsche saw this clearly. He admired the Roman pagans for their cruelty and their love of greatness and power. He wrote that,
To see others suffer does one good, to make others suffer even more: this is a hard saying but an ancient, mighty, human, all-too-human principle .... Without cruelty there is no festival.
and this:
Those who are cruel enjoy the highest gratification of the feeling of power.
Nietzsche disdained Christian morality, calling it "slave morality." He admired what he called "master morality" which differed from slave morality in that master morality values power and pride whereas slave morality values qualities such as empathy, kindness, and sympathy.

Nietzsche correctly recognized that no secular philosophy offers its adherents any warrant for empathy, kindness and sympathy, especially when directed toward the weak and the poor.

Most secular people, for instance, accept the evolutionary account of man's origin, but evolution is all about survival of the fittest, not the survival of the weak and sickly. Darwin himself was troubled by the recognition that caring for the weak was not in the best interest of the human species. It enhanced neither fitness nor survival.

Secular folk might choose to care for the unfortunate, to be sure, but doing so is simply a matter of personal preference. There’s no duty to do so, no obligation. If they were to treat the sick, weak and poor cruelly there’s nothing in their worldview to tell them that they’re doing anything wrong.

If they do think it’s wrong they’re actually piggy-backing on Christian assumptions about human worth. Nothing in the atheistic worldview obligates one to care for the sick, and nothing in atheism condemns one's refusal to care for them.

Christians care for the weak, the poor and the oppressed because, they believe, every person carries the image of Christ. To care for the sick is to care for Christ.

According to historian Tom Holland writing in his best-selling book Dominion, when a plague broke out in Rome in 165-180 A.D., possibly smallpox, many fled the city leaving the sick to fend for themselves. The Romans were astonished that many Christians stayed behind to nurse those who suffered, some also contracting the disease.

The amazing thing to the Romans was that the Christians were complete strangers to those they cared for.

A century later another pestilence spread from Africa throughout the Mediterranean world. On Easter Sunday in 260 AD, Bishop Dionysius of Corinth praised the efforts of the Christians, many of whom had died while caring for others during the outbreak. He said:
Most of our brother Christians showed unbounded love and loyalty, never sparing themselves, and thinking only of one another. Heedless of danger, they took charge of the sick, attending to their every need and ministering to them in Christ, and with them departed this life serenely happy; for they were infected by others with the disease, drawing on themselves the sickness of their neighbors and cheerfully accepting their pains.
The Roman Emperor Julian, a 4th century ruler who despised Christianity and who yearned to restore the ancient pagan rites in Rome, was bewildered by the devotion to the poor manifested by the Christians:
How apparent to everyone it is, and how shameful, that our own people lack support from us, when no Jew ever has to beg, and the impious Galileans [Christians] support not only their own poor, but ours as well.
Julian perhaps had in mind people like Basil and Gregory of Nyssa, two wealthy, well-educated brothers in the 4th century who dedicated their lives to helping the poor.

Basil become bishop of Caesarea in what is today eastern Turkey, and, in the midst of an awful famine in 369, built what was essentially the first hospital. It was actually a small city, and the poor were welcomed and treated by Basil himself who gave refuge and care, even to lepers, greeting them with a kiss.

No matter how disgusting and abject the sufferers were, Basil said, Christ was in them.

His brother Gregory became bishop of Nyssa, also in present-day Turkey, and became one of the strongest opponents of slavery in the ancient world.

Among the ancients it was an accepted and common practice to simply abandon unwanted babies by the roadside, leave them on garbage piles, or to dump them down drains. Some people would rescue the infants to raise them as slaves. Females were often raised to be sold to brothels.

The practice was taken for granted, except in Jewish enclaves, until the emergence of Christianity.

One of the most influential adversaries of the practice was the elder sister of Basil and Gregory, a woman named Macrina, who would rescue the abandoned girls and take them home to raise as her own daughters.

The standard toward the sick, the poor, the slave and the defenseless reinforced by Basil, Gregory and Macrina was incomprehensible to their pagan contemporaries, but it was how Christians acted because they saw every person as bearing the image of Christ.

As Christianity gradually spread throughout the West, the Christian attitude toward the sick and poor spread as well. Even those like the atheist surgeon Dr. Rieux, the hero of Albert Camus' novel The Plague, who in the story strives at great personal cost to himself to help Algerians victimized by an outbreak of bubonic plague, are living off the capital of a Christian understanding of the human person. Nothing in Rieux's atheism provided any ground for his devotion to the dying.

Those who today are giving so much, risking so much, to help the victims of Covid-19 are following in the train of centuries of Christians who were themselves animated by Christ's words in Matthew 25. Even if some of today's healers consider themselves secular or atheistic, even if they are personally hostile to Christianity, they have still, perhaps without realizing it, adopted for themselves an attitude toward the distressed and sick which is Christianity's gift to the world.

But whether or not they realize where their roots lie these men and women are indeed true heroes.

Saturday, March 28, 2020

Biden and Kavanaugh

In late summer of 2018 Supreme Court nominee Brett Kavanaugh was accused by a woman named Christine Blasey Ford of having jumped on top of her when he was in his late teens. The media talked about nothing else. Kavanaugh was presumed guilty by virtually the entire left-wing media because, as people like Joe Biden were saying, a woman who brings such charges against a man should always be believed.

Kavanaugh's career hung by a very tenuous thread, and his reputation was in tatters.

Now comes word that the same Joe Biden who sternly reproached anyone who would doubt a woman's allegation of sexual assault is himself being accused of an even worse assault than was Kavanaugh.

Caution: The following excerpt may strike some readers as gratuitously crude. Please don't read it if you think you might be offended.
Former Vice President Joe Biden has been accused of sexual assaulting a former staffer in 1993 in his Senate office. Tara Reade, who worked in Biden’s office in 1993, accused Biden of touching her, kissing her and penetrating her with his fingers without consent.

Reade made her claims in an interview with Katie Halper, a writer and podcast host, released Wednesday.

“We were alone, and it was the strangest thing. There was no, like, exchange really, he just had me up against the wall,” Reade said.

“His hands were on me and underneath my clothes. And then, he went, he went down my skirt and then up inside it, and he penetrated me with his fingers… and um… He was kissing me at the same time and he was saying something to me,” she also told Halper.

“He got finished doing what he was doing and I, how I was pulled back and he said, ‘Come on man, I heard you liked me.’ And that phrase stayed with me because I kept thinking what I might have said. And I can’t remember exactly if he said ‘I thought’ or if ‘I heard.’ It’s like he implied that I had done this.”

Reade said that she told her mother and brother about the assault. She also told a friend, who she says advised her not to come forward and face a public firestorm. Her mother, who has since passed, told her to report the incident, but her brother also discouraged her from doing so.

The Daily Caller reached out to Biden’s campaign about the allegation Wednesday and has received no response.
There's more to this story at the link.

It will be interesting to see where this goes.

Will now the same media forces who were mobilized to destroy Brett Kavanaugh mobilize once again to destroy Joe Biden, or was the outrage directed at Kavanaugh really an empty pose undertaken purely for political purposes? Will the media try to bury Ms Reade's allegations under a mountain of Covid-19 reporting or will they strive to be consistent and fair and pursue the charges against Mr. Biden with the same ferocity as they pursued the charges made against Mr. Kavanaugh?

We'll know the answers to those questions soon enough, but if I had to guess I'd say that the media will do everything they can to find something else to talk about. To date few news sources have even mentioned this latest scandal:
The reaction to this allegation has largely been muted in major media outlets. As of Friday afternoon, searches of the New York Times, CNN.com, the Los Angeles Times and many other outlets show no articles on the allegations. Additionally, a keyword search of cable news channels turns up no results on major networks like CNN, MSNBC, Fox News or CNBC.
There are some similarities between the Ford and the Reade allegations, but the latter seem much more credible inasmuch as this is not the first time that evidence has come forward of Mr. Biden's general creepiness. See here, here and here.

It'll also be interesting to see the response of those who said they could never vote for Mr. Trump because of his well-documented past treatment of women. Will they now insist that they would never vote for Joe Biden for the same reason? We'll see.

The following video provides more background to this sordid story:

Friday, March 27, 2020

Cuomo's Surprise

At a press briefing Tuesday New York Governor Andrew Cuomo delivered himself of something of a shocker. In making the case that the government should spend whatever it takes to mitigate the Covid-19 crisis he stated emphatically that,
My mother is not expendable, your mother is not expendable and our brothers and sisters are not expendable, and we’re not going to accept the premise that human life is disposable, and we’re not going to put a dollar figure on human life. The first order of business is to save lives, period. Whatever it costs.
The governor, a Democrat (and a Catholic, no less), apparently defines human life as something that magically commences at birth because he certainly isn't very concerned about pre-natal human life. This is the governor who last year signed legislation allowing for babies to be aborted up to the time of birth if the health of the mother was at risk.

Of course, "health of the mother" is a very elastic term which means whatever the mother and her doctor want it to mean. If a mother will be deeply upset if she has to carry the baby to term that could be construed by a sympathetic doctor as a mental health rationale for terminating the pregnancy. The New York law, called the Reproductive Health Act, essentially permits abortion on demand at any time in a pregnancy.

This is also the same governor who in 2014 declared that anyone who believes that human life is not disposable and that cost should not be a concern in saving human lives is not welcome in the state of New York. He referred to such people as "extreme" conservatives because they hold to the inarguable biological position that an unborn child is both human and alive.

Cuomo considers these folks extreme because they believe that those human lives deserve prima facie protection, just like Cuomo believes the life of his octogenarian mother deserves prima facie protection.

In the mind of Andrew Cuomo, some human life is evidently more expendable than other human life.

Thursday, March 26, 2020

Amazing Embryos

Ever since I was an undergraduate biology major I have been intrigued by the mystery of how a zygote (a fertilized egg) develops from a single cell into a multi-cellular embryo and from there to a complete organism.

The reason this is such a profound mystery is that the initial cell somehow "knows" to divide and the daughter cells somehow "know" to form different kinds of cells which somehow "know" to migrate around the embryo and form different kinds of tissue which somehow "know" to integrate with other kinds of tissues to form organs, and so on.

So, how do cells with no brains "know" how to do all this? Where are the instructions located which choreograph this astonishing process and tell all the parts what to do and how to do it, and how are those instructions communicated?

The information is not to be found in either the genome or the epigenome, apparently, so where is it, what is its storage medium, and how is it stored and accessed? What mechanisms control it so that the entire assembly unfolds in a flawless sequence with each step occurring precisely when it must in order to successfully construct an adult organism? And how, exactly, does the zygote "know" to produce, say, a flower rather than a fish, or a bird, or a human?

These questions are fascinating and they emerge again in an article at Uncommon Descent that quotes geneticist Michael Denton:
The earliest events leading from the first division of the egg cell to the blastula stage in amphibians, reptiles and mammals are illustrated in figure 5.4 (in his book Evolution: A Theory in Crisis). Even to the untrained zoologist it is obvious that neither the blastula itself, nor the sequence of events that lead to its formation, is identical in any of the vertebrate classes shown.
The blastula stage is an early step in embryogenesis when the zygote divides several times to produce a ball of cells. When those cells then evaginate and begin to take on the form of the early embryo biologists call that the gastrula stage.

Denton continues:
The differences become even more striking in the next major phase of embryo formation – gastrulation. This involves a complex sequence of cell movements whereby the cells of the blastula rearrange themselves, eventually resulting in the transformation of the blastula into the intricate folded form of the early embryo, or gastrula, which consists of three basic germ cell layers: the ectoderm, which gives rise to the skin and the nervous system; the mesoderm, which gives rise to muscle and skeletal tissues; and the endoderm, which gives rise to the lining of the alimentary tract as well as to the liver and pancreas....

In some ways the egg cell, blastula, and gastrula stages in the different vertebrate classes are so dissimilar that, were it not for the close resemblance in the basic body plan of all adult vertebrates, it seems unlikely that they would have been classed as belonging to the same phylum.

There is no question that, because of the great dissimilarity of the early stages of embryogenesis in the different vertebrate classes, organs and structures considered homologous in adult vertebrates cannot be traced back to homologous cells or regions in the earliest stages of embryogenesis. In other words, homologous structures are arrived at by different routes.
This is surprising. Different types of animals follow different pathways in building morphological structures such as the arm of a man, the foreleg of a horse, the wing of a bird, and the pectoral fin of a fish, that are otherwise believed to be evolutionarily "related."

If they follow different pathways then there must be a different set of assembly instructions for the development of these "homologs," and thus all of the above questions arise again.

There is in the organism from the time it's just a single cell until it's fully developed, a massive amount of information that programs its development.

The locus, nature, and modus operandi of this information are unknown, but one thing I think can be inferred: If information of such astonishing sophistication controls the progression of the cell's development, it seems very unlikely that that information is the product of blind, impersonal, random processes.

Complex information such as we find in computer code or architectural blueprints are never the product of random processes like genetic mutation, but are always, insofar as we've ever experienced it, the product of a mind.

I leave it to the reader to draw his or her own conclusions. Meanwhile, here's a very good animation of human embryo formation and development:

Wednesday, March 25, 2020

Is Inequality Wrong?

Some time ago the science magazine New Scientist ran a story (paywall) that was titled Inequality: How Our Brains Evolved to Love it, Even Though We Know it's Wrong. Because the article requires a subscription I didn't read it so it'd be unfair to assume too much about it, but the title itself is puzzling.


Even though I don't want to assume too much, I am going to assume that the editors of New Scientist, or at least many of their readers, lean metaphysically in the direction of naturalistic materialism. That is, I'm going to assume they hold to the view that nature is all there is and that all of nature is ultimately explicable solely in terms of matter and the laws which govern its behavior.

If I'm wrong in my assumption, I apologize at the outset. But assuming that I'm correct, I have a couple of questions for New Scientist's editors.

Doubtless they explain in the article what they mean by inequality, but whatever is meant by it, how do we know it's wrong? In order to know that X is wrong there must be some objective moral frame of reference to which we can compare X to see if it conforms to that standard. On naturalistic materialism, however, there are no objective moral reference frames, there are only subjective preferences and biases.

On naturalism when someone says, for example, that racism, murder, or political corruption are wrong all they're doing is emoting. They're saying something like, "I really don't like racism, murder or political corruption."

Moreover, inequality is the natural, expected outcome of the evolutionary process. Evolution by its very nature generates inequalities of all sorts. Why should anyone think that one evolutionary by-product, inequality among humans, is any more or less wrong than any other unless those by-products are being compared to some higher level moral standard? How can we say that kindness is right and cruelty is wrong if both are simply the products of impersonal processes like random mutation and natural selection?

If our fondness for inequality is merely a product of evolution then to declare that it's wrong is a lot like declaring that our fondness for sweet tasting foods is wrong. Nothing that has resulted from a blind, impersonal process like evolution can be either right or wrong. It just is.

We like to think, of course, that the evolution of sympathy or kindness is good and the evolution of greed, racism and aggressiveness is bad, but how can we justify such an assessment in the absence of an objective moral standard? And, to repeat, on naturalism there is no higher moral standard. At bottom, everything is just atoms jiggling in the void, and jiggling atoms are neither moral nor immoral.

Tuesday, March 24, 2020

Truth or Consequences

Jim Geraghty at National Review has published a timeline that documents all of the lies told by Chinese authorities and their lackeys at the World Health Organization about Covid-19 from its onset in early December through January of this year.

The timeline is too long and detailed to reproduce here, but here are a few salient excerpts. Geraghty has links to all of his claims at the OP:
December 31: The Wuhan Municipal Health Commission declares, “The investigation so far has not found any obvious human-to-human transmission and no medical staff infection.” This is the opposite of the belief of the doctors working on patients in Wuhan, and two doctors were already suspected of contracting the virus.

January 1: The Wuhan Public Security Bureau issued a summons to Dr. Li Wenliang, accusing him of “spreading rumors.” Two days later, at a police station, Dr. Li signed a statement acknowledging his “misdemeanor” and promising not to commit further “unlawful acts.” Seven other people are arrested on similar charges and their fate is unknown.

Also that day, “after several batches of genome sequence results had been returned to hospitals and submitted to health authorities, an employee of one genomics company received a phone call from an official at the Hubei Provincial Health Commission, ordering the company to stop testing samples from Wuhan related to the new disease and destroy all existing samples.”

According to a New York Times study of cellphone data from China, 175,000 people left Wuhan that day. According to global travel data research firm OAG, 21 countries had direct flights to Wuhan. In the first quarter of 2019 for comparison, 13,267 air passengers traveled from Wuhan, China, to destinations in the United States, or about 4,422 per month. The U.S. government would not bar foreign nationals who had traveled to China from entering the country for another month.

January 3: The Chinese government continued efforts to suppress all information about the virus: “China’s National Health Commission, the nation’s top health authority, ordered institutions not to publish any information related to the unknown disease, and ordered labs to transfer any samples they had to designated testing institutions, or to destroy them.”

January 10: After unknowingly treating a patient with the Wuhan coronavirus, Dr. Li Wenliang started coughing and developed a fever. He was hospitalized on January 12. In the following days, Li’s condition deteriorated so badly that he was admitted to the intensive care unit and given oxygen support.

The New York Times quoted the Wuhan City Health Commission’s declaration that “there is no evidence the virus can spread among humans.” Chinese doctors continued to find transmission among family members, contradicting the official statements from the city health commission.

January 18: HHS Secretary Azar had his first discussion about the virus with President Trump....Despite the fact that Wuhan doctors know the virus is contagious, city authorities allow 40,000 families to gather and share home-cooked food in a Lunar New Year banquet.

January 21: The CDC announced the first U.S. case of coronavirus in a Snohomish County, Wash., resident who returning from China six days earlier.

By this point, millions of people had left Wuhan, carrying the virus all around China and into other countries.

January 22: WHO director general Tedros Adhanom Ghebreyesus continued to praise China’s handling of the outbreak.

January 24: The U.S reported its second case in Chicago. Within two days, new cases were reported in Los Angeles, Orange County, and Arizona. The virus was by now in several locations in the United States, and the odds of preventing an outbreak were dwindling to zero.
Had the Chinese authorities acted appropriately and honestly we might have avoided most of the suffering and most of the economic misery we're likely to incur from the quarantines we're undergoing. The Chinese need to pay a price for what, through their mendacity and irresponsibility, they've unleashed on the world.

Meanwhile, the West needs to get over its foolish obsession with political correctness and our inordinate fear of offending someone whose ethnicity differs from our own. Bickering over whether it's racist to call the Chinese virus the Chinese virus is asinine. So are sentiments like this:
On February 4th Mayor of Florence, Italy, Dario Nardella, urged residents to hug Chinese people to encourage them in the fight against the novel coronavirus. Meanwhile, a member of Associazione Unione Giovani Italo Cinesi, a Chinese society in Italy aimed at promoting friendship between people in the two countries, called for respect for novel coronavirus patients during a street demonstration. “I’m not a virus. I’m a human. Eradicate the prejudice.”
Maybe that explains, at least in part, why the people of northern Italy are suffering so much today from this contagion.

It's not "prejudice" to avoid those who might be infected. If it were then many of us today are prejudiced against our co-workers, friends and even our families.

Nor is it prejudice, or more precisely, ethnic bigotry, to condemn the handling of this disease by the Chinese authorities and to criticize them for stifling and perhaps punishing (executing?) the heroic doctors and other medical personnel who tried to get the truth out to the world.

The long-suffering Chinese people have for over 70 years been cruelly oppressed by communist despots. It's time the world stopped fawning over them and started telling the truth about them.

Monday, March 23, 2020

Varieties of Ethical Subjectivism

A commenter at Uncommon Descent, in defense of the view that morality has no objective grounding, inasmuch as it's rooted in our evolutionary development, delivers himself of this head-scratcher:
Since the moral fabric is man made, all we are doing is seeing it change, as it has done over the centuries. Sometimes history shows that the change has been for the good, and sometimes for the bad. But since civilization is thriving, it is reasonable to conclude that we have had more wins than losses.
What's puzzling about this is that if morality is man-made then what's the standard by which we can tell whether any change is good or bad? Doesn't this comment tacitly assume that there's an objective reference point, a moral horizon, as it were, by which we can tell whether we're flying upside down or right side up?

On evolutionary terms, of course, there is no objective referent. About that the commenter is correct. On evolution morality is all man-made and therefore purely subjective.

If it's objected that civilizational thriving can serve as a measure of whether their moral practices are good or bad we might ask whether the Aztecs and other civilizations which presumably thrived for hundreds, maybe thousands, of years after they introduced human and child sacrifice were doing something good by slaughtering helpless victims by the tens of thousands.

The post at the link cites Lewis Vaughn's Doing Ethics: Moral Reasoning and Contemporary Issues which provides an excellent explanation of the differences between relativism, which Vaughn avers can be subjective or cultural, and emotivism. Some might want to quibble with his terminology, but it's very helpful nonetheless. He writes this:
Subjective relativism is the view that an action is morally right if one approves of it. A person’s approval makes the action right. This doctrine (as well as cultural relativism) is in stark contrast to moral objectivism, the view that some moral principles are valid for everyone.

Subjective relativism, though, has some troubling implications. It implies that each person is morally infallible and that individuals can never have a genuine moral disagreement.

Cultural relativism is the view that an action is morally right if one’s culture approves of it. The argument for this doctrine is based on the diversity of moral judgments among cultures: because people’s judgments about right and wrong differ from culture to culture, right and wrong must be relative to culture, and there are no objective moral principles.

This argument is defective, however, because the diversity of moral views does not imply that morality is relative to cultures. In addition, the alleged diversity of basic moral standards among cultures may be only apparent, not real.

Societies whose moral judgments conflict may be differing not over moral principles but over non-moral facts.

Some think that tolerance is entailed by cultural relativism. But there is no necessary connection between tolerance and the doctrine. Indeed, the cultural relativist cannot consistently advocate tolerance while maintaining his relativist standpoint. To advocate tolerance is to advocate an objective moral value. But if tolerance is an objective moral value, then cultural relativism must be false, because it says that there are no objective moral values.

Like subjective relativism, cultural relativism has some disturbing consequences. It implies that cultures are morally infallible, that social reformers can never be morally right, that moral disagreements between individuals in the same culture amount to arguments over whether they disagree with their culture, that other cultures cannot be legitimately criticized, and that moral progress is impossible.

Emotivism is the view that moral utterances are neither true nor false but are expressions of emotions or attitudes. It leads to the conclusion that people can disagree only in attitude, not in beliefs. People cannot disagree over the moral facts, because there are no moral facts. Emotivism also implies that presenting reasons in support of a moral utterance is a matter of offering non-moral facts that can influence someone’s attitude.

It seems that any non-moral facts will do, as long as they affect attitudes. Perhaps the most far-reaching implication of emotivism is that nothing is actually good or bad. There simply are no properties of goodness and badness. There is only the expression of favorable or unfavorable emotions or attitudes toward something.
I'd probably want to say that all three of these can be subsumed under the heading of subjectivism, i.e. the view that moral judgments are based on individual preferences and feelings and that cultural relativism is simply subjectivism writ large. In any case, the important point is that any moral assertion not based on an objective foundation is purely illusory. It's just a rhetorical vehicle for expressing one's individual tastes and biases and has no binding force on anyone else.

Moreover, there can only be an objective moral foundation if there is a moral authority which transcends human fallibility and weakness. In other words, unless there is a God there can be no objective moral values or obligations on anyone.

This is why moral claims made by non-theists don't make sense. They wish to deny the existence of God and yet implicitly hold views about morality that can be true only if God exists.

Saturday, March 21, 2020

A Ray of Hope?

You've probably heard about this by now, but in case you haven't there's some reason for hope amidst all the doom and gloom in our news about the Coronavirus. At President Trump's press briefing Thursday he mentioned that there are a couple of pharmaceuticals already on the market which may be effective in mitigating the symptoms of COVID-19. One of these is an anti-malarial drug called hydroxychloroquine:
President Donald Trump confirmed during a news conference Thursday that the FDA has rapidly approved hydroxychloroquine, a drug commonly used to treat malaria, to be prescribed to help treat Coronavirus.

Following the news conference, Trump campaign national press secretary Kayleigh McEnany tweeted, "President Donald Trump announces HUGE news!! Hydroxychloroquine, a drug used to treat malaria, has shown encouraging early results against the coronavirus. By eliminating red tape, President Trump will be able to make this drug available almost immediately."
There's some confusion about whether the FDA has "rapidly approved" the use of hydroxychoroquine, but it's already on the market to treat malaria and will doubtless soon be prescribed for COVID-19. If it is effective it may shorten the length of time we're held hostage to this disease.

According to Gregory Rigano, the project lead on clinical trials for COVID-19 prevention, a recent controlled clinical study has shown that 100% of patients treated with a combination of HCQ and Azithromycin were "virologically cured" within six days of treatment (see the previous link).

If this is not a false hope but rather a genuine breakthrough it could be one of the greatest blessings God has ever bestowed upon this country, not only in terms of lives saved but in terms of the devastating economic consequences of a prolonged shutdown of so much of this country's - and the world's - large and small businesses.

We should know soon enough whether the president's optimism is justified, but we should certainly hope and pray that it is. It's hard to imagine the consequences of continuing on like this for several more months.

Friday, March 20, 2020

Where Relativism Leads

I've posted this from time to time in the past but since my classes are talking (remotely, unfortunately) about ethical relativism I thought it'd be appropriate to run it again:

Denyse O'Leary passes on a story told by a Canadian high school philosophy teacher named Stephen Anderson. Anderson recounts what happened when he tried to show students what can happen to women in a culture with no tradition of treating women as human beings:
I was teaching my senior Philosophy class. We had just finished a unit on Metaphysics and were about to get into Ethics, the philosophy of how we make moral judgments. The school had also just had several social-justice-type assemblies—multiculturalism, women’s rights, anti-violence and gay acceptance. So there was no shortage of reference points from which to begin.
Anderson decided to open the discussion by simply displaying, without comment, the photo of Bibi Aisha (see below). Aisha was an Afghani teenager whose mother had died and who was forced at the age of 14 into an abusive marriage with a Taliban fighter who abused her and kept her with his animals.

At 18 she fled the abuse but was caught by police, jailed for five months, and returned to her family. Her father returned her to her husband's family. To take revenge on her escape, her father-in-law, husband, and three other family members took Aisha into the mountains, cut off her nose and her ears, and left her to die. After crawling to her grandfather’s house, she was saved by a nearby American hospital.

Anderson continues:
I felt quite sure that my students, seeing the suffering of this poor girl of their own age, would have a clear ethical reaction, from which we could build toward more difficult cases.

The picture is horrific. Aisha’s beautiful eyes stare hauntingly back at you above the mangled hole that was once her nose. Some of my students could not even raise their eyes to look at it. I could see that many were experiencing deep emotions, but I was not prepared for their reaction.

I had expected strong aversion; but that’s not what I got. Instead, they became confused. They seemed not to know what to think. They spoke timorously, afraid to make any moral judgment at all. They were unwilling to criticize any situation originating in a different culture.

They said, “Well, we might not like it, but maybe over there it’s okay.” One student said, “I don’t feel anything at all; I see lots of this kind of stuff.” Another said (with no consciousness of self-contradiction), “It’s just wrong to judge other cultures.”

While we may hope some are capable of bridging the gap between principled morality and this ethically vacuous relativism, it is evident that a good many are not. For them, the overriding message is “never judge, never criticize, never take a position.”
This is a picture of Bibi Aisha before undergoing a series of reconstructive surgeries at Bethesda Naval Hospital in Washington, D.C. She was deliberately mutilated by family members because she did not want to stay in a marriage to which she did not consent and in which she was treated like livestock. Anyone who would do this to another human being is evil. Any culture which condones it is degenerate, and any persons who cannot bring themselves to acknowledge this, or to sympathize with her suffering, are morally stunted.

The regrettable prevalence of moral relativism in our culture should not surprise us, however. Once a society jettisons its Judeo-Christian heritage it no longer has any non-subjective basis for making moral judgments. Its moral sense is stunted, warped and diminished because it's based on nothing more than one's own subjective feelings. Since no one can say that his feelings are superior to the feelings of the people who did this to Bibi Aisha we not infrequently hear insipidities like, "If it's right for them then it's right," or "It's wrong to judge other cultures."

These are symptoms of moral paralysis, and that paralysis is the legacy of modernity and the secular Enlightenment.

Thursday, March 19, 2020

The Secularist's Moral Confusion

A few years ago Susan Jacoby wrote a piece in the New York Times about a book by Phil Zuckerman titled Living the Secular Life: New Answers to Old Questions. Her review serves up several examples of how to miss the point. As an atheist herself, Jacoby is eager to defend Zuckerman's thesis that one can live a life that's just as morally good, or better, than that of any theist. Belief in God, both Jacoby and Zuckerman aver, is not necessary for the moral life. She writes:
Many years ago, when I was an innocent lamb making my first appearance on a right-wing radio talk show, the host asked, “If you don’t believe in God, what’s to stop you from committing murder?” I blurted out, “It’s never actually occurred to me to murder anyone.”
In addition to the usual tendentious use of the word "right-wing" whenever a progressive is referring to anything to the right of the mid-line on the ideological highway, her answer to the question is a non-sequitur. The host is obviously asking her what, in her worldview, imposes any moral constraint on her. To answer that it never occurred to her to do such a thing as murder is to duck the question. The question is on what grounds would she have thought murder to be morally wrong if it had occurred to her to commit such a deed? She continues her evasions when she says this:
Nonreligious Americans are usually pressed to explain how they control their evil impulses with the more neutral, albeit no less insulting, “How can you have morality without religion?”
We might want to pause here to ask why Ms Jacoby feels insulted that someone might ask her what she bases her moral values and decisions upon. Is it insulting because she's being asked a question for which she has no good answer?

Anyway, after some more irrelevant filler she eventually arrives at the nub of Zuckerman's book:
[Zuckerman] extols a secular morality grounded in the “empathetic reciprocity embedded in the Golden Rule, accepting the inevitability of our eventual death, navigating life with a sober pragmatism grounded in this world.”
Very well, but why is it right to embrace the principle that we should treat others the way we want to be treated but wrong to adopt the principle that we should put our own interests ahead of the interests of others? Is it just that it feels right to Zuckerman to live this way? If so, then all the author is saying is that everyone should live by his own feelings. In other words, morality is rooted in each person's own subjective behavioral preferences, but if that's so then no one can say that anyone else is wrong about any moral matter. If what's right is what I feel to be right then the same holds true for everyone, and how can I say that others are wrong if they feel they should be selfish, greedy, racist, dishonest, or violent?

Just because I, or Susan Jacoby, feel strongly that such behaviors are wrong that surely doesn't make them wrong. Jacoby seems to unaware of the difficulty, however:
The Golden Rule (who but a psychopath could disagree with it?) is a touchstone for atheists if they feel obliged to prove that they follow a moral code recognizable to their religious compatriots. But this universal ethical premise does not prevent religious Americans (especially on the right) from badgering atheists about goodness without God — even though it would correctly be seen as rude for an atheist to ask her religious neighbors how they can be good with God.
This paragraph is unfortunate for at least three reasons. First, Jacoby's insinuation that only a moral pervert would reject the Golden Rule (GR) is a case of begging the question. She's assuming the GR is an objective moral principle and then asks how anyone could not see it as such, but the notion that there are objective moral principles is exactly what atheism disallows. Indeed, as indicated above, it's what Zuckerman and Jacoby both implicitly deny.

Second, the fact that someone can choose to live by the GR is not to the point. Anyone can live by whatever values he or she chooses. The problem for the atheist is that she cannot say that if someone disdains the GR and chooses to live selfishly or cruelly that that person is doing anything that is objectively wrong. In a Godless world values are like selections on a restaurant menu. The atheist can choose whatever she wants that suits her taste, but if her companion chooses something she doesn't like that doesn't make him wrong.

Third, Jacoby seems to imply that belief in God doesn't make one good, and in fact makes it hard to be good. This is again beside the point. One can believe in God and not know what's right. One can believe in God and not do what's right. The point, though, is that unless there is a God there is no objective moral right nor wrong. There are merely subjective preferences people have to which they are bound only by their own arbitrary will.

Morality requires a transcendent, objective, morally authoritative foundation, a foundation which has the right to impose moral strictures and the ability to enforce them. That is, it requires a personal being. If no such being exists then debates about right and wrong behavior are like debates about the prettiest color. They're no more than expressions of personal taste and preference.

Jacoby unwittingly supplies us with an interesting example from which to elaborate on the point:
Tonya Hinkle (a pseudonym) is a mother of three who lives in a small town in Mississippi....Her children were harassed at school after it became known that the Hinkles did not belong to a church. When Tonya’s first-grade twins got off the school bus crying, she learned that “this one girl had stood up on the bus and screamed — right in their faces — that they were going to HELL. That they were going to burn in all eternity because they didn’t go to church.”
Jacoby thinks this was awful, as do I, but why does Jacoby think that what these children did to Tonya's children was wrong - not factually wrong but morally wrong? She might reply that it hurt the little girl, and so it did, but on atheism why is it wrong to hurt people? Jacoby, falling back on the GR, might say that those kids wouldn't want someone to hurt them. Surely not, but why is that a reason why it's wrong to hurt others? How, exactly, does one's desire not to be hurt make it wrong to hurt others? All an atheist can say by way of reply is that it violates the GR, but then she's spinning in a circle. Where does the GR get its moral authority from in a Godless universe? Is it from social consensus? Human evolution? How can either of these make any act morally wrong?

At this point some people might reply that it's wrong to hurt others because it just is, but at this point the individual has abandoned reason and is resorting to dogmatic asseverations of faith in the correctness of their own moral intuitions - sort of like some of those obnoxious fundamentalists might do.

The unfortunate fact of the matter is, though, that, on atheism, if those kids can hurt Tonya's children and get away with it, it's not wrong, it's only behavior Jacoby doesn't like, and we're back to right and wrong being measured by one's personal feelings.

It's a common error but an error nonetheless when non-theists like Jacoby and Zuckerman seek to defend the possibility of moral values while denying any transcendent basis for them, and it's peculiar that Jacoby feels insulted when she's asked to explain how she can do this.

Another atheist, Robert Tracinski at The Federalist, makes a related mistake in an otherwise fine discussion of the thought of Ayn Rand. Tracinski explicitly acknowledges that most thoughtful atheists, at least those on the left, embrace moral subjectivism. He writes:
Probably the most important category [Rand] defied is captured in the expression, “If God is dead, all things are permitted.” Which means: if there is no religious basis for morality, then everything is subjective. The cultural left basically accepts this alternative and sides with subjectivism (when they’re not overcompensating by careening back toward their own neo-Puritan code of political correctness).
This is mostly correct except that I'd quibble with his use of the term "religious basis." Morality doesn't require a religious basis, it requires a basis that possesses the characteristics enumerated above: It must be rooted in an objectively existing moral authority - personal, transcendent and capable of holding human beings responsible for their choices. The existence and will of such a being - God - may or may not be an essential element of a particular religion.

Tracinski, then says that:
The religious right responds by saying that the only way to stem the tide of “anything goes” is to return to that old time religion.
It's not necessarily a return to "old time religion," or any religion, for that matter, which is needful for eliminating the subjectivity of moral judgments. It's a return to a belief that the world is the product of a morally perfect being who has established His moral will in the human heart and who insists that we follow it, i.e. that we treat others with justice and compassion.

Those beliefs may be augmented by a belief in special revelation and by the whole edifice of the Christian (or Jewish, or Islamic) tradition, but the core belief in the existence of the God of classical theism is not by itself "religious" at all. That core belief may not by itself be a sufficient condition for an objective morality but it is necessary for it.

Which is why people ask the question Jacoby finds so insulting. Put a different way, it's the question how an atheist can avoid making right and wrong merely a matter of personal taste. If that sort of subjectivity is what the secular life entails then its votaries really have nothing much to say, or at least nothing much worth listening to, about matters of right and wrong.

Wednesday, March 18, 2020

Hero or Sinner?

Radio host Dennis Prager once told a story of a woman in Nazi-occupied Poland that raised some interesting moral questions. The story was made into a Broadway play entitled Irene's Vow.

Prager said this about the play:

Playwright Dan Gordon and director Michael Parva have made goodness riveting in the new Broadway play, "Irene's Vow." The Irene of the title is Irene Gut Opdyke, who, at the time of the play's World War II's setting, was a pretty 19-year-old blond Polish Roman Catholic to whom fate (she would say God) gave the opportunity to save 12 Jews in, of all places, the home of the highest-ranking German officer in a Polish city.

Ultimately discovered by the Nazi officer, she was offered the choice of becoming the elderly Nazi's mistress or the Jews all being sent to death camps.

As it happens, I interviewed Opdyke on my radio show 20 years ago and again 12 years later, and she revealed to me how conflicted she was about what she consented to do not only because she became what fellow Poles derided as a "Nazi whore" but because as a deeply religious Catholic she was sure she was committing a grave sin by regularly sleeping with a man to whom she was not married and worse, indeed a married man, which likely rendered her sin of adultery a mortal sin.

What she did therefore, was not only heroic because she had to overcome daily fear of being caught and put to death, but because she also had to overcome a daily fear of committing a mortal sin before God.

During the German occupation, Irene Gut was hired by Wehrmacht Major Eduard Rügemer to work in a kitchen of a hotel that frequently served Nazi officials. Inspired by her Christian faith, she secretly took food from the hotel and delivered it to people in the Jewish ghetto.

She also smuggled Jews out of the ghetto into the surrounding forest and delivered food to them there. This, at a time when Poles were being shot just for giving bread to Jews.

Meanwhile, Rügemer asked her to work as a housekeeper in his requisitioned villa.

Once she had become the officer's housekeeper she had, unbeknownst to him, hidden a dozen Jews in his basement. During the day while he was away from the house tending to his duties, she would go out to find food and medicine for the Jews, and they would do the cleaning and other tasks that Irena was supposed to do.

Before the Major returned in the evening they would return to the basement. This went on for some time until one day Rügemer came home early and discovered what was happening. He was about to report the Jews to the SS but Irene pleaded with him not to. He then confronted her with the terrible choice mentioned above. You can read more about Irene here.

Here are a couple of questions this story compels us to ponder: Is what Irene did in agreeing to be the Major's mistress wrong or right? Do you, like Prager, see her adultery as heroic and deeply good or do you see it as wrong and sinful? Or neither? Is your answer one you could explain to someone else or is it an intuition that you have?

Tuesday, March 17, 2020

St. Patrick's Legacy

Millions of Americans, many of them descendents of Irish immigrants, will celebrate their Irish heritage by observing St. Patrick's Day today. We're indebted to Thomas Cahill and his best-selling book How The Irish Saved Civilization for explaining to us why Patrick's is a life worth commemorating.

As improbable as his title may sound, Cahill weaves a fascinating and compelling tale of how the Irish in general, and Patrick and his spiritual heirs in particular, served as a tenuous but crucial cultural bridge from the classical world to the medieval age and, by so doing, made Western civilization possible.

Born a Roman citizen in 390 A.D., Patrick had been kidnapped as a boy of sixteen from his home on the coast of Britain and taken by Irish barbarians to Ireland. There he languished in slavery until he was able to escape six years later. Upon his homecoming he became a Christian, studied for the priesthood, and eventually returned to Ireland where he would spend the rest of his life laboring to persuade the Irish to accept the Gospel and to abolish slavery.

Patrick was the first person in history, in fact, to speak out unequivocally against slavery and, according to Cahill, the last person to do so until the 17th century.

Meanwhile, Roman control of Europe had begun to collapse. Rome was sacked by Alaric in 410 A.D. and barbarians were sweeping across the continent, forcing the Romans back to Italy and plunging Europe into the Dark Ages.

Throughout the continent unwashed, illiterate hordes descended on the once grand Roman cities, looting artifacts and burning books. Learning ground to a halt and the literary heritage of the classical world was burned or moldered into dust. Almost all of it, Cahill claims, would surely have been lost if not for the Irish.

Having been converted to Christianity through the labors of Patrick, the Irish took with gusto to reading, writing and learning. They delighted in letters and bookmaking and painstakingly created indescribably beautiful Biblical manuscripts such as the Book of Kells which is on display today in the library of Trinity College in Dublin. Aware that the great works of the past were disappearing, they applied themselves assiduously to the daunting task of copying all surviving Western literature - everything they could lay their hands on.


For a century after the fall of Rome, Irish monks sequestered themselves in cold, damp, cramped mud or stone huts called scriptoria, so remote and isolated from the world that they were seldom threatened by the marauding pagans. Here these men spent their entire adult lives reproducing the old manuscripts and preserving literacy and learning for the time when people would be once again ready to receive them.


These scribes and their successors served as the conduits through which the Graeco-Roman and Judeo-Christian cultures were transmitted to the benighted tribes of Europe, newly settled amid the rubble and ruin of the civilization they had recently overwhelmed.

Around the late 6th century, three generations after Patrick, Irish missionaries with names like Columcille, Aidan, and Columbanus began to venture out from their monasteries and refuges, clutching their precious books to their hearts, sailing to England and the continent, founding their own monasteries and schools among the barbarians and teaching them how to read, write and make books of their own.

Absent the willingness of these courageous men to endure deprivations and hardships of every kind for the sake of the Gospel and learning, Cahill argues, the world that came after them would have been completely different. It would likely have been a world without books. Europe almost certainly would have been illiterate, and it would probably have been unable to resist the Muslim incursions that beset them a few centuries later.

The Europeans, starved for knowledge, soaked up everything the Irish missionaries could give them. From such seeds as these modern Western civilization germinated. From the Greeks the descendents of the Goths and Vandals learned philosophy, from the Romans they learned about law, from the Bible they learned of the worth of the individual who, created and loved by God, is therefore significant and not merely a brutish aggregation of matter.

From the Bible, too, they learned that the universe was created by a rational Mind and was thus not capricious, random, or chaotic. It would yield its secrets to rational investigation. Out of these assumptions, once their implications were finally and fully developed, grew historically unprecedented views of the value of the individual and the flowering of modern science.

Our cultural heritage is thus, in a very important sense, a legacy from the Irish - a legacy from Patrick. It's worth pondering what the world would be like today had it not been for those early Irish scribes and missionaries thirteen centuries ago.

Buiochas le Dia ar son na nGael (Thank God for the Irish), and I hope you have a great St. Patrick's Day even if because of Covid everything is closed and there are no parades.

Monday, March 16, 2020

Language Is Hard to Explain

Geneticist Michael Denton's most recent book, somewhat awkwardly titled Evolution: Still a Theory in Crisis, is a remarkable piece of philosophy of biology. In the book Denton critiques the classical Darwinian view that biological structures develop in organisms gradually over long periods of time because they serve an adaptive function.

This view is called functionalism and Denton argues persuasively that functionalist explanations are simply inadequate to explain a host of structures like the mammalian forelimb, the shapes of leaves, the structure of flowers, and many others.

One other example of phenomena that functionalist explanations cannot explain, Denton argues, is the emergence in human beings of language. He recently wrote a synopsis of his argument for Evolution News of which the following is a part:
In the early 1960s, in one of the landmark advances in 20th-century science, Noam Chomsky showed that all human languages share a deep invariant structure. Despite their very different "surface" grammars, they all share a deep set of syntactic rules and organizing principles. All have rules limiting sentence length and structure and all exhibit the phenomenon of recursion -- the embedding of one sentence in another.

Chomsky has postulated that this deep "universal grammar" is innate and is embedded somewhere in the neuronal circuitry of the human brain in a language organ.

Children learn [human] languages so easily, despite a "poverty of stimulus," because they possess innate knowledge of the deep rules and principles of human language and can select, from all the sentences that come to their minds, only those that conform to a "deep structure" encoded in the brain's circuits.

The challenge this poses to Darwinian evolution is apparent. Take the above-mentioned characteristic that all human languages exhibit: recursion. In the sentence, "The man who was wearing a blue hat which he bought from the girl who sat on the wall was six feet tall," the italicized words are embedded sentences. Special rules allow human speakers to handle and understand such sentences. And these rules, which govern the nature of recursion, are specific and complex.

So how did the computational machinery to handle it evolve? David Premack is skeptical:
I challenge the reader to reconstruct the scenario that would confer selective fitness on recursiveness.

Language evolved, it is conjectured, at a time when humans or proto-humans were hunting mastodons... Would it be a great advantage for one of our ancestors squatting alongside the embers to be able to remark, "Beware of the short beast whose front hoof Bob cracked when, having forgotten his own spear back at camp, he got in a glancing blow with the dull spear he borrowed from Jack"?

Human language is an embarrassment for evolutionary theory because it is vastly more powerful than one can account for in terms of selective fitness. A semantic language with simple mapping rules, of a kind one might suppose that the chimpanzee would have, appears to confer all the advantages one normally associates with discussions of mastodon hunting or the like.

For discussions of that kind, syntactical classes, structure-dependent rules, recursion and the rest, are overly powerful devices, absurdly so.
There is considerable controversy over what structures in the brain restrict all human languages to the same deep structure....

Yet, however it is derived during development, there is no doubt that a unique deep structure underlies the languages of all members of our species. It is because of the same underlying deep structure that we can speak the language of the San Bushman or an Australian aborigine, and they in turn can speak English.

The fact that all modern humans, despite their long "evolutionary separation" -- some modern races such as the San of the Kalahari and the Australian aborigines have been separated by perhaps 400,000 years of independent evolution -- can learn each other's languages implies that this deep grammar must have remained unchanged since all modern humans (African and non-African) diverged from their last common African ancestor, at least 200,000 years ago.

As Chomsky puts it:
What we call "primitive people"... to all intents and purposes are identical to us. There's no cognitively significant genetic difference anyone can tell. If they happened to be here, they would be one of us, and they would speak English... If we were there, we would speak their languages.

So far as anyone knows, there is virtually no detectable genetic difference across the species that is language-related.
[I]t is not only the deep structure of language that has remained invariant across all human races. All races share in equal measure all the other higher intellectual abilities: musical, artistic, and mathematical ability, and capacity for abstract thought. These also, therefore, must have been present in our African common ancestors more than 200,000 or more years ago, and must have remained unchanged, and for some reason latent since our common divergence.

To suggest that language and our higher mental faculties evolved in parallel to reach these same remarkable ends independently in all the diverse lineages of modern humans over 200,000 years ago or more, would be to propose the most striking instance of parallel evolution in the entire history of life and be inexplicable in Darwinian terms.
If I understand Denton aright, these capacities, which all humans share, have been present from the origin of the species, but no alleged non-human evolutionary precursor has them. Thus, these capacities, as remarkable as they are, must have evolved with extraordinary rapidity, appearing full-blown in the earliest human beings.

That's very hard to explain in terms of traditional Darwinian gradualism and functionalism.

It seems the more we learn the more implausible classical naturalistic Darwinism appears to be.

Saturday, March 14, 2020

The Best Prepared Candidate?

In the following video segment Britain's Sky News reporter Rita Panahi raises questions about Joe Biden's fitness for the presidency, specifically with regard to his mental acuity.

It's deeply ironic that the Democrats who have for three years been questioning Donald Trump's mental health seem to have few qualms about Mr. Biden's apparent liabilities.

The last couple of minutes of this almost seven minute video are very sad to watch, and I certainly don't want readers to think that it's intended to make fun of Mr. Biden's difficulties, but this is the man that most Democrats would have voters believe is best prepared to lead America for the next four years. We need to ask whether that's really true.

How is it that a party which is obsessed with diversity and which so often pats itself on the back for being more "inclusive" than its Republican opposites, as well as for it's greater appeal to the young, nevertheless is offering up to the electorate two rich white men in their late 70s?

One of these men is a wealthy socialist, of all things, who recently suffered a heart attack, never has had a real job outside of politics and has never achieved anything of note within the political sphere beyond getting himself elected.

The other candidate is also a life-long politician who verbally abuses voters, has a reputation for pawing women and young girls, has allegedly embarrassed female secret service agents by skinny-dipping in their presence and who is now apparently suffering frequent bouts of cognitive confusion.

The Democrats have indulged their contempt for Donald Trump for almost four years, but what are they offering American voters as an alternative?

Friday, March 13, 2020

Secrets of the Cell (Pt. V)

Here's the fifth and final installment in the series of videos titled Secrets of the Cell featuring Lehigh University biochemist Michael Behe. In this final episode, Behe argues that the design in nature points powerfully to a designer, an intelligent agent who had purposes in mind in engineering the cell.

The purposeful arrangement of parts is evidence, convincing evidence, that something is designed. We may not know who the designer was, when the designer worked or how the designer built what we see, but the fact that its parts are exceedingly complex and purposeful is an extremely reliable indicator of intentional design:
Together, these five short videos give a very succinct and clear exposition of what's known as Intelligent Design (ID), the argument that there are very compelling reasons to think that life on earth is the product of an incomprehensibly intelligent mind. To see episodes one through four scroll down or click on these links: Episode I; Episode II; Episode III; Episode IV.

Thursday, March 12, 2020

On the Nature of the Soul

Usually when people talk about the soul and life beyond the death of the physical body they draw looks of incredulity and even scorn from fashionably skeptical materialists, but when a scientist as prominent as physicist Roger Penrose talks about it, well, then the skeptics should at least listen.

Penrose's theory is that the soul consists of information stored at the sub-atomic level in microtubules in the body's cells. At death this information somehow escapes the confines of the microtubules and drifts off into the universe. He claims to have evidence to support this hypothesis, and perhaps he does.

I haven't seen the evidence, but I'd like to know how the information "knows" that the body has died and what mechanism controls it. I'd also like to know what the information is about, how it functions without a physical body, and what disembodied information leaking out into the universe "looks" like.

Anyway, I'm not altogether skeptical of Penrose's theory. I've long advocated the view that, if we do have a soul (as a substance that's neither physical nor mental - neither body nor mind), that it consists of information. In this I'm in agreement with Penrose.

Where I differ from him is that in my view the soul is the totality of true propositions about a person - an exhaustive description of the person at every moment of his or her existence. It's the essence of the person. But whereas Penrose locates the information in cellular microtubules I would place it in the mind of God. In the unimaginably vast database of God's mind there is, so to speak, a "file" containing a complete description of every person who has ever lived, perhaps every thing that has ever existed.

Since the information is located in the mind of God it's indestructible - immortal - unless God chooses to destroy it. Each human being is therefore potentially eternal.

To take this line of thinking one more step, perhaps when our physical bodies die our "file" is "downloaded," in whole or in part, into another body situated in a different world, or at least in a different set of dimensions than what we experience in this world. It would be a different kind of body, perhaps, but a body all the same.

On this view, the soul is not something wraith-like that's contained in us, but rather it's "in" God. As with a computer file, he could choose to delete it altogether or to express it, in whole or in part, in any corporeal "format" he sees fit.

In any case, if this hypothesis is at all close to describing the way things are, the death of our bodies is not the death of us, and, if physical death is not the end of our existence, we're each confronted with some pretty serious implications.

Wednesday, March 11, 2020

Joe the Science Denier

When I was an undergrad majoring in biology I worked for a couple of summers doing research with my faculty advisor which afforded us the opportunity to have many conversations on all sorts of topics. One thing he stressed in those conversations was the need for any aspirant to the intellectual life to develop a healthy skepticism - the scientific attitude required a skepticism about received opinion and authority in all areas of life - politics, religion, and even science.

My advisor insisted that to doubt and question what we were told by those in authority was the path to truth and independent thinking. This was back in the late sixties and early seventies and my advisor was himself an atheist and a political radical in much the same mold as Bernie Sanders.

I think I absorbed those informal lessons pretty thoroughly, but I came eventually to think his own skepticism ironic since as skeptical as he professed to be, he seemed dogmatically certain that his political views were truth itself, that there was no god of the sort that Christians believed in, and that Charles Darwin was as correct as any scientist ever was about what we might today call Molecules to Man evolution.

I was reminded of my advisor, for whom I had a deep respect, and our many conversations as I read a column on skepticism by neuroscientist Michael Egnor.

Egnor begins by noting that our social betters often consider those of us who have doubts about anthropogenic climate change, or Darwinian explanations of life, or the ability of metaphysical naturalism to offer a satisfactory account of morality, free will, human consciousness, etc. "anti-intellectual science deniers."

He then describes for us a typical example of the species, a guy named Joe:
Joe has no scientific education. He’s a truck driver. He works a couple of jobs to support his family, he pays his taxes, coaches his son’s little league team, and goes to church on Sundays. He is anything but a scientific expert, but he does know a few things.

Joe has been told since the 1980s that the world is going to end due to global warming. It sounds like those crazy guys with the placards who say the world is gonna end tomorrow. The earth’s sell-by-date keeps getting pushed forward — polar ice caps were supposed to melt, but didn’t, polar bears were supposed to go extinct, but didn’t, sea levels were supposed to inundate coastal cities, but didn’t, and tens of millions of climate refugees were supposed to perish fleeing the catastrophic heat. Joe’s still waiting.

He is also still waiting for the apocalyptic global cooling he was told about in the 1970s (Joe ain’t no scientist, but he has a good memory). He remembers watching Paul Ehrlich on TV in the late 1960s warning that overpopulation was going to cause billions of people to die of starvation and cause nations to disintegrate over the next couple of decades.

Joe wonders how a scientist could be so wrong and still keep his job and even get elected to the American Association for the Advancement of Science, the United States National Academy of Sciences, the American Academy of Arts and Sciences, and the American Philosophical Society.

Joe knows that if he screwed up his own job like that, he’d be fired before the day was out. But those rules don’t apply to scientists. Joe remembers hearing that DDT and other pesticides was going to kill all birds and give us all cancer. DDT was banned, and lots of people started (again) dying of malaria, and scientists were pretty proud of themselves for getting DDT banned and told people who didn’t want to get malaria to sleep with nets.

Joe remembers being told by scientists in the 1990s that AIDS was going to spread to the heterosexual community and kill millions of Americans. He remembers the panic over Y2K, when nothing happened except that some scientists got big grants to study it.

Joe has heard a lot about the science replication crisis — he doesn’t fully understand it, but he knows that it means that a whole lot of science is basically made up.

Joe remembers his father talking about when the U.S. government sterilized tens of thousands of innocent people against their will because scientific experts insisted that humanity was degenerating due to poor breeding. Joe isn’t exactly sure what eugenics was, but he knows that nearly all scientific institutions embraced it for nearly a century, and Joe suspects that it was just a way to make sure there weren’t too many people like Joe.

Joe doesn’t know what to think about evolution. He believes in God, and knows that it’s obvious that a Higher Power made this beautiful and vastly complex world. He doesn’t have a problem with the claim that animals change over time, but he doesn’t think that scientists should drag his son’s teachers into federal court to force them to teach his boy that there’s no purpose in life.

He thinks we should be able to question science, especially in schools. And he wonders why Darwin’s theory is so certain, since it can’t even stand up to questions from schoolchildren.
Egnor has more to say about the reasons for Joe's doubts about what he hears from the scientific priesthood, but I think we all get the picture. Joe may not know much about how scientists over the course of his lifetime have arrived at their conclusions, but he does know that quite often those conclusions have been very wrong. Maybe a little less trust and a bit more skepticism would've been a good thing and would still be a good thing today.

P.S. For what it's worth, although I included Egnor's paragraph on DDT, I disagree with his apparent disapproval of the ban on this pesticide and am myself glad it's no longer poisoning our ecosystems.

Tuesday, March 10, 2020

A Case for Dualism

Philosophical materialists maintain that the brain is all that's involved in our cognitive experience and that there's no need to posit the existence of an immaterial mind or soul. Moreover, given that brain function is the product of the laws of physics and chemistry, materialists argue that there's no reason to believe that we have free will.

For materialists mind is simply a word we use to describe the function of the brain, much like we use the word digestion to refer to the function of the stomach, but, they argue, just as digestion is an activity and not an organ or distinct entity in itself, likewise the mind is an activity of the brain and not a separate entity in itself.

As neurosurgeon Michael Egnor discusses in this fifteen minute video, however, the materialist view is not shared by all neuroscientists and some of the foremost practitioners in the field have profound difficulties with it.

Egnor explains how the findings of three prominent twentieth century brain scientists point to the existence of something beyond the material brain that's involved in human thought and which also point to the reality of free will.

His lecture is an excellent summary of the case for philosophical dualism and is well worth the fifteen minutes it takes to watch it:

Monday, March 9, 2020

Katie's Soul

My classes have begun discussing what philosophers call the mind/body problem, that is, the question whether the brain alone can provide an adequate explanation for our cognitive experience or whether there's justification for believing that something else, an immaterial mind or soul, is also involved.

I did a post some time ago on an article that sheds some very interesting light on this question, and I thought it might be worthwhile to post it again since it ties in with our class discussion. Here it is:

Neurosurgeon Michael Egnor has a fine piece at Plough.com in which he argues against the materialist view that we are simply material beings with no spiritual or mental remainder.

The materialist holds that everything about us that might be attributed to qualities like soul or mind are ultimately reducible to the physical structure of the material brain. Matter and the laws of physics can in principle explain everything.

The opening paragraphs of Egnor's essay call this view into serious question. He writes:
I watched the CAT scan images appear on the screen, one by one. The baby’s head was mostly empty. There were only thin slivers of brain – a bit of brain tissue at the base of the skull, and a thin rim around the edges. The rest was water.

Her parents had feared this. We had seen it on the prenatal ultrasound; the CAT scan, hours after birth, was much more accurate. Katie looked like a normal newborn, but she had little chance at a normal life.

She had a fraternal-twin sister in the incubator next to her. But Katie only had a third of the brain that her sister had. I explained all of this to her family, trying to keep alive a flicker of hope for their daughter.

I cared for Katie as she grew up. At every stage of Katie’s life so far, she has excelled. She sat and talked and walked earlier than her sister. She’s made the honor roll. She will soon graduate high school.

I’ve had other patients whose brains fell far short of their minds. Maria had only two-thirds of a brain. She needed a couple of operations to drain fluid, but she thrives. She just finished her master’s degree in English literature, and is a published musician.

Jesse was born with a head shaped like a football and half-full of water – doctors told his mother to let him die at birth. She disobeyed. He is a normal happy middle-schooler, loves sports, and wears his hair long.

Some people with deficient brains are profoundly handicapped. But not all are. I’ve treated and cared for scores of kids who grow up with brains that are deficient but minds that thrive. How is this possible?
Well, if materialism is true it's hard to see how it could be possible, but if materialism is false then there might be an explanation that includes a soul or mind that's somehow integrated with the brain but which is nevertheless not ultimately explicable in terms of the material stuff that makes us up.

Egnor goes on in his essay to show that mental processes like thoughts and sensations cannot be reduced to physical structures and also to explain why the materialist denial of human free will is almost certainly wrong.

He offers the sorts of arguments that are making it very difficult nowadays to be a consistent materialist. Indeed, some materialists are finding it so difficult to explain phenomena like human consciousness solely in terms of the material brain that they've even taken to denying that consciousness exists, but this seems like madness. After all, doesn't one have to be conscious in order to think about whether consciousness exists?

Evidently, some philosophers will go to any lengths, no matter how bizarre, to avoid having to accept any idea that may lead to the existence of anything that's consistent with a theistic worldview.

Egnor concludes his column with this:
There is a part of Katie’s mind that is not her brain. She is more than that. She can reason and she can choose. There is a part of her that is immaterial.... There is a part of Katie that didn’t show up on those CAT scans when she was born.

Katie, like you and me, has a soul.

Saturday, March 7, 2020

Secrets of the Cell (Pt. IV)

We've been posting the videos in the series titled Secrets of the Cell featuring biochemist Michael Behe. In this the fourth installment Behe argues that most evolution is not really progressive at all but actually regressive.

What often happens to produce change in an organism is that pre-existing genes are broken or blunted so that their function is lost. Despite this impairment, however, a survival advantage of some sort is still conferred on the organism.

This is exactly the opposite, though, of what occurs in the Darwinian model.

According to the regnant evolutionary theory, mutations in an organism's DNA actually produce new genes which in turn eventually produce new traits in the species. Starting with primitive molecules, accumulated mutations acted upon by natural selection have, over billions of years, produced every type of living thing we see on earth, including humans. Or so the theory goes.

The difference between the two views is quite significant. If most evolution is really devolution then the question arises as to the origin of the pre-existing genetic material. Where did it come from? Behe will tackle that question in future episodes, but for now this six minute video gives the viewer a good explanation of the process by which most genetic change actually occurs in a species.

To view previous episodes go here for part I, here for part II and here for part III.

Friday, March 6, 2020

Denying That We're Conscious

Philosopher Galen Strawson asks what the silliest claim ever made might be and concludes that the answer has to be the claim made by some philosophers that conscious experience is merely an illusion and doesn't "really" exist. In an interesting, albeit rather lengthy, piece at The New York Review of Books he calls this claim "The Denial."

A summary of the argument the contemporary Deniers make against conscious experience looks something like this:
  1. Naturalism entails materialism which entails that all reality is reducible to matter.
  2. Conscious experience cannot be reduced to matter.
  3. Therefore, conscious experience isn't real.
Strawson is himself a naturalist and a materialist so he agrees with the first premise, but he insists that the second premise is just wrong. We know far too little about the brain, he argues, to assert that conscious experience can't be reduced to brain matter. His own argument, then, looks like this:
  1. Naturalism is true and it entails that all reality is reducible to matter.
  2. Conscious experience is real.
  3. Therefore, conscious experience can be reduced to matter.
I completely agree with the second premise of this argument. Pace the Deniers, the premise can only be denied on pain of incoherence. I agree with Strawson that it seems folly to deny it. If it's an illusion that I'm in pain, for instance, then I'm still experiencing the sensation of pain via the illusion. Thus, even if I'm under the spell of an illusion I'm still having a conscious experience.

Nevertheless, the conclusion of this syllogism only follows if we know that the first premise is true. Strawson seems to beg the question by assuming it is, but that's just a metaphysical preference, a presupposition, an act of faith on his part. It could just as easily be false for all we know since he offers no argument for it.

Here's his own summary of his argument:
Naturalism states that everything that concretely exists is entirely natural; nothing supernatural or otherwise non-natural exists. Given that we know that conscious experience exists, we must as naturalists suppose that it’s wholly natural. And given that we’re specifically materialist or physicalist naturalists (as almost all naturalists are), we must take it that conscious experience is wholly material or physical.

And so we should, because it’s beyond reasonable doubt that [our mental] experience...is wholly a matter of neural goings-on: wholly natural and wholly physical.
Strawson goes on to describe how other naturalist philosophers have come to deny the reality of conscious experience:
But then—in the middle of the twentieth century—something extraordinary happens. Members of a small but influential group of analytic philosophers come to think that true naturalistic materialism rules out realism about consciousness. They duly conclude that consciousness doesn’t exist.

They reach this conclusion in spite of the fact that conscious experience is a wholly natural phenomenon, whose existence is more certain than any other natural phenomenon, and with which we’re directly acquainted, at least in certain fundamental respects.

These philosophers thus endorse the Denial.

The problem is not that they take naturalism to entail materialism—they’re right to do so. The problem is that they endorse the claim that conscious experience can’t possibly be wholly physical. They think they know this, although genuine naturalism doesn’t warrant it in any way.

So they...claim that consciousness doesn’t exist, although many of them conceal this by using the word “consciousness” in a way that omits the central feature of consciousness—the qualia [i.e. our sensations of color, taste, fragrance, sound, pain, etc.]
Strawson and I agree, then, that qualia, and thus conscious experience, are real, but we disagree over his rejection of the claim that conscious experience cannot be completely reduced to material stuff. It seems to me that qualia are fundamentally different from matter, and it's exceptionally difficult to see how the experience of red, for instance, can be reduced to electrochemical phenomena in the brain.

As the late philosopher Jerry Fodor once said:
Nobody has the slightest idea how anything material could be conscious. Nobody even knows what it would be like to have the slightest idea about how anything material could be conscious. So much for the philosophy of consciousness.
When we experience the sensation of color, or sweetness, or pain the immediate cause of that sensation is physical or material, but the sensation itself is not. A miniature scientist walking around inside someone's brain while the host is tasting sugar will only observe electrochemical reactions occuring in the neurons and synapses.

She'd see electrons whizzing about, chemicals interacting, and nerve fibers lighting up, perhaps, but however deeply she probed into the host's brain she wouldn't observe "sweetness" anywhere. Likewise, mutatis mutandis, with every other sensation her host might be experiencing.

So, I'd suggest a reformulation of the first syllogism:
  1. Naturalism entails materialism which entails that all reality is reducible to matter.
  2. Conscious experience probably cannot be reduced to matter.
  3. Therefore, Naturalism is probably false.
This argument rests on the truth of the second premise, of course, a premise which Strawson denies. But he can't, or at least doesn't in his essay, give any reason for his denial of the premise other than his a priori conviction that Naturalism is true.

If, though, it's reasonable to think that the second premise is true, and a lot of philosophers, many metaphysical Naturalists among them, are convinced it is, then it's reasonable to believe the conclusion that Naturalism is probably false is true as well.