Wednesday, May 31, 2017

Useless Beauty

One characteristic of living things that has thrilled everyone who has ever considered it is the astonishing level of beauty they exhibit. Consider, as an example, this bird of paradise:

or this blue dachnis:


Why are living things like birds and butterflies so beautiful? Darwin thought that females selected mates based on their fitness and that this sex selection caused beauty to evolve as a by-product. This is still the reigning explanation today (although it doesn't explain the beauty of flowers), but as an article by Adrian Barnett at New Scientist explains, not everyone is on board with this explanation, maybe not even Darwin himself. Here's an excerpt:
“The sight of a feather in a peacock’s tail… makes me sick,” wrote Darwin, worrying about how structures we consider beautiful might come to exist in nature. The view nowadays is that ornaments such as the peacock’s stunning train, the splendid plumes of birds of paradise, bowerbirds’ love nests, deer antlers, fins on guppies and just about everything to do with the mandarin goby are indications of male quality.

In such species, females choose males with features that indicate resistance to parasites (shapes go wonky, colours go flat if a male isn’t immunologically buff) or skill at foraging (antlers need lots of calcium, bowers lots of time).

But in other cases, the evolutionary handicap principle applies, and the fact it’s hard to stay alive while possessing a huge or brightly coloured attraction becomes the reason for the visual pizzazz. And when this process occasionally goes a bit mad, and ever bigger or brasher becomes synonymous with ever better, then the object of female fixation undergoes runaway selection until physiology or predation steps in to set limits.

What unites these explanations is that they are all generally credited to Darwin and his book The Descent of Man, and Selection in Relation to Sex. Here, biologists say, having set out his adaptationist stall in On the Origin of Species, Darwin proposed female choice as the driving force behind much of the animal world’s visual exuberance.

And then along comes Richard Prum to tell you there’s more to it than that. Prum is an ornithology professor at Yale University and a world authority on manakins, a group of sparrow-sized birds whose dazzling males perform mate-attracting gymnastics on branches in the understories of Central and South American forests. Years of watching the males carry on until they nearly collapsed convinced him that much of the selection is linked to nothing except a female love of beauty itself, that the only force pushing things forward is female appreciation. This, he says, has nothing to do with functionality: it is pure aesthetic evolution, with “the potential to evolve arbitrary and useless beauty”.(emphasis mine)

As Prum recounts, this idea has not found the greatest favour in academic circles. But, as he makes plain, he’s not alone. Once again, it seems Darwin got there first, writing in Descent that “the most refined beauty may serve as a sexual charm, and for no other purpose”. The problem is, it seems, that we all think we know Darwin. In fact, few of us go back to the original, instead taking for granted what other people say he said. In this case, it seems to have created a bit of validation by wish fulfilment: Darwin’s views on sexual selection, Prum says, have been “laundered, re-tailored and cleaned-up for ideological purity”.
The difficulty here, at least for me, is that it doesn't explain why animals would have developed a sense of beauty in the first place. Pair-bonding and reproduction certainly don't require it, obviously, since many organisms, including humans it must be said, successfully reproduce without benefit of physical attractiveness. So why would some organisms evolve a dependence upon it, and what is it in the organism's genotype that governs this aesthetic sense?

Could it be that animals, or at least some of them, are intelligently designed to just delight in beauty?

Tuesday, May 30, 2017

The Knowledge Problem

One's worldview has consequences. If someone embraces a naturalistic worldview then he or she is inclined to think that everything that exists reduces to material stuff and that every phenomenon can be explained, in principle, in terms of physical law. This view is usually called physicalism.

On the other hand, if one is a theist then one is usually disposed to think that there's more to reality than just physical matter. Reality, on this view, includes non-physical, immaterial mind and/or soul. This view is usually referred to as substance dualism.

Thirty five years ago a philosopher named Frank Jackson posed a clever thought experiment which he believed refuted physicalism. There's an interesting discussion of the strengths and weaknesses of Jackson's thought experiment, called Mary and the Knowledge Problem, by Ari N. Schulman at The New Atlantis. Schulman begins by explaining Jackson's argument. He writes:
[T]he thought experiment was proposed by Australian philosopher Frank Jackson in a 1982 paper. It was posed as a challenge to physicalism, the school of thought that holds that the mind is purely material, made solely of the stuff of rocks and meat, fully explicable by physics and chemistry. Physicalists regard the common-sense view that the mind is special — whether because it’s self-aware, can think and feel, or has free will — as an illusion.

Aiming to refute physicalism, Jackson asks us to imagine a scientist named Mary, who is so brilliant that she acquires all of the “physical information there is to obtain” about the workings of vision. Mary, that is, learns everything there is to know about how various wavelengths of light stimulate the retina, the neurology of the visual processing system, how this system interacts with the speech centers to produce spoken descriptions of images, and so on. The catch is that Mary has lived her whole life in a room in which everything is entirely in black, white, and shades of gray, including her books and the TV monitor she uses to investigate the world.

One day Mary is released from her room. For the first time, she sees colors with her own eyes. The question is: Does Mary learn something new?

The intuitive answer for most people is: yes, of course — Mary learns what it is like to see color. She learns about the redness of a rose, the blueness of the sky. But recall that Mary already knew everything physical about vision. So whatever it is that Mary learns is not encapsulated in physical descriptions. We can conclude, then, that there are such things as nonphysical facts about vision, meaning there must also be nonphysical properties of vision. In short, there is something special about the mind, and physicalism must be false.

The conclusion may strike many readers as obvious, but physicalism is the orthodoxy among today’s physicists, biologists, and philosophers of mind. To the physicalists, their position is the beginning and the end of the modern scientific project: in principle, a core metaphysical commitment that distinguishes modern science from its forebears; in its particulars, the final theory that is supposed to await us on the distant day when science is finished.

Peruse the pop-science headlines on any given day and read about how “we now know” that the love you feel looking at your child is actually just oxytocin in the limbic centers of the brain, a development that just happened to help our ancestors outbreed their neighbors, who apparently felt for their children the way we do about a plate of wet bread. The sense that your soul just died is a sign that you are now bending properly along the great de-spiriting arc of history. Thus, the knowledge argument seems to provide a welcome bulwark against the rising tide of physicalism, a relief to those who believe that love is love, whatever else it might also be.

Of course, as clear and intuitive as it first appears, Mary’s Room, like every other question of mind, is not nearly so simple. It would be surprising if a centuries-old philosophical project could be crumbled at its foundations by the kind of lesson taught to four-year-olds in preschool. Fittingly, then, despite three and a half decades of sustained discussion, the knowledge argument has apparently not won a single academic convert to dualism, the opposing set of theories holding that the mind is not entirely physical, or that mental and physical properties are distinct.

Though it is now perhaps the go-to example of an argument against physicalism, philosophers have offered compelling reasons to doubt it. The knowledge argument seems largely to have entrenched the opposing sides, providing each with an ever more elaborate set of rationalizations for its existing views.

Physicalism and dualism are conventionally seen as enemies, and with good reason. Each position is as much a sustained rejection of its opposite as it is a positive program of explanation in its own right. But as we will see, the mutual hostility of modern physicalism and dualism conceals a deeper convergence.

Understanding the thought experiment requires a return to foundational concerns about how we move past subjective experience to achieve objective knowledge of the world. These questions in turn point back to the genuine mystery of mind — of how it is that certain bits of dust, arranged just so, become capable of pondering the infinite.
It's not surprising that the "knowledge argument" hasn't persuaded many, if any, philosophers to abandon physicalism. To give up physicalism (or dualism) is not like giving up one's belief that one's favorite baseball team will win the pennant this year.

To yield on a matter as fundamental as the ultimate nature of the universe would rock one's entire worldview, and would have grave implications for one's belief in God. People only undergo a revolution in their worldview as a last resort and then only with much psychological and emotional agonizing. As long as there are any arguments at all that can be clung to to keep one from being blown away in the gale of opposing evidence those arguments will be tenaciously latched onto.

This is not to say that the counter-arguments invoked against Mary aren't good, only that it usually takes more than arguments to persuade someone to abandon a supporting pillar of his or her worldview.

Schulman goes on at length to examine the arguments on both sides of the issue in what is a very thorough and informative essay. Philosophically-minded readers may want to check it out.

Monday, May 29, 2017

Remembering Our Heroes

Memorial Day is a day to remember those who paid the ultimate price in combat for our country, but perhaps I can take a little license and also praise the sacrifices and character of men like those described in these accounts from the war in Iraq. Some of them never came home, but all of them deserve our gratitude and admiration:
A massive truck bomb had turned much of the Fort Lewis soldiers’ outpost to rubble. One of their own lay dying and many others wounded. Some 50 al-Qaida fighters were attacking from several directions with machine guns and rocket-propelled grenades. It was obvious that the insurgents had come to drive the platoon of Stryker brigade troops out of Combat Outpost Tampa, a four-story concrete building overlooking a major highway through western Mosul, Iraq.

“It crossed my mind that that might be what they were going to try to do,” recalled Staff Sgt. Robert Bernsten, one of 40 soldiers at the outpost that day. “But I wasn’t going to let that happen, and looking around I could tell nobody else in 2nd platoon was going to let that happen, either.”

He and 10 other soldiers from the same unit – the 1st Battalion, 24th Infantry Regiment – would later be decorated for their valor on this day of reckoning, Dec. 29, 2004. Three were awarded the Silver Star, the Army’s third-highest award for heroism in combat. When you combine those medals with two other Silver Star recipients involved in different engagements, the battalion known as “Deuce Four” stands in elite company. The Army doesn’t track the number of medals per unit, but officials said there could be few, if any, other battalions in the Iraq war to have so many soldiers awarded the Silver Star.

“I think this is a great representation of our organization,” said the 1-24’s top enlisted soldier, Command Sgt. Maj. Robert Prosser, after a battalion award ceremony late last month at Fort Lewis. “There are so many that need to be recognized.... There were so many acts of heroism and valor.”

The fight for COP Tampa came as Deuce Four was just two months into its yearlong mission in west Mosul. The battalion is part of Fort Lewis’ second Stryker brigade. In the preceding weeks, insurgents had grown bolder in their attacks in the city of 2 million. Just eight days earlier, a suicide bomber made his way into a U.S. chow hall and killed 22 people, including two from Deuce Four.

The battalion took over the four-story building overlooking the busy highway and set up COP Tampa after coming under fire from insurgents holed up there. The troops hoped to stem the daily roadside bombings of U.S. forces along the highway, called route Tampa. Looking back, the Dec. 29 battle was a turning point in the weeks leading up to Iraq’s historic first democratic election.

The enemy “threw everything they had into this,” Bernsten said. “And you know in the end, they lost quite a few guys compared to the damage they could do to us. “They didn’t quit after that, but they definitely might have realized they were up against something a little bit tougher than they originally thought.”

The battle for COP Tampa was actually two fights – one at the outpost, and the other on the highway about a half-mile south.

About 3:20 p.m., a large cargo truck packed with 50 South African artillery rounds and propane tanks barreled down the highway toward the outpost, according to battalion accounts.

Pfc. Oscar Sanchez, on guard duty in the building, opened fire on the truck, killing the driver and causing the explosives to detonate about 75 feet short of the building. Sanchez, 19, was fatally wounded in the blast. Commanders last month presented his family with a Bronze Star for valor and said he surely saved lives. The enormous truck bomb might have destroyed the building had the driver been able to reach the ground-floor garages.

As it was, the enormous explosion damaged three Strykers parked at the outpost and wounded 17 of the 40 or so soldiers there, two of them critically.

Bernsten was in a room upstairs. “It threw me. It physically threw me. I opened my eyes and I’m laying on the floor a good 6 feet from where I was standing a split second ago,” he said. “There was nothing but black smoke filling the building.” People were yelling for each other, trying to find out if everyone was OK.

“It seemed like it was about a minute, and then all of a sudden it just opened up from everywhere. Them shooting at us. Us shooting at them,” Bernsten said. The fight would rage for the next two hours. Battalion leaders said videotape and documents recovered later showed it was Abu Musab al-Zarqawi’s al-Qaida in Iraq fighters. They were firing from rooftops, from street corners, from cars, Bernsten said.

Eventually, Deuce Four soldiers started to run low on ammunition. Bernsten, a squad leader, led a team of soldiers out into the open, through heavy fire, to retrieve more from the damaged Strykers. “We went to the closest vehicle first and grabbed as much ammo as we could, and got it upstairs and started to distribute it,” he said. “When you hand a guy a magazine and they’re putting the one you just handed them into their weapon, you realize they’re getting pretty low. So we knew we had to go back out there for more.”

He didn’t necessarily notice there were rounds zipping past as he and the others ran the 100 feet or so to the Strykers. “All you could see was the back of the Stryker you were trying to get to.”

Another fight raged down route Tampa, where a convoy of six Strykers, including the battalion commander’s, had rolled right into a field of hastily set roadside bombs. The bombs hadn’t been there just five minutes earlier, when the convoy had passed by going the other way after a visit to the combat outpost. It was an ambush set up to attack whatever units would come to the aid of COP Tampa.

Just as soldiers in the lead vehicle radioed the others that there were bombs in the road, the second Stryker was hit by a suicide car bomber. Staff Sgt. Eddieboy Mesa, who was inside, said the blast tore off the slat armor cage and equipment from the right side of the vehicle, and destroyed its tires and axles and the grenade launcher mounted on top. But no soldiers were seriously injured.

Insurgents opened fire from the west and north of the highway. Stryker crewmen used their .50-caliber machine guns and grenade launchers to destroy a second car bomb and two of the bombs rigged in the roadway. Three of the six Strykers pressed on to COP Tampa to join the fight.

One, led by battalion operations officer Maj. Mark Bieger, loaded up the critically wounded and raced back onto the highway through the patch of still-unstable roadside bombs. It traveled unescorted the four miles or so to a combat support hospital. Bieger and his men are credited with saving the lives of two soldiers.

Then he and his men turned around and rejoined the fight on the highway. Bieger was one of those later awarded the Silver Star. Meantime, it was left to the soldiers still on the road to defend the heavily damaged Stryker and clear the route of the remaining five bombs.

Staff Sgt. Wesley Holt and Sgt. Joseph Martin rigged up some explosives and went, under fire, from bomb to bomb to prepare them for demolition. They had no idea whether an insurgent was watching nearby, waiting to detonate the bombs. Typically, this was the kind of situation where infantry soldiers would call in the ordnance experts. But there was no time, Holt said.

“You could see the IEDs right out in the road. I knew it was going to be up to us to do it,” Holt said. “Other units couldn’t push through. The colonel didn’t want to send any more vehicles through the kill zone until we could clear the route.” And so they prepared their charges under the cover of the Strykers, then ran out to the bombs, maybe 50 yards apart. The two men needed about 30 seconds to rig each one as incoming fire struck around them.

“You could hear it [enemy fire] going, but where they were landing I don’t know,” Holt said. “You concentrate on the main thing that’s in front of you.” He and Martin later received Silver Stars.

The route clear, three other Deuce Four platoons moved out into the neighborhoods and F/A-18 fighter jets made more than a dozen runs to attack enemy positions with missiles and cannon fire. “It was loud, but it was a pretty joyous sound,” Bernsten said. “You know that once that’s happened, you have the upper hand in such a big way. It’s like the cavalry just arrived, like in the movies.”

Other soldiers eventually received Bronze Stars for their actions that day, too.

Sgt. Christopher Manikowski and Sgt. Brandon Huff pulled wounded comrades from their damaged Strykers and carried them over open ground, under fire, to the relative safety of the building.

Sgt. Nicholas Furfari and Spc. Dennis Burke crawled out onto the building’s rubbled balcony under heavy fire to retrieve weapons and ammunition left there after the truck blast.

Also decorated with Bronze Stars for their valor on Dec. 29 were Lt. Jeremy Rockwell and Spc. Steven Sosa. U.S. commanders say they killed at least 25 insurgents. Deuce Four left the outpost unmanned for about three hours that night, long enough for engineers to determine whether it was safe to re-enter. Troops were back on duty by morning, said battalion commander Lt. Col. Erik Kurilla.

In the next 10 months, insurgents would continue to attack Deuce Four troops in west Mosul with snipers, roadside bombs and suicide car bombs. But never again would they mass and attempt such a complex attack.

Heroics on two other days earned Silver Stars for Deuce Four.

It was Aug. 19, and Sgt. Major Robert Prosser’s commander, Lt. Col. Erik Kurilla, had been shot down in front of him. Bullets hit the ground and walls around him. Prosser charged under fire into a shop, not knowing how many enemy fighters were inside. There was one, and Prosser shot him four times in the chest, then threw down his empty rifle and fought hand-to-hand with the man.

The insurgent pulled Prosser’s helmet over his eyes. Prosser got his hands onto the insurgent’s throat, but couldn’t get a firm grip because it was slick with blood.

Unable to reach his sidearm or his knife, and without the support of any other American soldiers, Prosser nonetheless disarmed and subdued the insurgent by delivering a series of powerful blows to the insurgent’s head, rendering the man unconscious.

Another Silver Star recipient, Staff Sgt. Shannon Kay, received the award for his actions on Dec. 11, 2004. He helped save the lives of seven members of his squad after they were attacked by a suicide bomber and insurgents with rockets and mortars at a traffic checkpoint.

He and others used fire extinguishers to save their burning Stryker vehicle and killed at least eight enemy fighters. Throughout the fight, Kay refused medical attention despite being wounded in four places.
For men like these and the millions of others whose courage and sacrifice have for two hundred and fifty years enabled the rest of us to live in relative freedom and security, we should all thank God. And for those who never made it back we should ask God's richest blessing on their souls.

Saturday, May 27, 2017

Mathematical Ethics?

I'm currently reading the book Hidden Figures, a delightful true story told by Margot Lee Shetterly about a group of African American women in the 1940s and 50s who worked in aeronautics research as mathematicians. The book has been made into a movie and the story of how their math skills, as well as their pluck, enabled these women to overcome racial and gender barriers is compelling. So I was perplexed when I came across this report from Robby Soave at Reason.com about a math curriculum designed by a group called Teach for America that implies that it's somehow an injustice to expect minority students to learn mathematics as it's traditionally taught.

Soave writes:
Teach for America thinks that language is "social justice," and has designed a course that makes some startling claims about math. [For example]:
"In western mathematics, our ways of knowing include formalized reasoning or proof, decontextualization, and algorithmic thinking, leaving little room for those having non-western mathematical skills and thinking processes," the training course claims.
Whoever wrote this should be cashiered on the grounds of nincompoopery. So should whoever approved its publication. There's no such thing as "western mathematics." Math is universal. There's not one set of "mathematical skills and thinking processes" for Europeans, another set for Asians, and still another for Africans. Math is applied logic and the laws of logic are not relative to different cultures as though they were like preferences in food or dress. The law of non-contradiction is not a matter of cultural predilection or opinion.

Soave continues to extract more instances of buffoonery from the Teach for America materials:
"Mathematical ethics recognizes that, for centuries, mathematics has been used as a dehumanizing tool....mathematics formulae also differentiate between the classifications of a war or a genocide and have been used to trick indigenous peoples out of land and property."
Mathematical ethics? What could that possibly be? Are there scholars who teach and write about mathematical ethics? Do they explore the ethical implications of imaginary numbers or the moral ramifications of dividing by zero? Mathematical ethics sounds a bit like astrobiology - they're both disciplines without a subject matter.

The balance of the quoted sentence sounds even more nonsensical, perhaps, than the notion of a mathematical ethics.

But enough. Soave concludes with this:
I'm open to the idea that math—particularly advanced math—is over-valued as a K-12 subject. There's a good argument to be made that high schoolers should be taking less Algebra II and reading more Shakespeare. But if we're going to teach math, I'm not sure we should be teaching that it's mostly just this bad thing Western countries used to subjugate indigenous peoples, as if that's the main thing you need to know about math.
I agree with him completely about this. I question, for example, the need for academically-oriented high school students to take calculus, a math that even many engineers don't use much. Their time would be better spent taking probability and statistics or taking more history/government, literature, or philosophy, and saving calculus, if they need it, for college.

Soave is also correct in pointing to the absurdity of teaching young people that math is somehow a tool of evil oppression. On the contrary, I can think of no better way for young minority kids to improve their socio-economic prospects in life than to master mathematics. It opens a lot of highly remunerative doors for the student who makes the effort. Just ask many of our Asian-American students - who certainly don't seem to have a problem with "western mathematics" - or read Shetterly's Hidden Figures.

Friday, May 26, 2017

Maybe Closer Than He Thinks

In his bestselling book A Brief History of Thought French humanist philosopher Luc Ferry asserts that the paramount philosophical question throughout the history of civilization has been the question of how we find meaning in life when death looms for all of us.

He calls this the question of "salvation," and in his book he surveys the answers proffered by the ancient Stoics, Christians, Enlightenment thinkers, Nietzsche and the post-moderns, and comes, after much meandering, to the disappointingly tepid conclusion that salvation consists in loving others until we die. In considering others, he argues, we achieve a kind of transcendence without the metaphysical baggage of God.

Ferry himself anticipates the complaint that his conclusion is unsatisfying:
You might object that compared to the doctrine of Christianity - whose promise of the resurrection of the body means that we shall be reunited with those we love after death - a humanism without metaphysics is small beer. I grant you that amongst the available doctrines of salvation, nothing can compete with Christianity - provided, that is, that you are a believer.
So why does Ferry settle for what is really a hopeless, meaningless "humanism without metaphysics"? Because he simply can't, or won't, bring himself to believe that there's more to reality than matter and energy:
If one is not a believer - and one cannot force oneself to believe, nor pretend to believe - then we must learn to think differently about the ultimate question posed by all doctrines of salvation, namely that of the death of a loved one.
Ferry is correct, of course, that one cannot compel belief, either in oneself or another, but what he might be asked is whether he sincerely wants the Christian message to be true. Regardless of his disbelief, does he hope that he's wrong? Many if not most people who claim not to believe that we'll in some sense exist beyond death and that "we shall be reunited with those we love after death" by their own admission do not want such a notion to be true. They don't believe the Christian proclamation, and they don't want the world to be the sort of place where such claims are in fact correct.

The prominent philosopher Thomas Nagel provides us with a good example of this. In his book The Last Word, Nagel writes these words:
In speaking of the fear of religion, I don’t mean to refer to the entirely reasonable hostility toward certain established religions and religious institutions, in virtue of their objectionable moral doctrines, social policies, and political influence. Nor am I referring to the association of many religious beliefs with superstition and the acceptance of evident empirical falsehoods. I am talking about something much deeper—namely, the fear of religion itself.

I speak from experience, being strongly subject to this fear myself: I want atheism to be true and am made uneasy by the fact that some of the most intelligent and well-informed people I know are religious believers. It isn’t just that I don’t believe in God and, naturally, hope that I’m right in my belief. It’s that I hope there is no God! I don’t want there to be a God; I don’t want the universe to be like that.
I suspect that the person, on the other hand, who wants theism to be true, who is open to belief in God, is much more likely to find belief slowly creeping over him than those others whom C.S. Lewis describes in The Great Divorce as having, "their fists clenched, their teeth clenched, their eyes fast shut. First they will not, in the end they cannot, open their hands for gifts, or their mouths for food, or their eyes to see." They refuse to allow themselves even to hope that there really is salvation in the Christian sense.

Ferry doesn't seem to be this sort of person, though. He writes at the end of his book:
I find the Christian proposition infinitely more tempting [than any of the alternatives] - except for the fact that I do not believe in it. But were it to be true I would certainly be a taker.
He seems to recognize, even though he doesn't explicitly say it, that unless what we do in this life matters forever, it doesn't matter at all. If death is the end of individual existence then nothing we do has any genuine meaning or significance.

If Ferry's being honest with his readers and himself, if he really is open to "the Christian proposition," then I suspect he's closer to faith than he might realize.

Thursday, May 25, 2017

The Roots of Hitler's Ethics

About five years ago Richard Weikart published a study on the roots of the moral thinking of Adolf Hitler, a review of which is posted at Evolution News and Views. Here's an excerpt:
One of the most controversial parts of the movie Expelled: No Intelligence Allowed was the segment where Ben Stein interviewed the history professor Richard Weikart about his book, From Darwin to Hitler: Evolutionary Ethics, Eugenics, and Racism in Germany. Darwinists went apoplectic, deriding Stein and Weikart for daring to sully the good name of Darwin by showing the way that Hitler and German scientists and physicians used evolutionary theory to justify some of their atrocities, such as their campaign to kill the disabled.

Some critics even denied that the Nazis believed in Darwinism at all. Weikart challenges his critics to examine the evidence in his fascinating sequel, Hitler's Ethic: The Nazi Pursuit of Evolutionary Progress, which examines the role of Darwinism and evolutionary ethics in Hitler's worldview.

In this work Weikart helps unlock the mystery of Hitler's evil by vividly demonstrating the surprising conclusion that Hitler's immorality flowed from a coherent ethic. Hitler was inspired by evolutionary ethics to pursue the utopian project of biologically improving the human race. Hitler's evolutionary ethic underlay or influenced almost every major feature of Nazi policy: eugenics (i.e., measures to improve human heredity, including compulsory sterilization), euthanasia, racism, population expansion, offensive warfare, and racial extermination.
Once people reject the idea that morality is rooted in an omnipotent, omniscient, perfectly good being the next logical step is to abandon the idea that there's any objective moral standard at all. This leads inevitably to moral arbitrariness and subjectivity, i.e. what's right is whatever feels right to me. Moral subjectivism leads directly to egoism, i.e. the belief that one should put one's own interests ahead of the interests of others, and egoism leads to the ethic of "might makes right".

Hitler's "morality" was completely consistent with his rejection of a belief in a personal God. Hitler was who every atheist would also be if they a) had the power and b) were logically consistent. Thankfully, few of them are both powerful and consistent, but in the 20th century some were. Mao, Stalin, Pol Pot all were atheists who had complete power within their sphere and acted consistently with their naturalistic, materialistic worldview. The consequences were completely predictable.

Wednesday, May 24, 2017

Getting it Not Quite Right

A recent Gallup poll reveals that the number of people who hold to the strict creationist view that God created humans in their present form at some time within the last 10,000 years or so has declined and the number of people who believe in some form of evolution, whether naturalistic or directed by God has increased:
The percentage of U.S. adults who believe that God created humans in their present form at some time within the last 10,000 years or so -- the strict creationist view -- has reached a new low. Thirty-eight percent of U.S. adults now accept creationism, while 57% believe in some form of evolution -- either God-guided or not -- saying man developed over millions of years from less advanced forms of life.
Allahpundit, a commentator at HotAir.com, looks at the data and writes:
The pure creationist position was trending downward, then made a big comeback in 2011 for no obvious reason. It’s tempting to call that result an outlier or statistical noise, but the hybrid position of guided evolution polled poorly in the low 30s in 2011 and remained flat in 2014, suggesting a real trend. Now suddenly it’s come surging back. Why? You tell me.
Here's the graph he refers to:



Maybe the answer to Allahpundit's question is that many people who formerly embraced a literal seven-day creation have been persuaded by intelligent design theorists that ID offers a more satisfactory explanation of origins. Many people who may formerly have thought that there was a conflict between evolution and theism might now, after almost three decades of work by intelligent design theorists, believe that the two are compatible. In other words, the survey is stuck with a format of questions that are no longer very helpful. David Klinghoffer, commenting Gallup's questions, correctly observes that,
Since 1982 [Gallup] has been asking:

Which of the following statements comes closest to your views on the origin and development of human beings — 1) Human beings have developed over millions of years from less advanced forms of life, but God guided this process, 2) Human beings have developed over millions of years from less advanced forms of life, but God had no part in this process, 3) God created human beings pretty much in their present form at one time within the last 10,000 years or so?

What I wish they would ask is:

Which of the following statements comes closest to your views on the origin and development of living creatures – 1) Animal and human life arose and developed over billions of years, guided by a designing intelligence, whether God or otherwise, 2) Animal and human life arose and developed over billions of years, by strictly blind, natural processes, unguided by any intelligent agent, 3) God created all animal and human life at one time within the last 10,000 years or so?

Now that would tell you a lot about the state of the evolution debate. But the modern intelligent design movement didn’t exist 35 years ago, so Gallup is stuck in 1982.
The Gallup poll also shows that fewer than one in five Americans holds a secular view of evolution, but that proportion has almost doubled from about 10% in 2000 to about 19% today. Allahpundit makes a compelling argument that this number gives a pretty good idea of the percentage of atheists in the U.S. That number is usually pegged at around 3% to 10%, but it may be much higher. Here's Allahpundit:
The 19 percent figure for evolution without God is interesting in light of [a] recent [study] suggesting there may be many more atheists in the U.S. than everyone believes. Ask people if they think of themselves as “atheist” and chances are no more than three percent will say yes. Ask them if they believe in God without using the A-word and maybe 10 percent will say no.

How many people secretly believe there is no God, though, and are simply reluctant to say so, even to a pollster? [The] study ... divided people into two groups and gave them identical questionnaires filled with innocuous statements (e.g., “I own a dog”) — with one exception. One group had the statement “I do not believe in God” added to their questionnaire.

People in each group were asked to identify how many of the statements were true of them without specifying which ones were true. Then the numbers from the control group were compared to the numbers from the “I do not believe in God” group. Result: As best as researchers can tell from the numerical disparity, 26 percent don’t believe in God, way, way more than most surveys show. I’m skeptical that the number runs quite that high, but the fact that 19 percent told Gallup they believe in evolution without God may mean the number of atheists is higher than the 3-10 percent range usually cited.

After all, how many religious believers are likely to also believe that God played no role in man’s development? Per Gallup, just one percent of weekly churchgoers signed on to that proposition and just six percent of nearly weekly or monthly observers did. “Evolution without God” may be a reasonably good proxy for atheism.
If so, the ratio of theists to atheists in this country is about 4 to 1. Of course, a lot of people can believe or disbelieve in God without acting or thinking consistently with that belief, and many respondents to the study may not have answered the questionnaire in a manner consistent with their true convictions. A similar poll several years ago, for instance, found that 9% of those who claimed to be atheists prayed at least once a week. One wonders how carefully those folks thought about their responses.

Tuesday, May 23, 2017

Neither a Right Nor a Privilege

The recently crowned Miss USA, Kára McCullough, was asked during the pageant whether she thought health care was a right or a privilege. Poor girl. Not realizing that the "correct" answer is that it's a "right," she proclaimed it to be a "privilege" and was roundly criticized on social media for her abysmal lack of awareness. She subsequently backtracked, "correcting" her error, switching from "privilege" to "right" the following day, which is too bad.

The actual answer is that health care is neither a right nor a privilege as Ed Morrissey explains at Hot Air:
Taken as a whole, the market for health care services and goods is a commodity, and our failure to treat it as such is what’s making it so dysfunctional.

Health care consists of goods and services produced and delivered by highly specialized providers in exchange for monetary compensation. Overall, it’s a commodity, for which the terms “right” and “privilege” are largely meaningless. In an economic sense, health care is no different than markets for other commodities, such as food, vehicles, fuel, and so on. The ability to purchase goods and services depends on the resources one has for compensation for their delivery in most cases....

Rights, as understood by founders, do not require the transfer of goods and services, but come from the innate nature of each human being. The right to free speech does not confer a right to publication, or to listeners. The right to peaceably assemble does not confer a right to confiscate private property in which to gather or to destroy either. The right to bear arms does not require the government to provide guns or ammunition, and so on. Rights do not require government provision....
Morrissey goes on to explain why health care is not a privilege either:
In our form of self-governance and generally free markets, privilege generally refers to licensed access to certain restricted activities involving public assets. The most common of these is a driver’s license, which confers a privilege to use public roads. One does not need a license to drive exclusively on private roads, as anyone who grew up on a farm or ranch can attest. Doctors and lawyers require licenses to practice their professions, so providing health care can be described as a privilege, but we do not require a government grant to consume health care. Anyone who can provide compensation (directly or through third parties by mutual consent) for care can access it. Some providers — notably those maintained by religious communities, who have recently come under fire — don’t even require compensation for access.
There's more from Morrissey on this topic at the link. One thing he doesn't mention in his very helpful piece, though, is the absurdity, in a secular society in which religious beliefs are supposed to have no bearing on public policy, of any talk of "rights" at all.

On a secular view of society the concept of a right is vacuous. Where does a right come from? What confers it upon us? The answer, given the secular viewpoint, is nowhere and nothing. Rights are artificial, imaginary constructs that we invoke in order to give strong expression to our feelings, but there are no such actual entities as rights unless they're conferred by something higher than ourselves.

It may be argued that certainly governments can confer legal rights as does ours in our Constitution, but this doesn't help those who insist that health care is a right since health care is mentioned nowhere in the Constitution.

If there is a meaningful right to health care it must somehow be a human right conferred upon us, in the words of Thomas Jefferson, by our Creator, but then those who wish to exclude Creators from the public square cannot have recourse to one when they want to assert that health care is a human right.

We don't have human rights because we exist, or because we're rational, or because we're nice people. To the extent that our rights are inherent in us it's only because we're created in the image of God and God loves us. We belong to Him, and that's what gives us value as persons and a right not to be harmed by others. Shove that underlying premise out of our public life and talk of rights is just silly, empty rhetoric.

One wonders why so many secular folk don't see that they're talking nonsense and how they continue to get away with doing it.

Monday, May 22, 2017

Beetle Origami

One of the countless fascinating examples of engineering in nature that defies explanation in terms of random mutation and natural selection is the ability of insects, such as beetles, to fold and unfold their wings. It's an astonishing ability since the folds are quite complex as this video of a ladybug beetle shows:
If the metaphysical view called naturalism is true, such processes are the result of fortuitous accidents and coincidences throughout the history of beetle evolution, yet one might rightly wonder how accident and coincidence, acting with no goal or purpose in mind, can produce a feature that, were it found in some other context, would certainly be attributed to the design of an intelligent agent.

David Klinghoffer at Evolution News quotes from an article on this phenomenon from USA Today:
Japanese scientists were curious to learn how ladybugs folded their wings inside their shells, so they surgically removed several ladybugs’ outer shells (technically called elytra) and replaced them with glued-on, artificial clear silicone shells to peer at the wings’ underlying folding mechanism.

Why bother with such seemingly frivolous research? It turns out that how the bugs naturally fold their wings can provide design hints for a wide range of practical uses for humans. This includes satellite antennas, microscopic medical instruments, and even everyday items like umbrellas and fans.

“The ladybugs’ technique for achieving complex folding is quite fascinating and novel, particularly for researchers in the fields of robotics, mechanics, aerospace and mechanical engineering,” said lead author Kazuya Saito of the University of Tokyo.
The highlights are mine.

It's truly remarkable that our most brilliant engineers are being taught design by what they are seeing in living things. It's not something that would be expected given a belief in a mechanistic, purposeless, atelic natural world. On the other hand, it's not at all surprising that the natural world would be infused with engineering marvels if the natural world is itself the product of intelligent engineering.

Saturday, May 20, 2017

Tabloid Journalism

The media, particularly the Washington Post, New York Times, CNN, and MSNBC, are yearning to get something on Trump that could topple his presidency. So eager are they to relive the glory years of the early 1970s and Watergate that, in lieu of anything substantial to nail the president with, they've decided that simply making stuff up is not beneath them.

Here's a sample of fabrications by "journalists" at the Washington Post excerpted from a column by Mollie Hemmingway:
1. On May 10, the Washington Post‘s Philip Rucker, Ashley Parker, Sari Horwitz, and Robert Costa claimed that:
[Deputy Attorney General Rod J.] Rosenstein threatened to resign after the narrative emerging from the White House on Tuesday evening cast him as a prime mover of the decision to fire Comey and that the president acted only on his recommendation, said the person close to the White House, who spoke on the condition of anonymity because of the sensitivity of the matter.
It turns out, however, that this report was false:
But the “person close to the White House” who made the claim without using his or her name was contradicted by none other than Deputy Attorney General Rod J. Rosenstein himself. The next day he said, “I’m not quitting” when asked by reporters. “No,” he said to the follow-up question of whether he had threatened to quit.
2. On May 10, Ashley Parker wrote:
Last week, then-FBI Director James B. Comey requested more resources from the Justice Department for his bureau’s investigation into collusion between the Trump campaign and the Russian government, according to two officials with knowledge of the discussion.
This story was also false:
The story was based on anonymous sources, naturally, and noted “The news was first reported by the New York Times.” If true, it would support a narrative that Trump had fired Comey not due to his general incompetence but because he was trying to thwart a legitimate and fruitful investigation. Anonymous sources again had something very different to say from people whose comments were tied to their names, who all denied the report. The Justice Department spokeswoman immediately responded that the claim was false....

The next day under oath, acting FBI Director Andrew McCabe repeatedly denied that the probe into Russia was undersourced or requiring any additional funds.

3. On January 26 Josh Rogin reported that “the State Department’s entire senior management team just resigned” as “part of an ongoing mass exodus of senior Foreign Service officers who don’t want to stick around for the Trump era.”
This story was false, too:
The story went viral before the truth caught up. As per procedure, the Obama administration had, in coordination with the incoming Trump administration, asked for the resignations of all political appointees. While it would have been traditional to let them stay for a few months, the Trump team let them know that their services wouldn’t be necessary. The entire story was wrong.

4. Rogin also had the false story that Steve Bannon had personally confronted Department of Homeland Security’s Gen. John F. Kelly to pressure him not to weaken an immigration ban.
False again:
‘It was a fantasy story,’ Kelly said. Of the reporter, he said: ‘Assuming he’s not making it up… whoever his sources are, [they're] playing him for a fool.’

5. This week, the Washington Post reported that President Trump threatened national security during his meeting with Russians last week....
Ditto:
The report was immediately slapped down as false by multiple high-level Trump officials who were present in the meeting, including National Security Advisor H.R. McMaster who said,
The story that came out tonight as reported is false. The president and the foreign minister reviewed a range of common threats to our two countries, including threats to civil aviation. At no time, at no time, were intelligence sources or methods discussed. And the president did not disclose any military operations that were not already publicly known. Two other senior officials who were present, including the Secretary of the State, remember the meeting the same way and have said so. Their on the record accounts should outweigh anonymous sources. I was in the room. It didn’t happen.
If the WaPo keeps this up pretty soon you'll be able to buy it in the supermarket checkout lane where it'll be displayed right next to the Globe, Star, and National Enquirer. In fact, all it needs to compete with those tabloids is go half-size and add a few lurid photos. It's already got the story-telling down pat.

Friday, May 19, 2017

Mann's Miffed

Climate scientist Michael Mann has been waging a Twitter war this week against Scott Adams, the creator of the comic strip Dilbert. Evidently, Mann was not amused by last Sunday's Dilbert which featured a climate scientist who looks pretty much like Mann sounding foolish:


The irony in this is that the punch line in the cartoon has the climate scientist resorting to calling Dilbert a "science denier" simply for questioning the scientist's methodology, yet, in a surprising exhibition of self-caricature, some of the tweeters, including Mann himself, do to Adams the very thing he mocks them for in the strip.
Michael E. Mann ✔ @MichaelEMann
Scott Adams ("Dilbert") is an equal-opportunity science denier: Evolution
10:25 AM - 15 May 2017
Irked by the portrait Adams paints of them they express their displeasure by confirming the portrait. The ability to recognize irony doesn't seem to be a prominent feature of their skill set.

Thursday, May 18, 2017

The Nerve of the Guy

If you live in California, have a job and pay taxes Governor Jerry Brown would like you to know that you’re a freeloader and he’s tired of your complaining, writes Kira Davis at Redstate.com:
In a speech in Orange County last week Brown responded to criticism over a new $52 billion tax hike that was hastily rammed through Sacramento by the Democrat supermajority over the objections of taxpayers across the state.

The new taxes include a 12 cent increase in the gas tax (which is already one of the highest in the nation) and an increase in car registration fees averaging $50 per car owner. Some will see their annual registration increase by as much as $180.
California taxpayers are outraged, although I assume none of the angry ones are Democrats. After all, if they keep electing legislators and governors whose philosophy it is to tax and spend they hardly have room to complain when that's what their elected officials do.

In any case, Governor Jerry Brown wasn't sympathetic to the complaints. He called the beleaguered taxpayers of his state "freeloaders," of all things:
“The freeloaders — I’ve had enough of them,” Brown said, adding that the approved tax and fee hikes bring those charges to the level they were 30 years ago if adjusted for inflation. “They have a president that doesn’t tell the truth and they’re following suit.”
Davis deconstructs Brown's declaration of contempt for the peasants who have had their fill of paying ever more taxes to pay for California's welfare state: Let’s review this statement, in case you’re having trouble interpreting his meaning. If you work and pay taxes, if you struggle to pay your most basic bills while “owing” the California government a third of your paycheck (or more), Jerry Brown thinks you’re a freeloader.

The 1,000,000 citizens in Los Angeles county alone who collect food stamps provided by taxpayers are not freeloaders. The millions of illegal immigrants being harbored in California’s sanctuary cities to the cost of taxpayers are not freeloaders. Illegal immigrants being provided “free” legal help by the state on the backs of taxpayers are not freeloaders. The bloodsuckers in the Sacramento legislature who get paid $178 a day in per diem funds on top of their bloated salaries just for walking in the door to their job every day are not freeloaders.

No, you – the burdened, law-abiding taxpayer are the freeloader for simply asking the government of California be more fiscally responsible with the money they already have instead of stealing more of your money without your consent to pay for programs that are already funded but have been raided for pet projects and personal enrichment. This is like the man who thrashed the goose that laid golden eggs because it only laid one a day. California's state government is going to drive anyone who can to leave the state for more hospitable climes and the only people left will be those too poor to move.
Davis isn't done. She turns her pen to summarizing the hypocrisy of California's political leadership, including its governor:
Brown scolds the people who make this state work, asking them if they’d rather pay double for borrowing money, as if that isn’t exactly what we’re doing right now after approving borrowing money for vague “drought control” programs and the bottomless black hole of public school funding.

The Governor asking if we should take money from the universities is especially outrageous given that the President of the University of California school system Janet Napolitano is currently embroiled in a massive scandal after it was discovered they deliberately hid $175 million from the state government auditing office even as they requested more state funding for social justice programs and events.

California has become increasingly hostile towards the people who make this state work, and to hear the man who holds the highest office in the state address us with derision and disdain as “freeloaders” is disgusting and deeply insulting (alliteration is my number one symptom of outrage). Instead of thanking us for the work we put in every day to pay for his benefits and the benefits of others who don’t work at all, he chooses to treat us like garbage simply because we dare to demand our hard-earned money be used more wisely.

Brown thinks you’re a freeloader for demanding accountability, but he’s perfectly willing to use taxpayer money to fight against the citizen-led recall of a senator who sold his vote and his soul for a seat at Sacramento’s perpetual, tax-funded lunch table.
JazzShaw at Hot Air wades into the fray with this bit of commentary:
Congratulations, California. You keep electing these same Democrats over and over again and then you act surprised when they make you one of the most heavily taxed populations in the country....

How much more of this are you going to put up with? .... [A]t this point California’s voters appear to be in the political equivalent of an abusive relationship. The Democrats continue to beat you up and then insult and berate you if you dare to complain about it. It’s time to break the cycle of abuse and get yourself a new paramour who treats you right. Nobody else can save you unless you’re willing to take the first steps to help yourselves.
Unfortunately, California is a microcosm of the United States itself. The people on the receiving end of the goodies vote, and their numbers have grown to the point where they have at least as much political clout as those who pay the bills, maybe more. Given the math it's hard to be optimistic about California's future, nor, for that matter, that of the United States.

Wednesday, May 17, 2017

It's Just a Bad Idea

Erielle Davidson is an economic research assistant at the Hoover Institution in Palo Alto, California. In a column at The Federalist cites yet more evidence that confirms what anyone with common sense would already know, which is that raising the minimum wage mostly hurts small businesses, especially marginally successful businesses, and the often poor people they employ.

She writes:
The [Harvard] paper focused specifically upon the restaurant industry in San Francisco, using data from the review platform Yelp to track the activity and performance of individual restaurants. Researchers Dara Lee Luca and Michael Luca discovered that a $1 increase in the minimum wage leads to approximately a 4 to 10 percent increase in the likelihood of any given restaurant exiting the industry entirely. In economic terms, minimum wage hikes quicken a restaurant’s “shutdown” point.

Luca and Luca found this effect to be more pronounced among the restaurants with lower ratings while essentially nonexistent among five-star restaurants. A $1 increase in the minimum wage increased the likelihood of a 3.5-star exiting by roughly 14 percent, while having zero effects on the restaurants with five-star ratings. In other words, minimum wage hikes disproportionately affect the restaurants that are already struggling in popularity.

Why is this paper stunningly relevant? In an era where liberal-minded folks see increasing the minimum wage as a key way to equalize economic outcomes, studies such as this undercut the ignorant economics those on the Left espouse. Basic economics tell us that increasing the minimum wage will hurt not only firms by increasing their operational costs but also the very workers they presumably fire in the process to keep those operational costs down.

....This particular paper shows that not all firms adjust to hikes by merely raising the prices of the goods and services they provide or firing a fraction of their workers. Some firms shut down altogether, taking their job opportunities with them.
The research indicates that contrary to the hopes and claims of the Fight For $15 crowd and the Black Lives Matter movement raising the minimum wage has a negative effect on minority earning power:
Economists have found minimum wage hikes to be unhelpful in reducing inequality and [are often] followed by more low-income workers being laid off, a great number of whom are people of color. The first of a series of scheduled minimum wage hikes in Seattle in 2015 resulted in a 1 percent drop in the employment rate of Seattle’s low-wage workers and preceded the worst job decline for the city since the 2008-09 recession.

According to a 2014 report, only 17 percent of Seattle workers previously making under $15 per hour before the hike were white. The rest were Asian (20 percent), Black (28 percent), and Hispanic (22 percent). Undoubtedly, these folks were disproportionately hurt by the reduction in employment that followed the hikes.

While “racial and economic justice” through forced minimum wage hikes sounds appealing, it just doesn’t work the way its supporters suppose. In fact, quite the opposite. Slapping a $15 minimum wage requirement on a large corporation might “feel good” to angry workers, but those very workers are ultimately the ones who will pay the price.
There's more in Davidson's article at the link. The minimum wage issue is an example of people wanting to implement an easy measure to help the poor without any thought to the consequences of their action on the people it's supposed to help. It's ironic that many of the people fighting the hardest to raise the minimum wage are the very people who'll find themselves out of work when it's imposed. San Francisco is raising their city-wide minimum wage this summer, so it'll be interesting to see what happens to the low-wage employment numbers in the city when they do.

Tuesday, May 16, 2017

How We Got Here

Philosopher W.T. Stace writing in The Atlantic Monthly in 1948 gives a concise summary of how we came to be where we are in the modern world, i.e. adrift in a sea of moral subjectivism and anomie. He asserts that:
The real turning point between the medieval age of faith and the modern age of unfaith came when scientists of the seventeenth century turned their backs upon what used to be called "final causes"...[belief in which] was not the invention of Christianity [but] was basic to the whole of Western civilization, whether in the ancient pagan world or in Christendom, from the time of Socrates to the rise of science in the seventeenth century....They did this on the [basis that] inquiry into purposes is useless for what science aims at: namely, the prediction and control of events.

....The conception of purpose in the world was ignored and frowned upon. This, though silent and almost unnoticed, was the greatest revolution in human history, far outweighing in importance any of the political revolutions whose thunder has reverberated around the world....

The world, according to this new picture, is purposeless, senseless, meaningless. Nature is nothing but matter in motion. The motions of matter are governed, not by any purpose, but by blind forces and laws....[But] if the scheme of things is purposeless and meaningless, then the life of man is purposeless and meaningless too. Everything is futile, all effort is in the end worthless. A man may, of course, still pursue disconnected ends - money, fame, art, science - and may gain pleasure from them. But his life is hollow at the center.

Hence, the dissatisfied, disillusioned, restless spirit of modern man....Along with the ruin of the religious vision there went the ruin of moral principles and indeed of all values....If our moral rules do not proceed from something outside us in the nature of the universe - whether we say it is God or simply the universe itself - then they must be our own inventions.

Thus it came to be believed that moral rules must be merely an expression of our own likes and dislikes. But likes and dislikes are notoriously variable. What pleases one man, people, or culture, displeases another. Therefore, morals are wholly relative.
On one point I would wish to quibble with Stace's summary. He writes in the penultimate paragraph above that, "If our moral rules do not proceed from something outside us in the nature of the universe - whether we say it is God or simply the universe itself - then they must be our own inventions."

I think, however, that if our moral rules derive from the universe they're no more binding or authoritative than if they are our own inventions. The only thing that can impose a moral duty is a personal being, one that has both moral authority and the power to hold us accountable for our actions. A being which would possess that kind of authority and power, the power to impose an objective moral duty, would be one which transcends human finitude. Neither the universe nor any entity comprised of other humans qualifies.

In other words, unless God exists there simply are no objective moral duties. Thus, if one believes we all have a duty to be kind rather than cruel, to refrain from, say, rape or child abuse or other forms of violence, then one must either accept that God exists or explain how such obligations can exist in a world where man is simply the product of blind impersonal forces plus chance plus time.

Put simply, in the world of Darwinian naturalism, no grounds exist for saying that hurting people is wrong. Indeed, no grounds exist for saying anything is wrong.

It's not just that modernity and the erosion of theistic belief in the West has led to moral relativism. It's that modernity and the concomitant loss of any genuine moral authority in the world leads ineluctably to moral nihilism.

This is one of the themes I discuss in my novel In the Absence of God which you can read about by clicking on the link at the top right of this page.

Monday, May 15, 2017

The End of Science

In article at Evolution News Denyse O'Leary argues that metaphysical naturalism, the view that the cosmos is all there is, all there ever was, and all there ever will be, is actually doing great harm to science.

O'Leary argues that science is based upon the assumption that truth is objective, that it's out there waiting to be discovered, but that metaphysical naturalism actually undercuts that assumption. On naturalism truth becomes much more subjective and malleable. What's true for the scientist is what coheres most comfortably with his worldview. If his worldview is naturalism then any theory that eliminates the need for a God will be embraced, will be considered "science," regardless of the state of the evidence.

O'Leary writes:
Let’s start at the top, with cosmology. Some say there is a crisis in cosmology; others say there are merely challenges. Decades of accumulated evidence have not produced the universe that metaphysical naturalism expects and needs. The Big Bang has not given way to a theory with fewer theistic implications. There is a great deal of evidence for fine-tuning of this universe; worse, the evidence for alternatives is fanciful or merely ridiculous. Put charitably, it would not even be considered evidence outside of current science.

One response has simply been to develop ever more fanciful theories. Peter Woit, a Columbia University mathematician, is an atheist critic of fashionable but unsupported ideas like string theory (Not Even Wrong, 2007) and the multiverse that it supports. Recently, Woit dubbed 2016 the worst year ever for “fake physics” (as in “fake news“). As he told Dennis Horgan recently at Scientific American, he is referring to “misleading, overhyped stories about fundamental physics promoting empty or unsuccessful theoretical ideas, with a clickbait headline.”

Fake physics (he links to a number of examples at his blog) presents cosmology essentially as an art form. It uses the trappings of science as mere decor (the universe is a computer simulation, the multiverse means that physics cannot predict anything…). Conflicts with reality call for a revolution in our understanding of physics rather than emptying the wastebasket.

....The need to defend the multiverse without evidence has led to a growing discomfort with traditional decision-making tools of science, for example, falsifiability and Occam’s razor. And metaphysical naturalism, not traditional religion, is sponsoring this war on reality.... Can science survive the idea that nature is all there is? The initial results are troubling. Where evidence can be ignored, theory needs only a tangential relationship to the methods and tools of science.
In other words, what matters is that metaphysical naturalism be propped up. If that means sacrificing traditional methods and principles in physics, well, then so much the worse for those methods and principles. One reason for the hostility of many scientists toward the theory of intelligent design is not that it is an assault on science, it's not, it's that it's an assault on metaphysical naturalism. It's the threat to their metaphysics, not to their science, that has many scientists outraged by the inroads made by intelligent design theorists:
What if a theory, such as intelligent design, challenges metaphysical naturalism? It will certainly stand out. And it will stand out because it is a threat to all other theories in the entire system. Merely contradictory or incoherent theories clashing against each other are not a threat in any similar way; there are just so many more of them waiting up the spout.

Could intelligent design theory offer insights? Yes, but they come at a cost. We must first acknowledge that metaphysical naturalism is death for science. Metaphysical naturalists are currently putting the science claims that are failing them beyond the reach of disconfirmation by evidence and casting doubt on our ability to understand evidence anyway.

ID is first and foremost a demand that evidence matter, underwritten by a conviction that reason-based thinking is not an illusion. That means, of course, accepting fine-tuning as a fact like any other, not to be explained away by equating vivid speculations about alternative universes with observable facts. Second, ID theorists insist that the information content of our universe and life forms is the missing factor in our attempt to understand our world. Understanding the relationship between information on the one hand and matter and energy on the other is an essential next discovery. That’s work, not elegant essays.
O'Leary concludes with this:
We will get there eventually. But perhaps not in this culture; perhaps in a later one. Science can throw so many resources into protecting metaphysical naturalism that it begins to decline. Periods of great discovery are often followed by centuries of doldrums. These declines are usually based on philosophical declines. The prevalence of, for example, fake physics, shows that we are in the midst of just such a philosophical decline. It’s a stark choice for our day.
Modern science germinated in the fertile soil of a theistic worldview. Most of the founders of the modern scientific era were theists. They believed that the universe was rational because it was created by a rational God and that because it was rational it could be elucidated by logical, empirical analysis. They also believed that the universe was a suitable subject of study, that there was no sacrilege in examining how it worked because the universe was not itself divine nor "enchanted."

By thinking God's thoughts after Him, these great minds assumed, it was possible to unravel the mysteries of life and the cosmos, but contemporary scientists, having abandoned the metaphysical foundation of the scientific enterprise, have been running on fumes for a century or more and have turned what was traditionally the pursuit of truth into a desperate attempt to prop up their naturalism.

This probably won't end well for science.

Saturday, May 13, 2017

Consciousness and Nonsense

Pascal-Emmanuel Gobry, in a column at The Week, observes that arguments surrounding human consciousness comprise one of the most animated debates in contemporary philosophy. He goes on to note that one reason why consciousness is so vexing to academic philosophers is that "a great many of them are atheists, and the reality of subjective consciousness frustrates an extremist but widely held version of atheistic metaphysics called eliminative materialism."

He writes:
This form of metaphysics takes the position that the only things that exist are matter and mindless physical processes. But in a world of pure matter, how could you have subjective, conscious beings like us?

To someone schooled in the great historical philosophical traditions — which have been largely dismissed following the adoption of post-modernism in the academy — this debate is immensely frustrating. In fact, much of the ongoing conversation about consciousness is self-evidently absurd.

"The scientific and philosophical consensus is that there is no non-physical soul or ego, or at least no evidence for that," writes philosopher David Chalmers. The New York Times backed him up, calling this a "succinct" summation of the status quo. Except that it's not.

First of all, there can be no scientific consensus or evidence about non-physical realities, because science is only concerned with physical realities. As for the "philosophical consensus," well, anyone who knows anything about philosophy knows that there has never been such a thing and never will be. And even if there were, it wouldn't mean anything, since philosophy is not a science; in science, an expert consensus does represent the state of the art of knowledge on a particular issue. In philosophy, it merely represents a fad.
The problem for materialist philosophers is that there's no plausible physical explanation for consciousness. For example, sensations like pain are caused by physical processes in the nervous system and brain, but the sensation itself is not physical. It's not something that can be observed or measured by anyone other than the person experiencing it. How do atomic particles like electrons zipping along neurons produce the sensation of pain or sound or color? What is the bridge between the physical stimulus and the non-physical sensation? No one knows.

Some materialists claim to solve the problem by asserting that the sensations just are the electrochemical stimuli, but not many philosophers are willing to agree that their experience of pain is nothing more than the firing of specific nerve fibers. Surely the agonizing sensation of pain is more than just atoms whizzing about. The sensation of sweet is something other than a chemical reaction in the brain. The sound of middle c is something other than the vibrations which elicit it.

The 18th century mathematician and philosopher Gottfried Leibniz argued that if an observer were miniaturized so that he could be inserted into a patient's brain he wouldn't see the brain light up red when the patient looked at a red object. We know today that all he'd see would be the flux of molecules coalescing and coming apart, so where does the red come from?

Gobry continues with a look at the work of philosopher Daniel Dennett:
Another argument on consciousness that enjoys a bit of consensus, especially lately, is that consciousness is an illusion. Our brain constructs models of the world around us and then tricks itself into believing that this is an expression of the world. The foremost proponent of this view is the philosopher Daniel Dennett.

But again, this view is literally nonsense. The concept of an illusion presupposes that there is a subjective consciousness experiencing the illusion.
There's more from Gobry at the link. He's right that Dennett's view seems nonsensical. If consciousness is an illusion then the sensations we have of pain, sound, fragrance, and so on are also illusions. So, too, are our ideas and thoughts. If this is so, then almost our entire experience of the world is an illusion, including Dennett's idea that consciousness is an illusion. What's the point, Dennett might be asked, of writing a book full of ideas which, if true, are themselves illusions? Isn't the very act of trying to persuade someone of the truth of one's illusions a rather vain exercise?

This is a good example, unfortunately, of the silliness into which very intelligent people lapse when they're determined to efface any vestige of the "supernatural" from their metaphysics.

Friday, May 12, 2017

Comey Deserved it

Last October, right before the election, FBI Director James Comey detonated a nuclear chain reaction of caterwauling among Democrats when he announced that his agency was reopening the investigation into Hillary Clinton's handling of classified documents. The Democrats were understandably livid, seeing this as an announcement that could only hurt Ms Clinton's chances in the upcoming election. They called Comey everything from incompetent to dishonest and would have been delighted had President Obama fired him at the time.

Well, President Obama didn't but President Trump did, and now many of the same people who were outraged at Comey, and who wanted to see him cashiered, are now outraged that Mr. Trump did the very deed they wanted done in the first place. It seems that it's not Comey's dismissal that has them in a swivet but rather the fact that it was the nefarious Donald Trump who dismissed him that they find so galling.

Someone on television the other night observed that if these people didn't have double standards they'd have no standards at all.

Be that as it may, none of the reasons Trump's critics adduce to explain their displeasure with his move make much sense. Removing Comey doesn't stop the investigation into Mr. Trump's alleged "collusion" with Russia, nor was it an unconstitutional powerplay. Indeed, President Clinton fired his FBI Director and no one thought he exceeded his authority in so doing.

But all the criticism aside, lost in the media sturm und drang is the fact that, as Ben Domenech, publisher of The Federalist, writes, Comey deserved to be fired:
There is a simple fact that makes analysis of the firing of FBI Director James Comey difficult: he deserved to be fired. At any point over the past nine months, prominent members of both parties have contended that Comey had to go. It is far easier to advance a convincing argument that Comey’s behavior over that time represented the wrong course for the FBI Director to take in every single instance, from his decision to hold his press conference, his decision not to recommend indictment, his decision to publicly continue to talk both on and off the record about these matters, his decision to publicly reopen the case in the manner he did, his decision to rely upon a laughable dossier constructed by the President’s political opponents, and his continued decisions regarding what he says in public and private, and what he implies about current investigations. The overall appearance he creates as the head of the FBI has seen an utter collapse in that time from that of a respected independent career official to someone who is viewed fundamentally as a political actor who cares more about his personal image than the department he leads.

At every juncture, Comey might have been better off adopting George Costanza’s approach: just do the opposite, and see what happens.

So here is the problem: James Comey deserved to be fired. But the timing of his firing lends itself to questions about the Russia investigation and conspiracy theories that threaten to send talking heads rocketing into the atmosphere like a thousand wide-eyed Yuri Gagarins. Talk of a coup or a constitutional crisis or comparisons to Richard Nixon’s Saturday Night Massacre overwhelmed the airwaves yesterday, as did utterly unjustified claims from the likes of Jeffrey Toobin that Comey was fired because he somehow had the goods on the President and that the White House intends to replace him with a stooge who will shut down existing investigations into campaign associates.
Domenech elaborates on this last point:
The New York Times editorial page claimed: “Mr. Comey was fired because he was leading an active investigation that could bring down a president.” That is a very bold claim – no such claim appears in The Wall Street Journal or USA Today editorials, who view the dismissal as deserved. The comparisons to a despot rolled in, while the whiplash from the announcement had its best representation in the crowd at the taping of Stephen Colbert, which erupted in hoots and applause at the news of the firing, only to be chided for wrongthink. No, see, you have been sitting here in the studio and not watching CNN, so you do not know this is wrong now.
Colbert's audience apparently thought Comey's firing was a good thing until Colbert managed to teach them the proper opinions to hold on the matter:
The rest of Domenech's column fleshes out his argument that Comey was the wrong man for the job he was chosen to do. As one who only observed Comey's behavior from afar and who was willing to give him the benefit of the doubt - even when he made the incredibly puzzling announcement last summer that despite Ms. Clinton's obviously felonious mishandling of classified information the FBI would not recommend an indictment because she didn't mean any harm - I think Domenech is correct. Comey's behavior has been both befuddling and bizarre, and has tarnished the reputation of the FBI for integrity and competence. It was past time for him to move on to the next phase of his life.

Thursday, May 11, 2017

Treat Everyone Equally

Historically he civil rights movement was largely a struggle to gain equal opportunity for minorities. All that was being asked for was a chance to show that african Americans could compete with everyone else if given the chance. Eventually, however, the movement morphed into an effort to grant preferential treatment to minorities. One contemporary example of this is the demand that university campuses establish "sanctuaries" for black students from which whites would be excluded. Another much more serious example is the executive order issued by President Obama that required public schools to tolerate misbehavior from black students that would never be accepted from other students.

Max Eden elaborates on this in an article at The Federalist:
Activists, advocates, and academics had been ringing the alarm over the racial disparity in suspension rates. And it is certainly troubling that African-American students are suspended at three times the rate of white students.

A sober mind might assume that [these statistics] might largely reflect tragic realities in our society. As Michael Petrilli, president of the Thomas B. Fordham Institute, has argued, “it cannot surprise us if minority students today misbehave at ‘disproportionate’ rates. African American and Latino children in America are much more likely to face challenges,” such as living in poverty, growing up in a single-parent family, or living in a dangerous neighborhood, that puts “them ‘at risk’ for antisocial behavior.”

But Obama Education Secretary Arne Duncan declared flatly that this is “not caused by differences in children,” but rather “it is adult behavior that needs to change.” Now, implicit bias likely accounts for some share of the disparity. But if you assume that it accounts for all of it, then the solution isn’t to support teachers with better training or professional development. The only solution is to prevent teachers from doing what they’ve been doing. And that’s exactly what the Obama Department of Education did.

Duncan issued “Dear Colleague” guidance in January 2014 telling districts they could be subject to federal investigation for unlawful discrimination if their discipline policy “is neutral on its face—meaning that the policy itself does not mention race—and is administered in an evenhanded manner but has a disparate impact, i.e., a disproportionate and unjustified effect on students of a particular race.” In other words: “We may decide to come after you, even if your policy is entirely fair, if we see differences on a spreadsheet.”

Over the past five years, 27 states have revised their school discipline laws and 53 districts serving 6.35 million students implemented discipline reforms. From 2011-12 to 2013-14, suspensions dropped by nearly 20 percent nationwide. The Obama letter didn’t start this fire, but it certainly fanned the flames. As the Fordham Institute’s Chester Finn noted, the intent of the letter was “to scare the bejesus out of U.S. educators when it comes to disciplining minority students.”
The Obama Education department's directive not only mandated unequal treatment based on race it had several predictable negative consequences: 1) It encouraged worse behavior among minority kids, 2) it sent minority students the message that they can't be expected to meet the standards other kids are expected to meet, 3) it made it impossible to discipline white and Asian kids who would reasonably ask why they're being punished for something that black kids are allowed to do, and 4) it went a long way toward fostering increased bitterness and resentment among whites toward blacks.

Eden's piece gives some data:
Judging by press accounts, the results are rather horrific. After the federal government forced Oklahoma City to change its discipline policies, one teacher reported she was “told that referrals would not require suspension unless there was blood.” In Buffalo, a teacher who got kicked in the head by a student said: “We have fights here almost every day…. The kids walk around and say, ‘We can’t get suspended—we don’t care what you say.’” In St. Paul, the county attorney declared that the threefold increase in assaults on teachers constituted a “public health crisis.”

Teacher surveys also indicate these policies are backfiring. Significant majorities of teachers in Oklahoma City, Denver, Tampa, Santa Ana, Jackson, and Baton Rouge report that the discipline reforms either weren’t working or were escalating school violence.

Judging by student surveys, schools have become less safe as these policies went into effect. In Los Angeles, the percent of students who said they felt safe in their schools plummeted from 72 percent to 60 percent after the district banned suspension for non-violent offenses. In Chicago, researchers found a significant deterioration in school order, according to students and teachers.

I dug into school climate surveys in New York City, and found that as Mayor Bill de Blasio’s school discipline reductions were implemented school climate plummeted, especially in schools comprised of 90 percent or higher minority students: 58 percent reported a deterioration in student respect (compared to 19 percent that saw an improvement); 50 percent reported an increase in physical fighting (compared to 14 percent that saw an improvement); and drug use and gang activity increased in about four times as many schools as decreased.
Fortunately, the occasion for Eden's column is the executive order issued by President Trump last week which gives Education Secretary Betsy DeVos the authority to rescind Mr. Duncan's ill-conceived "guidance." For the sake of kids both black and white, she should forthwith make it clear that all students will be held to the same standards of conduct regardless of their race. The idea that black kids can't be expected to conduct themselves appropriately in school is nothing more than racial bigotry masquerading as liberal compassion. There may be no better first step to improving race relations in this country than holding everyone to the same behavioral expectations.

Wednesday, May 10, 2017

Supreme Incoherence

Here's a puzzler: The Supreme Court, which ruled in Obergefell (2015) that gay marriage was henceforth legal in the United States, declined last January to hear a challenge by reality TV polygamist Kody Brown and his wives to Utah's law against polygamy. It'd be interesting to read the Court's rationale for denying certeriori to the plaintiffs in the case, given the logic of legalizing same-sex marriage.

Kody Brown and His Wives

I've argued on a number of occasions on Viewpoint that extending the right to marry to gays cuts away the ground for laws against polygamy, polyamory, or almost any form of family structure one can think of. Here's why:

Marriage has for two thousand years or more been considered the union of one man and one woman, but enlightened folk in the last couple of years have decided that this traditional understanding is "oppressive" and the Supreme Court agreed. Justice Anthony Kennedy opined that opposition to gay marriage was rooted in an irrational animus against gays, and the Court took it upon themselves to rule that the gender of the individuals seeking holy matrimony no longer matters.

Very well, but if the gender no longer matters, how can the Court now decide that the number of individuals still matters? If marriage can now unite one man and another or one woman and another, why can it not unite one man and several women, or indeed any combination of people who wish to enter into the relationship? What logical grounds does the Court have for prohibiting such unions?

Perhaps there is a good reason for it, but if so it eludes me. I suspect that the Court realizes the muddle they've created and declined to hear the Utah case because they wanted to avoid the embarrassment of having to acknowledge that their decision in Obergefell was not based on the Constitution but was instead a purely arbitrary capitulation to sociological fashion. If they chose to hear Brown's case and strove to be consistent, they'd have to strike down Utah's statute and any other statutes which ban any variety of "marriage" which involves consenting adults.

Polygamy is not yet fashionable, but when it becomes so, and it probably will, the Court will doubtless find that they no longer have any logical reason to prohibit it, given the precedent they've set with Obergefell. Perhaps future generations will look back at Obergfell not as the end of an oppressive practice against gays but rather as the beginning of the end of marriage.

Tuesday, May 9, 2017

Intolerance Left and Right

An article at Politico discusses research that shows that intolerance is a human trait that neither conservatives nor liberals have a monopoly on. The research contradicts the assumptions held by both groups that the other is the more intolerant:
So who’s right? Are conservatives more prejudiced than liberals, or vice versa? Research over the years has shown that in industrialized nations, social conservatives and religious fundamentalists possess psychological traits, such as the valuing of conformity and the desire for certainty, that tend to predispose people toward prejudice. Meanwhile, liberals and the nonreligious tend to be more open to new experiences, a trait associated with lower prejudice. So one might expect that, whatever each group’s own ideology, conservatives and Christians should be inherently more discriminatory on the whole.

But more recent psychological research, some of it presented in January at the annual meeting of the Society of Personality and Social Psychology (SPSP), shows that it’s not so simple. These findings confirm that conservatives, liberals, the religious and the nonreligious are each prejudiced against those with opposing views. But surprisingly, each group is about equally prejudiced. While liberals might like to think of themselves as more open-minded, they are no more tolerant of people unlike them than their conservative counterparts are.
There's much more on this at the link, and much more about liberal and conservative attitudes that you might find surprising. One minor quibble I had with the article was that I didn't see anything that indicated how the researchers or the article's writers defined either prejudice or intolerance. For example, it's not made particularly clear whether intolerance is defined as simply disagreeing with the other side or actually trying to shut the other side up.

If it's the former then "prejudice" is unexceptional and anodyne. If, however, it's the latter then it's relevant to note that two places where prejudice is today manifesting itself as sheer intolerance of heterodox opinions is in the media and on university campuses. It may be that were these institutions dominated by conservatives they would be just as intolerant and narrow-minded as they are under liberals, but we have no way of knowing.

We do know, though, that in the contemporary world these institutions are dominated largely by liberals who flatter themselves to think that they are paragons of tolerance when in fact they're often as eager to punish deviations from liberal orthodoxy as was any medieval inquisitor. Indeed, totalitarian thought-control has always been, at least since the onset of modernity, a temptation and phenomenon of the left.