Saturday, April 30, 2016

Why Tyrants Ban the Bible

Eric Metaxas, has a column at USA Today in which he suggests some answers to a couple of interesting questions: Why do tyrants almost always ban the Bible, and why do so many secular folks fear it? Whether one believes that the Bible is the literal word of God or is convinced that it's just a compilation of the literary and historical musings of people who lived in a long dead civilization, the questions should have resonance, in fact they should have special piquancy for those who hold the latter view of the Bible. Why would a book of ancient fables and superstitions be feared by those who seek to exercise mind-control over the people? Why not just treat it like we treat Aesop's Fables?

Anyway, here are some excerpts from what Metaxas says:
Every single year the Bible is the world’s best-selling book. In fact, it’s the number one best-selling book in history. But recently it made another, less-coveted list: the American Library Association’s “top 10 most-challenged books of 2015.” This means the Bible is among the most frequently requested to be removed from public libraries.

But what’s so threatening about it? Why could owning one in Stalin’s Russia get you sent to the Gulag, and why is owning one today in North Korea punishable by death? What makes it scarier to some people than anything by Stephen King?

We could start with the radical notion that all human beings are created by God in His image, and are equal in His eyes. This means every human being should be accorded equal dignity and respect. If the wrong people read that, trouble will be sure to follow. And some real troublemakers have read it.

One of them was George Whitefield, who discovered the Bible as a teenager and began preaching the ideas in it all across England. Then he crossed the Atlantic and preached it up and down the thirteen colonies until 80 percent of Americans had heard him in person. They came to see that all authority comes from God, not from any King, and saw it was their right and duty to resist being governed by a tyrant, which led to something we call the American Revolution.

Another historical troublemaker was the British Parliamentarian William Wilberforce. When he read the Bible, he saw that the African slave trade — which was a great boon to the British economy — was nonetheless evil. He spent decades years trying to stop it. Slave traders threatened to have him killed, but in 1807, he won his battle and the slave trade was abolished throughout the British Empire. In 1833, slavery itself was abolished too.

In the 20th century, an Indian lawyer named Mohandas Gandhi picked up some ideas from the Bible about non-violent resistance that influenced his views as he led the Indian people to independence. And who could deny the Bible’s impact on the Rev. Martin Luther King, Jr., who said the Bible led him to choose love and peaceful protest over hatred and violence? He cited the Sermon on the Mount as his inspiration for the Civil Rights movement, and his concept of the "creative suffering," endured by activists who withstood persecution and police brutality, came from his knowledge of Jesus’ trials and tribulations.
It could be added that a book that teaches that no earthly authority is ultimate, that men must obey God's law when it conflicts with man's law, that tyrants who abuse their power, which they all do, will answer for their evil, a book that says all that is not going to find favor with dictators.

But why is it often banned from public libraries in countries which ostensibly have freedom of speech? Perhaps one reason is that the Bible defies the secularist orthodoxy that "the cosmos is all there is, all there ever was, and all there ever will be" to quote Carl Sagan. Any book that says otherwise is simply not to be tolerated, even by those who claim to make a virtue of tolerance. These folks may not be tyrants of the sort who rule North Korea, but they share some aspects of the tyrannical spirit all the same.

Friday, April 29, 2016

The Dark Ages

A lot of high school and even college students are taught that the historical period roughly from the fall of Rome to the 15th century was a time of intellectual stagnation with little or no scientific or technological progress. The ignorance that descended over Europe during this epoch has caused it to be called the "Dark Ages," a pejorative assigned to the Middle Ages by historians of the 18th century hostile to the Church and desirous of deprecating the period during which the Church wielded substantial political power.

Lately, however, historians have challenged the view that this epoch was an age of unenlightened ignorance. Rodney Stark has written in several of his books (particularly, his How the West Won) of the numerous discoveries and advancements made during the "dark ages" and concludes that they weren't "dark" at all. The notion that they were, he argues, is an ahistorical myth. Indeed, it was during this allegedly benighted era that Europe made the great technological and philosophical leaps that put it well in advance of the rest of the world.

For example, agricultural technology soared during this period. Advances in the design of the plow, harnessing of horses and oxen, horseshoes, crop rotation, water and wind mills, all made it possible for the average person to be well-fed for the first time in history. Transportation also improved which enabled people and goods to move more freely to markets and elsewhere. Carts, for example, were built with swivel axles, ships were more capacious and more stable, and horses were bred to serve as draught animals.

Military technology also made advances. The stirrup, pommel saddle, longbow, crossbow, armor, and chain mail eventually made medieval Europeans almost invincible against non-European foes.

Similar stories could be told concerning science, philosophy, music and art, and thus the view espoused by Stark that the medieval era was a time of cultural richness is gaining traction among contemporary historians who see the evidence for this interpretation of the time to be too compelling to be ignored.

This short video featuring Anthony Esolen provides a nice summary:

Thursday, April 28, 2016

Scylla and Charybdis

One of the mysteries of our current political drama is why so many people who would, one would think, be hostile to Donald Trump and Hillary Clinton nevertheless support them. Trump, for his part, has received a lot of love from people who identify themselves as either Christian or conservative or both, yet there's not much in Trump's personal history nor his current deportment which would engender confidence that he is sympathetic to the concerns of either group and an awful lot in his history and conduct that would indicate that the opposite is true.

Even so, Trump is very popular and one reason why may be found in this article from the Washington Examiner:
U.S. Immigration and Customs Enforcement has revealed that 124 illegal immigrant criminals released from jail by the Obama administration since 2010 have been subsequently charged with murder.

"The criminal aliens released by ICE in these years — who had already been convicted of thousands of crimes — are responsible for a significant crime spree in American communities, including 124 new homicides. Inexplicably, ICE is choosing to release some criminal aliens multiple times," said the report written by CIS's respected director of policy studies, Jessica M. Vaughan.

What's more, her report said that in 2014, ICE released 30,558 criminal aliens who had been convicted of 92,347 crimes. Only 3 percent have been deported.

Her analysis is the latest shocking review of Obama's open-border immigration policy. And despite the high number of illegal immigrants charged with murder, the list doesn't include those released by over 300 so-called "sanctuary cities" and those ICE declined to take into custody.

Vaughan added that "ICE reported that there are 156 criminal aliens who were released at least twice by ICE since 2013. Between them, these criminals had 1,776 convictions before their first release in 2013, with burglary, larceny, and drug possession listed most frequently."
In a way, Donald Trump is a product of President Obama's open borders immigration policy. Trump has promised more forcefully than any other candidate, certainly more forcefully than either of the Democrat contestants, that he will put a stop to the flood of illegal aliens and criminals pouring into our country. It's that promise, perhaps - along with the fact that his supporters believe that, unlike the "establishment" Republicans who've been running the show for the last eight years and whom the rank and file regard as a bunch of milquetoasts, Trump will actually fight - which has won a lot of people to his candidacy despite his lamentable emotional immaturity and sundry other shortcomings.

For others, contemplating the looming choice between Mr. Trump and Hillary Clinton, who has made herself rich peddling access to the wealthy, both domestic and foreign, who wish to influence policy and legislation for their own benefit, and who behaved at best recklessly as Secretary of State and certainly incompetently, Homer's Odysseus comes to mind. The hero Odysseus was forced by circumstances to navigate his boat between the monster of Scylla and the whirlpool of Charybdis, a perilous and terrifying task, and a political counterpart to which confronts the American voter this November. Americans will likely find themselves having to choose between the Scylla of Hillary and the Charybdis of Trump, who together comprise, perhaps, the most disagreeable pair of presidential candidates in modern times.


Scylla and Charybdis

Wednesday, April 27, 2016

Lost in the Multiverse

Physicist Adam Frank is impressed, as most scientists are, with the degree of fine-tuning scientists are finding in the cosmos. He writes:
As cosmologists poked around Big Bang theory on ever-finer levels of detail, it soon became clear that getting this universe, the one we happily inhabit, seemed to be more and more unlikely. In his article, Scharf gives us the famous example of carbon-12 and its special resonances. If this minor detail of nuclear physics were just a wee bit different, our existence would never be possible. It’s as if the structure of the carbon atom was fine-tuned to allow life.
But this issue of fine-tuning goes way beyond carbon nuclei. It's ubiquitous in cosmology.
Change almost anything associated with the fundamental laws of physics by one part in a zillion and you end up with a sterile universe where life could never have formed. Not only that, but make tiny changes in even the initial conditions of the Big Bang and you end up with a sterile universe. Cosmologically speaking, it’s like we won every lottery ever held. From that vantage point we are special — crazy special.
Indeed, the figure of one part in a zillion hardly begins to capture the incomprehensible precision with which these cosmic constants and forces are set, but lest one conclude that perhaps it's all purposefully engineered, Frank quickly waves the reader away from that unthinkable heresy:
Fine-tuning sticks in the craw of most physicists, and rightfully so. It’s that old Copernican principle again. What set the laws and the initial conditions for the universe to be “just so,” just so we could be here? It smells too much like intelligent design. The whole point of science has been to find natural, rational reasons for why the world looks like it does. “Because a miracle happened,” just doesn’t cut it.
This is a bit too flippant. Intelligent design doesn't say "a miracle happened" as though that were all that's needed to account for our world. ID says simply that natural processes are inadequate by themselves to explain what scientists are finding in their equations. Even so, it's ironic that every naturalistic theory of cosmogensis does say that the origin of the universe was miraculous if we define a miracle as an extraordinarily improbable event that does not conform to the known laws of physics.

In any case, how do scientists who wish to avoid the idea of purposeful design manage to do so? Well, they conjure a near infinite number of universes, the multiverse, of which ours is just one:
In response to the dilemma of fine-tuning, some cosmologists turned to the multiverse. Various theories cosmologists and physicists were already pursuing — ideas like inflation and string theory — seemed to point to multiple universes.
Actually, these theories allowed for the existence of other universes, they don't require them, but be that as it may, the advantage of positing a multiplicity of different worlds is that the more different worlds you have the more likely even a very improbable world will become, just as the more times you deal a deck of cards the more likely it will be that you'll deal a royal flush. Frank, though, issues a caveat:
There is, however, a small problem. Well, maybe it’s not a small problem, because the problem is really a very big bet these cosmologists are taking. The multiverse is a wildly extreme extrapolation of what constitutes reality. Adding an almost infinite number of possible universes to your theory of reality is no small move.

Even more important, as of yet there is not one single, itty-bitty smackeral of evidence that even one other universe exists (emphasis mine)....

Finding evidence of a multiverse would, of course, represent one of the greatest triumphs of science in history. It is a very cool idea and is worth pursuing. In the meantime, however, we need to be mindful of the metaphysics it brings with it. For that reason, the heavy investment in the multiverse may be over-enthusiastic.

The multiverse meme seems to be everywhere these days, and one question to ask is how long can the idea be supported without data (emphasis mine). Recall that relativity was confirmed after just a few years. The first evidence for the expanding universe, as predicted by general relativity, also came just a few years after theorists proposed it. String theory [upon which the multiverse idea is based], in contrast, has been around for 30 years now, and has no physical evidence to support it.
I'm surprised Frank doesn't mention the irony in this. Scientists feel impelled to shun ID because, they aver, it's not scientific to posit intelligences for which there's no physical evidence (set aside the fact that the existence of a finely-tuned universe is itself pretty compelling evidence). Yet, in its stead they embrace a theory, the multiverse, for which, as Frank readily admits, there's no physical evidence and yet they think this is somehow more reasonable than embracing ID.

When you're determined to escape the conclusion that the universe is intentionally engineered, you'll embrace any logic and any theory, no matter how bizarre, that allows you to maintain the pretense of having refuted the offending view.

Pretty amusing.

Tuesday, April 26, 2016

Sauce for the Goose

Larry O'Connor at Hot Air questions how President Obama can reconcile his pique at Israeli Prime Minister Benjamin Netanyahu when Netanyahu came to the U.S. to urge Congress to reject Mr. Obama's deal with Iran with his decision to go to London to urge the Brits not to leave the European Union. Even more, he threatened them with diminished prospects of a trade deal if they did leave. Why is it wrong, in President Obama's eyes, for Netanyahu to interject himself in the affairs of this country, but not wrong for Mr. Obama to interject himself in the affairs of Great Britain?

Here's O'Connor:
During the height of the debate over Obama’s disastrous nuclear capitulation to Iran, Israeli Prime Minister Benjamin Netanyahu accepted an invitation to address the US Congress to make a strong case against the deal that could hand nuclear weapons to a terrorist state pledged to annihilating his nation.

Do you remember the White House’s petulant reaction to the speech?

On CNN, Fareed Zakaria asked the president.... "Is it appropriate for a foreign head of government to inject himself into an American debate?”

Obama then employed his patented, passive-aggressive snark. “I’ll let you ask Prime Minister Netanyahu that question, if he grants you an interview,” he replied with a suggestive smirk. An arrogant response meant to convey a snide “I know and you know the answer to that question, but I’ll snidely demure so I can give the appearance of staying above the fray. Oh, and I’ll throw in a little jab suggesting Netanyahu is scared to sit with you and take a question like that even though I clearly am not.”

And then he couldn’t help himself. He said, “I do not recall a similar example.”

By providing that last answer, he validated Zakaria’s question and the premise it was built upon. That Netanyahu had “injected himself, forcefully,” into an internal American affair and it was unprecedented.

He could have said, “Listen Fareed, the US has a long and important relationship with Israel and clearly this Iran deal is going to affect the Israeli people and all the other nations in the region. I welcome the Prime Minister’s input and he has every right to accept Congress’ invitation.”

But he didn’t.
No, he didn't. He in fact took great umbrage at Mr. Netanyahu's impertinence in addressing the U.S. Congress. So, did he apply the same standard to himself when he visited England? Not at all:
No matter what side of the Brexit (British exit from the EU) question you fall on, there’s no doubt that Obama is “injecting himself forcefully into the debate” over British policy, right? And by doing so, he is showing himself to be a world-class hypocrite.
I don't feel comfortable with O'Connor's use of the word "hypocrite" to describe the President of the United States, but sadly, I can't think of any other word that better describes the conjunction of his petulance toward Netanyahu's conduct and his own conduct in England.

Monday, April 25, 2016

Born Yesterday

A post over at Why Evolution Is True, a site at which scientistic materialists and naturalists tend to gather, recently caught my eye. The poster, who went by the name Grania, wrote this about the ethics of the Bible:
A number of people over the years have pointed out that almost anyone can improve upon the Ten Commandments with minimal effort, the original set of ten ... moral laws by which humans were purportedly to live their lives.

What appears in Exodus is so old that its ethics are more concerned with livestock, possessions and outward symbols of worshiping the right god. It’s not particularly concerned with the well-being of children say, or women or pretty much anyone who wasn’t an adult male Jew camping at the bottom of Mount Sinai.

The point is that through no effort of our own and no failing of theirs, we live in a century where we are moral giants compared to our ancestors. We benefit from their failings and their flaws as much as from what wisdom they collated; and now we can do better without even thinking too hard about it.
There are at least two things wrong with what Grania claims.

First, the ethical imperatives in the Old Testament, and elaborated upon in the New, impose upon us the duties of doing justice, of caring for the widow, the orphan, the weak, the stranger, and the poor. I don't think that the last two thousand years has seen any improvement on those commands.

The Israelites were enjoined to love their neighbor as they loved themselves, and much of the rest of the Bible is a commentary on how, in practical terms, one might do that. The claim that we are moral giants compared to those who went before us seems to reflect the naivete of the person "born yesterday." It's very hard for anyone conversant with the last 100 years of world history to think that we today are morally superior to the inhabitants of earlier centuries, but if we grant that Western civilization has achieved a measure of moral superiority it is only because it has been for two thousand years marinated in the ethical teaching of Scripture, not because we're somehow better moral examples of the human species.

Second, the people of the Book can say with perfect logical consistency that to fail in the duty to love our neighbor is to do grave moral wrong. It is to violate the will of the Creator who, by virtue of His perfect goodness, power and knowledge, is a supreme moral authority and who holds us accountable for our behavior. The modern atheist, however, lacking any objective moral authority, cannot claim that any moral duties exist nor that there exist any moral wrongs. One can choose to love one's neighbor, of course, but if one chooses instead to harm one's neighbor he's violating no duty nor doing anything that's morally wrong. On an atheistic view there simply are no objective moral duties and thus no moral wrongs because there's no objective moral authority and no real accountability.

On atheism human beings are like the citizens of a town in the wild west with no laws and no sheriff. What's right is whatever the guy with the most guns on his side says is right. The atheist might bristle at this claim. He may strenuously deny it, but he has to piggyback on the moral assumptions of theism in order to do so. Atheism itself offers no grounds for supposing that morality is anything more than a cultural or social convention.

Saturday, April 23, 2016

The Climate Controversy

A colleague passed along this short video featuring an expert in the field of climatology that helps put the debate over climate change into perspective.

What do the scientists working in this field agree upon and what do they disagree upon? Who is driving the contentious aspects of the debate and what are their motives? I think you'll find this five minute presentation both interesting and informative:

Friday, April 22, 2016

Another Point in Favor of Substance Dualism

Neurosurgeon Michael Egnor points out that among the things that a material brain cannot accomplish just by itself is abstract thought. Egnor concludes that this is evidence for mind/brain dualism because certainly human beings are capable of abstract thinking. Why does he say that the material brain is incapable of generating abstract thoughts? He makes his case in a short essay at Evolution News, excerpts from which are here:
Wilder Penfield was a pivotal figure in modern neurosurgery. He was an American-born neurosurgeon at the Montreal Neurological Institute who pioneered surgery for epilepsy. He was an accomplished scientist as well as a clinical surgeon, and made seminal contributions to our knowledge of cortical physiology, brain mapping, and intra-operative study of seizures and brain function under local anesthesia with patients awake who could report experiences during brain stimulation.

His surgical specialty was the mapping of seizure foci in the brain of awake (locally anesthetized) patients, using the patient's experience and response to precise brain stimulation to locate and safely excise discrete regions of the cortex that were causing seizures. Penfield revolutionized neurosurgery (every day in the operating room I use instruments he designed) and he revolutionized our understanding of brain function and its relation to the mind.

Penfield began his career as a materialist, convinced that the mind was wholly a product of the brain. He finished his career as an emphatic dualist.

During surgery, Penfield observed that patients had a variable but limited response to brain stimulation. Sometimes the stimulation would cause a seizure or evoke a sensation, a perception, movement of muscles, a memory, or even a vivid emotion. Yet Penfield noticed that brain stimulation never evoked abstract thought. He wrote:
There is no area of gray matter, as far as my experience goes, in which local epileptic discharge brings to pass what could be called "mindaction"... there is no valid evidence that either epileptic discharge or electrical stimulation can activate the mind... If one stops to consider it, this is an arresting fact. The record of consciousness can be set in motion, complicated though it is, by the electrode or by epileptic discharge. An illusion of interpretation can be produced in the same way. But none of the actions we attribute to the mind has been initiated by electrode stimulation or epileptic discharge. If there were a mechanism in the brain that could do what the mind does, one might expect that the mechanism would betray its presence in a convincing manner by some better evidence of epileptic or electrode activations.[italics mine]
The brain was necessary for abstract thought, normally, but it was not sufficient for it. Abstract thought was something other than merely a process of the brain.

Why don't epilepsy patients have "calculus seizures" or "moral ethics" seizures, in which they involuntarily take second derivatives or contemplate mercy? The answer is obvious -- the brain does not generate abstract thought. The brain is normally necessary for abstract thought, but not sufficient for it.

Thus, the mind, as Penfield understood, can be influenced by matter, but is, in its abstract functions, not generated by matter.
There's more at the link. Egnor's argument boils down to this. If the material brain is sufficient to account for all of our cognitive experience, and since stimulation that normally triggers all sorts of "mental" activity never triggers abstract thinking, abstract thinking must arise from something other than the material brain.

This is not proof that there's a mind, of course, but it is certainly consistent with the dualist hypothesis that we are a composite of mind and brain and certainly puzzling on the materialist hypothesis that the material brain is solely responsible for all of our mental experience.

Thursday, April 21, 2016

Just Right

Scientists have cataloged dozens of parameters that have to be just right for life to appear on earth. The improbability of a planet having so many of these properties is so high that some scientists have speculated that life, at least complex life, might exist nowhere else in the universe no matter how many other planets are out there. This is the thesis of such books as Rare Earth by Ward and Brownlee and Privileged Planet by Gonzalez and Richards.

A few such parameters are:
...a galactic habitable zone, a central star and planetary system having the requisite character, the circumstellar habitable zone, a right sized terrestrial planet, the advantage of a gas giant guardian and large satellite, conditions needed to ensure the planet has a magnetosphere and plate tectonics, the chemistry of the lithosphere, atmosphere, and oceans, the role of "evolutionary pumps" such as massive glaciation and rare bolide impacts, and whatever led to the still mysterious Cambrian explosion of animal phyla. The emergence of intelligent life may have required yet other rare events.
New Scientist has a report on some research that shows yet another cosmic coincidence that allows life to exist on earth. It turns out that, as strange as it may seem at first, the orbit of Saturn has to be almost exactly as it is for life to exist on earth:
Earth's comfortable temperatures may be thanks to Saturn's good behaviour. If the ringed giant's orbit had been slightly different, Earth's orbit could have been wildly elongated, like that of a long-period comet.

Our solar system is a tidy sort of place: planetary orbits here tend to be circular and lie in the same plane, unlike the highly eccentric orbits of many exoplanets. Elke Pilat-Lohinger of the University of Vienna, Austria, was interested in the idea that the combined influence of Jupiter and Saturn – the solar system's heavyweights – could have shaped other planets' orbits. She used computer models to study how changing the orbits of these two giant planets might affect the Earth.

Earth's orbit is so nearly circular that its distance from the sun only varies between 147 and 152 million kilometres, or around 2 per cent about the average. Moving Saturn's orbit just 10 percent closer in would disrupt that by creating a resonance – essentially a periodic tug – that would stretch out the Earth's orbit by tens of millions of kilometres. That would result in the Earth spending part of each year outside the habitable zone, the ring around the sun where temperatures are right for liquid water.

Tilting Saturn's orbit would also stretch out Earth's orbit. According to a simple model that did not include other inner planets, the greater the tilt, the more the elongation increased. Adding Venus and Mars to the model stabilised the orbits of all three planets, but the elongation nonetheless rose as Saturn's orbit got more tilted. Pilat-Lohinger says a 20-degree tilt would bring the innermost part of Earth's orbit closer to the sun than Venus.
In other words, our solar system is like a delicately balanced ecosystem, all the parts of which seem to be important in making earth the sort of place where life can arise and be sustained. The odds of such a system existing elsewhere in the universe would seem to be very small.


It might be mentioned in passing that it's not just Saturn's orbit that makes life possible on earth. Scientists have shown that massive outer planets like Saturn, Jupiter, Uranus, and Neptune act as gravitational vacuum sweepers sucking up a lot of debris that would otherwise invade the inner reaches of the solar system and threaten earth with constant collisions. It really is astonishing how many factors must all be just right for life to exist on this one little planet.

Wednesday, April 20, 2016

Police Shootings

If a visitor from Mars were to listen to media chatter they might think that young unarmed black men are being mown down in the streets by racist white cops with alarming regularity. The facts, though, are otherwise as a Pulitzer Prize-winning Washington Post study reveals.

The WaPo's data is displayed on a chart that reveals a number of facts about police shootings that may come as a surprise to some who check out their findings.

For example: In 2015 there were 990 people shot and killed by police. The overwhelming majority of those killed by police were armed and white. Four hundred ninety four of the dead were white and 258 were black (Hispanics and other races made up the balance). Most of the deceased were showing a weapon, but 93 were unarmed, 32 of whom were white and 38 black.

Among these unarmed individuals 14 of the whites and 15 of the blacks who were killed were attacking, or in some way threatening, the police officer. Of those who were unarmed and not attacking the officer, several of them were shot accidentally, or they brandished a device that was mistaken for a weapon, etc.

In short, there may be a problem with police sometimes using excessive force, but the idea perpetuated by Black Lives Matter and others that African Americans are a deliberate target of racist cops is simply not born out by the facts.

Tuesday, April 19, 2016

On the Evolution of Language

Geneticist Michael Denton's new book, somewhat awkwardly titled Evolution: Still a Theory in Crisis, is a remarkable piece of philosophy of biology. In the book Denton critiques the classical Darwinian view that biological structures develop in organisms gradually over long periods of time because they serve an adaptive function. This view is called functionalism and Denton argues persuasively that functionalist explanations are simply inadequate to explain a host of structures like the mammalian forelimb, the shapes of leaves, the structure of flowers, and many others.

One other example of phenomena that functionalist explanations cannot explain, Denton argues, is the emergence in human beings of language. He recently wrote a synopsis of his argument for Evolution News and Views of which the following is a part:
In the early 1960s, in one of the landmark advances in 20th-century science, Noam Chomsky showed that all human languages share a deep invariant structure. Despite their very different "surface" grammars, they all share a deep set of syntactic rules and organizing principles. All have rules limiting sentence length and structure and all exhibit the phenomenon of recursion -- the embedding of one sentence in another. Chomsky has postulated that this deep "universal grammar" is innate and is embedded somewhere in the neuronal circuitry of the human brain in a language organ.

Children learn [human] languages so easily, despite a "poverty of stimulus," because they possess innate knowledge of the deep rules and principles of human language and can select, from all the sentences that come to their minds, only those that conform to a "deep structure" encoded in the brain's circuits.

The challenge this poses to Darwinian evolution is apparent. Take the above-mentioned characteristic that all human languages exhibit: recursion. In the sentence, "The man who was wearing a blue hat which he bought from the girl who sat on the wall was six feet tall," the italicized words are embedded sentences. Special rules allow human speakers to handle and understand such sentences. And these rules, which govern the nature of recursion, are specific and complex. So how did the computational machinery to handle it evolve? David Premack is skeptical:
I challenge the reader to reconstruct the scenario that would confer selective fitness on recursiveness. Language evolved, it is conjectured, at a time when humans or protohumans were hunting mastodons... Would it be a great advantage for one of our ancestors squatting alongside the embers to be able to remark, "Beware of the short beast whose front hoof Bob cracked when, having forgotten his own spear back at camp, he got in a glancing blow with the dull spear he borrowed from Jack"?

Human language is an embarrassment for evolutionary theory because it is vastly more powerful than one can account for in terms of selective fitness. A semantic language with simple mapping rules, of a kind one might suppose that the chimpanzee would have, appears to confer all the advantages one normally associates with discussions of mastodon hunting or the like. For discussions of that kind, syntactical classes, structure-dependent rules, recursion and the rest, are overly powerful devices, absurdly so.
There is considerable controversy over what structures in the brain restrict all human languages to the same deep structure.... Yet however it is derived during development, there is no doubt that a unique deep structure underlies the languages of all members of our species. It is because of the same underlying deep structure that we can speak the language of the San Bushman or an Australian aborigine, and they in turn can speak English.

The fact that all modern humans, despite their long "evolutionary separation" -- some modern races such as the San of the Kalahari and the Australian aborigines have been separated by perhaps 400,000 years of independent evolution -- can learn each other's languages implies that this deep grammar must have remained unchanged since all modern humans (African and non-African) diverged from their last common African ancestor, at least 200,000 years ago. As Chomsky puts it:
What we call "primitive people"... to all intents and purposes are identical to us. There's no cognitively significant genetic difference anyone can tell. If they happened to be here, they would be one of us, and they would speak English... If we were there, we would speak their languages. So far as anyone knows, there is virtually no detectable genetic difference across the species that is language-related.
[I]t is not only the deep structure of language that has remained invariant across all human races. All races share in equal measure all the other higher intellectual abilities: musical, artistic, and mathematical ability, and capacity for abstract thought. These also, therefore, must have been present in our African common ancestors more than 200,000 or more years ago, and must have remained unchanged, and for some reason latent since our common divergence. To suggest that language and our higher mental faculties evolved in parallel to reach these same remarkable ends independently in all the diverse lineages of modern humans over 200,000 years ago or more, would be to propose the most striking instance of parallel evolution in the entire history of life and be inexplicable in Darwinian terms.
If I understand Denton aright, these capacities, which all humans share, have been present from the origin of the species but no alleged non-human evolutionary precursor has them. Thus, these capacities, as remarkable as they are, must have evolved with extraordinary rapidity, appearing full-blown in the earliest human beings. That's very hard to explain in terms of traditional Darwinian gradualism and functionalism.

It seems the more we learn the more implausible classical naturalistic Darwinism appears to be.

Monday, April 18, 2016

A Legacy in Three Acts

President Obama is at the point in his administration when the occupant of the White House begins to think hard about the legacy he'll leave to history. Mr. Obama may himself be indifferent to such matters, but if he does care about how historians will judge his tenure it must be giving him heartburn.

Consider just three issues into which his administration has pumped a lot of time, energy and political capital:

Health care (Obamacare) - For the third straight year insurance companies participating in the Obamacare marketplaces are finding it necessary to raise their premiums, arguing that the system is unsustainable if premiums are actually "affordable." If they can't raise premiums many of the insurers will be leaving the system. If they do raise premiums many healthy people will simply not buy insurance leaving the risk pool full of people most likely to use coverage which makes it more expensive for everyone else. In any case, Obamacare is growing increasingly unstable and will probably be greatly changed after Mr. Obama leaves office.

Mr. Obama's credibility has been seriously tarnished by this. He promised on numerous occasions that his plan would actually make insurance cheaper for families, a family of four, he insisted, would actually pay, on average, $2500 less for premiums, but the opposite has been the case. Just as bad as higher premium costs has been the rise in deductibles. Many people with insurance are not only paying more for their coverage, but their deductibles are so high the insurance they're paying for never gets used. Mr. Obama also promised that everyone would be insured under his plan, but that hasn't happened, either. Millions are uninsured and their number is actually expected to rise over coming years.

Moreover, twelve of the original 23 federally-financed Obamacare co-ops have already collapsed and eight of the 11 remaining co-ops appear likely to fail this year. All in all, Obamacare does not look like an achievement upon which historians will look upon with favor.

See the link for a more detailed synopsis of the problems the Affordable Health Care law is facing.

Education - Joy Pullman at The Federalist concludes that Common Core, despite costing the taxpayers hundreds of millions of dollars, has been an educational failure. It's a failure a lot of teachers saw coming long ago. Pullman writes that:
[T]he centerpiece of Mr. Obama's education policy, is dead. The big postmortems will roll out in a year or two, but it’s already clear this education monstrosity is eking out its last gasps.

To recap, Common Core is an organizing scheme that aims to control all of American education, from preschool through college. Its core is a set of testing and curriculum blueprints being used as a lever to get all of what kids learn in every subject and at every age into “alignment” with its centrally planned, academically low-quality, and one-size-fits-all mandates.
In fact, Pullman says, all Common Core succeeded in doing is enriching the consultants who lobbied for its implementation.
Last week the Brookings Institution issued the preliminary autopsy in its annual major report on education. It finds that American children are receiving objectively worse academic instruction because of Common Core, in two major respects: In the increase in nonfiction their teachers are assigning, and in a nationwide decline in students taking algebra in eighth grade.

Further, it finds that Common Core has done nothing to help children learn more overall, which was one of its supporters’ major claims: “there also is no evidence that CCSS has made much of a difference during a six-year period of stagnant NAEP [National Assessment of Educational Progress] scores.” NAEP is the nation’s highest-quality set of large-scale tests, used widely by researchers as benchmarks for American kids’ abilities over time. While younger students have made some small gains on NAEP since it began in 1992, high-school graduates since then have not improved one whit. Even though its supporters promised it would, Common Core isn’t helping.

Researcher Ze’ev Wurman looked at several other indicators of student achievement and found none have improved since Common Core went into effect. In fact, SAT and ACT scores are slightly down. Maybe this spring’s new, Common Core-aligned SAT will start covering for that by making the test easier (as it has every time it did a major test change).
Employment - A third unfortunate aspect of Mr. Obama's legacy is his support for raising the minimum wage. Jed Graham of Investors Business Daily gives us some empirical evidence that forcing employers to pay their employees higher wages simply results in fewer employees:
Despite a nationwide increase in low-paying retail jobs (an increase of 400,000 retail jobs since the end of 2014) three states have experienced a contraction. One, North Dakota, has suffered because low worldwide oil prices have pulled in the reins on the fracking boom there, but the other two states Massachusetts and Connecticut have lost retail jobs for a different, thoroughly predictable, reason.

It’s probably no coincidence that Connecticut and Massachusetts, the first two states to approve hikes in their statewide minimum wage to north of $10 an hour, now stand out because of retail-employment contractions.

Massachusetts has lost 2,200 retail jobs since employment in the sector peaked last July, seasonally adjusted Labor Department data show. Retail employment is down 500 from December 2014, before the first step of the state’s wage hike went into effect. In January, two of the four Sam’s Clubs slated for closure by Wal-Mart were in Massachusetts. While the company didn’t specifically blame the state’s rising minimum wage, closures of other Wal-Mart discount centers or Neighborhood Markets came in high-minimum-wage havens including Oakland, the city of Los Angeles and Los Angeles County. Meanwhile, Wal-Mart scrapped plans for two stores in D.C., where the $10.50 minimum wage will rise by $1 this July.

Massachusetts was also home to two of the 40 department stores targeted for closure by Macy’s in January after a weak holiday season.

Before Massachusetts approved an $11 wage, Connecticut held the title for the highest state wage. The state first hiked its minimum from $8.25 to $8.70 at the start of 2014, then became the first to embrace President Obama’s call for a $10.10 wage. The current minimum wage of $9.60 will hit $10.10 at the start of 2017.

Now retail employment in Connecticut is down 1,400 from a peak of 185,000 first hit in April 2014, and the state has fewer retail jobs than it did 30 months ago.
This seems to be a pattern everywhere that the minimum wage is rising. American Apparel, for example, is laying off 500 workers in California.

This is how the market works and why government interference is so often counter-productive. Government intrusion into the economy is like trying to tune the engine of a race car with a hammer. When government deprives employers of the freedom to pay their workers what the workers are worth to the employer, the employer will simply hire fewer workers. Politicians will preen themselves on having raised the minimum wage while more low-skilled job-seekers despair of finding any work at all.

Adopting policies that put people out of work is a very odd way to help the poor, and it's a terrible way to build a legacy.

Saturday, April 16, 2016

Reductio ad Absurdum

The controversy surrounding open access to any rest room or locker room one feels comfortable in may strike you as ridiculous or it may seem to you to be a matter of basic human rights, but let's set aside for a moment our preconceptions. One test of an idea is to see what happens when it's taken to its logical conclusion. If the implications of an idea can be drawn out without devolving into absurdities then the idea is perhaps sound. If, when taken to its logical endpoint, though, the idea gets us entangled in absurdities from which we cannot extricate ourselves then that's a good indicator that the idea is itself absurd.

So with that short introduction out of the way watch this video, and ask yourself if there's anything wrong with the interviewer's logic. If there is not, ask yourself if there's anything wrong with the starting point of the students being interviewed. After all, something has to be wrong somewhere:
The tactic the interviewer employs in this video is called in logic reductio ad absurdum - a demonstration that the person's initial position, when taken to its logical terminus, reduces to an absurdity. In this case the affirmation that one is whatever one sees oneself as clearly entails that if a 5'9" white male sees himself as a 6'5" Chinese woman then he really is a 6'5" Chinese woman. This is, however, a position one can only hold on pain of being thought delusional.

Friday, April 15, 2016

Pay it Forward

Looking to do something that'll make a difference in people's lives? Why not consider Kiva:
Kiva is a world-wide microfinance organization. The way it works is people from all around the globe lend money to small businesspersons and others in the third world to help them obtain the capital they need to manage their businesses successfully. The borrowers then pay the loan back over time, and the lender can relend the money or withdraw it.

It's a great way to help the poor help themselves and a wonderful opportunity to express your gratitude for all that you have by "paying it forward". You can learn more about how Kiva works at the link.

Thursday, April 14, 2016

Nobody's Right If Everybody's Wrong**

There's an interesting contretemps bubbling at Marquette University over one professor's criticism of another instructor, and it seems in this case it's hard to find anyone who has behaved quite as they should have. Here's the story:
In November 2014 an undergraduate approached philosophy instructor and PhD candidate Cheryl Abbate, after a class on John Rawls’ theory of equal liberty. The student said he objected to her suggestions during the class that same-sex marriage isn’t open for debate and that “everyone agrees on this.”

Unknown to Ms. Abbate, the student recorded the exchange on his cell phone. During the conversation, she told him “there are some opinions that are not appropriate, that are harmful, such as racist opinions, sexist opinions” and if someone in the class was homosexual, “don’t you think that that would be offensive to them if you were to raise your hand and challenge this?”

When the student replied that he has a right to argue his opinion, Ms. Abbate responded that “you can have whatever opinions you want but I can tell you right now, in this class homophobic comments, racist comments and sexist comments will not be tolerated. If you don’t like that you are more than free to drop this class.” The student reported the exchange to Marquette professor John McAdams, who teaches political science. Mr. McAdams also writes a blog called the Marquette Warrior, which often criticizes the Milwaukee school for failing to act in accordance with its Catholic mission.
If the facts are as this Wall Street Journal article reports them then in my opinion both the instructor and the student behaved improperly, but the instructor's offense was much the worse. She teaches a philosophy class, for heaven sakes. To rule opinions out of bounds, provided they're courteously expressed, is professionally inexcusable. College is a place where students should be exposed to all sorts of viewpoints and be free to express their own. To treat people like fragile snowflakes which melt at the slightest touch of an unpleasant opinion, is to do students a serious disservice. If Ms. Abbate really did prohibit certain opinions to be voiced in her class then she needs to receive some lessons from the administration on the value of the free exchange of ideas in a college setting.

The student (perhaps) also acted improperly by secretly recording what seems to have been a private conversation between himself and Ms. Abbate.

But there's more.
Mr. McAdams wrote on his blog that Ms. Abbate was “using a tactic typical among liberals now. Opinions with which they disagree are not merely wrong, and are not to be argued against on their merits, but are deemed ‘offensive’ and need to be shut up.” His blog went viral, and Ms. Abbate received vicious emails. She has since left Marquette.
Needless to say, the students who sent the vicious emails were acting abominably. This is not the way to react to what Ms. Abbate did. Students could certainly have registered their displeasure without being mean-spirited, rude, or disrespectful. By acting this way they tacitly provide justification for Ms. Abbate's fear that allowing dissenting opinions in class would generate an uncomfortable classroom environment.

Mr. McAdams, too, is not without fault in this matter. He may have been correct in what he said, but throughout the almost fifty years I've been teaching it has always been my conviction that it's unprofessional for one colleague to criticize another to students. I'm not so naive as to think it isn't done, I know all too well that it is, but it's not only an egregious breach of professional ethics, it's often also very juvenile.

But we're not done.
[N]ow Marquette is going after Mr. McAdams. In December 2014, the school sent him a letter suspending his teaching duties and banning him from campus while it reviewed his “conduct” related to the blog post. “You are to remain off campus during this time, and should you need to come to campus, you are to contact me in writing beforehand to explain the purpose of your visit, to obtain my consent and to make appropriate arrangements for that visit,” Dean Richard Holz wrote.

Marquette President Michael Lovell told the tenured professor that he would be suspended without pay and would not be reinstated unless he admitted his conduct was “reckless” and apologized for the unpleasant emails Ms. Abbate received.
This seems to be a gross over-reaction. Mr. McAdams was wrong to stoke the hostility to this instructor - although it's not clear that he did so intentionally - and I have no problem with insisting that he accept responsibility in some fashion, but to deprive him of his livelihood for his offense seems draconian. Moreover, his punishment seems to violate the guidelines set forth in Marquette's Faculty Handbook which says professors may be terminated at the university’s discretion,
only for “serious instances of illegal, immoral, dishonorable, irresponsible, or incompetent conduct.” The handbook says that “in no case” may just cause for dismissal be interpreted “to impair the full and free enjoyment of legitimate personal or academic freedoms of thought, doctrine, discourse, association, advocacy, or action.”
So, Marquette seems to have a mess on its hands, one in which nobody looks particularly good. What do you think?

** Buffalo Springfield 1967, For What It's Worth:

Wednesday, April 13, 2016

Our Readership

From time to time I like to take a moment to thank the many faithful readers of Viewpoint who visit us from all over the world. Most of our readers are North American, of course, but we have dozens of regular readers in most European countries as well as in countries in South America, Asia, Australia, the Middle East, and Africa.

To give an idea of what VP's readership looks like Blogger provides data on visits to the blogs it hosts and has these stats for just the past week on Viewpoint:
  • United States - 459
  • France - 27
  • United Kingdom - 21
  • China - 19
  • Spain - 15
  • Germany - 8
  • Russia - 8
  • Burkina Faso - 5
  • Portugal - 5
  • Canada - 3
To see the readership data depicted visually scroll to the bottom of the page and click on the arrow. The red dots on the globe represent recent hits.

With my brother Bill's help we started this blog in 2004 with the intention of reaching just friends and family, but it has obviously grown beyond that, a growth for which I'm deeply grateful. I appreciate every single one of you and am both flattered and honored that so many of you have written to tell me you read VP regularly. Thanks.

Tuesday, April 12, 2016

The Unreasonable Effectiveness of Mathematics

Physicist Sir James Jeans, contemplating the fact that the universe seems so astonishingly conformable to mathematics, once remarked that God must be a mathematician. It would indeed be a breathtaking coincidence had the mathematical architecture of the cosmos just happened to be the way it is by sheer serendipity.

Here's a lovely video that illustrates just one example of how mathematics seems to lie at the fundament of the universe. The video describes how the geometry of nature so often exhibits what's called the Fibonacci sequence:
In 1959, the physicist and mathematician Eugene Wigner described the fact that mathematical equations describe every aspect of the universe as "the unreasonable effectiveness of mathematics."

Mathphobic students may wince at a statement like this, but it gets worse.

Physicist Max Tegmark has more recently claimed that the universe is not only described by mathematics, but is, in fact, mathematics itself.

To suggest that everything ultimately reduces to a mathematical expression is another way of saying that the universe is information. But if so, information doesn't just hang in mid-air, as it were. Behind the information there must be a mind in which the information resides or from which it arises. In either case, so far from the materialist belief that matter gives rise to everything else, it seems more likely that matter is itself a physical expression of information and that the information expressed by the cosmos is itself the product of mind.

In other words, it just keeps getting harder and harder to agree with the materialists that matter is the fundamental substance that makes up all reality. Materialism just seems so 19th century.

Monday, April 11, 2016

On Meaning

Holocaust survivor and psychologist Victor Frankl once wrote a book titled Man's Search for Meaning in which he asserted that man can't live without believing that there is some purpose or meaning to his life. To waken in the morning and realize that there's no real point to anything one does in the hours that lie ahead beyond just keeping oneself alive is psychologically deadening. It can lead to a kind of existential despair.

Each of us, of course, has projects which inject a kind of temporary meaning into our lives and help us to avoid a numbed listlessness, but when we ask what, in the overall scheme of things, those projects amount to, the answer seems to depend on how enduring they are.

Long term projects like raising a family or building a business seem more meaningful than short term projects like mowing the grass or watching a television program. Yet the problem is that if death ends our existence it also erases the meaning or significance of what we do, no matter how important it may seem to us while we're engaged in it.

For some, a relative few, their projects live on after them for a time, but even of many of these it might be asked, what's the point? Napoleon conquered much of Europe, was responsible for the slaughter of tens of thousands of men, but he was overthrown, died in exile, and the monarchy of France was restored. His deeds live on after his death, but what was the sense of all that death and carnage?

Meaning is a slippery notion, it's hard to define precisely what it is, but if our lives, like the light of a firefly, are here one instant and gone the next, if the earth is doomed to die a casualty of a solar supernova, then nothing lasts and nothing really means anything. Unless what we do matters forever it doesn't really matter at all.

These gloomy thoughts occurred to me as I read about a lecture given by biologist Jerry Coyne. Coyne told his audience that:
The universe and life are pointless....Pointless in the sense that there is no externally imposed purpose or point in the universe. As atheists this is something that is manifestly true to us. We make our own meaning and purpose.
This is perhaps the consensus view among those holding to a naturalistic worldview. It was eloquently articulated by philosopher Bertrand Russell in his book A Free Man's Worship in which he wrote the following words:
Such, in outline, but even more purposeless, more void of meaning is the world which Science presents for our belief. Amid such a world, if anywhere, our ideals henceforward must find a home. That Man is the product of causes which had no prevision of the end they were achieving; that his origin, his growth, his hopes and fears, his loves and his beliefs, are but the outcome of accidental collocations of atoms; that no fire, no heroism, no intensity of thought and feeling, can preserve an individual life beyond the grave; that all the labours of the ages, all the devotion, all the inspiration, all the noonday brightness of human genius, are destined to extinction in the vast death of the solar system, and that the whole temple of Man's achievement must inevitably be buried beneath the debris of a universe in ruins - all these things, if not quite beyond dispute, are yet so nearly certain, that no philosophy which rejects them can hope to stand. Only within the scaffolding of these truths, only on the firm foundation of unyielding despair, can the soul's habitation henceforth be safely built.
It's a bleak view of life, to be sure, but given that extinction awaits us, both individually and corporately, it's hard to dispute it. As the writer Somerset Maugham put it:
If death ends all, if I have neither to hope for good nor to fear evil, I must ask myself what am I here for….Now the answer is plain, but so unpalatable that most will not face it. There is no meaning for life, and [thus] life has no meaning.
The Russian writer Leo Tolstoy said essentially the same thing though with a bit more angst at the prospect of the emptiness and futility of existence:
What will come from what I am doing now, and may do tomorrow? What will come from my whole life? Otherwise expressed—Why should I live? Why should I wish for anything? Why should I do anything? Again, in other words, is there any meaning in my life which will not be destroyed by the inevitable death awaiting me?
If, though, death is not the end of our existence as a person then there's a chance that maybe there's meaning in the chaotic horror that is human history. If death is simply the transition between two stages of life, like the metamorphosis of a caterpillar to a butterfly, then maybe there's meaning, not only to history, but to each and every individual life.

If, on the other hand, death really is the end then all we are is dust in the wind, and philosophers and writers from Schopenhauer to Shakespeare are right: Life is just a tale told by an idiot, full of sound and fury and signifying nothing. We're born, we suffer, and we die, and that's all there is to it. Pretty depressing.

Saturday, April 9, 2016

What They Hate

Fareed Zakaria, a CNN commentator, recently did a series titled Why They Hate Us. The final installment might better have been subtitled What They Hate. The "They", of course, refers to Muslim extremists around the world. Zakaria makes a couple of important points. He writes:
The next time you hear of a terror attack -- no matter where it is, no matter what the circumstances -- you will likely think to yourself, "It's Muslims again." And you will probably be right. In 2014, about 30,000 people were killed in terror attacks worldwide. The vast majority of those perpetrating the violence were Muslim but -- and this is important -- so were the victims. Of the some 30,000 dead, the vast, vast majority were Muslims.

That's crucial to understand because it sheds light on the question, "Why do they hate us?" Islamic terrorists don't just hate America or the West. They hate the modern world, and they particularly hate Muslims who are trying to live in the modern world.
This is no doubt true, and it's worth pondering. Why do they hate the modern world? The Ostensible answer is because they see modern values and cultures as conflicting with Islam. This is not just an artifact of Islamic extremism. Zakaria tacitly admits that when he declares:
Let's be clear. While the jihadis are few, there is a larger cancer within the world of Islam -- a cancer of backwardness and extremism and intolerance. Most of the countries that have laws that restrict the free exercise of religion are Muslim majority, while those that have laws against leaving the faith are Muslim majority.
In other words, majority-Muslim countries enforce laws that are antithetical to the freedoms we take for granted in the West. The populace of these lands may not sympathize with the methods of the extremists, they may even themselves be victimized by the extremists, but they share in common with the extremists a view of the way the world should be that is innately hostile to the principles enshrined in our Bill of Rights and our Declaration of Independence, particularly freedom of religion, freedom of speech, freedom of the press, and the principles of human and gender equality.

Up to this point Zakaria has not written anything that could be gainsaid, but while admitting that this is a cancer in Islam, a cancer afflicting a great many Muslims who are not themselves terrorists but who sympathize with the ultimate goals of the terrorists, he is at pains to deny that this affliction is inherent in Islam. Here he embarks upon much more controversial terrain:
When experts try to explain that in the 14th century, Islamic civilization was the world's most advanced, or that the Quran was once read as a liberal and progressive document, they're not trying to deny the realities of backwardness today. What they are saying is that it can change.
It's not at all clear that Islamic civilization was ever the world's most advanced, nor is it clear what metrics should be employed to ascertain such a thing, but never mind. Zakaria has a more important question to ask:
Islam, after all, has been around for 14 centuries. There have been periods of war and of peace. Before 1900, for hundreds of years, Jews fled European pogroms and persecution to live in relative peace and security under the Ottoman Caliphate. That's why there were a million Jews in the Muslim Middle East in 1900. Today, Jews and Christians are fleeing from Iraq and Syria and radical Islamists take control of those lands. It's the same religion then and now. So what is different?(emphasis mine)

It's not theology, it's politics. Radical Islam is the product of the broken politics and stagnant economics of Muslim countries -- they have found in radical religion an ideology that lets them rail against the modern world, an ideology that is now being exported to alienated young Muslims everywhere -- in Europe, and even in some rare cases in the United States.

How can we bring an end to this?

There's really only one way: Help the majority of Muslims fight extremists, reform their faith, and modernize their societies. In doing so, we should listen to those on the front lines, many of whom are fighting and dying in the struggle against jihadis. The hundreds of Muslim reformers I've spoken to say their task is made much harder when Western politicians and pundits condemn Islam entirely, demean their faith, and speak of all Muslims as backward and suspect.
I want very much to agree with everything he has said here, but speaking only for myself there's a huge impediment standing in the way of my assent, and I suspect it stands in the way for a lot of devout Muslims as well. It's this: Among the obligations imposed upon pious Muslims there are two which are perhaps most important: Follow the example of the Prophet Mohammad, and follow the precepts of the Koran. The reason these obligations are a problem is that Mohammad himself practiced brutal war against infidels, and it's hard to read the Koran without getting the sense that it, too, endorses an any-means-necessary strategy for spreading the faith.

If this is true, perhaps the most intractable of the three measures Zakaria commends to us for ending Islamic terror is the second - reforming the faith. Accomplishing this would seem to be as difficult as convincing devout Christians that they must change their views, not only of the authority of the New Testament, but also of the authority of Jesus Christ. Most Christians would properly see such a demand as an attempt to extinguish Christianity altogether and most Muslims would, mutatis mutandis, probably view attempts to reform Islam the same way.

If reformation of the faith is the linchpin of the strategy to end terror, prospects for success don't look particularly promising.

Friday, April 8, 2016

When Does Inconsistency Become Hypocrisy?

One expects politicians and others who dabble in politics to sometimes say one thing but do another. It almost seems to be in their genes. Indeed, none of us is perfect, and most of us act from time to time in our lives in ways inconsistent with what we profess. Even so, PayPal and Apple are pushing inconsistency so far toward hypocrisy that it's hard to avoid the conclusion that they're behavior isn't just mere inconsistency.

An article by Bradford Richardson in the Washington Times tells us that both PayPal and Apple are seeking to punish North Carolina for passing a law that would prohibit people of one sex from using the restrooms of the other sex. This seems like pretty tame stuff, and I would suspect that most women, especially, would appreciate the added protection from male predators afforded them by the law, but apparently PayPal and Apple see a Very Important Principle being violated by the Tar Heel legislators:
PayPal drew a line in the sand when North Carolina passed a law prohibiting people from using the restrooms of the opposite sex, but critics say that line got washed away on the shores of Malaysia, a nation that consistently ranks among the least LGBT-friendly in the world.

The company canceled its plan to build a global operations center in Charlotte after the passage of HB2, which CEO Daniel Schulman called discrimination against the transgendered. He noted that the move will cost North Carolina 400 well-paying jobs.
Okay, but:
Malaysia’s Penal Code 187 — which punishes homosexual conduct with whippings and up to 20 years in prison — did not stop PayPal from opening a global operations center there in 2011, which the company estimated would employ 500 workers by 2013.
So, a law that protects women from having to share a restroom with someone with male physiological accoutrements is so egregious that PayPal, in high moral dudgeon, is taking its operations center elsewhere. Yet the company was not sufficiently outraged to decline to build a similar center in a country which punishes LGBT people with whippings and lengthy stays in their luxurious prisons. How do they reconcile these two reactions? Moreover:
PayPal’s international headquarters are located in Singapore, where sexual contact between males is punishable by up to two years in prison, and even littering can be punished by flogging. The company has a software development center in Chennai, India, where same-sex marriage is prohibited.
But it's not just PayPal that seems to have a problem with consistency:
Apple is among the other major corporations that have taken to the pulpit to lecture North Carolina for its sins despite doing business with anti-gay foreign regimes. CEO Tim Cook was one of several high-profile tech CEOs who signed a letter to Republican Gov. Pat McCrory calling on him to repeal the legislation.

“We are disappointed in your decision to sign this discriminatory legislation into law,” the letter reads. “The business community, by and large, has constantly communicated to lawmakers at every level that such laws are bad for our employees and bad for business.”

But, as Matt Sharp, legal counsel for the Alliance Defending Freedom, points out, that has not stopped Apple from opening stores in Saudi Arabia, where gay people are regularly executed in public and cross-dressing is also an offense that can, in the “best” case, end with a flogging. Pro-gay and trans advocacy are illegal, as is every religion except Islam.

“We’ve seen the same thing with Apple and some of these other companies that are fine doing business in Saudi Arabia and other countries that are extremely oppressive of the LGBT community,” Mr. Sharp said.
How do these companies respond to the charge of hypocrisy? With gobbledygook:
Deena Fidas, director of the Human Rights Campaign Foundation’s Workplace Equality Program, said PayPal’s expansion into areas with bad human rights and anti-gay records doesn’t call into question what the company is doing in the U.S. because it’s all about the message.

“Businesses are going to be making strategic decisions based on a variety of factors, and the fact that they’re not ceasing operations in other locales doesn’t diminish this very important moment where they’re sending a very clear message,” Ms. Fidas said.

“The reality is that right now, at any given moment, businesses are expanding in locales that have some problematic laws on the books for the LGBT community,” she added. “But what we’ve found is the private sector can be a bastion in an otherwise-unwelcoming climate.”
American Airlines and the NBA are also looking pretty "inconsistent" on this issue:
A spokeswoman for American Airlines, which has its second-biggest hub in Charlotte, called such laws “bad for the economies of the states in which they are enacted.” And the National Basketball Association, which has scheduled next season’s All-Star Game for Charlotte, said it is “deeply concerned that this discriminatory law runs counter to our guiding principles of equality and mutual respect.”

However, both corporations are eager to do business in China — American operates flights to Beijing and Shanghai; the NBA played two exhibition games in China before this season and will play two before this fall in the basketball-crazy country.

In addition, earlier this spring, American Airlines made clear that it plans to use its Miami hub for business in communist Cuba, petitioning for 12 of the 20 available daily flights to the Cuban capital and requesting flights to five other Cuban cities.
Yet China and Cuba are tough terrain for LGBTs:
While neither China nor Cuba criminalizes homosexuality as a form of bourgeois decadence, as each did during the Cold War era, gay rights are severely limited. Both nations have constitutional provisions defining marriage as the union of a man and a woman, a holy grail far beyond what America’s religious conservatives can hope to pass in 2016.

In both countries adoption by gay couples is banned, and anti-discrimination laws operate on the basis of sex, race, religion and other categories but do not protect the transgendered. Cuba’s anti-discrimination laws do cover gays, but only in some fields. Also, both countries are still officially one-party dictatorships that routinely arrest and beat anti-government protesters, limit the activities of dissidents and restrict depictions of homosexuality.

The official China Television Drama Production Industry Association posted new regulations stating that “no television drama shall show abnormal sexual relationships and behaviors, such as incest [and] same-sex relationships.”
We don't want to be too quick to call all this hypocrisy, but it sure looks like these corporations have one set of rules for the U.S. and a much less draconian set of rules for other nations. I'm loath to ask this, but do you think that in the end it may have something to do with profits?

Thursday, April 7, 2016

Where Does Altruism Come From?

Damon Linker argues at The Week that self-sacrifice is inexplicable on naturalism. Naturalism rests heavily upon evolutionary explanations of behavior, but cases like that of Thomas Vander Woude simply don't fit that narrative. Here's Vander Woude's story:
[C]onsider Thomas S. Vander Woude, the subject of an unforgettable 2011 article by the journalist Jeffrey Goldberg. One day in September 2008, Vander Woude's 20-year-old son Josie, who has Down syndrome, fell through a broken septic tank cover in their yard. The tank was eight feet deep and filled with sewage. After trying and failing to rescue his son by pulling on his arm from above, Vander Woude jumped into the tank, held his breath, dove under the surface of the waste, and hoisted his son onto his shoulders. Josie was rescued a few minutes later. By then his 66-year-old father was dead.

This is something that any father, atheist or believer, might do for his son. But only the believer can make sense of the deed.

Pick your favorite non-theistic theory: Rational choice and other economically-based accounts hold that people act to benefit themselves in everything they do. From that standpoint, Vander Woude — like the self-sacrificing soldier or firefighter — was a fool who incomprehensibly placed the good of another ahead of his own.

Other atheistic theories similarly deny the possibility of genuine altruism, reject the possibility of free will, or else, like some forms of evolutionary psychology, posit that when people sacrifice themselves for others (especially, as in the Vander Woude case, for their offspring) they do so in order to strengthen kinship ties, and in so doing maximize the spread of their genes throughout the gene pool.

But of course, as someone with Down syndrome, Vander Woude's son is probably sterile and possesses defective genes that, judged from a purely evolutionary standpoint, deserve to die off anyway. So Vander Woude's sacrifice of himself seems to make him, once again, a fool.

Things are no better in less extreme cases. If Josie were a genius, his father's sacrifice might be partially explicable in evolutionary terms — as an act designed to ensure that his own and his son's genes survive and live on beyond them both. But this egoistic explanation would drain the act of its nobility, which is precisely what needs to be explained.

We feel moved by Vander Woude's sacrifice precisely because it seems selfless — the antithesis of evolutionary self-interestedness.

But why is that? What is it about the story of a man who willingly embraces a revolting, horrifying death in order to save his son that moves us to tears? Why does it seem somehow, like a beautiful painting or piece of music, a fleeting glimpse of perfection in an imperfect world?
It's an interesting question that Linker raises. Even if we can account for what Vander Woude did in evolutionary terms it's harder to account for why we who read about it think he did something noble and wonderful. Most evolutionists deny that there is such a thing as true altruism, that everything we do has a self-interested motive buried somewhere, but it's hard to see what that would be in the case of Thomas Vander Woude.

Linker believes that such acts of radical altruism give us a fleeting glimpse of something transcendent. Whether one agrees with that conclusion or not it certainly seems that altruism, unlike egoism, is very difficult to account for in a naturalistic worldview.

Wednesday, April 6, 2016

Are We Living in the Matrix?

One of the perplexing metaphysical questions posed by philosophers has to do with the ontological nature of reality. Is matter the fundamental substance which makes up all reality or is the fundamental reality composed of mind? Those aren't the only alternatives, of course.

For example, it could be that reality is a composite of both matter and mind, but if, as many thinkers are beginning to suspect, the world is more like an idea than it is like a machine, if it's true, as many believe, that all matter is somehow an instantiation of information, then it would seem that mind, or mental substance, is the fundamental constituent of reality.

If that's so then maybe the world is, at bottom, something like the Matrix:
Thinking about this spurs us on to ask lots of further questions. Here's perhaps one of the biggest: If we're really living in a matrix-like reality, who or what programmed it?

Tuesday, April 5, 2016

The End of Moral Relativism

The Atlantic's Jonathan Merritt argues that our culture's fling with moral relativism, a fling that's persisted for at least sixty years, is over. Borrowing from a column by the New York times' David Brooks, Merritt maintains that, at least among Millenials, we're seeing what may be described as a New Absolutism. He may be right, but I think the absolutism he sees having descended upon us like a smog is not really an absolutism at all, but rather an emotivist power play.

I'll explain shortly, but first some excerpts from Merritt's column:
In The New York Times last week, David Brooks argued that while American college campuses were “awash in moral relativism” as late as the 1980s, a “shame culture” has now taken its place. The subjective morality of yesterday has been replaced by an ethical code that, if violated, results in unmerciful moral crusades on social media.

A culture of shame cannot be a culture of total relativism. One must have some moral criteria [by] which to decide if someone is worth shaming.

“Some sort of moral system is coming into place,” Brooks says. “Some new criteria now exist, which people use to define correct and incorrect action.”

This system is not a reversion to the values that conservatives may wish for. America’s new moral code is much different than it was prior to the cultural revolution of the 1960s and 70s. Instead of being centered on gender roles, family values, respect for institutions and religious piety, it orbits around values like tolerance and inclusion. (This new code has created a paradoxical moment in which all is tolerated except the intolerant and all included except the exclusive.)

Although this new code is moral, it is not always designated as such. As Brooks said, “Talk of good and bad has to defer to talk about respect and recognition.” No wonder many God-and-family conservatives dislike this new moral code as much as the relativism it replaced.
To be sure, there's a new wave of moral absolutism sweeping academia which sees things like racism, sexism, support for Israel, and reservations about gay marriage and global warming as absolutely wrong, but to simply point this out and then conclude that relativism is dead is to miss the fatal weakness lurking in the moral passion to which Merritt alludes.

That weakness lies in the fact that the moral fervor with which the above positions are often held on campus and in the media today has no ground in any objective moral referent. These positions are based on nothing more than the ardent feelings of those who hold them. As such they may be regarded as absolutes by those convinced of their rightness, but, if so, they are arbitrarily chosen absolutes, which is to say they're not really absolutes at all.

For any moral principle to be absolute it has to be objectively grounded in something which transcends one's own feelings, indeed which transcends humanity altogether. Otherwise, how do we adjudicate between the passionate feelings of one person and the passionate feelings of another? We can't, of course, unless we have some standard to which we can compare those disparate passions. Lacking such a standard, when we say something is wrong all we're saying is that it offends my personal preferences, to which someone might well ask, "Why should your preferences be the standard of right and wrong for everyone else?"

The new moral absolutism to which Merritt and Brooks advert is not absolutism at all. It's simply a form of narcissistic subjectivism, or egoism, which presumes that anything which transgresses my personal moral preferences is wrong for everyone and anyone to do.

You, let's say, think it's right to help the poor. I, let's say, think we should adopt social Darwinism and let the poor fend for themselves. You insist I'm wrong. I ask why am I wrong? You say because I'm being selfish and greedy. I ask why selfishness and greed are wrong. You say because they hurt people. I ask why it's wrong to hurt people. You reply that I wouldn't want people to hurt me. I respond that that's true but it's not a reason why I should care about hurting others. You give up on me, judging me hopelessly immoral, but what you haven't succeeded in doing is explaining why it's wrong to let the poor suffer. You've simply given expression to your feelings about the matter.

For there to be any objective moral duties there has to be a transcendent moral authority, a God, from which (whom) all moral goodness is derived. Take away God, as our secular society seems eager to do, and all we're left with is emotivism - people insisting that their emotional reactions to events are "right" and contrary reactions are "wrong", but lacking any basis for making such judgments.

This is not to say that if one believes in God one will know what's right. Nor is it to say that even if one knows what's right one will do what's right. What it is to say is that unless there is a God, or something very much like God, there simply is no right or wrong, and certainly no absolute moral duties.

As belief in God disappears morality becomes as insubstantial as the grin of the Cheshire cat in Alice in Wonderland - to borrow a reference from this marvelous tale for the second week in a row:

Monday, April 4, 2016

Moral Paralysis

In 2011 I ran the following post on moral relativism, and having talked about that topic recently in my classes I thought it'd be appropriate to run it again, slightly edited:

Denyse O'Leary passes on a story told by a Canadian high school philosophy teacher named Stephen Anderson. Anderson recounts what happened when he tried to show students what can happen to women in a culture with no tradition of treating women as human beings:
I was teaching my senior Philosophy class. We had just finished a unit on Metaphysics and were about to get into Ethics, the philosophy of how we make moral judgments. The school had also just had several social-justice-type assemblies — multiculturalism, women’s rights, anti-violence and gay acceptance. So there was no shortage of reference points from which to begin.

I decided to open by simply displaying, without comment, the photo of Bibi Aisha (see below). Aisha was the Afghani teenager who was forced into an abusive marriage with a Taliban fighter, who abused her and kept her with his animals. When she attempted to flee, her family caught her, hacked off her nose and ears, and left her for dead in the mountains. After crawling to her grandfather’s house, she was saved by a nearby American hospital.

I felt quite sure that my students, seeing the suffering of this poor girl of their own age, would have a clear ethical reaction, from which we could build toward more difficult cases.

The picture is horrific. Aisha’s beautiful eyes stare hauntingly back at you above the mangled hole that was once her nose. Some of my students could not even raise their eyes to look at it. I could see that many were experiencing deep emotions, but I was not prepared for their reaction.

I had expected strong aversion; but that’s not what I got. Instead, they became confused. They seemed not to know what to think. They spoke timorously, afraid to make any moral judgment at all. They were unwilling to criticize any situation originating in a different culture.

They said, “Well, we might not like it, but maybe over there it’s okay.” One student said, “I don’t feel anything at all; I see lots of this kind of stuff.” Another said (with no consciousness of self-contradiction), “It’s just wrong to judge other cultures.”

While we may hope some are capable of bridging the gap between principled morality and this ethically vacuous relativism, it is evident that a good many are not. For them, the overriding message is “never judge, never criticize, never take a position.”
This is a picture of Bibi Aisha. She was deliberately mutilated by her family because she did not want to stay in a marriage to which she did not consent and in which she was treated like livestock.

Anyone who would do this to another human being is evil. Any culture which condones it is degenerate, and any person who cannot bring themselves to acknowledge this, or to sympathize with her suffering, is a moral dwarf.

The shocking prevalence of moral dwarfism in our culture should not surprise us, however. Once a society jettisons its Judeo-Christian heritage it no longer has any non-subjective basis for making moral judgments. Its moral sense is stunted, warped, and diminished because it's based on nothing more than one's own subjective feelings.

With no objective moral standard by which to judge behavior people lose confidence in their moral judgments. They doubt that their opinions are any more "right" than the opinions of the people who did this to Bibi Aisha, and so they say things like, "If it's right for them then it's right", or "It's wrong for us to judge others", or "If you say it's wrong that's just your opinion."

This is moral paralysis, and it's a legacy of modernity and the secular Enlightenment which, in their embrace of metaphysical naturalism, have pulled the rug out from under all objective moral standards and offered nothing that can take their place beyond a vapid subjectivism.

Saturday, April 2, 2016

Indoctrinating the Young

An article in The Guardian by Nathalia Gjersoe is remarkable for what it suggests about current trends in the philosophy of education. The article points out that students, including secular students, seem intuitively resistant to Darwinian evolution and intuitively inclined toward intelligent design. This is a puzzle and one that Gjersoe and the people she writes about are resolved to correct. She writes:
Evolution is poorly understood by students and, disturbingly, by many of their science teachers. Although it is part of the compulsory science curriculum in most schools in the UK and the USA, more than a third of people in both countries reject the theory of evolution outright or believe that it is guided by a supreme being.

It is critical that the voting public have a clear understanding of evolution. Adaptation by natural selection, the primary mechanism of evolution, underpins a raft of current social concerns such as antibiotic resistance, the impact of climate change and the relationship between genes and environment. So why, despite formal scientific education, does intelligent design remain so intuitively plausible and evolution so intuitively opaque? And what can we do about it?
Unfortunately, Gjersoe's paragraph suggests that she herself may not have a very good grasp of evolutionary theory. Contemporary biologists have grown increasingly skeptical that natural selection is anything more than an ancillary mechanism driving evolutionary change, but let's set that aside. We can set aside, too, the fact that if people believe evolution is guided by a transcendent agent it's not evolution they reject but rather metaphysical naturalism.

In any case, Gjersoe goes on to assert that one reason for the resistance to evolutionary explanations of origins is that people, especially youngsters, intuitively hold to a view that she calls psychological essentialism which holds that species don't change. She asserts that this "is one of the primary reasons why the theory of evolution is so widely misunderstood by both children and adults."

I'm not convinced of that. I suspect that most people have trouble with naturalistic (i.e. Darwinian) evolution because it seems so implausible to them that blind, impersonal, unguided forces could produce something so intricate and complex as a human immune system or something so gratuitous and yet marvellous as sexual reproduction. The notion that such phenomena, as well as hundreds, if not millions, of other examples can be explained by the random activity of chance mutation and the vicissitudes of natural selection strikes many people, including many biologists, as quite nearly miraculous, and if miracles are to be part of our set of beliefs many are more inclined to think that their source is a personal agent rather than impersonal nature.

When viewers watch something like the following video, for instance, they're quite naturally stunned to think that the molecular machines discussed in it are some sort of fortuitous accident. When the examples of such amazing complexity and apparent design accumulate to a certain threshold, the fortuitous accident theory becomes literally incredible for a lot of people:
So, what to do about this very powerful inclination to see purposeful design in the world? How can it be stifled and eradicated so that people will be much more open to accepting Darwinian evolution? Gjersoe's answer is to indoctrinate children at ever younger ages:
So how do we override such widespread and tenacious cognitive biases?....

Evolution is typically taught to students at around 14- to 15-years of age as they prepare for their GCSEs. After persistent lobbying by the British Humanist Association, evolution was included in the British national primary curriculum for the first time last year. From September 2015, students will be taught about evolution from Year 6, at around 10-11 years of age.

Might this still be too late? [Deb Kelemen and colleagues at Boston University] chose 5- to 8-year-olds to test because at this age promiscuous teleology and psychological essentialism are still separate and fragmentary. She argues that by 10 years of age they have coalesced into a coherent theoretical framework that then gets in the way of contradictory scientific explanations and may remain the default, gut reaction, even in adults....If part of the reason that intelligent design is so popular is because it seems intuitively correct, might the solution be to disrupt those intuitions very early on? Should we reconsider the national curriculum (yet again) and start teaching evolution even earlier?
It's ironic that we are often told that parents and teachers should not instill their values in their children but should let them come to their own conclusions about right and wrong, religion, politics, etc. but then we read articles like this which celebrate doing that exact thing when it comes to explaining how living things came to be what they are. It seems that whether or not it's a good thing to instill one's values and beliefs in the young just depends on what one's values and beliefs are.