Thursday, March 24, 2016

Stop Pretending

Almost as regularly as if it were a law of nature, whenever someone points out that there is something deeply wrong with a religion which produces so many evil, murderous people, enlightened folk on the left reply by reminding us that other religions (i.e. Christianity) have also produced evil, murderous people. After all, they declaim as they activate their historical dredges, what about the Crusades 900-1000 years ago and the Inquisition 600-700 years ago.

Matt Walsh has had quite enough of this silliness and has penned an amusing riposte to this foolish "religious equivalence" argument which I urge readers to check out in its entirety. Here are some especially pungent passages:
....Speaking of amounting to nothing, I vented my frustration on Twitter, because what else can I do? I said I’m sick of hearing about the great benefits of multiculturalism. Europe is drowning in a tidal wave of unassimilated Muslims who are actively hostile to Europe’s culture, or what’s left of it. And for their trouble, Europeans are being gang raped in the street at night and blown to pieces during their morning commute.

Diversity is a strength, they tell me, but I have seen no evidence to support this doctrine. Diversity of thought might be a strength, but even then it is only a strength if the thought is rational and directed towards truth. The nonsensical thoughts of relativistic nincompoops are not valuable or helpful.

Similarly, racial and cultural diversity does not enrich us if we lose our identity in the process. When you throw a bunch of people with diametrically opposed beliefs and values and priorities into a food processor and hit frappe, you end up with a smoothie that tastes an awful lot like the collapse of western civilization and the rise of barbarians.

I didn’t say all of that on Twitter, what with the character limit, but that was my point. I also said I’m sick of the religious equivalences. Every time Muslims kill a bunch of people, secular liberals start in with their standard attempts to prove Christians are just as bad. It seems nothing will ever convince them that perhaps Islam is somewhat unique in its capacity for violence and atrocities.

So, naturally, my protests against equivalences were met with equivalences. A bunch of leftists told me Christianity and Islam are equal because of, among other things, a series of 11th century military campaigns. And so on and so on and so on. But it got even more absurd. Leftists are so desperate to draw parallels between Christianity and Islam that one of them attempted to mitigate the Brussels attack by reminding me about the Christians who took over a nature reserve in Oregon a few weeks ago.

Me: “Muslims keep blowing things up.”

Liberals: “Yeah but Christians trespassed on a wildlife refuge in Oregon!”

God help us.

I know this is probably a futile effort, but, in light of this umpteenth Islamic terrorist attack this year, I want to concentrate on this Christianity vs. Islam comparison for a moment. I know I probably will never convince leftists to stop idolizing multiculturalism and diversity, but perhaps I can convince them to stop pretending Islam and Christianity are on the same playing field.

OK, I know I will not convince them of that either, but allow me to waste my breath anyway.

Let’s start with the fact that we knew the terrorists in Brussels were Muslim without waiting for anyone to confirm it. We always know it without being told. Leftists know it too, but they haven’t stopped to ask themselves, if Islam is a peaceful religion, why are Muslims literally the only people in the world setting bombs off in subway stations and airports and theaters and embassies and restaurants. Spin this anyway you like, but right now the global terrorism market is a Muslim monopoly. We are certain a terrorist attack was carried out by Muslims the moment the bomb explodes. Shouldn’t that tell you something?

There is no Christian terrorism epidemic. That’s why nobody stops and says, “Wait, maybe the Brussels airport is covered in blood and debris because a white Presbyterian was trying to make a point about Jesus!” Nobody says that because there is literally zero chance of that being the case. Zero chance. It’s not possible. On the extraordinarily rare occasion that a Christian launches a lethal attack against a civilian target in the name of his faith, it’s almost always against something like an abortion clinic. And even then, it almost never happens. The last one was months ago, and the dude was psychotic, not religious.

Still, if you count him, that makes about one attack every decade or so, usually with no causalities, carried out against a facility that executes children. Compare that with the endless stream of massive assaults waged in the name of Islam against entirely random and innocent civilians sitting in restaurants or waiting in lines at airports or subway stations. You can’t compare them. There is no comparison to be made.

Indeed, the fact that abortion clinics don’t have to be fortified and surrounded by 40 armed guards every hour of the day shows just how incredibly effective Christianity is at preventing its adherents from resorting to violence in its name. So, yes, Christianity can lecture other religions about violence. Christianity is much better at standing against violence. Christianity is much more effective at advancing peace in the world. Christianity is just a better religion. It’s better. In every way. It’s better.

And it’s better not only because far fewer acts of evil are performed in its name, but because many more acts of love and mercy are performed in its name. No other religion sends people out to every decaying and forgotten corner and crevice of this Earth to heal the sick, serve the needy, and minister to the hopeless. No other religion runs nearly as many hospitals, homeless shelters, soup kitchens, etc.

If you find a group of foreigners digging a well in Guatemala or handing out mosquito nets in Uganda, they’re probably Christians. On the other hand, if you find a group of foreigners planting explosives in a subway station, they’re probably Muslims.

Christians themselves are flawed, but the faith has had, to put it mildly, an unmistakably positive influence on the world. Right now, as we speak, there are millions upon millions of people across the planet who would not be eating, taking medicine, or sleeping in a warm bed without the concerted efforts of Christians acting at the behest of Scripture. And as a reward, some of them can look forward to being crucified, literally. By Muslims, of course.

Christianity built Western civilization. Christianity advanced the doctrine of Natural Law, which serves as the basis for all of our liberties. Christianity defeated slavery and won the fight for civil rights. Christianity had a hand, and sometimes was the only hand, in most every good and decent thing about this world.

Christians are not perfect, but Christianity is. And the more Christian the world is, the better the world is. The less, the worse. That’s how it’s worked for 2,000 years. History has demonstrably proven Christianity to be an objectively necessary and indispensable force for good over and over and over again.

Equivalence? You cannot begin to find one. There are bad Christians and good Muslims, but if Christianity ceased to exist, millions of people would die. If Islam ceased to exist, millions of people wouldn’t. Draw whatever conclusions you want from there, but you cannot conclude that the two are equal.

But what are Christian fundamentalists? Women who wear long skirts and give birth to more than two kids? Men who go to church and read their Bibles? Unmarried couples who save sex for marriage? Pro-lifers who pray outside of abortion clinics? Bakers who won’t make cakes for gay weddings? Nuns? Pastors? Missionaries?

Liberals are fond of saying “fundamentalism” is the problem generally, as if living by your convictions is wrong regardless of the nature of your convictions. Such an idiotic notion can be expected from moral relativists who believe nothing to be fundamentally true, therefore anyone who adheres to any fundamental doctrine, no matter the doctrine, is dangerous.

On the contrary, Christian fundamentalism is a great blessing to society. It makes people peaceful, disciplined, humble, and kind. A Christian fundamentalist opens a pregnancy center to help women. A Muslim fundamentalist drags a woman into the town square and stones her to death. Equivalence? Stop it.

Obscure nut-jobs like the Westboro Baptists are not Christian fundamentalists. They are apostates. They’ve fabricated their own fundamentals and sprinkled a little Jesus on top of the fake religion they made up. Christian fundamentalists are only a problem when they are fundamentally dedicated to the fundamentals of their own heresies. But even these Christian heretics aren’t often found shooting up Paris to advance their cause.
After considering a few similarities between liberalism and Islamism, Walsh closes with this:
But of the theist religions, Islam is the only one routinely massacring civilians across the world. The only one. Stop claiming otherwise. Stop equivocating.
It's often said that the murderous Muslims are only a small percentage of Muslims, but there are said to be 1.6 billion Muslims in the world. Even if only .1% of them are jihadists that's still a million and a half radical killers, and that doesn't count the millions more who are sympathetic to them. That's quite a lot of people who would like to see you dead.

It's also often said that these people are not true Muslims and are not following the teachings of the Koran. On the contrary, there's ample warrant in the Koran for everything they do, but more importantly they are following the example of the founder of their religion. Whatever Mohammad may have recorded in print he himself was a violent warlord who condoned beheading one's foes, sex slavery, and warfare against non-Muslims.

People sometimes ask why more Muslims don't condemn the sort of terrorism we've seen in Paris and Brussels and from ISIS. One reason, perhaps, is that they realize they can't condemn it without distancing themselves from the example of their Prophet, and this, they see, would be an unforgivable betrayal of their faith.

Wednesday, March 23, 2016

Quantum Spookiness

The universe is a very strange place, stranger than we can imagine. One of the strangest things about it is something Albert Einstein once referred to as "spooky action at a distance." In quantum mechanics there's a phenomenon called quantum entanglement. No one knows how it works, no one really understands it, but every time it's been tested it's been shown to exist, and it's absolutely bizarre.

Here's the nutshell version: Two subatomic particles, e.g. electrons, can be produced from the disintegration of another particle. These daughter particles then travel at enormous velocities away from each other, but they somehow remain connected such that if a property of one of them is changed the same property in the other one changes simultaneously even though any signal sent from one to the other would have to travel at infinite speed to affect the other. This, though, is impossible, so how does the second electron know what's happened to the first? No one knows the answer to this, which is why Einstein, who could never accept the idea of entanglement, called the phenomenon "spooky."

Here's an excellent 15 minute video featuring physicist Brian Greene explaining this quantum weirdness:
An article at Nature discusses a recent test that pretty much clinches the theory that somehow particles that are widely separated from each other, even at opposite ends of the universe, are still in some mysterious way connected so that they can communicate instantaneously with each other:
It’s a bad day both for Albert Einstein and for hackers. The most rigorous test of quantum theory ever carried out has confirmed that the ‘spooky action at a distance’ that the German physicist famously hated — in which manipulating one object instantaneously seems to affect another, far away one — is an inherent part of the quantum world.

The experiment, performed in the Netherlands, could be the final nail in the coffin for models of the atomic world that are more intuitive than standard quantum mechanics, say some physicists. It could also enable quantum engineers to develop a new suite of ultrasecure cryptographic devices.

“From a fundamental point of view, this is truly history-making,” says Nicolas Gisin, a quantum physicist at the University of Geneva in Switzerland. In quantum mechanics, objects can be in multiple states simultaneously: for example, an atom can be in two places, or spin in opposite directions, at once. Measuring an object forces it to snap into a well-defined state. Furthermore, the properties of different objects can become ‘entangled’, meaning that their states are linked: when a property of one such object is measured, the properties of all its entangled twins become set, too.

This idea galled Einstein because it seemed that this ghostly influence would be transmitted instantaneously between even vastly separated but entangled particles — implying that it could contravene the universal rule that nothing can travel faster than the speed of light. He proposed that quantum particles do have set properties before they are measured, called hidden variables. And even though those variables cannot be accessed, he suggested that they pre-program entangled particles to behave in correlated ways.
The recent experiments cited in the Nature article are said to show that Einstein was wrong and that entanglement exists. The universe is indeed a very strange place.

Tuesday, March 22, 2016

Faith vs Fact

Biologist Austin Hughes has a lengthy critique at The New Atlantis of fellow biologist Jerry Coyne's recently published screed against religion titled Faith vs. Fact. Hughes gives a couple of paragraphs in his review to two challenges Coyne directs at believers and adroitly dispatches both of them. Here's Hughes:
Coyne issues the following challenge to his readers: “Over the years, I’ve repeatedly challenged people to give me a single verified fact about reality that came from scripture or revelation alone and then was confirmed only later by science or empirical observation.” I can think of one example, which comes from the work of St. Thomas Aquinas (whose writings Coyne badly misrepresents elsewhere in his book). Based on his exposure to Aristotle and Aristotle’s Arab commentators, Aquinas argued that it is impossible to know by reason whether or not the universe had a beginning. But he argued that Christians can conclude that the universe did have a beginning on the basis of revelation (in Genesis).

In most of the period of modern science, the assumption that the universe is eternal was quietly accepted by virtually all physicists and astronomers, until the Belgian Catholic priest and physicist Georges LemaƮtre proposed the Big Bang theory in the 1920s. Coyne does not mention LemaƮtre, though he does mention the data that finally confirmed the Big Bang in the 1960s. But, if the Big Bang theory is correct, our universe did indeed have a beginning, as Aquinas argued on the basis of revelation.

Coyne pairs the above challenge with an earlier challenge from new atheist writer Christopher Hitchens: “Name me an ethical statement made or an action performed by a believer that could not have been made or performed by a non-believer.” I agree that there is no a priori reason why atheists could not perform the kinds of heroic actions of self-sacrifice on behalf of the poor and marginalized that St. Vincent de Paul or St. Damien of Molokai are known for. It’s just that atheists so very rarely do. They have little to compare to the lives of the saints as a storehouse of examples of moral greatness.
Actually, I think there's a better answer to this second challenge: To wit: "Cruelty is objectively wrong." True, a non-believer can speak the words, but if atheism is correct their assertion would be false. On atheism there simply are no objective rights and wrongs.

In any case, Coyne's book, like much of the New Atheist genre, is filled with misunderstandings, self-contradictions, non-sequiturs, and philosophical simple-mindedness as both Hughes and philosopher Ed Feser have documented. Here are a couple of examples of Coyne's infelicities that occurred to me as I read the book last summer:

Throughout his book Coyne adopts a tendentious definition of faith. He insists that faith is believing something despite the lack of evidence, or even in the face of counter-evidence. He observes that such faith is considered a vice in science but esteemed as a virtue in religion, and indeed it would be a scientific vice were the definition correct, but it's not. Faith, whether in science or religion, is manifestly not belief despite the lack of evidence. It's belief despite the lack of proof or certainty, which is a quite different matter. Coyne's straw man definition is a description of "blind" faith, but it's not an accurate description of the faith of millions of thoughtful believers.

He goes on to make the astonishing claim that there's no evidence whatsoever of the existence of supernatural entities or powers. I say this is an astonishing claim because it requires the one who makes it to willfully avert one's eyes from the voluminous positive evidence afforded by both cosmology and biology - the fine-tuning of cosmic parameters and constants, the evidence of a beginning to the universe, the irreducible complexity of many cellular bio-machines and systems, and on and on. The claim also requires that the one who affirms it be wholly unacquainted with the numerous powerful metaphysical arguments for God's existence.

To be sure, it is perhaps the case that none of this evidence amounts to a certainty that God exists. Perhaps, though it's doubtful, naturalists will someday show that there are satisfactory naturalistic explanations for all of these aforementioned facts and arguments, but to think that these are not evidence is to belie a fundamental confusion in one's mind between evidence and proof. It is also to confuse plausibility with certainty. Ideally, both philosophers and scientists believe what they conclude to be the most plausible, or probably true, hypothesis. They don't wait for certainty, like Godot's friends waiting for him to show up, before making their epistemic commitments. One may suppose that the evidence is not sufficiently powerful to compel one to accept the conclusion that there is a God, especially if one does not want there to be a God in the first place, but what one cannot say is that there is no evidence that would justify the conclusion that God exists if one were open to that possibility.

The accumulated labor of both scientists and philosophers has taken that option off the table.

Monday, March 21, 2016

Determinism, Compatibilism and Libertarianism

In class discussions of free will and determinism, a number of students have asked if there isn't a middle way. One student even dug a post out of the archives that I did on just such a via media back in 2008 (12/24). The post starts out by addressing the notion of a kind of compromise position between libertarian free will and determinism, usually referred to as "compatibilism," and ends up summarizing the discussions we've had in class on these different philosophical positions. Here it is:

Barry Arrington at Uncommon Descent offers a succinct rebuttal of compatibilism, i.e. the view that our choices are fully determined and yet at the same time free. As Arrington points out, this certainly sounds like a contradiction.

The compatibilist defines freedom, however, as the lack of coercion, so as long as nothing or no one is compelling your behavior it's completely free even though at the moment you make your decision there's in fact only one possible choice you could make. Your choice is determined by the influence of your past experiences, your environment and your genetic make-up. The feeling you have that you could have chosen something other than what you did choose is simply an illusion, a trick played on us by our brains.

Compatibilism, however, doesn't solve the controversy between determinism and libertarianism (the belief we have free-will). It simply uses a philosophical sleight-of-hand to define it away. As long as it is the case that at any given moment there's just one possible future then our choices are determined by factors beyond our control, and if they're determined it's very difficult to see how we could be responsible for them. Whether we are being compelled by external forces to make a particular choice or not, we are still being compelled by internal factors that make our choice inevitable. Moreover, these internal factors are themselves the product of genetic and/or environmental influences.

The temptation for the materialist is to simply accept determinism, but not only does this view strip us of any moral responsibility, it seems to be based on a circularity: The determinist says that our choices are the inevitable products of our strongest motives, but if questioned as to how we can identify our strongest motives he would simply invite us to examine the choices we make. Since our strongest motives determine our actions our strongest motives are whichever motives we act upon. But, if so, the claim that we always act upon our strongest motives reduces to the tautology that we always act upon the motives we act upon. This is certainly true, but it's not very edifying.

On the other hand, it's also difficult to pin down exactly what a free choice is. It can't be a choice that's completely uncaused because then it wouldn't be a consequence of our character and in what sense would we be responsible for it? But if the choice is a product of our character, and our character is the result of our past experiences, environment, and our genetic make-up, then ultimately our choice is determined by factors over which we have no control, and we're back to determinism.

It seems to me that if materialism is true, and all we are is a material, physical being, and all of our choices are simply the product of chemical reactions occurring in the brain, then determinism must be true as well. If so, moral responsibility and human dignity are illusions, and no punishment or reward could ever be justified on grounds of desert.

This all seems completely counter-intuitive to most people so they hold on to libertarianism, even if they can't explain what a free choice is, but libertarianism is incompatible with materialism. Only if we have a non-physical, immaterial mind that somehow functions in human volition can there be free will and thus moral responsibility and human dignity.

Saturday, March 19, 2016

The Inexplicable Conservative Adulation of Trump

One of the most astonishing phenomena in modern politics has been the support for Donald Trump among people who have spent their careers posing as the arbiters and gatekeepers of conservative ideological orthodoxy. The reason this is such a remarkable development is that throughout his life Trump has aligned himself with people and causes that are the antithesis of conservatism. Even now he eschews conservative principle, taking both sides of some issues and reversing himself on some positions six times before breakfast.

John Zeigler at Mediaite has a strongly-worded column on the paradox of prominent conservatives like Rush Limbaugh gushing over Trump who gives every appearance of being an erstwhile liberal masquerading in this election cycle as a conservative.

Here are some excerpts from Zeigler's essay which should, of course, be read in its entirety:
To fully understand how and why the “conservative” (I use that term loosely) media willingly enabled this hostile takeover of the Republican Party, you primarily need to comprehend what a fraud the entire industry is. In short, the vast majority of “conservative” media is simply just a business cynically disguised as a cause....

It is my view that ... the “conservative” media had their most influence on Trump eventually getting the nomination [when he first announced his candidacy]. Trump’s campaign was like a rocket ship where the most perilous moments are during liftoff. If the conservative base had not accepted him a serious or credible candidate, then he would have quickly crashed and burned because, without traction, the media oxygen which would fuel his flight would have immediately evaporated....

[T]his made it much easier for “conservative” media icons like Limbaugh, Hannity, Bill O’Reilly, and even Levin to “play with fire” during the dull summer months of 2015 and seriously entertain the concept of a Trump candidacy. It can not be overstated that, given Trump’s very liberal history and lack of credentials, just how impossible this would have been for them to have even considered this reaction to his candidacy if Trump did not bring them celebrity/ratings during a down period....

[A]s the fire they created began to spread and gain strength, it became impossible for anyone to control, even if they wanted to. In short, once the average Trump supporter bought into the bogus “conservative” media narrative about him which absolved all his many past sins, he was basically unstoppable. After all, if he really wasn’t one of “us,” or was somehow bad for the “cause,” surely Rush, or Sean, or Bill, or Mark, or Matt would tell them that. Right?!

Once his liftoff was given clearance from conservative heavyweights, the Trump phenomenon went into orbit with his total domination of the news media. Desperately thirsting for the ratings he brought them, the cable news networks (even the liberals at CNN and MSNBC) shattered all semblances of journalistic standards by allowing him to appear for unprecedented amounts of time, usually unfettered. This sucked up all the oxygen from all the other candidates in the far-too-crowded field. It also allowed Trump to laughably claim he was self-funding his campaign when in reality he was hardly needing to spend a dime because of all the free airtime he was given (this is just one of dozens of key lies Trump has told, which the “conservative” media has conveniently ignored).

This created a self-perpetuating symbiotic relationship between his free airtime and his poll numbers. The more potential Trump voters saw him being taken seriously in the media, the more they gained the courage to tell pollsters that they supported him. The more he rose in the polls, the better excuse the networks had for having Trump on. They pretended that it was because he was now the “GOP Frontrunner,” when it was really just because he was much better for their ratings than interviewing any of the serious candidates....

[T]he conservative media doesn’t want to see their cash cow killed, especially since they have basically already budgeted for the record revenue a Trump vs. Hillary campaign would likely create....

Hannity effectively became an arm of the Trump campaign. Limbaugh continued to defend him. Drudge waged a vicious overt war against Rubio and Ted Cruz and made sure (much like he did with Rev. Wright in his 2008 bid to protect Obama during the primaries) that no negative Trump stories could get any real traction....

It should be noted that the nature of Trump’s fervent fans added even more fuel to why the “conservative” media went into the tank for Trump. They are not only the most passionate, but they are also the least rational. You cannot reason with them and they will simply tune you out if they don’t like what you are saying about their hero.

Of course, the “conservative” media doesn’t really care if Trump beats Hillary. They “win” either way. They get great ratings through November and “worst case” they end up with Hillary to provide them with easy content for at least the next four years. Pathetically, that’s all they really care about....

With all the talk of dismantling the “evil establishment” in this election cycle, for conservatives to ever win a presidential election again, the REAL “establishment” which needs to be shattered is that of the elite “conservative” media. While the Republican establishment is weak and incompetent, at least most of them are not overtly working for the other side.
I could understand supporting Trump were there no good alternatives for conservative voters in the Republican race, but there were, and still are. Scott Walker was a great candidate, Marco Rubio and Carly Fiorina were also outstanding, and Ted Cruz is excellent. Why, then, the almost complete blackout for the last six months of these candidates by people like Rush Limbaugh and Sean Hannity and their total focus on Donald Trump? Why do individuals like Ben Carson and Jerry Falwell, Jr., men who stand for civility and morality in politics and public life, endorse a man who has coarsened our public discourse and who openly boasts of adulterous relationships? Why do Bill O'Reilly, Matt Drudge, Ann Coulter, Breitbart.com and others fawn over a man who until the day before yesterday held political positions they disdain?

Jim Geraghty, in commenting on Zeigler's piece, says this:
Finally, while the major conservative media entities named in John Zeigler’s essay would probably vehemently deny that they let clubby groupthink, a desire for ratings and other bad influences alter their judgment, there is this remaining unexplained about-face from some of the biggest names in the conservative movement. From about 2009 to early 2015, to be praised in conservative media, you had to be indisputably conservative. Even a longtime record of voting conservatively didn’t protect you if you were seen as flinching in a tough fight. Mike Castle? Unacceptable! Mitch McConnell? Sellout! John Boehner? Worthless! Thad Cochran? Everything that’s wrong with the Senate! Lindsey Graham, Mitt Romney, Paul Ryan? Useless squishes!

Then in mid-2015, along comes Trump, with his long history of donating to the Democrats, support for Planned Parenthood, affirmative action, gun control, and a national health-care system, even friendship with Al Sharpton . . . and some of the biggest names on television and radio are perfectly fine with him.
Either these prominent conservatives are, in fact, (1) a bunch of unprincipled hucksters who simply exploit the conservatism of their audience for their own ratings and profit or (2) they know something about Trump that no one else does and which they're not sharing with us, or (3) they're just stupid. I don't believe (2) or (3), and I don't want to believe (1). Nevertheless, unless there's another possibility that I'm missing I don't see how (1) can be avoided.

Friday, March 18, 2016

Are We a Simulation? (Pt. II)

By way of concluding Wednesday's post on the possibility that you, I, and our entire universe actual exist in a computer simulation developed by some superior intellect in another world, we note that Robert Kuhn points out that the simulation hypothesis has great difficulty with the phenomenon of human consciousness:
A prime assumption of all simulation theories is that consciousness — the inner sense of awareness, like the sound of Gershwin or the smell of garlic — can be simulated; in other words, that a replication of the complete physical states of the brain will yield, ipso facto, the complete mental states of the mind. (This direct correspondence usually assumes, unknowingly, the veracity of what's known in philosophy of mind as "identity theory," one among many competing theories seeking to solve the intractable "mind-body problem".)

Such a brain-only mechanism to account for consciousness, required for whole-world simulations and promulgated by physicalists, is to me not obvious (Physicalism is the belief that everything in the universe is ultimately explicable in terms of the laws of physics. Physicalism is, for most purposes, synonomous with naturalism).
Kuhn is raising the question here as to how, for example, the sensation of seeing blue could be simulated? Until there is a plausible physical explanation of consciousness, which there is not at this point, it seems unlikely that conscious beings are nothing more than a simulation.

There's more of interest in this essay at the original article, including how physicist Paul Davies uses the simulation argument to refute the multiverse hypothesis. Kuhn closes his piece with this:
I find five premises to the simulation argument: (i) Other intelligent civilizations exist; (ii) their technologies grow exponentially; (iii) they do not all go extinct; (iv) there is no universal ban or barrier for running simulations; and (v) consciousness can be simulated.

If these five premises are true, I agree, humanity is likely living in a simulation. The logic seems sound, which means that if you don't accept (or don't want to accept) the conclusion, then you must reject at least one of the premises. Which to reject?
Personally, I find premise (i) problematic, premise (ii) possible, but questionable (it's just as likely that technological growth reaches a ceiling or collapses altogether), and premise (v) highly doubtful.

Thursday, March 17, 2016

On St. Patrick's Day

The following is a post I've run on previous St. Patrick's Days and thought I'd run again this year because, I say in all modesty, it's pretty interesting:

Millions of Americans, many of them descendents of Irish immigrants, celebrate their Irish heritage by observing St. Patrick's Day today. We are indebted to Thomas Cahill and his best-selling book How The Irish Saved Civilization for explaining to us why Patrick's is a life worth commemorating. As improbable as his title may sound, Cahill weaves a fascinating and compelling tale of how the Irish in general, and Patrick and his spiritual heirs in particular, served as a tenuous but crucial cultural bridge from the classical world to the medieval age and, by so doing, made Western civilization possible.

Born a Roman citizen in 390 A.D., Patrick had been kidnapped as a boy of sixteen from his home on the coast of Britain and taken by Irish barbarians to Ireland. There he languished in slavery until he was able to escape six years later. Upon his homecoming he became a Christian, studied for the priesthood, and eventually returned to Ireland where he would spend the rest of his life laboring to persuade the Irish to accept the Gospel and to abolish slavery. Patrick was the first person in history, in fact, to speak out unequivocally against slavery and, according to Cahill, the last person to do so until the 17th century.

Meanwhile, Roman control of Europe had begun to collapse. Rome was sacked by Alaric in 410 A.D. and barbarians were sweeping across the continent, forcing the Romans back to Italy, and plunging Europe into the Dark Ages. Throughout the continent, unwashed, illiterate hordes descended on the once grand Roman cities, looting artifacts and burning books. Learning ground to a halt and the literary heritage of the classical world was burned or moldered into dust. Almost all of it, Cahill claims, would surely have been lost if not for the Irish.

Having been converted to Christianity through the labors of Patrick, the Irish took with gusto to reading, writing and learning. They delighted in letters and bookmaking and painstakingly created indescribably beautiful Biblical manuscripts such as the Book of Kells which is on display today in the library of Trinity College in Dublin. Aware that the great works of the past were disappearing, they applied themselves assiduously to the daunting task of copying all surviving Western literature - everything they could lay their hands on.

For a century after the fall of Rome, Irish monks sequestered themselves in cold, damp, cramped mud huts called scriptoria, so remote and isolated from the world that they were seldom threatened by the marauding pagans. Here these men spent their entire adult lives reproducing the old manuscripts and preserving literacy and learning for the time when people would be once again ready to receive them.

These scribes and their successors served as the conduits through which the Graeco-Roman and Judeo-Christian cultures were transmitted to the benighted tribes of Europe, newly settled amid the rubble and ruin of the civilization they had recently overwhelmed. Around the late 6th century, three generations after Patrick, Irish missionaries with names like Columcille, Aidan, and Columbanus began to venture out from their monasteries and refuges, clutching their precious books to their hearts, sailing to England and the continent, founding their own monasteries and schools among the barbarians and teaching them how to read, write and make books of their own.

Absent the willingness of these courageous men to endure deprivations and hardships of every kind for the sake of the Gospel and learning, Cahill argues, the world that came after them would have been completely different. It would likely have been a world without books. Europe almost certainly would have been illiterate, and it would probably have been unable to resist the Muslim incursions that arrived a few centuries later.

The Europeans, starved for knowledge, soaked up everything the Irish missionaries could give them. From such seeds as these modern Western civilization germinated. From the Greeks the descendents of the Goths and Vandals learned philosophy, from the Romans they learned about law, from the Bible they learned of the worth of the individual who, created and loved by God, is therefore significant and not merely a brutish aggregation of matter.

From the Bible, too, they learned that the universe was created by a rational Mind and was thus not capricious, random, or chaotic. It would yield its secrets to rational investigation. Out of these assumptions, once their implications were finally and fully developed, grew historically unprecedented views of the value of the individual and the flowering of modern science.

Our cultural heritage is thus, in a very important sense, a legacy from the Irish. A legacy from Patrick. It is worth pondering on this St. Patrick's Day what the world would be like today had it not been for those early Irish scribes and missionaries thirteen centuries ago.

Buiochas le Dia ar son na nGaeil (Thank God for the Irish), and I hope you have a great St. Patrick's Day.

Wednesday, March 16, 2016

Are We a Simulation? (Pt. I)

Here's a post from last summer (8/22) that's relevant to some of the discussions that have come up in a couple of my classes recently:

Robert Kuhn host and writer of the public television program "Closer to Truth" has an excellent column on the theory that our universe is actually a computer simulation developed by a higher intelligence in some other universe. Kuhn writes:
I began bemused. The notion that humanity might be living in an artificial reality — a simulated universe — seemed sophomoric, at best science fiction.

But speaking with scientists and philosophers on "Closer to Truth," I realized that the notion that everything humans see and know is a gigantic computer game of sorts, the creation of supersmart hackers existing somewhere else, is not a joke. Exploring a "whole-world simulation," I discovered, is a deep probe of reality.

Philosopher Nick Bostrom, director of the Future of Humanity Institute at Oxford University, describes a fake universe as a "richly detailed software simulation of people, including their historical predecessors, by a very technologically advanced civilization."

It's like the movie "The Matrix," Bostrom said, except that "instead of having brains in vats that are fed by sensory inputs from a simulator, the brains themselves would also be part of the simulation. It would be one big computer program simulating everything, including human brains down to neurons and synapses."

Bostrum is not saying that humanity is living in such a simulation. Rather, his "Simulation Argument" seeks to show that one of three possible scenarios must be true (assuming there are other intelligent civilizations):
  • All civilizations become extinct before becoming technologically mature;
  • All technologically mature civilizations lose interest in creating simulations;
  • Humanity is literally living in a computer simulation.
His point is that all cosmic civilizations either disappear (e.g., destroy themselves) before becoming technologically capable, or all decide not to generate whole-world simulations (e.g., decide such creations are not ethical, or get bored with them). The operative word is "all" — because if even one civilization anywhere in the cosmos could generate such simulations, then simulated worlds would multiply rapidly and almost certainly humanity would be in one.

As technology visionary Ray Kurzweil put it, "maybe our whole universe is a science experiment of some junior high school student in another universe." (Given how things are going, he jokes, she may not get a good grade.)

Kurzweil's worldview is based on the profound implications of what happens over time when computing power grows exponentially. To Kurzweil, a precise simulation is not meaningfully different from real reality. Corroborating the evidence that this universe runs on a computer, he says, is that "physical laws are sets of computational processes" and "information is constantly changing, being manipulated, running on some computational substrate." And that would mean, he concluded, "the universe is a computer." Kurzweil said he considers himself to be a "pattern of information."

"I'm a patternist," he said. "I think patterns, which means that information is the fundamental reality."
Information, of course, is the product of minds, thus, if information is the fundamental reality in our world there must be a mind that has generated it. Many people, of course, agree with this and argue that the information which comprises this world is produced by the mind of God, but scientists, at least naturalistic scientists, argue that God is a metaphysical concept which lies outside the purview of science. Instead they advert to the existence of computer hackers in other universes which is also a metaphysical posit, but since it's not God, it's presumably okay to speculate about them.

At any rate, Kuhn goes on:
Would the simulation argument relate to theism, the existence of God? Not necessarily.

Bostrum said, "the simulation hypothesis is not an alternative to theism or atheism. It could be a version of either — it's independent of whether God exists." While the simulation argument is "not an attempt to refute theism," he said, it would "imply a weaker form of a creation hypothesis," because the creator-simulators "would have some of the attributes we traditionally associate with God in the sense that they would have created our world."

They would be superintelligent, but they "wouldn't need unlimited or infinite minds." They could "intervene in the world, our experiential world, by manipulating the simulation. So they would have some of the capabilities of omnipotence in the sense that they could change anything they wanted about our world."

So even if this universe looks like it was created, neither scientists nor philosophers nor theologians could easily distinguish between the traditional creator God and hyper-advanced creator-simulators.

But that leads to the old regress game and the question of who created the (weaker) creator-simulators. At some point, the chain of causation must end — although even this, some would dispute.
In other words, the universe displays indications of having been intelligently designed rather than having been an enormously improbable accident. This poses vexing problems for naturalists who feel constrained to account for the design without invoking you-know-who. So they theorize about a multiverse of an infinite number of worlds or speculate about extra-cosmic computer programmers who've created a world that looks real but is in fact just a computer simulation.

These extraordinary hypotheses are taken seriously by some philosophers and scientists, but if someone were to suggest that maybe this universe really is the only universe, that maybe it's real and not an illusory simulation foisted on us by some pimply extra-terrestrial, and that maybe it's instead the product of a single intelligent transcendent mind, he would suffer the ridicule and scorn of those who'd sooner believe that the universe is a science project of a seventh grader in some other more technologically advanced universe. I wonder which is the more implausible hypothesis.

I'll conclude with a couple more thoughts on this in Part II tomorrow.

Tuesday, March 15, 2016

The Ides of March

Today is March 15th. It was on this date in 44 B.C. that the Roman emperor Julius Caesar was assassinated by some sixty conspirators, including a group of Roman senators. The murder of Caesar changed world history and the U.K. Telegraph has a fascinating account by Dominic Selwood of this event.

Caesar was a complex character, as Selwood tells us, and like many great men he was admirable in some ways and repulsive in others:
He was a military colossus, original thinker, compelling writer, magnetic orator, dynamic reformer, and magnanimous politician. Yet he was also manipulative, narcissistic, egotistical, sexually predatory, shockingly savage in war even by Roman standards, and monomaniacally obsessed with acquiring absolute power for himself.
La Mort de CĆ©sar (ca. 1859–1867) by Jean-LĆ©on GĆ©rĆ“me
Here's the lede of Selwood's article:
Spurinna was a haruspex. His calling was vital, if a little unusual, requiring him to see the future in the warm entrails of sacrificial animals.

At the great festival of Lupercalia on the 15th of February 44 B.C., he was a worried man. While priests were running around the Palatine Hill hitting women with thongs to make them fertile, Spurinna was chewing over a terrible omen.

"Spurinna knew it was a terrible sign: a sure portent of death."

The bull that Julius Caesar, Dictator of Rome, had sacrificed earlier that day had no heart. Spurinna knew it was a terrible sign: a sure portent of death.

The following day, the haruspex oversaw another sacrifice in the hope it would give cause for optimism, but it was just as bad: the animal had a malformed liver. There was nothing for it but to tell Caesar.

In grave tones, Spurinna warned the dictator that his life would be in danger for a period of 30 days, which would expire on the 15th of March. Caesar dismissed the concerns. Although in his scramble for political power he had been made the chief priest of Rome (Pontifex Maximus), he was a campaign soldier by trade, and not bothered by the divinatory handwringing of seers like Spurinna.

As the 30 days passed, nothing whatsoever happened. Yet when the 15th of March dawned, Caesar’s wife awoke distressed after dreaming she held his bloodied body. Fearing for his life, she begged him not to leave the house. His dreams, too, had also been unsettling. He had been flying through the air, and shaken hands with Jupiter. But he pushed any concerns aside. The day was an important annual celebration in Rome’s religious calendar, and he had called a special meeting of the Senate.

His first appointment of the day was a quick sacrifice at a friend’s house. Spurinna the seer was also there. Caesar joked that his prophecies must be off as nothing had happened. Spurinna muttered that the day was not yet over.

The sacrifices proceeded, but the animals’ innards were blemished and the day was plainly inauspicious. Caesar knew when to call it a day, and agreed to postpone the meeting of the Senate and to go home.

Later that morning, his fellow military politician and protƩgƩ Decimus called round, urging him to come to the Senate in case his absence was seen as mocking or insulting. Persuaded by his friend, soldier to soldier, Caesar agreed to go in person to announce the meeting would be postponed.

Shortly after, a slave arrived at Caesar’s house to warn him of the plot against his life. But he was too late: Caesar had left. A short while later, a man named Artemidorus of Cnidus pushed through the jostling crowds and handed Caesar a roll setting out details of the plot. But the crowds were so thick he had no chance to read it.

The main Senate House was being rebuilt on Caesar’s orders, so the meeting was instead at the Curia behind the porticoed gardens attached to the great Theatre of Pompey. Another round of animal sacrifices before the start of the session was unfavourable, and Caesar waited outside, troubled. Again Decimus spoke with him. Unaware of his friend’s treachery, Caesar allowed himself to be led towards the chamber by the hand. Decked out in his triumphant general’s reddish-purple toga embroidered in gold, Julius Caesar, Dictator of Rome, entered the Senate’s meeting room, and ascended his golden throne.
Go to the link for Selwood's account of the denouement.

Selwood attributes several myths surrounding this assassination to Shakespeare and his play Julius Caesar. One interesting myth has to do with the line in the play given by a soothsayer who shouts to Caesar the words, "Beware the Ides of March!" This line has ever since come to be a portent of disaster, but what are the "Ides of March"?
In Rome’s impossibly complicated calendar, every month had an Ides....In the mists of time, the early Romans began each month at the new moon. They called that day the Kalends (Kalendae). Two weeks later came the full moon, which they named the Ides (Idus). Midway between the two was the half-moon, which they referred to as the Nones (Nonae). For some inexplicable reason, they then chose to refer to every other day in the month in terms of its relationship to the next one of these coming up. So they would say, “five days before the Kalends of March,” or “three days before the Nones of June”.

The Kalends was always the 1st of the month. Over time, the others came to fall on set days. In March, May, July, and October, the Nones was the 7th and the Ides was the 15th. For the remaining months, the Nones was the 5th and the Ides was the 13th. Therefore the 4th of July was IIII Nones July (i.e. four days before the Nones - the calculation is inclusive, so both the 4th and the 7th are counted).

Although every month had an Ides in the middle, the date chosen by Caesar’s murderers was nevertheless significant. Traditionally, the Roman year started on the 1st of March, meaning the Ides was the first full moon of the year. It was a major celebration, and the festival of Anna Perenna, the goddess of the cycle of the year. Her special gift was to reward people with long life. Caesar’s assassins clearly thought they were giving long life to Rome (and their own political careers) by removing the dictator they believed was blighting it all.
It has often amazed me when reading Roman history that they could accomplish such great feats of engineering and architecture with their exceedingly cumbersome system of numeration. It's almost equally as amazing that they could be such good historians with such a clunky calendar. At any rate, there's more at the link, including a discussion of the consequences of the murder for subsequent history. It makes for good reading on this the Ides of March.

Monday, March 14, 2016

Unintended Consequences

Employment news out of Seattle is discouraging but not surprising. In June 2014 Seattle raised the minimum wage for all employees in the city to $15 to be reached in increments.
Starting last April, it raised the minimum from $9.32 (the state minimum wage) to $10 for certain business, $11 for others. Increases to $12, $12.50 and $13 an hour began taking effect for most employers this Jan. 1.
There's still a way to go before the wage hits the target of $15 per hour in 2017, but it's not too early to get an indication of the effect the hike has had on jobs. An American Enterprise Institute study shows that,
[B]etween April and December last year Seattle saw the biggest employment drop in any nine-month period since 2009 — a full year into the Great Recession. The city unemployment rate rose a full percentage point.

Before the minimum-wage hikes began, Seattle employment tracked the rest of the nation — slowly rising from the 2008-09 bottom. But it started to plunge last spring, as the new law began to kick in.

Furthermore, Seattle’s loss of 10,000 jobs in just the three months of September, October and November was a record for any three-month period dating back to 1990.

Meanwhile, employment outside the city limits — which had long tracked the rate in Seattle proper — was soaring by 57,000 and set a new record high that November.
In what may come as a surprise to the members of the Seattle city council and similar bodies around the country, when government raises the cost of doing business employers lay off, or choose not to hire, marginal employees. In a vivid illustration of the Law of Unintended Consequences at work the attempt by government to increase the income of minimum wage employees actually makes it harder for those employees to get hired and, if hired, to keep their jobs.

Now thousands of Seattle residents are out of work because their leaders thought they'd help them by raising their pay rate. Given a choice, which would minimum wage employees prefer, to have a job making $7.50 an hour or to have no job but know that if they did have one they'd be making twice as much?

The late Ronald Reagan could have had Seattle in mind when, in his first Inaugural Address, he declared that, "government is not the solution to our problem; government is the problem."

Saturday, March 12, 2016

Snow Geese

A friend and faithful reader of VP wrote recently, reminding me that I haven't posted any bird photos in a while and asking to see some. It just so happens that I took my wife, daughter, and her fiance today to one of the major migratory stopover spots on the east coast to see a waterfowl called the snow goose. These birds breed in the tundra and migrate in huge numbers along the Atlantic coast and Mississippi river valley.



By the end of the 19th century hunting had taken a serious toll on the population of these birds and was banned in 1916 to allow the population to recover. The ban resulted in skyrocketing numbers of geese and hunting was restored in 1975. The birds today number in the millions. When they're swirling around after taking off, or as they land, it's like standing inside a snow globe. The numbers during our visit today were not as impressive as what's seen in these photos since we're now past the peak in the snow goose migration, but there were still several thousand geese and tundra swans on the water this morning. At it's peak, which is usually about the first week in March, there may be 100,000 or more snow geese on the lake.



I got these photos off the web, but they were taken at the site we visited today. It's called Middle Creek Wildlife Management Area and it sits athwart the Lancaster/Lebanon county line in south-central Pennsylvania. An interesting anecdote about Middle Creek: It was built in the 1970s for hunting and conservation and is managed by the Pennsylvania Game Commission. It has recently been in the news because the Game Commission wishes to raise hunting license fees to obtain the revenue to maintain the facility, but the legislature has been reluctant to grant the increase. The Game Commission has said that they can't keep Middle Creek open if they don't get more money and will be shutting it down in a year or so.

It would be a great shame if this happened, and I doubt very much that the legislators will let it come to that, but it's where matters stand as of now.

Friday, March 11, 2016

Science Guy Misunderstands Philosophy

Bill Nye (The "Science Guy") has blithely wandered onto some thin metaphysical ice. A video of him responding to a young philosophy major's question about the importance of philosophy elicited a response that has a lot of philosophers shaking their heads.
I addressed this very topic in a post last summer titled Does Science Need Philosophy? (8/21/15) and thought it might be appropriate to run it again in response to Nye's video:

It seems to be something of a trend lately for materialists, particularly materialist scientists, to denigrate philosophy. Cosmologists Stephen Hawking and Lawrence Krauss are two recent examples. Hawking even went so far as to pronounce philosophy dead in his book The Grand Design.

I wonder if one of the subconscious reasons for their disdain for philosophy is that these scientists and others are writing books claiming that science pretty much makes belief in God untenable, but they're finding that philosophers who critique their arguments are showing them to be embarrassingly unsophisticated. The animus against philosophy may derive from personal chagrin suffered at the hands of philosopher-critics.

Be that as it may, Hawking and Krauss, for all their brilliance, are astonishingly unaware of the philosophical faux pas that pervades their own writing.

Krauss, for example, made the claim in his book A Universe from Nothing that the universe emerged spontaneously out of a mix of energy and the laws of physics which he calls "nothing." Thus God is not necessary to account for the universe. Of course, this is a semantic sleight-of-hand since if the cosmos was produced by energy and physical laws then there was not "nothing," there was "something," and we're confronted with the mystery of how this energy and these laws came about.

Hawking declared philosophy "dead" in the early pages of his book and then spent a good part of the rest of the book philosophizing about realist and anti-realist views of the universe and the existence of a multiverse.

It's ironic that physicists like Hawking and Krauss would be so willing to deprecate philosophy since their own discipline is infused with it. Every time physicists talk about the multiverse or the nature of time or space or their own naturalistic assumptions about reality, they're doing metaphysics. When they talk about knowledge, cause and effect, the principle of sufficient reason, the principle of uniformity, or the problem of exactly what constitutes the scientific enterprise (the demarcation problem), they're doing philosophy. Whenever they discuss the ethics required of scientists in conducting and reporting their researches or express awe at the beauty of their equations, they're doing philosophy.

The entire discipline of science presupposes a host of philosophical assumptions like the trustworthiness of our senses and of our reason, the orderliness of the universe, the existence of a world outside our minds, etc. Yet these thinkers seem to be oblivious to the foundational role philosophy plays in their own discipline. Indeed, science would be impossible apart from axiomatic philosophical beliefs such as those listed above.

Science tells us the way the physical world is, but as soon as the scientist starts to draw conclusions about what it all means he or she is doing philosophy. It's inescapable. There's a bit of a joke at Uncommon Descent about this. It goes like so:
Scientist: "Why does philosophy matter?"
Philosopher: "I don't know, why does science matter?"
Scientist: "Well, because scie...."
Philosopher: "Annnnnnnd you are doing philosophy."
There's more on how science is inextricably infused by philosophy here.

Thursday, March 10, 2016

Reverting to the Dark Ages

Science has unmoored itself from its heritage in Christian metaphysics and adopted a naturalistic worldview, but it's the former in which science was conceived and in which it was nourished, cultivated, and grew to maturity. Now it has declared its independence, thinking it can stand on its own, no longer needing the support of the superstitions of its youth. Perhaps science need not rely on the assumptions bequeathed it by its religio-cultural heritage, perhaps scientists can dispense with Christian moral assumptions and belief in objective truth with no effect, but articles like this one by Melanie Phillips leave one less than convinced.

After lamenting that science is plagued by shoddy research and faulty conclusions, Phillips writes:
Richard Horton, editor-in-chief of The Lancet, has written bleakly: “The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue.”

One reason is that cash-strapped universities, competing for money and talent, exert huge pressure on academics to publish more and more to meet the box-ticking criteria set by grant-funding bodies. Corners are being cut and mistakes being made....

The problem lies with research itself. The cornerstone of scientific authority rests on the notion that replicating an experiment will produce the same result. If replication fails, the research is deemed flawed. But failure to replicate is widespread. In 2012, the OECD spent $59 billion on biomedical research, nearly double the 2000 figure. Yet an official at America’s National Institutes of Health has said researchers would find it hard to reproduce at least three-quarters of all published biomedical findings.

A 2005 study by John Ioannidis, an epidemiologist at Stanford University, said the majority of published research findings were probably false. At most, no more than about one in four findings from early-phase clinical trials would be true; epidemiological studies might have only a one in five chance of being true. “Empirical evidence on expert opinion”, he wrote, “shows that it is extremely unreliable”.
So why has this state of affairs come to pass?
Underlying much of this disarray is surely the pressure to conform to an idea, whether political, commercial or ideological. Ideological fads produce financial and professional incentives to conform and punishment for dissent, whether loss of grant-funding or lack of advancement. As Professor Ioannidis observed: “For many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.”

Underlying this loss of scientific bearings is a closed intellectual circle. Scientists pose as secular priests. They alone, they claim, hold the keys to the universe. Those who aren’t scientists merely express uneducated opinion. The resulting absence of openness and transparency is proving the scientists’ undoing. In the words of Richard Horton, “science has taken a turn towards darkness”.

But science defines modernity. It is our gold standard of truth and reason. This is the darkness of the West too.
To put this differently, when generations of scientists are invested in a materialistic naturalism that places no moral constraints on their work and which calls into question the very idea of objective truth the temptation to succumb to the professional and ideological pressures imposed by the grant and tenure process, and indeed the pressure to conform to the prevailing consensus among one's peers, then the quality of scientific work will slowly degrade. Science, disconnected from the only metaphysics which can provide a moral anchor, is easily thrust into the service of whatever the prevailing ideology may be, just as happened to science in the communist Soviet Union in the first half of the twentieth century.

Ideas have consequences.

Wednesday, March 9, 2016

Bush Lied

Ten years ago it was all one heard: "Bush Lied, People Died." "The Iraq war was a war to steal Iraq's oil." I argued on VP at the time that both claims were manifestly false, and that they cast doubt on the integrity and/or good sense of anyone who made them.

The "lie" allegation was manifestly absurd since every intelligence service in the world thought that Saddam had weapons of mass destruction (WMD) or was working to get them. Not only that, but he acted as though he had them, and had a history of using WMD (chemical weapons) on both his own people and on the Iranians. Bush may have been understandably mistaken that Saddam had WMD, but being mistaken is not lying.

The "oil" allegation was even more absurd. If we wanted to steal oil we could have just taken it from any of a host of countries which would have offered far less cost and risk than invading Iraq posed. Moreover, our subsequent refusal to take Iraqi oil proved that the charge was baseless.

Now Donald Trump has resurrected the old canards and used them to libel George W. Bush all over again. The fact that it's Trump making the charges gives us considerable reason to doubt their accuracy a priori, but nevertheless, the charge is so egregious that it needs to be answered. Judith Miller, former journalist at the New York Times, does just that in this video. Miller was a key reporter of these events at the time, and she was no Bush supporter. Even so, she evidently does care that a generation of Americans who knew not Bush but does know Trump hear the facts from someone who knows the truth:

Sixty Years of No Warming

I'm not sure what to make of this article by Tony Heller, but if he's right then NOAA (National Oceanic and Atmospheric Administration) has been behaving a bit irresponsibly with their data, and, contrary to the terrifying prognostications of folks like Al Gore, the specter of runaway global warming is a chimera.

The basic claim of Heller's piece is this:
In their “hottest year ever” press briefing, NOAA included a graph, which stated that they have a 58 year long radiosonde temperature record, but they only showed the last 37 years in the graph. The reason for the selective presentation of data is that the earlier data showed as much pre-1979 cooling as post-1979 warming.
In other words, Heller is claiming that NOAA’s data actually shows that there's been no net global warming for 60 years. Given all the alarums raised by climate scientists and all the radical, and very costly, policy proposals promoted by the world's politicians this is quite a shocking claim that Heller is making. I urge interested readers to visit the site and peruse the graphs and supporting materials he provides.

Maybe someone who knows more about climate science can explain why his argument is wrong (or right), but for now it's very hard to accept the confident assertion, made by so many of our politicians from President Obama on down, that global warming is in fact "settled science."

Tuesday, March 8, 2016

What Went Wrong?

A Jordanian journalist, writer, and political analyst named Jihad Al-Mansi wrote a piece in the Jordanian daily Al-Ghad in which he places the blame for the tragic backwardness of Arab societies - a backwardness which places them at the bottom of global rankings in science, culture, human and women's rights, and the war on corruption - squarely on the shoulders of his fellow Arabs. He adds that the Arab world lags behind the rest of the quickly advancing world which "has overtaken us by centuries, perhaps millennia."

An interesting aspect of his essay is that Arab backwardness is often attributed to Western colonialism and imperialism by Western liberals and by Arabs to insidious Jewish plots, but Al-Mansi will have none of that. He calls on his fellow Arabs to wake up, take responsibility for their situation and stop blaming others for their problems. Moreover, he urges contemporary Arabs to invest their financial and human resources in advancing future generations, because it is no longer possible to do much to improve the situation of the current generation.

Memri provides some excerpts from Al-Mansi's article:
The world is developing, in the philosophical, scientific, social, creative, educational, and cultural sense; it is on the verge of breaking free of backward gender-driven thinking...

This is taking place in countries far from our Arab region. There, they are developing scientifically and culturally, competing for the top position in all human indices. At the same time, we, in this region of the world, remain at the bottom of these indices – and some of our countries are absent from them altogether.

The Nobel laureates in peace, medicine, chemistry, physics, economics, and literature include people from all [countries] – but we Arabs are rarely among them, and for the most part sit in the audience [during the awards ceremonies] or watch them on TV...

Our only way of consoling ourselves is to reminisce and to recall [great Muslims of the past]. We do so in disregard of the fact that most of these people, in whom we take pride for human and cultural reasons, were not Arab, and most of them were stoned [to death] or imprisoned, and some had their books burned or were accused of heresy...

Our problem does not end at [our failure to win] a Nobel Prize. It is manifested much more in the fact that we hold no respectable position on any index or metric concerning freedom of thought, human rights, media, gender, environment, water, or war on corruption; our countries often come last in every field.

When we participate in the Olympic Games, our countries promote the motto 'honor for [merely] participating.' When we want to try for an Olympic medal, our solution is to grant citizenship to [foreign] athletes to do so. We are not among those on the winner's podium – and if we are, our representation is miniscule. We celebrate every gold medal won by a Comoro Islander as if he had liberated Jerusalem. Kenya, Guinea, or Sierra Leone have medaled 10 times and aim for more – while we and our 22 countries rejoice at [winning] just one. This is despite the fact that the income of some of our countries, and maybe all of them, surpasses that of Kenya, Sierra Leone, and others. But [our] billions in income are squandered on purchasing [sporting] clubs, as we refrain from investing in [our own] human, ideological, and athletic resources.

We are regressing, instead of progressing, in all fields: We fail in sports; we have no presence in the arts; politically, we execute the agendas of the superpowers and major enterprises, like pawns that move when expected and remain silent when demanded to do so. Economically, we are not welfare states; ideologically, we are influenced, not influencers; with regard to humanity, we reject the other rather than accept him. We accuse anyone who disagrees with us of being an infidel, and think that we're always right and the world is conspiring against us, never asking ourselves the logical question: Why would the world do this, when we are of no consequence in global, cultural, and human enterprise? We avoid the real answer, and cannot acknowledge that it is we who conspire against ourselves, killing each other and shedding each other's blood on pretexts based on a legacy that is 1,500 years old, more or less, [pretexts] that are intended to sow ethnic and religious conflicts among the streams and sects...

Gentlemen, our car is in reverse, and is not moving forward – as the world has overtaken us by centuries, perhaps millennia. We have missed the boat for this generation, and it is beyond rectifying. Will we wake up and invest our financial and human resources to help the coming generations? Will we?
Jihad Al-Mansi
Bernard Lewis, the great scholar of Islam, wrote a book titled What Went Wrong in which he pondered the question how a culture that at one time gave every indication of incipient greatness nevertheless fell into backwardness and stagnation. Why is it, Lewis asked, that Arab countries have not produced any great cultural achievements since the Middle Ages? His answer is that power fell into the hands of Islamic clerics, and, as Al-Mansi indicates above, any thought that wandered beyond clearly prescribed theological boundaries was harshly punished.

In such a climate it's very hard to produce great art, science, literature, or technology and thus these offspring of human genius were killed in the crib, as it were, throughout the Islamic world, and creativity, independent thought, and innovation were stifled. Indeed, they still are throughout much of the Islamic world today.

The key to progress, or at least one crucial key, is religious freedom. As long as Islamic fundamentalists insist on establishing theocratic regimes which punish unorthodox ideas the Arab world will continue to produce nothing of value to humanity beyond what more technologically advanced nations can extract from the earth under their feet.

Ironically, there is a lesson in this for the West. We live in a time when ideological "clerics" seek to impose a strait-jacket of orthodoxy on all political and social thought, especially in our universities. Independence and creativity is smothered by strict, if unwritten, rules enforcing ideological conformity. Political correctness is imposed, "trigger warnings" and "safe spaces" where students won't have to suffer being challenged by uncomfortable ideas are demanded, and "microaggressions" and other "deviant" ideas or political behavior bring swift punishment upon offenders. Faculty who hold heterodox opinions on Darwinism, global climate change or gay marriage are treated like heretics and their careers are not infrequently burned at the stake by our contemporary ideological inquisitors.

Yet, it's still possible to dissent from the shibboleths and dogma of our Western academic version of the Islamic ayatollahs, but only because they have yet to consolidate their grip on the rest of our society. They're working assiduously, however, to rectify that.

Monday, March 7, 2016

Can the Universe Be Infinitely Old?

One thorny problem for any naturalist metaphysics is that the consensus among scientists is that the universe came into being at some point in the past. If that's true then, for reasons discussed in a recent post, it's strong evidence for the existence of a creative, intelligent, transcendent, eternal, and personal first cause, i.e. either God or something very much like God.

If such a cause exists, of course, then naturalism is false, so naturalists, understandably chary about accepting the conclusion that their metaphysics is false, sometimes take refuge in the argument that the universe is infinitely old, past eternal, or beginning-less. The post linked to above offered scientific reasons for rejecting this argument, but there are philosophical reasons as well.

One of these is that an actually infinite set of any physical entities, whether they be moments, or atoms, or whatever, is probably impossible. Philosopher William Lane Craig explains in the following short video some of the paradoxes that arise in an infinite series of entities and why such a series is highly implausible:
Moreover, even if the universe were in fact infinitely old it still could never have arrived at the present moment.

Kirk Durston in an article at Evolution News and Views explains why:
The evidence from science points to a beginning for the universe. Some atheists, understanding the possible theological implications of a beginning, prefer to set aside science and assert that the past is infinite either in terms of the number of years this universe has existed, or in terms of a fantasized infinite series of universes in a multiverse....

In the real world, an infinite past means that if you were to set the current year as t = 0 and count back into the past, there would never be an end to your counting, for there is no year in the past that was the "beginning." No matter how long you counted, you would still have an infinite number of years ahead of you to count and, if you were to look back at the set of years you have already counted, it would always be finite.
In other words, if the universe is infinitely old then, if you began counting back from the present moment, you could never count back to a starting point. No matter if you counted forever you would never reach a first moment of the universe. This means, however, that neither could you count forward from infinity past to the present moment. If the universe extends infinitely into the past and contains an infinity of past moments then no matter how many of those moments tick by the present moment would never arrive.

Put differently, in order for a series of moments to arrive at the present there has to be a starting point, but if the past is eternal then the necessary starting point keeps receding further and further into the past and in fact does not exist at all. If there's no initial moment then there's no second moment, and if no second then no third, and so on, and if all this is so, then there is no present moment either. But obviously there is a present moment, so it would seem that the assumption of a past eternal universe is false.

Durston goes into more detail than this, but the implication is clear. If the universe is not past eternal then it had a beginning. And if it had a beginning it had a cause. And any cause of the universe must have the properties listed above, all of which is to say that a finite universe is strong evidence that theism is true.

Saturday, March 5, 2016

Did Libet Prove Determinism?

This post is from the archive but is relevant to a topic my classes are currently discussing, or soon will be discussing, so I thought it'd be useful to post it again:

Students of psychology, philosophy and other disciplines which touch upon the operations of the mind and the question of free will may have heard mention of the experiments of Benjamin Libet, a University of California at San Francisco neurobiologist who conducted some remarkable research into the brain and human consciousness in the last decades of the 20th century.

One of Libet's most famous discoveries was that the brain "decides" on a particular choice milliseconds before we ourselves are conscious of deciding. The brain creates an electrochemical "Readiness Potential" (RP) that precedes by milliseconds the conscious decision to do something. This has been seized upon by materialists who use it as proof that our decisions are not really chosen by us but are rather the unconscious product of our brain's neurochemistry. The decision is made before we're even aware of what's going on, they claim, and this fact undermines the notion that we have free will as this video explains:
Michael Egnor, at ENV, points out, however, the remarkable fact that, so far from supporting determinism, Libet himself believed in free will, his research supported that belief, and, what's more, his research also reinforced, in Libet's own words, classical religious views of sin.

Libet discovered that the decision to do X is indeed pre-conscious, but he also found that the decision to do X can be consciously vetoed by us and that no RP precedes that veto. In other words, the decision of the brain to act in a particular way may be determined by unconscious factors, but we retain the ability to consciously (freely) choose not to follow through with that decision. Our freedom lies in our ability to refuse any or all of the choices our brain presents to us. Or, we might say, free will is really "free won't."

Egnor's article is a fascinating piece if you're interested in the question of free will and Libet's contribution to our understanding of it.

Clinics Closing at Record Pace

This news will delight some readers and disturb others:
Abortion access in the U.S. has been vanishing at the fastest annual pace on record, propelled by Republican state lawmakers’ push to legislate the industry out of existence. Since 2011, at least 162 abortion providers have shut or stopped offering the procedure, while just 21 opened.
The attempt to impute this trend to nefarious Republicans, though they'd be happy to take credit for it, seems misguided. The article makes clear that states like California which are controlled by abortion-friendly Democrats are also seeing dozens of clinics closing their doors. In any case, the article continues:
At no time since before 1973, when the U.S. Supreme Court legalized abortion, has a woman’s ability to terminate a pregnancy been more dependent on her zip code or financial resources to travel. The drop-off in providers—more than one every two weeks—occurred in 35 states, in both small towns and big cities that are home to more than 30 million women of reproductive age....

Typically defined by medical researchers as facilities that perform 400 or more abortions per year, the ranks peaked in the late 1980s at 705, according to the Guttmacher Institute, a New York-based reproductive-health research organization. By 2011, the most recent year for which Guttmacher has data, that number had fallen to 553.

State regulations that make it too expensive or logistically impossible for facilities to remain in business drove more than a quarter of the closings. Industry consolidation, changing demographics, and declining demand were also behind the drop, along with doctor retirements and crackdowns on unfit providers....

That just 21 new clinics opened in five years underscores the difficulty the industry has faced in replenishing the ranks of health-care providers willing and financially able to operate in such a fraught field. The impact of that challenge is likely to be long-lived: Even rarer than the building of a new clinic is the reopening of one that has shut.
One thing that perhaps everyone can agree upon is this: Clinics closing because of diminished demand is a good thing. Whether one is pro-life or pro-choice there's widespread agreement that every child should be a wanted child, and if there is reduced demand for abortion that would suggest that more women are deciding that they want their children.

It would be interesting to know exactly what the reasons are for the lower demand for the services of abortion clinics. Is it, in fact, because of a greater desire on the part of young mothers to have children, is it simply that more couples are practicing contraception, or is it because more women are finding abortion to be morally problematic? Perhaps it's all three.

Thursday, March 3, 2016

Fleebaggers vs. Taxpayers

Some readers may remember the political turmoil that beset the state of Wisconsin when Governor Scott Walker pushed a number of reforms five years ago that were designed, among other things, to weaken the grip public employees unions had on the state's taxpayers. The legislative effort was labelled Act 10, and in an effort to prevent its passage, many Democrat legislators fled the state so the legislature would be denied a quorum and couldn't vote on the bill. The fleeing legislators came to be called "Fleebaggers." There was tremendous pressure brought to bear on Walker and the Republican legislature - death threats, protesters filling the capitol building, nation-wide criticism in the media - but they remained firm and Act 10 passed.

The MacIver Institute has run the numbers and found that, in the ensuing five years, contrary to all of the predictions of doom, an amazing $5 billion has been saved by the state of Wisconsin as a direct result of Act 10:
Five years ago, Gov. Walker and the Republican legislature started their odyssey that resulted in the signing of Act 10, a milestone law that has saved Wisconsin taxpayers $5.24 billion, according to a new analysis by the MacIver Institute.

The analysis found that Wisconsin saved $3.36 billion by requiring [that] government employees contribute a reasonable amount to their own retirement. The analysis also estimates local units of governments saved an additional $404.8 million total by taking common sense steps like opening their employees' health insurance to competitive bidding. Milwaukee Public Schools saved $1.3 billion in long-term pension liabilities, and Neenah saved $97 million in long-term pension liabilities in addition to other savings.

Five years after Gov. Walker introduced it, Act 10 is still the gift that keeps on giving. The MacIver Institute analysis found that the Medford School District recently realized an 11 percent decrease in the cost of its health insurance business by opening it to competitive bidding....Similarly, the Appleton Area School District switched health insurance providers last October and local taxpayers will see up to $3 million in savings in the first year alone.
There's more at the link. Here's a chart that accompanied the article and which shows a breakdown of the money saved by Wisconsin through the courage and wisdom of its legislature and governor:



The lesson here seems simple enough. Competition, low taxes, and reasonable pension reform are economic panaceas for state governments, but they're anathema to legislatures controlled by liberals who are beholden to public employees unions. So, while states like California, Illinois, and New York continue policies which nudge them ever closer to insolvency, states like Wisconsin are following a wiser course. The question is, why can't those other states, and our federal government, for that matter, see the wisdom of what Wisconsin has done?

Wednesday, March 2, 2016

Three Simple Rules for Beating Poverty

Ron Haskins, Senior Fellow at the Brookings Institute, offers some advice to anyone who truly wishes to rise up out of poverty into the American middle class:
Policy aimed at promoting economic opportunity for poor children must be framed within three stark realities. First, many poor children come from families that do not give them the kind of support that middle-class children get from their families. Second, as a result, these children enter kindergarten far behind their more advantaged peers and, on average, never catch up and even fall further behind. Third, in addition to the education deficit, poor children are more likely to make bad decisions that lead them to drop out of school, become teen parents, join gangs and break the law.

In addition to the thousands of local and national programs that aim to help young people avoid these life-altering problems, we should figure out more ways to convince young people that their decisions will greatly influence whether they avoid poverty and enter the middle class. Let politicians, schoolteachers and administrators, community leaders, ministers and parents drill into children the message that in a free society, they enter adulthood with three major responsibilities: at least finish high school, get a full-time job, and wait until age 21 to get married and have children.

Our research shows that of American adults who followed these three simple rules, only about 2 percent are in poverty and nearly 75 percent have joined the middle class (defined as earning around $55,000 or more per year). There are surely influences other than these principles at play, but following them guides a young adult away from poverty and toward the middle class.
There's much more worth reading in Haskins' essay and readers interested in the plight of the poor are urged to check it out. Here are a couple of suggestions, in addition to the three mentioned above, that Haskins is perhaps alluding to when he mentions other influences, but doesn't make explicit.
  1. Get married before you have children.
  2. Stay away from drugs, alcohol and pornography.
  3. Strive to be the best employee at your workplace.
  4. Never stop learning.
  5. Limit your time on social media.
Sound too preachy? Consider #1 about which Haskins offers some statistics:
Today, more than 40 percent of American children, including more than 70 percent of black children and 50 percent of Hispanic children, are born outside marriage. This unprecedented rate of non-marital births, combined with the nation’s high divorce rate, means that around half of children will spend part of their childhood—and for a considerable number of these, all of their childhood — in a single-parent family.

As hard as single parents try to give their children a healthy home environment, children in female-headed families are four or more times as likely as children from married-couple families to live in poverty. In turn, poverty is associated with a wide range of negative outcomes in children, including school dropout and out-of-wedlock births.
Sure, it's harder for some than it is for others, given the circumstances of their lives, to rise into the middle class, but someone who wants to do it can certainly make it much less difficult by following Haskins' advice.

Tuesday, March 1, 2016

Fundamental Reality

This is a post I've run before but am reposting since it's relevant to some topics my students and I have been discussing in class:

For most of the 19th and 20th centuries it was the consensus view among scientists and philosophers that reality, the universe, was fundamentally material. The belief was that everything was reducible to matter and energy and that if there was any immaterial substance, it was a property of matter. Thus, in this materialist view, there was no such thing as mind or soul that existed independently of matter. Mind, if it existed, emerged from matter.

All this began to change in the 20th century with the development of quantum physics, and as that century came to a close and the new century began a number of experiments were done which led physicists to believe that, in fact, mind is fundamental and that the material world is an emergent property of mind.

Rather than seeing the universe as a machine, as thinkers had done ever since Isaac Newton in the 17th century, the universe was now being viewed, in the words of Sir James Jeans, more like "a grand idea."

The following video gives a fairly good description of two experiments in physics which have led many (not all) scientists to agree with Jeans. The video moves quickly so you might wish to replay parts of it.

There's resistance to accepting the notion that the universe is a product of mind because such a view both refutes the materialism upon which atheism rests and fits nicely into a theistic view of the world (see the quote from physicist Alain Aspect below).

Nevertheless, this is the view accepted by a growing number of quantum physicists. Here are a few quotes to illustrate this:

  • “As a man who has devoted his whole life to the most clear headed science, to the study of matter, I can tell you as a result of my research about atoms this much: There is no matter as such. All matter originates and exists only by virtue of a force which brings the particle of an atom to vibration and holds this most minute solar system of the atom together. We must assume behind this force the existence of a conscious and intelligent mind. This mind is the matrix of all matter.” Max Planck (1944)
  • “Consciousness cannot be accounted for in physical terms. For consciousness is absolutely fundamental. It cannot be accounted for in terms of anything else.” Erwin Schroedinger.
  • “It will remain remarkable, in whatever way our future concepts may develop, that the very study of the external world led to the scientific conclusion that the content of the consciousness is the ultimate universal reality” Eugene Wigner 1961, Nobel Prize winner in 1963
  • "If materialism cannot accommodate consciousness and other mind-related aspects of reality, then we must abandon a purely materialist understanding of nature in general, extending to biology, evolutionary theory, and cosmology. Since minds are features of biological systems that have developed through evolution, the standard materialist version of evolutionary biology is fundamentally incomplete. And the cosmological history that led to the origin of life and the coming into existence of the conditions for evolution cannot be a merely materialist history." Philosopher Thomas Nagel
  • "What is more, recent experiments are bringing to light that the experimenter’s free will and consciousness should be considered axioms (founding principles) of standard quantum physics theory. So for instance, in experiments involving 'entanglement' (the phenomenon Einstein called 'spooky action at a distance'), to conclude that quantum correlations of two particles are nonlocal (i.e. cannot be explained by signals traveling at velocity less than or equal to the speed of light), it is crucial to assume that the experimenter can make free choices, and is not constrained in what orientation he/she sets the measuring devices...To understand these implications it is crucial to be aware that quantum physics is not only a description of the material and visible world around us, but also speaks about non-material influences coming from outside the space-time." Antoine Suarez, 2013
  • "Why do people cling with such ferocity to belief in a mind-independent reality? It is surely because if there is no such reality, then ultimately (as far as we can know) mind alone exists. And if mind is not a product of real matter, but rather is the creator of the “illusion” of material reality (which has, in fact, despite the materialists, been known to be the case, since the discovery of quantum mechanics in 1925), then a theistic view of our existence becomes the only rational alternative to solipsism (solipsism is the philosophical idea that only one’s own mind is sure to exist)." Alain Aspect, 2007
So far from mind being a product of a more fundamental material reality, these thinkers have concluded that matter actually is a phenomenon created by mind. Thus, in their view, the fundamental reality in the universe, and perhaps beyond, is mind.