Thursday, March 31, 2016

Falling Seas and Red Queens

Science operates by making testable predictions. If those predictions don't come to pass the theory which generates them begins to teeter. Too many unfulfilled predictions and the theory is considered to be falsified and is discarded. Proponents of the theory of global warming have made at least two predictions which, according to some scientific observers, have failed to bear fruit.

The first prediction, made back in the 1990s, was that global temperatures would continue to rise sharply and catastrophically if nothing were done to limit greenhouse gasses. Since then nothing much has been done to limit those emissions but nevertheless the average global temperature hasn't shown any significant change in about 17 years.

The second prediction was that sea levels would rise throughout this century. Well, the century isn't over, of course, but so far it doesn't look good for that prediction either. According to a pair of charts at Real Science, sea levels at Atlantic City, NJ and Manhattan, NY have both been dropping since at least 2009.

This doesn't mean that global warming isn't going to be a long-term trend - it may be - but it does show that the science on this is far from "settled". It also shows that the calls from some quarters to punish, prosecute, and persecute those who are skeptical of the claims of the global warming proponents by firing them from their jobs or even throwing them in prison are not only the sort of thing one expects to hear in fascist dictatorships but are also based on an inexcusable attempt to make science serve a left-wing ideology.

These people, who would shut down dissent by force and intimidation, who demand that people be punished for disagreeing with the conclusions that global warming proponents infer from the data, remind one of the fatuous and cruel Red Queen in Alice in Wonderland demanding that anyone who displeases her have their heads cut off:

Wednesday, March 30, 2016

Mind and Materialism

Raymond Tallis at The New Atlantis discusses the devastating assault on philosophical materialism that began in the 1970s when American philosopher Thomas Nagel explored the question, "What is it like to be a bat?"

Nagel argued that there is something it is like to be a bat whereas it does not make sense to say that it is like something to be a stone. Bats, and people, have conscious experience that purely material objects do not have, and it is this conscious experience that is the defining feature of minds.

This experience, Tallis observes, is not a fact about the physical realm:
This difference between a person’s experience and a pebble’s non-experience cannot be captured by the sum total of the objective knowledge we can have about the physical makeup of human beings and pebbles. Conscious experience, subjective as it is to the individual organism, lies beyond the reach of such knowledge. I could know everything there is to know about a bat and still not know what it is like to be a bat — to have a bat’s experiences and live a bat’s life in a bat’s world.

This claim has been argued over at great length by myriad philosophers, who have mobilized a series of thought experiments to investigate Nagel’s claim. Among the most famous involves a fictional super-scientist named Mary, who studies the world from a room containing only the colors black and white, but has complete knowledge of the mechanics of optics, electromagnetic radiation, and the functioning of the human visual system.

When Mary is finally released from the room she begins to see colors for the first time. She now knows not only how different wavelengths of light affect the visual system, but also the direct experience of what it is like to see colors. Therefore, felt experiences and sensations are more than the physical processes that underlie them.
Nagel goes on to make the claim, a claim that has put him in the bad graces of his fellow naturalists, that naturalism simply lacks the resources to account for conscious experience. Tallis writes:
But none of the main features of minds — which Nagel identifies as consciousness, cognition, and [moral] value — can be accommodated by this worldview’s [naturalism's] identification of the mind with physical events in the brain, or by its assumption that human beings are no more than animal organisms whose behavior is fully explicable by evolutionary processes.
One might wonder why naturalistic materialists are so reluctant to acknowledge that there's more to us than just physical matter. What difference does it make if an essential aspect of our being is mental? What does it matter if we're not just matter but also a mind? Indeed, what does it matter if we are fundamentally mind?

Perhaps the answer is that given by philosopher J.P.Moreland. Moreland makes an argument in his book Consciousness and the Existence of God that naturalism entails the view that everything that exists is reducible to matter and energy, that is, there are no immaterial substances. Thus, the existence of human consciousness must be explicable in terms of material substance or naturalism is likely to be false. Moreland also argues that there is no good naturalistic explanation for consciousness and that, indeed, the existence of consciousness is strong evidence for the existence of God.

Nagel, an atheist, doesn't go as far as Moreland in believing that the phenomena of conscious experience point to the existence of God, but he comes close, arguing that there must be some mental, telic principle in the universe that somehow imbues the world with consciousness. There is nothing about matter, even the matter which constitutes the brain, that can account for conscious experiences like the sensations of color or a toothache. There's nothing about a chemical reaction or the firing of nerve fibers that can conceivably account for what we experience when we see red, hear middle C, taste sweetness, or feel pain. Nor is there anything about matter that can account for the existence of moral value.

If it turns out that naturalism remains unable to rise to the challenge presented by consciousness then naturalism, and materialism, will forfeit their hegemony among philosophers, a hegemony that has already been seriously eroded.

Read the rest of Tallis' article at the link. It's very good.

Tuesday, March 29, 2016

Robot Consciousness (Pt. II)

As a follow-up to yesterday's post on Sophia, the robot featured in an article on robots and consciousness, it might be instructive to focus on a remark in the article by a CEO of a robotics firm. He's quoted as saying that:
...fear will melt away, as people start interacting with robots. "A lot of the concerns people have about robots taking away all the jobs or wrecking the economy or rising up and killing us all, I think those fears are really overblown," he said.
Perhaps so, but on the other hand, there does seem to be some justification for concern about the economic consequences of robots in the workplace, especially those workplaces which employ low-skilled, easily replaced workers:
The CEO of Carl's Jr. and Hardee's has visited the fully automated restaurant Eatsa — and it's given him some ideas on how to deal with rising minimum wages.

"I want to try it," CEO Andy Puzder told Business Insider of his automated restaurant plans. "We could have a restaurant that's focused on all-natural products and is much like an Eatsa, where you order on a kiosk, you pay with a credit or debit card, your order pops up, and you never see a person."

Puzder's interest in an employee-free restaurant, which he says would be possible only if the company found time as Hardee's works on its northeastern expansion, has been driven by rising minimum wages across the US.

"With government driving up the cost of labor, it's driving down the number of jobs," he says. "You're going to see automation not just in airports and grocery stores, but in restaurants."

Puzder has been an outspoken advocate against raising the minimum wage, writing two op-eds for The Wall Street Journal on how a higher minimum wage would lead to reduced employment opportunities. "This is the problem with Bernie Sanders, and Hillary Clinton, and progressives who push very hard to raise the minimum wage," says Puzder. "Does it really help if Sally makes $3 more an hour if Suzie has no job?"

As a result, he and others in the fast-food business are investing big in automation. "If you're making labor more expensive, and automation less expensive — this is not rocket science," says Puzder.

For the time being, Puzder doesn't think that it's likely that any machine could take over the more nuanced kitchen work of Carl's Jr. and Hardee's. But for more rote tasks like grilling a burger or taking an order, technology may be even more precise than human employees. "They're always polite, they always upsell, they never take a vacation, they never show up late, there's never a slip-and-fall, or an age, sex, or race discrimination case," says Puzder of swapping employees for machines.
And then there's another disturbing story in the LA Times:
White House economists released a forecast that calculated more precisely whom Atlas and other forms of automation are going to put out of work. Most occupations that pay less than $20 an hour are likely to be, in the words of the report, “automated into obsolescence.”
Much of this displacement is projected to occur over the next two decades:
Powerhouse consultancies like McKinsey & Co. forecast that 45% of today's workplace activities could be done by robots, AI or some other already demonstrated technology. Some professors argue that we could see 50% unemployment in 30 years.

Human workers of all stripes pound the table claiming desperately that they're irreplaceable. Bus drivers. Bartenders. Financial advisors. Speechwriters. Firefighters. Umpires. Even doctors and surgeons. Meanwhile, corporations and investors are spending billions — at least $8.5 billion last year on AI, and $1.8 billion on robots — toward making all those jobs replaceable. Why? Simply put, robots and computers don't need healthcare, pensions, vacation days or even salaries.
Nor does a workforce full of robots require a Human Resources department. Nor are they a source of friction and tension in the workplace. They don't take phony sick days, nor do they require emotional and psychological maintenance, or performance reviews, or incur any of the other expensive burdens of human employees. A robot like Sophia, who appears to be human, would surely be an attractive investment for an employer groaning under the burden of rising minimum wages for minimally skilled workers who demand a wage out of all proportion to their value to their employer.

Those folks "Fighting for Fifteen" might soon be wondering why there aren't any jobs for them at all.

Monday, March 28, 2016

Robot Consciousness (Pt. I)

After a classroom discussion on some specific characteristics of consciousness that materialism has a difficult time explaining a student forwarded me the link to this fascinating video:
The accompanying article quotes the developer of this amazing robot as saying that, "Our goal is that she will be as conscious, creative and capable as any human. We are designing these robots to serve in health care, therapy, education and customer service applications."

I'll believe that first sentence when I see it. Consider just one attribute of consciousness - understanding. A robot, a computer housed in a very artfully and intelligently designed mannequin, does not understand what it's doing. It's like a sophisticated version of philosopher John Searle's Chinese Room. Searle invites us to imagine a small room inside of which sits a man who understands not a word of Mandarin or of English. Slips of paper are passed to him through a slot. The paper has on it words written in Chinese characters which are completely unintelligible to him, but he has a book in which he can look up the characters and find their English equivalent. On another slip of paper he copies the English, which he also doesn't understand, and passes the paper out another slot. This is essentially what a computer does and what any robot will do. They won't understand what they're doing and will lack that essential aspect of human consciousness.

Not only do computers not understand the information they process, there's a host of other features of human consciousness which computerized robots will have a very difficult time achieving. Here are a few things that any human being does that machines, no matter how intelligently designed, cannot: Appreciate humor or beauty, feel gratitude or moral duty, experience disappointment, regret, guilt, boredom, resentment, or curiosity. Nor can machines doubt, hold a belief, desire, wish, worry, have ideas, assign meaning to what they do, or experience sensations like color, sound, fragrance, sweetness, etc. All of these are the hallmarks of consciousness and the difficulties involved in replicating them in a machine make the article's next paragraph seem extremely optimistic:
Hanson said that one day robots will be indistinguishable from humans. Robots walk, play, teach, help and form real relationships with people, he said. "The artificial intelligence will evolve to the point where they will truly be our friends," he said. "Not in ways that dehumanize us, but in ways that rehumanize us, that decrease the trend of the distance between people and instead connect us with people as well as with robots."
An interesting aside to the above is that should such robots ever be developed they would be the creations of highly intelligent engineers, a fact which should give us pause. If consciousness, in our experience, can only be created by highly intelligent software designers why do so many folks think that our consciousness is simply the product of random collisions of subatomic particles acting blindly over the span of a billion years or so with no goal in mind? To think that that's how our consciousness came to be requires a prodigious act of faith, indeed, blind faith, in the power of luck.

Sunday, March 27, 2016

Miracles, Multiverses, and Easter

The Christian world today celebrates what much of the rest of the Western world finds literally incredible, the revivification of a man 2000 years ago who had been dead for several days. Modernity finds such an account simply unbelievable. It would be a miracle if such a thing happened, moderns tell us, and in a scientific age everyone knows that miracles don't happen.

If pressed to explain how, exactly, science has made belief in miracles obsolete and how the modern person knows that miracles don't happen, the skeptic will often fall back on an argument first articulated by the Scottish philosopher David Hume (d.1776). Hume wrote that miracles are a violation of the laws of nature and as a firm and unalterable experience tells us that there has never been a violation of the laws of nature it follows that any report of a miracle is most likely to be false. Thus, since we should always believe what is most probable, and since any natural explanation of an alleged miracle is more probable than that a law of nature was broken, we are never justified in believing that a miracle occurred.

It has often been pointed out that Hume's argument suffers from the circularity of basing the claim that reports of miracles are not reliable upon the belief that there's never been a reliable report of one. However, we can only conclude that there's never been a reliable report of one if we know a priori that all historical reports are false, and we can only know that if we know that miracles are impossible. But we can only know they're impossible if we know that all reports of miracles are unreliable.

But set that dizzying circularity aside. Set aside, too, the fact that one can say that miracles don't happen only if one can say with certainty that there is no God.

Let's look instead at the claim that miracles are prohibitively improbable because they violate the laws of nature.

A law of nature is simply a description of how nature operates whenever we observe it. The laws are often statistical. I.e. if molecules of hot water are added to a pot of molecules of cold water the molecules will tend to eventually distribute themselves evenly throughout the container so that the water achieves a uniform temperature. It would be extraordinarily improbable, though not impossible, nor a violation of any law, for the hot molecules on one occasion to segregate themselves all on one side of the pot.

Similarly, miracles may not violate the natural order at all. Rather they may be highly improbable phenomena that would never be expected to happen in the regular course of events except for the intervention of Divine will. Like the segregation of warm water into hot and cold portions, the reversal of the process of bodily decomposition is astronomically improbable, but it's not impossible, and if it happened it wouldn't be a violation of any law.

The ironic thing about the skeptics' attitude toward the miracle of the resurrection of Christ is that they refuse to admit that there's good evidence for it because a miracle runs counter to their experience and understanding of the world. Yet they have no trouble believing other things that also run counter to their experience.

For example, modern skeptics have no trouble believing that living things arose from non-living chemicals, that the information-rich properties of life emerged by random chaos and chance, or that our extraordinarily improbable, highly-precise universe exists by fortuitous accident. They ground their belief in these things on their belief that there could be an infinite number of different universes, none of which is observable, and in an infinite number of worlds even highly improbable events are bound to happen.

Richard Dawkins, for example, rules out miracles because they are highly improbable, and then in the very next breath tells us that the naturalistic origin of life, which is at least as improbable, is almost inevitable, given the vastness of time and space.

Unlimited time and/or the existence of an infinite number of worlds make the improbable inevitable, he and others argue. There's no evidence of other worlds, unfortunately, but part of the faith commitment of the modern skeptic is to hold that these innumerable worlds must exist. The skeptic clings to this conviction because if these things aren't so then life and the universe we inhabit must have a personal, rather than a scientific, explanation and that admission would deal a considerable metaphysical shock to his psyche.

Nevertheless, if infinite time and infinite worlds can be invoked to explain life and the cosmos, why can't they also be invoked to explain "miracles" as well? If there are a near-infinite series of universes, as has been proposed in order to avoid the problem of cosmic fine-tuning, then surely in all the zillions of universes of the multiverse landscape there has to be at least one in which a man capable of working miracles is born and himself rises from the dead. We just happen to be in the world in which it happens. Why should the multiverse hypothesis be able to explain the spectacularly improbable fine-tuning of the cosmos and the otherwise impossible origin of life but not a man rising from the dead?

For the person who relies on the multiverse explanation to account for the incomprehensible precision of the cosmic parameters and constants and for the origin of life from mere chemicals, the resurrection of a dead man should be no problem. Given enough worlds and enough time it's a cinch to happen.

No one who's willing to believe in a multiverse should be a skeptic about miracles. Indeed, no one who's willing to believe in the multiverse can think that anything at all is improbable. Given the multiverse everything that is not logically impossible must be inevitable.

Of course, the skeptic's real problem is not that a man rose from the dead but rather with the claim that God deliberately raised this particular man from the dead. That's what they find repugnant, but they can't admit that because in order to justify their rejection of the miracle of the Resurrection they'd have to be able to prove that there is no God, or that God's existence is at least improbable, and that sort of proof is beyond anyone's ability to accomplish.

If, though, one is willing to assume the existence of an infinite number of universes in order to explain the properties of our universe, why would he have trouble accepting that there's a Mind out there that's responsible for raising Jesus from the dead? After all, there's a lot more evidence for the latter than there is for the former.

Saturday, March 26, 2016

Clash of Civilizations

It's been said that a conservative is a liberal who's been mugged. A mugging tends to concentrate the mind and often causes us to dispense with idealistic and naive notions about others. Something like this seems to be happening on a large scale in Europe where terrorist attacks perpetrated by members of the religion of peace are causing a lot of people to abandon some naive attitudes about Muslim immigrants, and Islam in general, and demand that, well, that something, be done. Here are some excerpts from a recent New York Times piece:
Nigel Farage, a leader of the populist, conservative U.K. Independence Party, said: “I think we’ve reached a point where we have to admit to ourselves, in Britain and France and much of the rest of Europe, that mass immigration and multicultural division has for now been a failure.”

The attacks will also put more strain on the deal brokered last week by Chancellor Angela Merkel of Germany with the Turkish government to restrict the migrant flow into Europe, in return for more liberal visa arrangements for travel into Europe by Turkish nationals. That deal was already being criticized as a security threat to Europe and had been questioned on humanitarian and legal grounds.

“In the public eye everything gets connected: the mass abuse in Cologne on New Year’s Eve and the attacks today,” said Rem Korteweg, a security analyst at the Center for European Reform in London, referring to the sexual abuse and robberies in Cologne on New Year’s Eve that were linked to migrants. “However different, in the public mind and for the euroskeptic populace, they’re all the same thing.”

But in Europe, the insecurity around migration and terrorism has challenged key beliefs and principles of the European Union. The Schengen area of visa-free travel across 26 countries has already broken down under the pressure of the migrant flow, with many worried that the zone may never be fully resurrected because of terrorism. Analysts say that attacks like those in Paris and Brussels make it far more likely that European governments will insist on stricter passport, visa and luggage checks at their borders.
This liberal Belgian, posting on Twitter, is probably not atypical:
I’m all for integration and tolerance, but something is rotten to the core when it comes to Muslim culture within Europe. Djihadis, fundamentalists, whatever you want to call them are either too plentiful or have too much influence. Whether that was our fault due to not giving them the tools to integrate or theirs for refusing to take advantage of those tools is besides the point.

I am done defending this culture. I am done playing devil’s advocate… Perhaps it’s time we showed the world again that when we stand as one force, we will not bend. It’s time to show that dogs without bark can still bite.
Europeans have convinced themselves for at least a generation that all cultures are equally "valid" and all religions are aiming at the same ends. They've persuaded themselves that they can admit into their national home millions of immigrants whose values are diametrically opposed to their own and that despite the cultural clash everyone will be sure to work synergistically to bring about the Age of Aquarius. Now they're discovering through bitter experience that things don't seem to work that way in the real world, that these people, or at least a substantial fraction of them (see video), passionately hate Western civilization, are eager to kill Europeans and their children, and this blow to the Europeans' worldview is disorienting.
If Shapiro is correct that's 500,000 Muslims in America who think terror against civilians is sometimes justified, and almost 700 million Muslims worldwide who want to see sharia law imposed everywhere.

There's been no word yet from President Obama as to whether he's having second thoughts after Paris and Brussels about the wisdom of admitting hundreds of thousands of Muslim refugees into this country, although I doubt that he is. He's not the sort of man who entertains doubts about himself. Surely, though, there are ways of helping these wretched refugees without putting ourselves in the same situation as has Europe. Just as many compassionate Americans help the poor and homeless in countless ways without giving them the key to their house, it seems that as a nation we can do likewise.

In any case, I'm going to step out on a limb and make a prediction: If the president goes ahead with plans to allow unvetted refugees to pour into this country, and we suffer another Islamist-inspired terror attack this summer, Trump will win in November in a landslide.

Friday, March 25, 2016

A Good Friday Meditation

Some time ago we did a post based on a remark made by a woman named Tanya at another blog. I thought that as we approach Good Friday it might be worth running the post again, slightly edited.

Tanya's comment was provoked by an atheist at the other blog who had issued a mild rebuke to his fellow non-believers for their attempts to use the occasion of Christian holidays to deride Christian belief. In so doing, he exemplified the sort of attitude toward those with whom he disagrees that one might wish all people, atheists and Christians alike, would adopt. Unfortunately, Tanya spoiled the mellow, can't-we-all-just-get-along mood by manifesting a petulant asperity toward, and an unfortunate ignorance of, the traditional Christian understanding of the atonement.

She wrote:
I've lived my life in a more holy way than most Christians I know. If it turns out I'm wrong, and some pissy little whiner god wants to send me away just because I didn't worship him, even though I lived a clean, decent life, he can bite me. I wouldn't want to live in that kind of "heaven" anyway. So sorry.
Tanya evidently thinks that "heaven" is, or should be, all about living a "clean, decent life." Perhaps the following tale will illustrate the shallowness of her misconception:
Once upon a time there was a handsome prince who was deeply in love with a young woman. We'll call her Tanya. The prince wanted Tanya to come and live with him in the wonderful city his father, the king, had built, but Tanya wasn't interested in either the prince or the city. The city was beautiful and wondrous, to be sure, but the inhabitants weren't particularly fun to be around, and she wanted to stay out in the countryside where the wild things grow. Even though the prince wooed Tanya with every gift he could think of, it was all to no avail. She wasn't smitten at all by the "pissy little whiner" prince. She obeyed the laws of the kingdom and paid her taxes and was convinced that that should be good enough to satisfy the king's demands.

Out beyond the countryside, however, dwelt dreadful, orc-like creatures who hated the king and wanted nothing more than to be rid of him and his heirs. One day they learned of the prince's love for Tanya and set upon a plan. They snuck into her village, kidnapped Tanya, and sent a note to the king telling him that they would be willing to exchange her for the prince, but if their offer was refused they would kill Tanya.

The king, distraught beyond words, related the horrible news to the prince.

Despite all the rejections the prince had experienced from Tanya, he still loved her deeply, and his heart broke at the thought of her peril. With tears he resolved that he would do the Orcs' bidding. The father wept bitterly because the prince was his only son, but he knew that his love for Tanya would not allow him to let her suffer the torment to which the ugly people would surely subject her. The prince asked only that his father try his best to persuade Tanya to live safely in the beautiful city once she was ransomed.

And so the day came for the exchange, and the prince rode bravely and proudly bestride his mount out of the beautiful city to meet the ugly creatures. As he crossed an expansive meadow toward the camp of his mortal enemy he stopped to make sure they released Tanya. He waited until she was out of the camp, fleeing toward the safety of the king's city, oblivious in her near-panic that it was the prince himself she was running past as she hurried to the safety of the city walls. He could easily turn back now that Tanya was safe, but he had given his word that he would do the exchange, and the ugly people knew he would never go back on his word.

The prince continued stoically and resolutely into their midst, giving himself for Tanya as he had promised. Surrounding him, they pulled him from his steed, stripped him of his princely raiment, and tortured him for three days in the most excruciating manner. Not once did any sound louder than a moan pass his lips. His courage and determination to endure whatever agonies to which he was subjected were strengthened by the assurance that he was doing it for Tanya and that because of his sacrifice she was safe.

Finally, wearying of their sport, they cut off his head and threw his body onto a garbage heap.

Meanwhile, the grief-stricken king, his heart melting like ice within his breast, called Tanya into his court. He told her nothing of what his son had done, his pride in the prince not permitting him to use his son's heroic sacrifice as a bribe. Even so, he pleaded with Tanya, as he had promised the prince he would, to remain with him within the walls of the wondrous and beautiful city where she'd be safe forevermore.

Tanya considered the offer, but decided that she liked life on the outside far too much, even if it was risky, and besides, she really didn't want to be in too close proximity to the prince. "By the way," she wondered to herself, "where is that pissy little whiner son of his anyway?"
Have a meaningful Good Friday. You, too, Tanya.

Thursday, March 24, 2016

Stop Pretending

Almost as regularly as if it were a law of nature, whenever someone points out that there is something deeply wrong with a religion which produces so many evil, murderous people, enlightened folk on the left reply by reminding us that other religions (i.e. Christianity) have also produced evil, murderous people. After all, they declaim as they activate their historical dredges, what about the Crusades 900-1000 years ago and the Inquisition 600-700 years ago.

Matt Walsh has had quite enough of this silliness and has penned an amusing riposte to this foolish "religious equivalence" argument which I urge readers to check out in its entirety. Here are some especially pungent passages:
....Speaking of amounting to nothing, I vented my frustration on Twitter, because what else can I do? I said I’m sick of hearing about the great benefits of multiculturalism. Europe is drowning in a tidal wave of unassimilated Muslims who are actively hostile to Europe’s culture, or what’s left of it. And for their trouble, Europeans are being gang raped in the street at night and blown to pieces during their morning commute.

Diversity is a strength, they tell me, but I have seen no evidence to support this doctrine. Diversity of thought might be a strength, but even then it is only a strength if the thought is rational and directed towards truth. The nonsensical thoughts of relativistic nincompoops are not valuable or helpful.

Similarly, racial and cultural diversity does not enrich us if we lose our identity in the process. When you throw a bunch of people with diametrically opposed beliefs and values and priorities into a food processor and hit frappe, you end up with a smoothie that tastes an awful lot like the collapse of western civilization and the rise of barbarians.

I didn’t say all of that on Twitter, what with the character limit, but that was my point. I also said I’m sick of the religious equivalences. Every time Muslims kill a bunch of people, secular liberals start in with their standard attempts to prove Christians are just as bad. It seems nothing will ever convince them that perhaps Islam is somewhat unique in its capacity for violence and atrocities.

So, naturally, my protests against equivalences were met with equivalences. A bunch of leftists told me Christianity and Islam are equal because of, among other things, a series of 11th century military campaigns. And so on and so on and so on. But it got even more absurd. Leftists are so desperate to draw parallels between Christianity and Islam that one of them attempted to mitigate the Brussels attack by reminding me about the Christians who took over a nature reserve in Oregon a few weeks ago.

Me: “Muslims keep blowing things up.”

Liberals: “Yeah but Christians trespassed on a wildlife refuge in Oregon!”

God help us.

I know this is probably a futile effort, but, in light of this umpteenth Islamic terrorist attack this year, I want to concentrate on this Christianity vs. Islam comparison for a moment. I know I probably will never convince leftists to stop idolizing multiculturalism and diversity, but perhaps I can convince them to stop pretending Islam and Christianity are on the same playing field.

OK, I know I will not convince them of that either, but allow me to waste my breath anyway.

Let’s start with the fact that we knew the terrorists in Brussels were Muslim without waiting for anyone to confirm it. We always know it without being told. Leftists know it too, but they haven’t stopped to ask themselves, if Islam is a peaceful religion, why are Muslims literally the only people in the world setting bombs off in subway stations and airports and theaters and embassies and restaurants. Spin this anyway you like, but right now the global terrorism market is a Muslim monopoly. We are certain a terrorist attack was carried out by Muslims the moment the bomb explodes. Shouldn’t that tell you something?

There is no Christian terrorism epidemic. That’s why nobody stops and says, “Wait, maybe the Brussels airport is covered in blood and debris because a white Presbyterian was trying to make a point about Jesus!” Nobody says that because there is literally zero chance of that being the case. Zero chance. It’s not possible. On the extraordinarily rare occasion that a Christian launches a lethal attack against a civilian target in the name of his faith, it’s almost always against something like an abortion clinic. And even then, it almost never happens. The last one was months ago, and the dude was psychotic, not religious.

Still, if you count him, that makes about one attack every decade or so, usually with no causalities, carried out against a facility that executes children. Compare that with the endless stream of massive assaults waged in the name of Islam against entirely random and innocent civilians sitting in restaurants or waiting in lines at airports or subway stations. You can’t compare them. There is no comparison to be made.

Indeed, the fact that abortion clinics don’t have to be fortified and surrounded by 40 armed guards every hour of the day shows just how incredibly effective Christianity is at preventing its adherents from resorting to violence in its name. So, yes, Christianity can lecture other religions about violence. Christianity is much better at standing against violence. Christianity is much more effective at advancing peace in the world. Christianity is just a better religion. It’s better. In every way. It’s better.

And it’s better not only because far fewer acts of evil are performed in its name, but because many more acts of love and mercy are performed in its name. No other religion sends people out to every decaying and forgotten corner and crevice of this Earth to heal the sick, serve the needy, and minister to the hopeless. No other religion runs nearly as many hospitals, homeless shelters, soup kitchens, etc.

If you find a group of foreigners digging a well in Guatemala or handing out mosquito nets in Uganda, they’re probably Christians. On the other hand, if you find a group of foreigners planting explosives in a subway station, they’re probably Muslims.

Christians themselves are flawed, but the faith has had, to put it mildly, an unmistakably positive influence on the world. Right now, as we speak, there are millions upon millions of people across the planet who would not be eating, taking medicine, or sleeping in a warm bed without the concerted efforts of Christians acting at the behest of Scripture. And as a reward, some of them can look forward to being crucified, literally. By Muslims, of course.

Christianity built Western civilization. Christianity advanced the doctrine of Natural Law, which serves as the basis for all of our liberties. Christianity defeated slavery and won the fight for civil rights. Christianity had a hand, and sometimes was the only hand, in most every good and decent thing about this world.

Christians are not perfect, but Christianity is. And the more Christian the world is, the better the world is. The less, the worse. That’s how it’s worked for 2,000 years. History has demonstrably proven Christianity to be an objectively necessary and indispensable force for good over and over and over again.

Equivalence? You cannot begin to find one. There are bad Christians and good Muslims, but if Christianity ceased to exist, millions of people would die. If Islam ceased to exist, millions of people wouldn’t. Draw whatever conclusions you want from there, but you cannot conclude that the two are equal.

But what are Christian fundamentalists? Women who wear long skirts and give birth to more than two kids? Men who go to church and read their Bibles? Unmarried couples who save sex for marriage? Pro-lifers who pray outside of abortion clinics? Bakers who won’t make cakes for gay weddings? Nuns? Pastors? Missionaries?

Liberals are fond of saying “fundamentalism” is the problem generally, as if living by your convictions is wrong regardless of the nature of your convictions. Such an idiotic notion can be expected from moral relativists who believe nothing to be fundamentally true, therefore anyone who adheres to any fundamental doctrine, no matter the doctrine, is dangerous.

On the contrary, Christian fundamentalism is a great blessing to society. It makes people peaceful, disciplined, humble, and kind. A Christian fundamentalist opens a pregnancy center to help women. A Muslim fundamentalist drags a woman into the town square and stones her to death. Equivalence? Stop it.

Obscure nut-jobs like the Westboro Baptists are not Christian fundamentalists. They are apostates. They’ve fabricated their own fundamentals and sprinkled a little Jesus on top of the fake religion they made up. Christian fundamentalists are only a problem when they are fundamentally dedicated to the fundamentals of their own heresies. But even these Christian heretics aren’t often found shooting up Paris to advance their cause.
After considering a few similarities between liberalism and Islamism, Walsh closes with this:
But of the theist religions, Islam is the only one routinely massacring civilians across the world. The only one. Stop claiming otherwise. Stop equivocating.
It's often said that the murderous Muslims are only a small percentage of Muslims, but there are said to be 1.6 billion Muslims in the world. Even if only .1% of them are jihadists that's still a million and a half radical killers, and that doesn't count the millions more who are sympathetic to them. That's quite a lot of people who would like to see you dead.

It's also often said that these people are not true Muslims and are not following the teachings of the Koran. On the contrary, there's ample warrant in the Koran for everything they do, but more importantly they are following the example of the founder of their religion. Whatever Mohammad may have recorded in print he himself was a violent warlord who condoned beheading one's foes, sex slavery, and warfare against non-Muslims.

People sometimes ask why more Muslims don't condemn the sort of terrorism we've seen in Paris and Brussels and from ISIS. One reason, perhaps, is that they realize they can't condemn it without distancing themselves from the example of their Prophet, and this, they see, would be an unforgivable betrayal of their faith.

Wednesday, March 23, 2016

Quantum Spookiness

The universe is a very strange place, stranger than we can imagine. One of the strangest things about it is something Albert Einstein once referred to as "spooky action at a distance." In quantum mechanics there's a phenomenon called quantum entanglement. No one knows how it works, no one really understands it, but every time it's been tested it's been shown to exist, and it's absolutely bizarre.

Here's the nutshell version: Two subatomic particles, e.g. electrons, can be produced from the disintegration of another particle. These daughter particles then travel at enormous velocities away from each other, but they somehow remain connected such that if a property of one of them is changed the same property in the other one changes simultaneously even though any signal sent from one to the other would have to travel at infinite speed to affect the other. This, though, is impossible, so how does the second electron know what's happened to the first? No one knows the answer to this, which is why Einstein, who could never accept the idea of entanglement, called the phenomenon "spooky."

Here's an excellent 15 minute video featuring physicist Brian Greene explaining this quantum weirdness:
An article at Nature discusses a recent test that pretty much clinches the theory that somehow particles that are widely separated from each other, even at opposite ends of the universe, are still in some mysterious way connected so that they can communicate instantaneously with each other:
It’s a bad day both for Albert Einstein and for hackers. The most rigorous test of quantum theory ever carried out has confirmed that the ‘spooky action at a distance’ that the German physicist famously hated — in which manipulating one object instantaneously seems to affect another, far away one — is an inherent part of the quantum world.

The experiment, performed in the Netherlands, could be the final nail in the coffin for models of the atomic world that are more intuitive than standard quantum mechanics, say some physicists. It could also enable quantum engineers to develop a new suite of ultrasecure cryptographic devices.

“From a fundamental point of view, this is truly history-making,” says Nicolas Gisin, a quantum physicist at the University of Geneva in Switzerland. In quantum mechanics, objects can be in multiple states simultaneously: for example, an atom can be in two places, or spin in opposite directions, at once. Measuring an object forces it to snap into a well-defined state. Furthermore, the properties of different objects can become ‘entangled’, meaning that their states are linked: when a property of one such object is measured, the properties of all its entangled twins become set, too.

This idea galled Einstein because it seemed that this ghostly influence would be transmitted instantaneously between even vastly separated but entangled particles — implying that it could contravene the universal rule that nothing can travel faster than the speed of light. He proposed that quantum particles do have set properties before they are measured, called hidden variables. And even though those variables cannot be accessed, he suggested that they pre-program entangled particles to behave in correlated ways.
The recent experiments cited in the Nature article are said to show that Einstein was wrong and that entanglement exists. The universe is indeed a very strange place.

Tuesday, March 22, 2016

Faith vs Fact

Biologist Austin Hughes has a lengthy critique at The New Atlantis of fellow biologist Jerry Coyne's recently published screed against religion titled Faith vs. Fact. Hughes gives a couple of paragraphs in his review to two challenges Coyne directs at believers and adroitly dispatches both of them. Here's Hughes:
Coyne issues the following challenge to his readers: “Over the years, I’ve repeatedly challenged people to give me a single verified fact about reality that came from scripture or revelation alone and then was confirmed only later by science or empirical observation.” I can think of one example, which comes from the work of St. Thomas Aquinas (whose writings Coyne badly misrepresents elsewhere in his book). Based on his exposure to Aristotle and Aristotle’s Arab commentators, Aquinas argued that it is impossible to know by reason whether or not the universe had a beginning. But he argued that Christians can conclude that the universe did have a beginning on the basis of revelation (in Genesis).

In most of the period of modern science, the assumption that the universe is eternal was quietly accepted by virtually all physicists and astronomers, until the Belgian Catholic priest and physicist Georges Lemaître proposed the Big Bang theory in the 1920s. Coyne does not mention Lemaître, though he does mention the data that finally confirmed the Big Bang in the 1960s. But, if the Big Bang theory is correct, our universe did indeed have a beginning, as Aquinas argued on the basis of revelation.

Coyne pairs the above challenge with an earlier challenge from new atheist writer Christopher Hitchens: “Name me an ethical statement made or an action performed by a believer that could not have been made or performed by a non-believer.” I agree that there is no a priori reason why atheists could not perform the kinds of heroic actions of self-sacrifice on behalf of the poor and marginalized that St. Vincent de Paul or St. Damien of Molokai are known for. It’s just that atheists so very rarely do. They have little to compare to the lives of the saints as a storehouse of examples of moral greatness.
Actually, I think there's a better answer to this second challenge: To wit: "Cruelty is objectively wrong." True, a non-believer can speak the words, but if atheism is correct their assertion would be false. On atheism there simply are no objective rights and wrongs.

In any case, Coyne's book, like much of the New Atheist genre, is filled with misunderstandings, self-contradictions, non-sequiturs, and philosophical simple-mindedness as both Hughes and philosopher Ed Feser have documented. Here are a couple of examples of Coyne's infelicities that occurred to me as I read the book last summer:

Throughout his book Coyne adopts a tendentious definition of faith. He insists that faith is believing something despite the lack of evidence, or even in the face of counter-evidence. He observes that such faith is considered a vice in science but esteemed as a virtue in religion, and indeed it would be a scientific vice were the definition correct, but it's not. Faith, whether in science or religion, is manifestly not belief despite the lack of evidence. It's belief despite the lack of proof or certainty, which is a quite different matter. Coyne's straw man definition is a description of "blind" faith, but it's not an accurate description of the faith of millions of thoughtful believers.

He goes on to make the astonishing claim that there's no evidence whatsoever of the existence of supernatural entities or powers. I say this is an astonishing claim because it requires the one who makes it to willfully avert one's eyes from the voluminous positive evidence afforded by both cosmology and biology - the fine-tuning of cosmic parameters and constants, the evidence of a beginning to the universe, the irreducible complexity of many cellular bio-machines and systems, and on and on. The claim also requires that the one who affirms it be wholly unacquainted with the numerous powerful metaphysical arguments for God's existence.

To be sure, it is perhaps the case that none of this evidence amounts to a certainty that God exists. Perhaps, though it's doubtful, naturalists will someday show that there are satisfactory naturalistic explanations for all of these aforementioned facts and arguments, but to think that these are not evidence is to belie a fundamental confusion in one's mind between evidence and proof. It is also to confuse plausibility with certainty. Ideally, both philosophers and scientists believe what they conclude to be the most plausible, or probably true, hypothesis. They don't wait for certainty, like Godot's friends waiting for him to show up, before making their epistemic commitments. One may suppose that the evidence is not sufficiently powerful to compel one to accept the conclusion that there is a God, especially if one does not want there to be a God in the first place, but what one cannot say is that there is no evidence that would justify the conclusion that God exists if one were open to that possibility.

The accumulated labor of both scientists and philosophers has taken that option off the table.

Monday, March 21, 2016

Determinism, Compatibilism and Libertarianism

In class discussions of free will and determinism, a number of students have asked if there isn't a middle way. One student even dug a post out of the archives that I did on just such a via media back in 2008 (12/24). The post starts out by addressing the notion of a kind of compromise position between libertarian free will and determinism, usually referred to as "compatibilism," and ends up summarizing the discussions we've had in class on these different philosophical positions. Here it is:

Barry Arrington at Uncommon Descent offers a succinct rebuttal of compatibilism, i.e. the view that our choices are fully determined and yet at the same time free. As Arrington points out, this certainly sounds like a contradiction.

The compatibilist defines freedom, however, as the lack of coercion, so as long as nothing or no one is compelling your behavior it's completely free even though at the moment you make your decision there's in fact only one possible choice you could make. Your choice is determined by the influence of your past experiences, your environment and your genetic make-up. The feeling you have that you could have chosen something other than what you did choose is simply an illusion, a trick played on us by our brains.

Compatibilism, however, doesn't solve the controversy between determinism and libertarianism (the belief we have free-will). It simply uses a philosophical sleight-of-hand to define it away. As long as it is the case that at any given moment there's just one possible future then our choices are determined by factors beyond our control, and if they're determined it's very difficult to see how we could be responsible for them. Whether we are being compelled by external forces to make a particular choice or not, we are still being compelled by internal factors that make our choice inevitable. Moreover, these internal factors are themselves the product of genetic and/or environmental influences.

The temptation for the materialist is to simply accept determinism, but not only does this view strip us of any moral responsibility, it seems to be based on a circularity: The determinist says that our choices are the inevitable products of our strongest motives, but if questioned as to how we can identify our strongest motives he would simply invite us to examine the choices we make. Since our strongest motives determine our actions our strongest motives are whichever motives we act upon. But, if so, the claim that we always act upon our strongest motives reduces to the tautology that we always act upon the motives we act upon. This is certainly true, but it's not very edifying.

On the other hand, it's also difficult to pin down exactly what a free choice is. It can't be a choice that's completely uncaused because then it wouldn't be a consequence of our character and in what sense would we be responsible for it? But if the choice is a product of our character, and our character is the result of our past experiences, environment, and our genetic make-up, then ultimately our choice is determined by factors over which we have no control, and we're back to determinism.

It seems to me that if materialism is true, and all we are is a material, physical being, and all of our choices are simply the product of chemical reactions occurring in the brain, then determinism must be true as well. If so, moral responsibility and human dignity are illusions, and no punishment or reward could ever be justified on grounds of desert.

This all seems completely counter-intuitive to most people so they hold on to libertarianism, even if they can't explain what a free choice is, but libertarianism is incompatible with materialism. Only if we have a non-physical, immaterial mind that somehow functions in human volition can there be free will and thus moral responsibility and human dignity.

Saturday, March 19, 2016

The Inexplicable Conservative Adulation of Trump

One of the most astonishing phenomena in modern politics has been the support for Donald Trump among people who have spent their careers posing as the arbiters and gatekeepers of conservative ideological orthodoxy. The reason this is such a remarkable development is that throughout his life Trump has aligned himself with people and causes that are the antithesis of conservatism. Even now he eschews conservative principle, taking both sides of some issues and reversing himself on some positions six times before breakfast.

John Zeigler at Mediaite has a strongly-worded column on the paradox of prominent conservatives like Rush Limbaugh gushing over Trump who gives every appearance of being an erstwhile liberal masquerading in this election cycle as a conservative.

Here are some excerpts from Zeigler's essay which should, of course, be read in its entirety:
To fully understand how and why the “conservative” (I use that term loosely) media willingly enabled this hostile takeover of the Republican Party, you primarily need to comprehend what a fraud the entire industry is. In short, the vast majority of “conservative” media is simply just a business cynically disguised as a cause....

It is my view that ... the “conservative” media had their most influence on Trump eventually getting the nomination [when he first announced his candidacy]. Trump’s campaign was like a rocket ship where the most perilous moments are during liftoff. If the conservative base had not accepted him a serious or credible candidate, then he would have quickly crashed and burned because, without traction, the media oxygen which would fuel his flight would have immediately evaporated....

[T]his made it much easier for “conservative” media icons like Limbaugh, Hannity, Bill O’Reilly, and even Levin to “play with fire” during the dull summer months of 2015 and seriously entertain the concept of a Trump candidacy. It can not be overstated that, given Trump’s very liberal history and lack of credentials, just how impossible this would have been for them to have even considered this reaction to his candidacy if Trump did not bring them celebrity/ratings during a down period....

[A]s the fire they created began to spread and gain strength, it became impossible for anyone to control, even if they wanted to. In short, once the average Trump supporter bought into the bogus “conservative” media narrative about him which absolved all his many past sins, he was basically unstoppable. After all, if he really wasn’t one of “us,” or was somehow bad for the “cause,” surely Rush, or Sean, or Bill, or Mark, or Matt would tell them that. Right?!

Once his liftoff was given clearance from conservative heavyweights, the Trump phenomenon went into orbit with his total domination of the news media. Desperately thirsting for the ratings he brought them, the cable news networks (even the liberals at CNN and MSNBC) shattered all semblances of journalistic standards by allowing him to appear for unprecedented amounts of time, usually unfettered. This sucked up all the oxygen from all the other candidates in the far-too-crowded field. It also allowed Trump to laughably claim he was self-funding his campaign when in reality he was hardly needing to spend a dime because of all the free airtime he was given (this is just one of dozens of key lies Trump has told, which the “conservative” media has conveniently ignored).

This created a self-perpetuating symbiotic relationship between his free airtime and his poll numbers. The more potential Trump voters saw him being taken seriously in the media, the more they gained the courage to tell pollsters that they supported him. The more he rose in the polls, the better excuse the networks had for having Trump on. They pretended that it was because he was now the “GOP Frontrunner,” when it was really just because he was much better for their ratings than interviewing any of the serious candidates....

[T]he conservative media doesn’t want to see their cash cow killed, especially since they have basically already budgeted for the record revenue a Trump vs. Hillary campaign would likely create....

Hannity effectively became an arm of the Trump campaign. Limbaugh continued to defend him. Drudge waged a vicious overt war against Rubio and Ted Cruz and made sure (much like he did with Rev. Wright in his 2008 bid to protect Obama during the primaries) that no negative Trump stories could get any real traction....

It should be noted that the nature of Trump’s fervent fans added even more fuel to why the “conservative” media went into the tank for Trump. They are not only the most passionate, but they are also the least rational. You cannot reason with them and they will simply tune you out if they don’t like what you are saying about their hero.

Of course, the “conservative” media doesn’t really care if Trump beats Hillary. They “win” either way. They get great ratings through November and “worst case” they end up with Hillary to provide them with easy content for at least the next four years. Pathetically, that’s all they really care about....

With all the talk of dismantling the “evil establishment” in this election cycle, for conservatives to ever win a presidential election again, the REAL “establishment” which needs to be shattered is that of the elite “conservative” media. While the Republican establishment is weak and incompetent, at least most of them are not overtly working for the other side.
I could understand supporting Trump were there no good alternatives for conservative voters in the Republican race, but there were, and still are. Scott Walker was a great candidate, Marco Rubio and Carly Fiorina were also outstanding, and Ted Cruz is excellent. Why, then, the almost complete blackout for the last six months of these candidates by people like Rush Limbaugh and Sean Hannity and their total focus on Donald Trump? Why do individuals like Ben Carson and Jerry Falwell, Jr., men who stand for civility and morality in politics and public life, endorse a man who has coarsened our public discourse and who openly boasts of adulterous relationships? Why do Bill O'Reilly, Matt Drudge, Ann Coulter, Breitbart.com and others fawn over a man who until the day before yesterday held political positions they disdain?

Jim Geraghty, in commenting on Zeigler's piece, says this:
Finally, while the major conservative media entities named in John Zeigler’s essay would probably vehemently deny that they let clubby groupthink, a desire for ratings and other bad influences alter their judgment, there is this remaining unexplained about-face from some of the biggest names in the conservative movement. From about 2009 to early 2015, to be praised in conservative media, you had to be indisputably conservative. Even a longtime record of voting conservatively didn’t protect you if you were seen as flinching in a tough fight. Mike Castle? Unacceptable! Mitch McConnell? Sellout! John Boehner? Worthless! Thad Cochran? Everything that’s wrong with the Senate! Lindsey Graham, Mitt Romney, Paul Ryan? Useless squishes!

Then in mid-2015, along comes Trump, with his long history of donating to the Democrats, support for Planned Parenthood, affirmative action, gun control, and a national health-care system, even friendship with Al Sharpton . . . and some of the biggest names on television and radio are perfectly fine with him.
Either these prominent conservatives are, in fact, (1) a bunch of unprincipled hucksters who simply exploit the conservatism of their audience for their own ratings and profit or (2) they know something about Trump that no one else does and which they're not sharing with us, or (3) they're just stupid. I don't believe (2) or (3), and I don't want to believe (1). Nevertheless, unless there's another possibility that I'm missing I don't see how (1) can be avoided.

Friday, March 18, 2016

Are We a Simulation? (Pt. II)

By way of concluding Wednesday's post on the possibility that you, I, and our entire universe actual exist in a computer simulation developed by some superior intellect in another world, we note that Robert Kuhn points out that the simulation hypothesis has great difficulty with the phenomenon of human consciousness:
A prime assumption of all simulation theories is that consciousness — the inner sense of awareness, like the sound of Gershwin or the smell of garlic — can be simulated; in other words, that a replication of the complete physical states of the brain will yield, ipso facto, the complete mental states of the mind. (This direct correspondence usually assumes, unknowingly, the veracity of what's known in philosophy of mind as "identity theory," one among many competing theories seeking to solve the intractable "mind-body problem".)

Such a brain-only mechanism to account for consciousness, required for whole-world simulations and promulgated by physicalists, is to me not obvious (Physicalism is the belief that everything in the universe is ultimately explicable in terms of the laws of physics. Physicalism is, for most purposes, synonomous with naturalism).
Kuhn is raising the question here as to how, for example, the sensation of seeing blue could be simulated? Until there is a plausible physical explanation of consciousness, which there is not at this point, it seems unlikely that conscious beings are nothing more than a simulation.

There's more of interest in this essay at the original article, including how physicist Paul Davies uses the simulation argument to refute the multiverse hypothesis. Kuhn closes his piece with this:
I find five premises to the simulation argument: (i) Other intelligent civilizations exist; (ii) their technologies grow exponentially; (iii) they do not all go extinct; (iv) there is no universal ban or barrier for running simulations; and (v) consciousness can be simulated.

If these five premises are true, I agree, humanity is likely living in a simulation. The logic seems sound, which means that if you don't accept (or don't want to accept) the conclusion, then you must reject at least one of the premises. Which to reject?
Personally, I find premise (i) problematic, premise (ii) possible, but questionable (it's just as likely that technological growth reaches a ceiling or collapses altogether), and premise (v) highly doubtful.

Thursday, March 17, 2016

On St. Patrick's Day

The following is a post I've run on previous St. Patrick's Days and thought I'd run again this year because, I say in all modesty, it's pretty interesting:

Millions of Americans, many of them descendents of Irish immigrants, celebrate their Irish heritage by observing St. Patrick's Day today. We are indebted to Thomas Cahill and his best-selling book How The Irish Saved Civilization for explaining to us why Patrick's is a life worth commemorating. As improbable as his title may sound, Cahill weaves a fascinating and compelling tale of how the Irish in general, and Patrick and his spiritual heirs in particular, served as a tenuous but crucial cultural bridge from the classical world to the medieval age and, by so doing, made Western civilization possible.

Born a Roman citizen in 390 A.D., Patrick had been kidnapped as a boy of sixteen from his home on the coast of Britain and taken by Irish barbarians to Ireland. There he languished in slavery until he was able to escape six years later. Upon his homecoming he became a Christian, studied for the priesthood, and eventually returned to Ireland where he would spend the rest of his life laboring to persuade the Irish to accept the Gospel and to abolish slavery. Patrick was the first person in history, in fact, to speak out unequivocally against slavery and, according to Cahill, the last person to do so until the 17th century.

Meanwhile, Roman control of Europe had begun to collapse. Rome was sacked by Alaric in 410 A.D. and barbarians were sweeping across the continent, forcing the Romans back to Italy, and plunging Europe into the Dark Ages. Throughout the continent, unwashed, illiterate hordes descended on the once grand Roman cities, looting artifacts and burning books. Learning ground to a halt and the literary heritage of the classical world was burned or moldered into dust. Almost all of it, Cahill claims, would surely have been lost if not for the Irish.

Having been converted to Christianity through the labors of Patrick, the Irish took with gusto to reading, writing and learning. They delighted in letters and bookmaking and painstakingly created indescribably beautiful Biblical manuscripts such as the Book of Kells which is on display today in the library of Trinity College in Dublin. Aware that the great works of the past were disappearing, they applied themselves assiduously to the daunting task of copying all surviving Western literature - everything they could lay their hands on.

For a century after the fall of Rome, Irish monks sequestered themselves in cold, damp, cramped mud huts called scriptoria, so remote and isolated from the world that they were seldom threatened by the marauding pagans. Here these men spent their entire adult lives reproducing the old manuscripts and preserving literacy and learning for the time when people would be once again ready to receive them.

These scribes and their successors served as the conduits through which the Graeco-Roman and Judeo-Christian cultures were transmitted to the benighted tribes of Europe, newly settled amid the rubble and ruin of the civilization they had recently overwhelmed. Around the late 6th century, three generations after Patrick, Irish missionaries with names like Columcille, Aidan, and Columbanus began to venture out from their monasteries and refuges, clutching their precious books to their hearts, sailing to England and the continent, founding their own monasteries and schools among the barbarians and teaching them how to read, write and make books of their own.

Absent the willingness of these courageous men to endure deprivations and hardships of every kind for the sake of the Gospel and learning, Cahill argues, the world that came after them would have been completely different. It would likely have been a world without books. Europe almost certainly would have been illiterate, and it would probably have been unable to resist the Muslim incursions that arrived a few centuries later.

The Europeans, starved for knowledge, soaked up everything the Irish missionaries could give them. From such seeds as these modern Western civilization germinated. From the Greeks the descendents of the Goths and Vandals learned philosophy, from the Romans they learned about law, from the Bible they learned of the worth of the individual who, created and loved by God, is therefore significant and not merely a brutish aggregation of matter.

From the Bible, too, they learned that the universe was created by a rational Mind and was thus not capricious, random, or chaotic. It would yield its secrets to rational investigation. Out of these assumptions, once their implications were finally and fully developed, grew historically unprecedented views of the value of the individual and the flowering of modern science.

Our cultural heritage is thus, in a very important sense, a legacy from the Irish. A legacy from Patrick. It is worth pondering on this St. Patrick's Day what the world would be like today had it not been for those early Irish scribes and missionaries thirteen centuries ago.

Buiochas le Dia ar son na nGaeil (Thank God for the Irish), and I hope you have a great St. Patrick's Day.

Wednesday, March 16, 2016

Are We a Simulation? (Pt. I)

Here's a post from last summer (8/22) that's relevant to some of the discussions that have come up in a couple of my classes recently:

Robert Kuhn host and writer of the public television program "Closer to Truth" has an excellent column on the theory that our universe is actually a computer simulation developed by a higher intelligence in some other universe. Kuhn writes:
I began bemused. The notion that humanity might be living in an artificial reality — a simulated universe — seemed sophomoric, at best science fiction.

But speaking with scientists and philosophers on "Closer to Truth," I realized that the notion that everything humans see and know is a gigantic computer game of sorts, the creation of supersmart hackers existing somewhere else, is not a joke. Exploring a "whole-world simulation," I discovered, is a deep probe of reality.

Philosopher Nick Bostrom, director of the Future of Humanity Institute at Oxford University, describes a fake universe as a "richly detailed software simulation of people, including their historical predecessors, by a very technologically advanced civilization."

It's like the movie "The Matrix," Bostrom said, except that "instead of having brains in vats that are fed by sensory inputs from a simulator, the brains themselves would also be part of the simulation. It would be one big computer program simulating everything, including human brains down to neurons and synapses."

Bostrum is not saying that humanity is living in such a simulation. Rather, his "Simulation Argument" seeks to show that one of three possible scenarios must be true (assuming there are other intelligent civilizations):
  • All civilizations become extinct before becoming technologically mature;
  • All technologically mature civilizations lose interest in creating simulations;
  • Humanity is literally living in a computer simulation.
His point is that all cosmic civilizations either disappear (e.g., destroy themselves) before becoming technologically capable, or all decide not to generate whole-world simulations (e.g., decide such creations are not ethical, or get bored with them). The operative word is "all" — because if even one civilization anywhere in the cosmos could generate such simulations, then simulated worlds would multiply rapidly and almost certainly humanity would be in one.

As technology visionary Ray Kurzweil put it, "maybe our whole universe is a science experiment of some junior high school student in another universe." (Given how things are going, he jokes, she may not get a good grade.)

Kurzweil's worldview is based on the profound implications of what happens over time when computing power grows exponentially. To Kurzweil, a precise simulation is not meaningfully different from real reality. Corroborating the evidence that this universe runs on a computer, he says, is that "physical laws are sets of computational processes" and "information is constantly changing, being manipulated, running on some computational substrate." And that would mean, he concluded, "the universe is a computer." Kurzweil said he considers himself to be a "pattern of information."

"I'm a patternist," he said. "I think patterns, which means that information is the fundamental reality."
Information, of course, is the product of minds, thus, if information is the fundamental reality in our world there must be a mind that has generated it. Many people, of course, agree with this and argue that the information which comprises this world is produced by the mind of God, but scientists, at least naturalistic scientists, argue that God is a metaphysical concept which lies outside the purview of science. Instead they advert to the existence of computer hackers in other universes which is also a metaphysical posit, but since it's not God, it's presumably okay to speculate about them.

At any rate, Kuhn goes on:
Would the simulation argument relate to theism, the existence of God? Not necessarily.

Bostrum said, "the simulation hypothesis is not an alternative to theism or atheism. It could be a version of either — it's independent of whether God exists." While the simulation argument is "not an attempt to refute theism," he said, it would "imply a weaker form of a creation hypothesis," because the creator-simulators "would have some of the attributes we traditionally associate with God in the sense that they would have created our world."

They would be superintelligent, but they "wouldn't need unlimited or infinite minds." They could "intervene in the world, our experiential world, by manipulating the simulation. So they would have some of the capabilities of omnipotence in the sense that they could change anything they wanted about our world."

So even if this universe looks like it was created, neither scientists nor philosophers nor theologians could easily distinguish between the traditional creator God and hyper-advanced creator-simulators.

But that leads to the old regress game and the question of who created the (weaker) creator-simulators. At some point, the chain of causation must end — although even this, some would dispute.
In other words, the universe displays indications of having been intelligently designed rather than having been an enormously improbable accident. This poses vexing problems for naturalists who feel constrained to account for the design without invoking you-know-who. So they theorize about a multiverse of an infinite number of worlds or speculate about extra-cosmic computer programmers who've created a world that looks real but is in fact just a computer simulation.

These extraordinary hypotheses are taken seriously by some philosophers and scientists, but if someone were to suggest that maybe this universe really is the only universe, that maybe it's real and not an illusory simulation foisted on us by some pimply extra-terrestrial, and that maybe it's instead the product of a single intelligent transcendent mind, he would suffer the ridicule and scorn of those who'd sooner believe that the universe is a science project of a seventh grader in some other more technologically advanced universe. I wonder which is the more implausible hypothesis.

I'll conclude with a couple more thoughts on this in Part II tomorrow.

Tuesday, March 15, 2016

The Ides of March

Today is March 15th. It was on this date in 44 B.C. that the Roman emperor Julius Caesar was assassinated by some sixty conspirators, including a group of Roman senators. The murder of Caesar changed world history and the U.K. Telegraph has a fascinating account by Dominic Selwood of this event.

Caesar was a complex character, as Selwood tells us, and like many great men he was admirable in some ways and repulsive in others:
He was a military colossus, original thinker, compelling writer, magnetic orator, dynamic reformer, and magnanimous politician. Yet he was also manipulative, narcissistic, egotistical, sexually predatory, shockingly savage in war even by Roman standards, and monomaniacally obsessed with acquiring absolute power for himself.
La Mort de César (ca. 1859–1867) by Jean-Léon Gérôme
Here's the lede of Selwood's article:
Spurinna was a haruspex. His calling was vital, if a little unusual, requiring him to see the future in the warm entrails of sacrificial animals.

At the great festival of Lupercalia on the 15th of February 44 B.C., he was a worried man. While priests were running around the Palatine Hill hitting women with thongs to make them fertile, Spurinna was chewing over a terrible omen.

"Spurinna knew it was a terrible sign: a sure portent of death."

The bull that Julius Caesar, Dictator of Rome, had sacrificed earlier that day had no heart. Spurinna knew it was a terrible sign: a sure portent of death.

The following day, the haruspex oversaw another sacrifice in the hope it would give cause for optimism, but it was just as bad: the animal had a malformed liver. There was nothing for it but to tell Caesar.

In grave tones, Spurinna warned the dictator that his life would be in danger for a period of 30 days, which would expire on the 15th of March. Caesar dismissed the concerns. Although in his scramble for political power he had been made the chief priest of Rome (Pontifex Maximus), he was a campaign soldier by trade, and not bothered by the divinatory handwringing of seers like Spurinna.

As the 30 days passed, nothing whatsoever happened. Yet when the 15th of March dawned, Caesar’s wife awoke distressed after dreaming she held his bloodied body. Fearing for his life, she begged him not to leave the house. His dreams, too, had also been unsettling. He had been flying through the air, and shaken hands with Jupiter. But he pushed any concerns aside. The day was an important annual celebration in Rome’s religious calendar, and he had called a special meeting of the Senate.

His first appointment of the day was a quick sacrifice at a friend’s house. Spurinna the seer was also there. Caesar joked that his prophecies must be off as nothing had happened. Spurinna muttered that the day was not yet over.

The sacrifices proceeded, but the animals’ innards were blemished and the day was plainly inauspicious. Caesar knew when to call it a day, and agreed to postpone the meeting of the Senate and to go home.

Later that morning, his fellow military politician and protégé Decimus called round, urging him to come to the Senate in case his absence was seen as mocking or insulting. Persuaded by his friend, soldier to soldier, Caesar agreed to go in person to announce the meeting would be postponed.

Shortly after, a slave arrived at Caesar’s house to warn him of the plot against his life. But he was too late: Caesar had left. A short while later, a man named Artemidorus of Cnidus pushed through the jostling crowds and handed Caesar a roll setting out details of the plot. But the crowds were so thick he had no chance to read it.

The main Senate House was being rebuilt on Caesar’s orders, so the meeting was instead at the Curia behind the porticoed gardens attached to the great Theatre of Pompey. Another round of animal sacrifices before the start of the session was unfavourable, and Caesar waited outside, troubled. Again Decimus spoke with him. Unaware of his friend’s treachery, Caesar allowed himself to be led towards the chamber by the hand. Decked out in his triumphant general’s reddish-purple toga embroidered in gold, Julius Caesar, Dictator of Rome, entered the Senate’s meeting room, and ascended his golden throne.
Go to the link for Selwood's account of the denouement.

Selwood attributes several myths surrounding this assassination to Shakespeare and his play Julius Caesar. One interesting myth has to do with the line in the play given by a soothsayer who shouts to Caesar the words, "Beware the Ides of March!" This line has ever since come to be a portent of disaster, but what are the "Ides of March"?
In Rome’s impossibly complicated calendar, every month had an Ides....In the mists of time, the early Romans began each month at the new moon. They called that day the Kalends (Kalendae). Two weeks later came the full moon, which they named the Ides (Idus). Midway between the two was the half-moon, which they referred to as the Nones (Nonae). For some inexplicable reason, they then chose to refer to every other day in the month in terms of its relationship to the next one of these coming up. So they would say, “five days before the Kalends of March,” or “three days before the Nones of June”.

The Kalends was always the 1st of the month. Over time, the others came to fall on set days. In March, May, July, and October, the Nones was the 7th and the Ides was the 15th. For the remaining months, the Nones was the 5th and the Ides was the 13th. Therefore the 4th of July was IIII Nones July (i.e. four days before the Nones - the calculation is inclusive, so both the 4th and the 7th are counted).

Although every month had an Ides in the middle, the date chosen by Caesar’s murderers was nevertheless significant. Traditionally, the Roman year started on the 1st of March, meaning the Ides was the first full moon of the year. It was a major celebration, and the festival of Anna Perenna, the goddess of the cycle of the year. Her special gift was to reward people with long life. Caesar’s assassins clearly thought they were giving long life to Rome (and their own political careers) by removing the dictator they believed was blighting it all.
It has often amazed me when reading Roman history that they could accomplish such great feats of engineering and architecture with their exceedingly cumbersome system of numeration. It's almost equally as amazing that they could be such good historians with such a clunky calendar. At any rate, there's more at the link, including a discussion of the consequences of the murder for subsequent history. It makes for good reading on this the Ides of March.

Monday, March 14, 2016

Unintended Consequences

Employment news out of Seattle is discouraging but not surprising. In June 2014 Seattle raised the minimum wage for all employees in the city to $15 to be reached in increments.
Starting last April, it raised the minimum from $9.32 (the state minimum wage) to $10 for certain business, $11 for others. Increases to $12, $12.50 and $13 an hour began taking effect for most employers this Jan. 1.
There's still a way to go before the wage hits the target of $15 per hour in 2017, but it's not too early to get an indication of the effect the hike has had on jobs. An American Enterprise Institute study shows that,
[B]etween April and December last year Seattle saw the biggest employment drop in any nine-month period since 2009 — a full year into the Great Recession. The city unemployment rate rose a full percentage point.

Before the minimum-wage hikes began, Seattle employment tracked the rest of the nation — slowly rising from the 2008-09 bottom. But it started to plunge last spring, as the new law began to kick in.

Furthermore, Seattle’s loss of 10,000 jobs in just the three months of September, October and November was a record for any three-month period dating back to 1990.

Meanwhile, employment outside the city limits — which had long tracked the rate in Seattle proper — was soaring by 57,000 and set a new record high that November.
In what may come as a surprise to the members of the Seattle city council and similar bodies around the country, when government raises the cost of doing business employers lay off, or choose not to hire, marginal employees. In a vivid illustration of the Law of Unintended Consequences at work the attempt by government to increase the income of minimum wage employees actually makes it harder for those employees to get hired and, if hired, to keep their jobs.

Now thousands of Seattle residents are out of work because their leaders thought they'd help them by raising their pay rate. Given a choice, which would minimum wage employees prefer, to have a job making $7.50 an hour or to have no job but know that if they did have one they'd be making twice as much?

The late Ronald Reagan could have had Seattle in mind when, in his first Inaugural Address, he declared that, "government is not the solution to our problem; government is the problem."

Saturday, March 12, 2016

Snow Geese

A friend and faithful reader of VP wrote recently, reminding me that I haven't posted any bird photos in a while and asking to see some. It just so happens that I took my wife, daughter, and her fiance today to one of the major migratory stopover spots on the east coast to see a waterfowl called the snow goose. These birds breed in the tundra and migrate in huge numbers along the Atlantic coast and Mississippi river valley.



By the end of the 19th century hunting had taken a serious toll on the population of these birds and was banned in 1916 to allow the population to recover. The ban resulted in skyrocketing numbers of geese and hunting was restored in 1975. The birds today number in the millions. When they're swirling around after taking off, or as they land, it's like standing inside a snow globe. The numbers during our visit today were not as impressive as what's seen in these photos since we're now past the peak in the snow goose migration, but there were still several thousand geese and tundra swans on the water this morning. At it's peak, which is usually about the first week in March, there may be 100,000 or more snow geese on the lake.



I got these photos off the web, but they were taken at the site we visited today. It's called Middle Creek Wildlife Management Area and it sits athwart the Lancaster/Lebanon county line in south-central Pennsylvania. An interesting anecdote about Middle Creek: It was built in the 1970s for hunting and conservation and is managed by the Pennsylvania Game Commission. It has recently been in the news because the Game Commission wishes to raise hunting license fees to obtain the revenue to maintain the facility, but the legislature has been reluctant to grant the increase. The Game Commission has said that they can't keep Middle Creek open if they don't get more money and will be shutting it down in a year or so.

It would be a great shame if this happened, and I doubt very much that the legislators will let it come to that, but it's where matters stand as of now.

Friday, March 11, 2016

Science Guy Misunderstands Philosophy

Bill Nye (The "Science Guy") has blithely wandered onto some thin metaphysical ice. A video of him responding to a young philosophy major's question about the importance of philosophy elicited a response that has a lot of philosophers shaking their heads.
I addressed this very topic in a post last summer titled Does Science Need Philosophy? (8/21/15) and thought it might be appropriate to run it again in response to Nye's video:

It seems to be something of a trend lately for materialists, particularly materialist scientists, to denigrate philosophy. Cosmologists Stephen Hawking and Lawrence Krauss are two recent examples. Hawking even went so far as to pronounce philosophy dead in his book The Grand Design.

I wonder if one of the subconscious reasons for their disdain for philosophy is that these scientists and others are writing books claiming that science pretty much makes belief in God untenable, but they're finding that philosophers who critique their arguments are showing them to be embarrassingly unsophisticated. The animus against philosophy may derive from personal chagrin suffered at the hands of philosopher-critics.

Be that as it may, Hawking and Krauss, for all their brilliance, are astonishingly unaware of the philosophical faux pas that pervades their own writing.

Krauss, for example, made the claim in his book A Universe from Nothing that the universe emerged spontaneously out of a mix of energy and the laws of physics which he calls "nothing." Thus God is not necessary to account for the universe. Of course, this is a semantic sleight-of-hand since if the cosmos was produced by energy and physical laws then there was not "nothing," there was "something," and we're confronted with the mystery of how this energy and these laws came about.

Hawking declared philosophy "dead" in the early pages of his book and then spent a good part of the rest of the book philosophizing about realist and anti-realist views of the universe and the existence of a multiverse.

It's ironic that physicists like Hawking and Krauss would be so willing to deprecate philosophy since their own discipline is infused with it. Every time physicists talk about the multiverse or the nature of time or space or their own naturalistic assumptions about reality, they're doing metaphysics. When they talk about knowledge, cause and effect, the principle of sufficient reason, the principle of uniformity, or the problem of exactly what constitutes the scientific enterprise (the demarcation problem), they're doing philosophy. Whenever they discuss the ethics required of scientists in conducting and reporting their researches or express awe at the beauty of their equations, they're doing philosophy.

The entire discipline of science presupposes a host of philosophical assumptions like the trustworthiness of our senses and of our reason, the orderliness of the universe, the existence of a world outside our minds, etc. Yet these thinkers seem to be oblivious to the foundational role philosophy plays in their own discipline. Indeed, science would be impossible apart from axiomatic philosophical beliefs such as those listed above.

Science tells us the way the physical world is, but as soon as the scientist starts to draw conclusions about what it all means he or she is doing philosophy. It's inescapable. There's a bit of a joke at Uncommon Descent about this. It goes like so:
Scientist: "Why does philosophy matter?"
Philosopher: "I don't know, why does science matter?"
Scientist: "Well, because scie...."
Philosopher: "Annnnnnnd you are doing philosophy."
There's more on how science is inextricably infused by philosophy here.

Thursday, March 10, 2016

Reverting to the Dark Ages

Science has unmoored itself from its heritage in Christian metaphysics and adopted a naturalistic worldview, but it's the former in which science was conceived and in which it was nourished, cultivated, and grew to maturity. Now it has declared its independence, thinking it can stand on its own, no longer needing the support of the superstitions of its youth. Perhaps science need not rely on the assumptions bequeathed it by its religio-cultural heritage, perhaps scientists can dispense with Christian moral assumptions and belief in objective truth with no effect, but articles like this one by Melanie Phillips leave one less than convinced.

After lamenting that science is plagued by shoddy research and faulty conclusions, Phillips writes:
Richard Horton, editor-in-chief of The Lancet, has written bleakly: “The case against science is straightforward: much of the scientific literature, perhaps half, may simply be untrue.”

One reason is that cash-strapped universities, competing for money and talent, exert huge pressure on academics to publish more and more to meet the box-ticking criteria set by grant-funding bodies. Corners are being cut and mistakes being made....

The problem lies with research itself. The cornerstone of scientific authority rests on the notion that replicating an experiment will produce the same result. If replication fails, the research is deemed flawed. But failure to replicate is widespread. In 2012, the OECD spent $59 billion on biomedical research, nearly double the 2000 figure. Yet an official at America’s National Institutes of Health has said researchers would find it hard to reproduce at least three-quarters of all published biomedical findings.

A 2005 study by John Ioannidis, an epidemiologist at Stanford University, said the majority of published research findings were probably false. At most, no more than about one in four findings from early-phase clinical trials would be true; epidemiological studies might have only a one in five chance of being true. “Empirical evidence on expert opinion”, he wrote, “shows that it is extremely unreliable”.
So why has this state of affairs come to pass?
Underlying much of this disarray is surely the pressure to conform to an idea, whether political, commercial or ideological. Ideological fads produce financial and professional incentives to conform and punishment for dissent, whether loss of grant-funding or lack of advancement. As Professor Ioannidis observed: “For many current scientific fields, claimed research findings may often be simply accurate measures of the prevailing bias.”

Underlying this loss of scientific bearings is a closed intellectual circle. Scientists pose as secular priests. They alone, they claim, hold the keys to the universe. Those who aren’t scientists merely express uneducated opinion. The resulting absence of openness and transparency is proving the scientists’ undoing. In the words of Richard Horton, “science has taken a turn towards darkness”.

But science defines modernity. It is our gold standard of truth and reason. This is the darkness of the West too.
To put this differently, when generations of scientists are invested in a materialistic naturalism that places no moral constraints on their work and which calls into question the very idea of objective truth the temptation to succumb to the professional and ideological pressures imposed by the grant and tenure process, and indeed the pressure to conform to the prevailing consensus among one's peers, then the quality of scientific work will slowly degrade. Science, disconnected from the only metaphysics which can provide a moral anchor, is easily thrust into the service of whatever the prevailing ideology may be, just as happened to science in the communist Soviet Union in the first half of the twentieth century.

Ideas have consequences.