Monday, January 6, 2014

Closing of the Scientific Mind

David Gelernter is a professor of computer science at Yale who has written a very provocative essay for Commentary Magazine on the dehumanization of man that has been wrought by modern scientism. Gerlernter doesn't use the word "scientism" but when he talks of the modern disregard among scientists for any way of knowing other than through the empirical methodology associated with the hard sciences, he's talking about scientism. He writes:
The huge cultural authority science has acquired over the past century imposes large duties on every scientist. Scientists have acquired the power to impress and intimidate every time they open their mouths, and it is their responsibility to keep this power in mind no matter what they say or do. Too many have forgotten their obligation to approach with due respect the scholarly, artistic, religious, humanistic work that has always been mankind’s main spiritual support. Scientists are (on average) no more likely to understand this work than the man in the street is to understand quantum physics. But science used to know enough to approach cautiously and admire from outside, and to build its own work on a deep belief in human dignity. No longer.
The modern view that man is just a flesh and bone machine leads ineluctably to the belief that the human machine can be improved the same way we upgrade our computers: Today science and the “philosophy of mind”—its thoughtful assistant, which is sometimes smarter than the boss—are threatening Western culture with the exact opposite of humanism. Call it roboticism. Man is the measure of all things, Protagoras said. Today we add, and computers are the measure of all men.

Many scientists are proud of having booted man off his throne at the center of the universe and reduced him to just one more creature—an especially annoying one—in the great intergalactic zoo. That is their right. But when scientists use this locker-room braggadocio to belittle the human viewpoint, to belittle human life and values and virtues and civilization and moral, spiritual, and religious discoveries, which is all we human beings possess or ever will, they have outrun their own empiricism. They are abusing their cultural standing. Science has become an international bully.

Nowhere is its bullying more outrageous than in its assault on the phenomenon known as subjectivity.

Your subjective, conscious experience is just as real as the tree outside your window or the photons striking your retina—even though you alone feel it. Many philosophers and scientists today tend to dismiss the subjective and focus wholly on an objective, third-person reality—a reality that would be just the same if men had no minds. They treat subjective reality as a footnote, or they ignore it, or they announce that, actually, it doesn’t even exist.

If scientists were rat-catchers, it wouldn’t matter. But right now, their views are threatening all sorts of intellectual and spiritual fields. The present problem originated at the intersection of artificial intelligence and philosophy of mind—in the question of what consciousness and mental states are all about, how they work, and what it would mean for a robot to have them. It has roots that stretch back to the behaviorism of the early 20th century, but the advent of computing lit the fuse of an intellectual crisis that blasted off in the 1960s and has been gaining altitude ever since.
Gerlernter elaborates on these themes in the rest of the article, which I heartily commend to your reading. Along the way he talks about the shameful treatment dished out to Thomas Nagel for having the temerity to question the ability of Darwinism to account for human consciousness, cognition, and values in his book Mind and Cosmos. He also critiques the "transhumanism" of Ray Kurzweil and his acolytes, as well as various philosophical attempts (eliminitivism, functionalism, cognitivism) to banish consciousness from human ontology. His discussion of the "zombie" problem in philosophy of mind is particularly interesting.

He spends time discussing cognitivism (or computationalism) which is the view that mind is to brain what software is to computers. He concludes by pointing out that the analogy is deeply flawed:
But the master analogy—between mind and software, brain and computer—is fatally flawed. It falls apart once you mull these simple facts:
  1. You can transfer a program easily from one computer to another, but you can’t transfer a mind, ever, from one brain to another.
  2. You can run an endless series of different programs on any one computer, but only one “program” runs, or ever can run, on any one human brain.
  3. Software is transparent. I can read off the precise state of the entire program at any time. Minds are opaque—there is no way I can know what you are thinking unless you tell me.
  4. Computers can be erased; minds cannot.
  5. Computers can be made to operate precisely as we choose; minds cannot.
There are more. Come up with them yourself. It’s easy.

There is a still deeper problem with computationalism. Mainstream computationalists treat the mind as if its purpose were merely to act and not to be. But the mind is for doing and being. Computers are machines, and idle machines are wasted. That is not true of your mind. Your mind might be wholly quiet, doing (“computing”) nothing; yet you might be feeling miserable or exalted, or awestruck by the beauty of the object in front of you, or inspired or resolute—and such moments might be the center of your mental life. Or you might merely be conscious.
It is indeed easy to come up with qualities of the mind/brain complex that are disanalogous to anything found in the software/computer complex. For example, human minds can hold beliefs and doubts, they can understand, they can appreciate beauty and humor, they can experience frustration, boredom, disappointment, chagrin, interest, etc., they can reflect on their opinions, and much else. Computational machines can do none of these things which suggests to a lot of philosophers that minds are something entirely different from brains and that the software/hard drive analogy is grossly inapt.

Check out Gerlernter's essay. It's a little long, but it does a nifty job of explaining why the heyday of modern materialism. It has simply failed to offer a plausible explanation for our subjective, conscious experience and the explanations that it has offered seem so far-fetched that the credibility of any metaphysical view that relies on them, as materialism does, is severely damaged.