Pages

Saturday, July 21, 2018

Computers Don't Think

There's lots of talk about computers soon being able to "think" like human beings and maybe even bringing about an AI apocalypse. Neurosurgeon Michael Egnor strongly dissents from this view, however.

He grants that humans can use computers to do despicable things but that computers themselves will never be able to think.

Egnor writes:
A cornerstone of the development of artificial intelligence is the pervasive assumption that machines can, or will, think. Watson, a question-answering computer, beats the best Jeopardy players, and anyone who plays chess has had the humiliation of being beaten by a chess engine....Does this mean that computers can think as well as (or better than) humans think? No, it does not.

Computers are not “smart” in any way. Machines are utterly incapable of thought.

The assertion that computation is thought, hence thought is computation, is called computer functionalism. It is the theory that the human mind is to the brain as software is to hardware. The mind is what the brain does; the brain “runs” the mind, as a computer runs a program.

However, careful examination of natural intelligence (the human mind) and artificial intelligence (computation) shows that this is a profound misunderstanding.
Citing the 19th century German philosopher Franz Brentano Egnor observes that computers lack a fundamental and critical characteristic of all thoughts. They lack "aboutness", or what philosophers call intentionality. Here's what he means:
All thoughts are about something, whereas no material object is inherently “about” anything. This property of aboutness is called intentionality, and intentionality is the hallmark of the mind.

Every thought that I have shares the property of aboutness—I think about my vacation, or about politics, or about my family. But no material object is, in itself, “about” anything. A mountain or a rock or a pen lacks aboutness—they are just objects. Only a mind has intentionality, and intentionality is the hallmark of the mind.

Another word for intentionality is meaning. All thoughts inherently mean something. A truly meaningless thought is an oxymoron. The meaning may be trivial or confusing, but every thought entails meaning of some sort. Every thought is about something, and that something is the meaning of the thought.
Computation, however, is an algorithmic process. It's the matching of an input to an output. There's no meaning to what the computer does. Whatever meaning we ascribe to the process is, in fact, imposed by our minds, it doesn't arise from within the machine.

What computers do, then, is represent the thoughts of the person designing, programming, and/or using it:
Computation represents thought in the same way that a pen and paper can be used to represent thought, but computation does not generate thought and cannot produce thought.
Only minds can think. Machines cannot.

When a materialist thinks about her materialism she's essentially disproving her fundamental belief that the material brain is all that's necessary to account for her thoughts. How can electrochemical reactions along material neurons be about something? Electrons whizzing across a synapse are not about anything. They have no meaning in themselves. The meaning must come from something else.