Tuesday, July 30, 2019

Godel, Escher, Bach

Philosopher Walter Myers notes that August will mark the 40th anniversary of the publication of Douglas Hofstadter's Godel, Escher, Bach: An Eternal Golden Braid. In the preface to the 1999 edition Hofstadter clarifies his purpose in writing the book. Myers writes:
The three luminaries [mathematician Kurt Godel, artist M.C. Escher, composer Johann Sebastian Bach] are not the central figures of the book. The book was intended to ask the fundamental question of how the animate can emerge from the inanimate, or more specifically, how does consciousness arise from inanimate, physical material?

As philosopher and cognitivist scientist David Chalmers has eloquently asked, “How does the water of the brain turn into the wine of consciousness?”

Hofstadter believes he has the answer: the conscious “self” of the human mind emerges from a system of specific, hierarchical patterns of sufficient complexity within the physical substrate of the brain. The self is a phenomenon that rides on top of this complexity to a large degree but is not entirely determined by its underlying physical layers.
In other words, Hofstadter argued that human consciousness is what philosophers call an emergent property. Just as wetness emerges when hydrogen and oxygen combine in a certain way, so, too, does consciousness emerge whenever brain matter reaches a certain level of complexity.

Myers explains that Hofstadter believes this happens in both humans and in the artificial intelligence of computers although he has no theory as to how it does so. Nevertheless, his conviction is that if computers could be designed to model the neural networks of the brain then consciousness will arise.

The models he suggests are very complicated, and, as Myers points out, we're a long way away from generating an artificial analogue to consciousness. Computers still lack the capacity, for example, to understand what they're doing.

Not only do computers not understand in the sense that humans understand a concept or idea, there is a host of cognitive capacities and experiences of which humans are capable that computers would have to achieve in order to be conscious.

Computers would have to be capable, for example, of holding beliefs, of having doubts, regrets, hopes, resentments, frustrations, worries, desires and intentions.

They would have to somehow be programmed to actually experience gratitude, boredom, curiosity, interest, guilt, pleasure, pain, flavor, color, fragrance and warmth - not just detect some sort of stimulus but to actually experience these phenomena.

Are those working in the field of AI confident that within the foreseeable future they'll build a machine capable of appreciating beauty, humor, meaning and significance? Will machines ever be able to distinguish between moral good and evil, right and wrong, or apprehend abstract ideas like universals or mathematics (as opposed to just doing computations)?

Unlike machines, human beings have a sense of self, they have memories which seem to be rooted in the past, either recent or remote. Indeed, they have a sense of past, present and future. Will the machines of the future be capable of any of this?

To be conscious in the human sense a machine would have to be able to do all of this, it would have to be able to feel. The robot Sonny from the movie I, Robot notwithstanding, machines don't feel. A computer can be programmed to say "I love you," it can be programmed to act as if it does love you, but do AI proponents believe that they'll ever be able to design a computer that actually feels love for you?

Another problem arises in reading Myers' account of Hofstadter's ideas. The complexity of the neuronal systems that give rise to consciousness in human beings is so profound that one wonders how it could ever be accounted for in terms of blind, random evolutionary processes like genetic mutation and natural selection. How did an undirected, random reshuffling and mutation of genes over millions of years produce an organ capable of doing all of the things mentioned above?

As Myers observes, human consciousness is unique among animals. "There is," he writes, "quite simply, no mechanical explanation of how the human mind has emerged from brawling chimpanzees over the course of millions of years of evolution."

The common response that, "Well, regardless whether we can explain how such a prodigy could've happened, it must have done so because, after all, here we are" is really an admission that there's no answer at all.