Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights...

23
Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

Transcript of Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights...

Page 1: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

Section 2.3I, Robot

Mind as Software

McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

Page 2: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-2

Functionalism

According to functionalism, mental states are functional states.

To perform a function is to take a certain input and produce a certain output.

When two things perform the same function, they are said to have the same “causal role.”

Page 3: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-3

Functionalism vs. Behaviorism

For behaviorism, mental states are neither causes nor effects.

The only causes behaviorism recognizes are physical stimuli and the only effects, physical responses.

For functionalism, mental states are both causes and effects.

Page 4: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-4

Artificial Intelligence

The goal of artificial intelligence is to create a machine that can think for itself; that has a mind of its own.

According to strong AI, there’s nothing more to having a mind than running the right kind of program.

Strong AI claims that the mind is to the brain as the software of a computer is to its hardware.

Page 5: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-5

Thought Experiment: Lewis’s Pained Madman “There might be a

strange man who sometimes feels pain, just as we do, but whose pain differs greatly from ours in its causes and effects.”

This possibility suggests that being in a particular functional state is not a necessary condition for being in a mental state.

Page 6: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-6

Functionalism and Feeling

1. If functionalism were true, it would be impossible for someone to be in pain and function differently than we do when we are in pain.

2. But, as Lewis’s pained madman shows, that’s not impossible.

3. So, functionalism is false; being in a certain functional state is not a necessary condition for being in a mental state.

Page 7: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-7

Thought Experiment: Block’s Chinese Nation “Suppose we convert the government of China

to functionalism, and we convince its officials that it would enormously enhance their international prestige to realize a human mind for an hour.”

Suppose the people of China run a mind program. Would there now be another mind on Earth?

This is known as the “absent qualia objection” to functionalism.

Page 8: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-8

Block’s Argument

1. If functionalism were true, then anything that had the right sort of functional organization would have a mind.

2. But as Block’s Chinese nation shows, something can have the right sort of functional organization and not have a mind.

3. So functionalism is false; having the right sort of functional organization is not a sufficient condition for having a mind.

Page 9: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-9

Thought Experiment: Putnam’s Inverted Spectrum

“Imagine your spectrum becomes inverted at a particular time in your life and you remember what it was like before that.”

Imagine further that you learn to function as before.

This possibility suggests that being in a particular functional state is not sufficient for being in a particular mental state.

Page 10: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-10

Putnam’s Argument

1. If functionalism were true, it would be impossible for people with the same functional organization to have different mental states.

2. But, as Putnam’s inverted spectrum shows, it is possible for people with the same functional organization to have different mental states.

3. So functionalism is false; having a certain functional organization is not a sufficient condition for being in a certain mental state.

Page 11: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-11

Thought Probe: Pseudonormal Vision Pseudonormal vision may

occur when the sensations of green and red are reversed.

People with pseudonormal vision would be functionally indistinguishable from normal people.

Does the possibility of pseudonormal vision support the claim that functionalism can’t account for conscious experiences?

Page 12: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-12

Thought Experiment: The Turing Test Suppose an interrogator

is allowed to conduct a conversation with both a human and a computer and, after a specified period of time, cannot tell which is which.

Turing claims that any computer that passed such a test would have to be able to think.

Page 13: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-13

Thought Experiment: Searle’s Chinese Room Suppose that a person in

a room with a rulebook and a set of Chinese symbols were able to use them to answer questions put to him in Chinese.

The person could do this without understanding Chinese.

This possibility shows that performing a particular function is not sufficient for understanding meaning.

Page 14: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-14

Syntax and Semantics

How a symbol can be combined with other symbols to form a sentence is determined by its syntax.

What a symbol means is determined by its semantics.

Searles point: syntax does not equal semantics--one can put together syntactically correct strings of symbols without knowing what they mean.

Page 15: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-15

Searle’s Argument

1. If a computer could understand a language solely in virtue of running a program, then the man in the room would understand Chinese.

2. But the man in the room doesn’t understand Chinese.

3. So computers can’t understand a language solely in virtue of running a program.

Page 16: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-16

Replies to the Chinese Room:

Systems reply: the man in the room doesn’t understand Chinese, but the whole system does.

Robot reply: the man in the room doesn’t understand Chinese, but if the room were put in a robot, the robot would.

Brain simulator reply: the man in the room doesn’t understand Chinese, but if the program simulated nerve firings, the system would.

Combination reply: even if each of the above replies is inadequate, taken together they would create a system that understands Chinese.

Page 17: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-17

Searle’s Chinese Gym

Connection machines attempt to mimic the architecture of the human brain by connecting processors in parallel.

Because there is no central processor, some believe such machines don’t fall prey to the Chinese Room Argument.

Searle counters by postulating a gym full of people carrying out the same operations as the nodes in a connectionist machine.

In such a situation, the system as a whole would not understand what the symbols mean.

Page 18: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-18

Thought Probe:Total Turing Test To pass the Total Turing Test, the computer

being tested would have to be able to do everything that a normal human being does, including walking, riding a bicycle, swimming, dancing, playing a musical instrument, and so on.

Is passing the Total Turing Test either necessary or sufficient for being intelligent and thus having a mind?

Page 19: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-19

Intentionality

Intentionality is the property of being of or about something.

Mental states can have intentionality because they can be of or about something. For example: the belief that the Yankees

will win the pennant is about the Yankees, the pennant, and the proposition that the Yankees will win the pennant.

Page 20: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-20

Intentionality and the Chinese Room

An adequate theory of the mind should explain how it is possible to think about things.

Searle claims that the Chinese room shows that functionalism cannot account for intentionality.

Page 21: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-21

Machines and the Chinese Room

Searle does not take the Chinese room to show that machines can’t think, for we are machines and we think!

What it shows is that there is more to thinking that running a computer program.

Page 22: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-22

Thought Experiment: Block’s Conversational Jukebox

Suppose that all of the intelligent conversations that could be had in an hour are stored as a list on a tape. Suppose further that a computer carries on a conversation by searching the list.

The computer would seem to be intelligent, but that would be an illusion.

This possibility shows that performing a certain function is not sufficient for being in a mental state.

Page 23: Section 2.3 I, Robot Mind as Software McGraw-Hill © 2013 McGraw-Hill Companies. All Rights Reserved.

2.3-23

Thought Probe: Devout Robots

Suppose a robot that passed the Turing test asked to be baptized.

Should it be? Should it be given

the same rights that we have?