Section 2: The Mind-Body Problem


Functionalism


Identity Theory argues that mental states are brain states. We've already seen that this is probably not true, and that there are good reasons to assume that the identity doesn't really hold. Another option, of course, is that mental states are states of some non-physical substance (of the kind of substance dualism Descartes holds). This route is not popular.

What's left? One way to argue the point is to suggest that mental states are functional states. According to this theory, mental states are defined in terms of what their function is (what they do). The function of a state like "being in pain", for instance, might be to alert the body to make repairs to a part of itself. Notice, however, that functionalism doesn't really say anything about what the materials are that underlie the mental states. In other words, functionalism is open to the possibility that two things could have the same exact mental states but yet be composed of radically different materials.

The obvious example here is A.I. If it were possible to "download" all of your present mental states into a robot, and have those mental states re-run inside the robot, then for all its worth you are in two places at once, since the robot and you are in the same mental states and the same functional states (even though one of you is carbon-based and the other is silicon-based).

What's the difference between functionalism and logical behaviorism? The two are similar, but have key differences between them.

Logical Behaviorism,

1. Mental states are defined in terms of their causal role (the inputs and corresponding outputs)
2. Inputs are physical stimuli, Outputs are bodily movements

Functionalism,

1. Mental states are defined in terms of their causal role (the inputs and corresponding outputs)
2*. Inputs and Outputs can be mental states

Obviously the two differ in terms of 2 and 2*. While the difference may seem small, it is actually substantial. Logical behaviorists, recall, essentially discard the inner life of the "machine". All that matters is "what goes in" and "what comes out". A functionalist takes the inner life a bit more seriously. For example, a functionalist might argue that it is by coming to believe that your lover is cheating on you that you will become jealous . In such terms, it seems that a functionalist is open to claiming that the function of a mental state is not only to cause behavior but also to cause other mental states. This is a large difference -- functionalism, unlike logical behaviorism -- takes mental causation seriously.

Artificial Intelligence (A. I.)

Recall that according to a functionalist, being in a mental state is to be in a functional state. Mental states are defined by their functions, or what they do.

If functionalism is entirely correct, then current strides to create computers with minds like ours is on the right track. To think this is to suppose that the theory of "strong A. I." is true.

Strong A. I.: having a mind is simply the ability to perform certain functions, where the functions are determined by the programs 'being run'. So there's nothing more to having a mind than running the right kind of program.

According to strong A.I., your mind is really just a program. Think of your brain as hardware, and your mind as software. Of course, the claim that mental states are simply functional states (not identical with the brain, as in identity theory) leads us to think that it could be possible to transfer my mind to another piece of hardware, perhaps composed of something entirely different than organic material. As long as the hardware continued to run the same program (perform the same functions), it is the same mind. So mind-transference would be logically possible according to a functionalist and nothing would be lost.