The Second Self: Computers and the Human Spirit by Sherry Turkle
ISBN 0671468480 Out of Print Limited Availability -- Turkle watched MIT students dive into computer programming from the vantage point of the Sociology department. What struck her was that such otherwise bright students found no more powerful metaphor for themselves than that of machine. Abstract arguments aside, she found this really sad.
I'm not sure why it should be sad (said the programmer). A metaphor is a context for thought and understanding which has relevant similarities to a target conceptual domain. A "good" metaphor is easily understandable to the thinker and has similarities to the target domain which are relevant to some pertinent task.
Thus, a metaphor is good (or bad) only within the context of a particular task. Perhaps the power of the metaphor is the range of tasks that it's good for. In any case, a programmer's mind can think of just about anything as a universal simulation that we all happen to be a part of. That done, the idea of "machine" suddenly becomes arbitrarily relevant to any situation. Therefore, "Machine" is as powerful a metaphor as ever needed for any task, so long as you get to define the specs of the machine.
Obviously, many metaphor/task pairs require fewer levels of indirection than those involving machines. Indirection is a thought process that strong programmers have no problem with, while it confounds many others. Art is indirection, but you can't convince an artist of it. Physics and math are really two sides of the same coin, but you can't convince either of it.
There is much in mathematics that is not a metaphor for any task in physics...
I agree with the programmer commenting above - it's a telling point on authorial bias that the sociologist fells it's sad that programmers use a machine metaphor for the mind - it shows that the author a) does not understand the metaphor, b) injects her own (common) opinion of machines as simplistic clockwork devices, and c) ignores the great mass of evidence that people usually behave in simple patterns that are perhaps less complex than most everyday machines. Even the opponents of strong AI concede that it will be possible soon to have machines that will mimic human behavior to a great degree, simply through brute-force methods.
I strongly agree with the sociologist. Metaphors are not just composed of metaphiers and metaphrands (respectively, the machine and the mind in the example above). They're also composed of paraphiers and paraphrands. Things which are beside, associated with and implied by, the metaphor.
In the machine metaphor, some of the paraphiers are 'no emotions', 'no feelings', and 'no rights'. That is, machines have no emotions, feelings nor rights so by analogy, humans have none of those things. Or at the least, these things are of absolutely no concern to us. To say that a human being (or the mind) is like a machine is to debase it, to strip it of everything that makes it valuable and human. To think of oneself that way is extremely sad.
To recap my points: I feel that Turkle is injecting her own opinions into the book, making it less a study than an editorial; if strong AI is possible, then minds can be machines, and so the metaphor is sufficiently powerful to be useful and acceptable; and Turkle seems to favor a non-reductionist stance, where consciousness is somehow not a natural property of a sufficiently complex system. [the brain]
And in fact, it isn't. Consciousness is not a natural property of the brain. If it were then simple disorders like schizophrenia wouldn't cause the diminution and annihilation of consciousness. Nor would common phenomena like hypnosis fundamentally alter or suppress consciousness. Then there are the somnanbulist states. And then there's the fact that it takes a lot of extremely complex behaviour on the part of parents in order to raise children with consciousness. To believe that "consciousness is a natural property of a sufficiently complex brain" requires one to be ignorant of both consciousness and the brain.
And again, does the above author mean any complex system or specifically the brain?? If it's only the brain then why leave the original words, with their provably absurd meaning, intact while removing the proof that their meaning is absurd?
People should understand what consciousness is before fighting over it and why consciousness is central to the mind before dragging in people's unconscious behaviours.
In TheOriginOfConsciousnessInTheBreakdownOfTheBicameralMind, JulianJaynes spends an entire chapter providing an excellent breakdown of what consciousness is not. It's not experience, it's not behaviour, it's not reason, and it's not even thought. All these things are necessary for consciousness but none are sufficient. He then spends another entire chapter describing what consciousness is, and then lastly proposes the "consciousness is metaphor" definition of consciousness.
This page is the inverse of the statement: "Don't anthropomorphize computers. They hate that."
In order to decide whether Turkle is right that machine is a weak metaphor for person, you'd have to look at her actual cases. I think most of them aren't programmers, and of the programmers, many did indeed have "mechanistic" pictures of machines and by analogy people.
On the other hand, it's my recollection that Turkle didn't understand the metaphor well enough, and stood too far outside it, as if it were a clear cut pathology like the boy who decorated his room with gears and talked in a monotone.
Exactly my point. Thank you.
You know the Zen saying, before zen, trees are trees, mountains are mountains. Newcomer to Zen, trees are the Buddha, mountains are fluctuations of mind, rainbows, etc. After enlightenment, trees are trees, mountains are mountains. (Abbreviated in that 1960s song: "first there is a mountain, then there is no mountain, then there is.")
Same for the word machine. When you've gotten to the point that absolutely everything could be a simulation inside a computer, you're halfway. Then I recommend the koan, "Machine compared to what?" At that point the meaning comes back: simplistic, repetitious, but easily understood, or controlled, stuff like that.
Also, to the extent that there are lots of important and interesting phenomena that we don't know how to simulate (intelligence being one!), machine is a bad analogy for person because we don't know how to draw the analogy! In that sense programmability can be a siren call: in principle you could understand anything in terms of a computer metaphor, but in practice you run against the rock when actually trying leads nowhere. But I could! Smack! But I could! Smack!... Futilly retrying things that ought to work is one definition of neurosis. Sometimes sticking to one viewpoint is a way of avoiding understanding. --SteveWitham
Agreed. But Turkle has not shown that the rest of humanity does not also use an equally ineffective metaphor - just that she does not like the one chosen by the people she interviewed.
She doesn't need to. Most psychologists have a good understanding of what consciousness is already, so she wouldn't need to prove that a bad metaphor of consciousness is bad. Consciousness is the ability to make metaphors about one's behaviour. See JulianJaynes' book (which provides two solid chapters summarizing current research on consciousness).
Are you referring to the book published in 1976? Who wrote the chapters on current research? From all that I've read, Jaynes is hardly accepted by the majority of psychologists as authoritative. And you haven't addressed the issue - Turkle does not show that the rest of humanity, or even college students, are not using an equally ineffective metaphor.
She doesn't need to. Demonstrating that a position is incorrect, nonsensical or absurd doesn't necessitate providing an alternative. The argument is simply irrelevant and I dismiss it out of hand. But that's not even what Turkle is trying to say.
Turkle is trying to say that the 'machine' metaphor is dehumanizing and harmful to human dignity. And so, proving that other people have a superior, less dehumanizing, conception of consciousness than as a 'machine' is trivial. Most people's conception of consciousness is as a little homunculus in their brain controlling actions. Now, that metaphor may be useless and circular but it's certainly not undignified.
As for Jaynes' position in psychology, psychology is a very screwed up field. You know that's so when Freud is still respected but deMause is denigrated. Don't think that psychology is unique in this. Physics is as well, and even mathematics to a lesser extent. Then there's economics which is royally screwed up. So I'll pick my own authorities, thank you very much.
Personally I like Dennett's metaphors - "Multiple Drafts" and "Center of Narrative Gravity" among them. He also makes good use of the "computer" metaphor, suitably restricted in scope - the "Serial Virtual Machine" transforming inherently parallel processes into what introspectively appears to be a continuous "stream" of consciousness.
There's a good point in Dennett - of course serial computers are an apt metaphor for what we perceive our minds to be : serial computers were modeled after Turing's analysis of his own thought processes - and he reconstructed them as serial and goal-directed. But careful investigation shows that our minds cannot be operating in this fashion.
There probably are metaphors of consciousness that do more for human dignity. I seem to recall Dennett touches on this point as well : it probably matters more that we "really" understand the human mind (which involves finding appropriate metaphors at all levels of explanation, as well as good, solid theoretical arguments and empirical investigations) than that we "sugarcoat" our knowledge of mind with false-to-fact metaphors that promote social responsibility or other desirable values.
This shows pretty well the basic premise in psychology: ThereIsOnlyOneRightAnswer?, and only I know what it is.
Anyway, what metaphor do you think is appropriate for a human being? Can you think of something that isn't as deterministic as a Windows box full of shareware?