Consciousness Considered Harmful

this page being refactored

Any mess I make is my personal mess until it gets cleaned up. One way or the other, let's clean it. I'm not trying to crack the whip on you, but the page stopped updating weeks ago and I didn't think anyone would object to erasing it. Instead of refactoring under deadline, why don't you make a copy on your hard disk, refactor as the spirit moves you, and re-post a much tightened page when it's ready? Meantime, we delete the page. No one else cares, so you have much liberty in this. Sounds good?

No, sounds bad. This page is also useful just as it is for the insights it provides into both you and Richard, both prolific contributors. Additionally, the correct approach to refactoring is to leave the unrefactored text intact with a note that refactoring is pending, rather than deletion pending refactoring, which has been characterized as a thinly veiled form of DisagreeByDeleting. See for example the well-refactored page DeleteByDelayingRefactoring for more details. --AndyPierce

As I explained before, there are two common assumptions people make that are challenged on this page. One is that science deals with "objectively observable" things only, as opposed to reportables. This should probably be moved to its own page, ObservableVsReportable (done now). The other is the assumption that you can split the mind into separate rational versus irrational parts. Those are the main things I care about. Challenging popular misconceptions is valuable to me.

Discussion of Janov's definition of consciousness should be moved to WhatIsConsciousness (now done). I deleted the digression about developing a model of human emotion (may create a page about HumanEmotion? at some point). The discussion of empathy should be condensed a bit because it's a bit roundabout. Needless to say, the back and forth shoud go. -- rk

I have archived this page at http://sunir.org/c2/ConsciousnessConsideredHarmful.txt so it can never be deleted--at least not for two years. Until then, bonne journée.


Well, it sure gave Hamlet a run for his money.

But in the modern sense, if human consciousness as a concept is constrained to the self-referential symbolic analog processing as described by Julian Jaynes, constrained largely to the dominant hemisphere and thus yielding reams and reams of explanation, analysis, dry introspection and general yammering on technology, then it could be said that consciousness commits the same evil as empty calorie foods. It usurps the space normally allocated for life.

But that's not all consciousness is. Consciousness makes empathy possible. It also makes projection and reversal possible. Without consciousness, the entire EmpathyProjectionReversal metric is meaningless.

Consciousness is also required for certain emotions. Things like pride, embarrassment, hubris and shame. I know there's some research on this subject because I've read an article explicitly talking about 'conscious emotions'. I wish there were a database of human emotions.

I think the attack on consciousness derives from a general popular attack on rationality. The Myth of Irrationality talks about the popular conception of the mind. There's a couple chapters of the book at (http://www.btinternet.com/~neuronaut/webtwo_features_irrationality.htm).

Not very far into that page it says, "The special aspects of the human mind - self-awareness, memory, higher emotion and imagination - are skills we learn rather than faculties that unfold within us like so many budding flowers." This assertion is most certainly not universally accepted in the cognitive sciences. A perusal of texts such as TheSocietyOfMind?, TheLanguageInstinct and HowTheMindWorks make clear that there is a strong case to be made for these facilities indeed being essential in the human brain. This chapter, at least, contains a great deal of, well, passionate appeals to emotion, and little rational reference to current research. It's not clear where the author is going with his argument, but at least in this example his appeals seem calculated to arose strong reaction rather than dispassionate contemplation. -- StevenNewton

And OriginOfConsciousness, as well as many pages at the above mentioned site, make it clear that such a case is already lost. See HumanConsciousness for more details.

The notion that some language module exists in the brain due to biological evolution is ahistorical and probably even nonsensical. If language is so complicated, then it's doubtful that biological evolution could have come up with it when social evolution could not. Julian Jaynes' account of how language slowly and incrementally developed over millennia of social evolution is much more credible. And it probably fits the anthropological evidence better.

Ultimately, the author above doesn't need to present what's universally accepted. He just needs to know what he's talking about. One point he raises is that when linguists present a bunch of sentences to show how complex language supposedly is, that just proves the inadequacy of the tools linguists use. There's a Counter-Chomsky page at http://perso.club-internet.fr/tmason/WebPages/LangTeach/CounterChomsky.htm -- RichardKulisz (I like Chomsky for his political writings so I don't care if he's wrong about linguistics.)

Ah, it's clear now. John McCrone?, "neuronaut" and author of The Myth of Irrationality cited above, is a journalist and a bit of a scientific crank. As Mr. Spock would say, "faaaascinating".

And you're an intellectual snob. Not that it matters but McCrone? has published in journals and his views are supported by many respected researchers. The French for example don't believe in the Chomskian Universal Grammar.

[That last statement sounds clueless to me. How many French people are you talking about?]


Consciousness crucial to empathy

If consciousness is crucial to empathy, and babies do not have consciousness, then how can they have empathy?

Infants may start life with the ability to recognize a few facial expressions and some hard-wiring to respond to them emotionally (perhaps by mimicking them). But that's not how things remain.

For instance, it's impossible for an infant to have an empathic reaction to emotions that require consciousness. But some may question the existence of conscious emotions so we need a broader point.

It is inconceivable for an infant to have empathic reactions for all possible human emotions hardwired in from birth. IOW, it's not possible that for each emotion, the value "this is a good emotion" or "this is a bad emotion" is hardwired. This is because empathy isn't simple mimicry, so it isn't a simple function. And as we shall see, it isn't a hardwired function.

First a little explanation of empathy. Empathy is your feeling as if you were the other guy, and the as if is a key area. To empathize is to put yourself in the other person's place and feel something about that, but it's not saying you'll feel the same thing the other person does. In fact, look at that concept as the nonsense it must be. The closest we could come to "having the same feeling" would be having a feeling for which we share a single name.

Now an example, suppose you have a friend who likes fighting and he gets into a fight. Should you wince at the pounding he'll get, or be happy he'll get the adrenaline rush he craves? Probably a little of both. But if another friend of yours, who absolutely hates fights, got cornered into one then your empathic response would be unambiguous. A person's empathic response depends crucially on their knowledge and beliefs about the desires of the target of their empathy.

Now another example, say your friend is kissed for the first time by the woman he's head over heels in love with. The fact that he loves this woman and that you know this (or suppose it, whatever) modulates your empathic reaction. The raw physical situation "your friend was kissed by a woman" determines exactly zero of your empathic reaction. Whether she's a beautiful or ugly woman doesn't matter compared to whether he considers her to be beautiful. That's a mental component to an empathic reaction which can't be hardwired. Even the simple knowledge of whether your friend is gay or straight can't be hardwired. If you're absurdly happy that your gay friend got kissed by a beautiful woman, then that is not an empathic reaction. If you love fights and your friend who is absolutely terrified of them has gotten into one, then being exhilarated (or any kind of positive emotion or feeling) is not an empathic reaction. Empathy functions on an intellectual model of someone's personality. If it didn't, it would be an absurdly impoverished function, not even worthy of being given a name.

The question of whether an empathic reaction is accurate or not in the sense that it is based on the supposed desires and feelings of the target, is deeply relevant to empathy. The question of whether or not the supposed desires and feelings of the target are accurate, that is irrelevant to empathy. Accuracy in this entirely different sense is called sensitivity. "Overall" accuracy requires both high empathy and high sensitivity. Some psychopaths are extremely sensitive without having any empathy. The reverse is also possible. (I think that autistic people are still empathic but they have no sensitivity (they also have fewer emotions themselves).

So the answer to the initial question is that babies do not have empathy. They have proto-empathy.


So if someone is delighted at some outcome and I'm revulsed by it, even if I know their feeling, it would be more accurate to say I don't empathize with them, rather than to say that I empathize inaccurately with them. Here, we've perhaps brought empathy and caring into conflict because while I might care for someone who, say, mutilates his body, I would never share delight in that. Not only that, I believe my true response would be more valuable to all concerned.

People who mutilate their bodies usually do it because they hate themselves. As for people who modify their bodies, that's another matter. It's very difficult for empathy to conflict with caring. One way is if the "caring" is paternalistic. Or if it's infatuation instead of love. But those are usually excluded from the concept of 'caring' after a little reflection.


Conscious emotions: love? Especially as opposed to infatuation and lust.


Preconditions for empathy

Reading emotions from people's faces is crucial to empathy. In a situation where one can't see a person's face, and that person is a complete stranger, there's just nothing for empathy to latch on to.

Your face-to-face experience with others serves as your "database of emotion" and should allow you to empathize even across a faceless bit stream, unless you're too angry to do that. (By the way, let's get some faces posted pretty soon. I have a feeling that would make a quality difference in the exchange, but I won't expand or give details on Why just yet.)

I disagree on both counts. A general model of emotions isn't sufficient. I have to have knowledge of the specific desires of that person. I have to be able to see that person as a person and not just an automaton. That works in reverse too, I can avoid empathic reactions by refusing to see someone as a person. I've only consciously done this once (for excellent reasons btw) but I'm sure it's a skill like any other. That nixes the idea that empathy can't be turned off btw.

Secondly, the precise way that a person safeguards their individuality is by mentally trashing people they don't agree with (remember ob + audire?). You can do this the hostile way or the paternalistic way but neither method is enjoyable for the target. The only way to avoid, say, hostility leaking into your words is to not respond until you've sufficiently trashed the other person in your own mind. But whether or not it shows to others, instant and universal empathy is impossible.


I think the attack on consciousness derives from a general popular attack on rationality.

The "attack" on consciousness is not against consciousness per se, but against a hyperabundance of rationality and too little sensation. You can be highly aware of the concept of empathy and still feel nothing toward the guy across the web from you, as you verbally thrash and bash. That's harmful in my view, and in multiple ways.

There's no such thing as a hyperabundance of rationality. Reason, after all, is simply a means to an end. When it goes astray, it's rarely if ever because it was a bad tool to use, but generally because the end was ill-chosen. People tend to forget this. Up top, where it says that consciousness is useless if all it is good for is such and such, it actually includes explanation in the list. Have we honestly decided that trying to understand things is a waste of time? -- JoshuaGrosse

No, we haven't decided that trying to understand things is a waste of time. And explanation is useful, too. But there are excesses and misapplications of both. For example, if someone you love wants a hug and all you do is give them an explanation of the benefits of hugging, you're doing the kind of "harm" I'm talking about. Hug them for god's sake and shut up. :-) If you had an accurate model of human emotion based on such a database, you'd know to hug them and shut up. :)

Agreed. So using reason to give somebody something completely different from what they want is a misapplication. So the knowledge of why hugging is helpful doesn't actually help you hug people. So what? Does this make it bad? It seems to me this page is conflating is harmful with can't do everything by itself.

Not agreed. You don't need "an accurate model of human emotion" to know how to feel any more than you need an accurate model of human excretion to know how to urinate. Of course, if you've suffered organ damage, you might need informational support in constructing a replacement system. If that's not the case, such a system is a distraction at best. Harm is in the context of application. Explaining why you mistreat people so that you can go on mistreating them co-opts the otherwise benign activity of explanation for an evil cause. Take a look at some of the evidence.

Wait. Are we discussing whether or not consciousness can be considered harmful, or are we conducting a prosecution of Kulisz for using his rhetoric for evil? The latter is a case I'd rather not be in on, and I think ought not to be mixed together with some other argument.

We're discussing whether intellectualism to the exclusion of human feeling is harmful, approximately, with examples drawn from nearby when convenient. Other examples are welcome.

Ok, so intellectualism to the exclusion of feeling is bad. As if everyone here didn't know that. What the implication here seems to be, though, is that you can exclude feeling by including too much bonus rationality, as if you were trying to stock a shelf with them. I don't buy that at allthat the desire to explain, analyze, and understand things is harmful. I can see how it can exacerbate problems caused by independent evils like lack of empathy, but blaming it for such things strikes me as scapegoating in the worst possible way, and I don't think that it can ever be considered dangerous in and of itself.

No, we're agreed (above) that the harm comes from using intellectual activity to insulate ourselves from empathy with others by usurping part of what's needed for empathy in the intellectual process. Perhaps the analogy of stocking shelves is broken. More to the hardware, engaging in one brain activity has the ability to suppress other brain activities, so it's not just a matter of resource shortage. It's more like that you can use intellectual gyration to build a dam against feeling, and you can keep yourself chronically distracted with intellectual fluff so as to avoid even the insight that that's what's happening. Dangerous? I didn't quite say that. I said harmful, and I think I'll stick with that term.

To me, dangerous means things like surgical tools that should be employed carefully, while harmful means things that should be avoided. Maybe the words have different connotations for you, but I have a hard time picturing sentiments like the one up top, that consciousness usurps the space normally allocated for a life, as mere caution stickers.

To be "considered harmful", especially in the popular idiom here, something need not be the sole cause of disaster. "Goto" is an effect, not a cause, if you choose to analyze it that way. Just because some jerk programmer isn't allowed to use "goto" doesn't mean he's not going to make a mess of his code, and just because he is allowed to use it doesn't mean he will make a mess. You get my drift. Cause and effect are models, just as are requirements and designs. In reality, the influence travels in both directions. Having a gun in hand is considered harmful in some contexts (most, if you ask for my opinion) but "guns don't kill, etc. etc.". I hope this clarifies my intended meaning. Sometimes you have a choice to shut up and hug someone. Sometimes you just don't. I hope some hug-eligible people pass your way soon, but if not then write again and we'll see about admitting furry animals in their stead.

So, what you're saying is that consciousness can be used for good or evil? I agree with that, but I think this is an odd way to express it. The same can be said for any other powerful tool, yet I don't see any pages like WritingConsideredHarmful?, ProgrammingConsideredHarmful?, or WikiConsideredHarmful?. Code with GOTOs on the other hand is generally held to be worse on average than code without, and guns were designed to kill people or at least damage objects.

There's a barrier in communication here. If I had your email address, I'd use it. Let me state as clearly as I can what this page is about, in bullet form if you like.

-- WaldenMathews

If you'll permit me to butt in here, a common pattern of dissociation in the recent past was the Id, Ego and Superego. That's three categories for only two hemispheres, and from Jaynes we know that Ego and Superego were in different hemispheres. So either the Id (base feelings) were in the same hemisphere as the Ego or they were with the Superego. Since the Superego is as intellectual as the Ego, feelings have always been tied to intellectual faculties.

People nowadays are better integrated and whether that's because the dominant hemisphere took over even more of the other's capabilities or because they communicate differently doesn't matter all that much. You could argue that 3 millennia ago, people lost "true feeling" at the expense of sanity and functionality. But they also gained empathy. Even as recently as a few centuries ago, people were very different than we are now. More emotional, more dissociated, more mood swings. But they were also far less sane than we are. You claim that cerebration "overshot" survival but you're very wrong on that. The Victorians still practiced widespread infanticide!

The "feelings" of bicamerals and other psychotics are brilliant, crystal clear and overwhelming, but they do not have all the feelings that we have. The Victorians may have been "moody" but they did not have the depth of feeling that we do. It isn't a black and white, rational versus irrational, or intellectual versus feeling, matter. To decide that consciousness isn't worth it, you'll need a complete account of what was gained, lost and transformed in the process. And prejudgements don't help. -- RichardKulisz


CategoryOffTopic


EditText of this page (last edited April 6, 2003) or FindPage with title or text search