Tải bản đầy đủ (.pdf) (23 trang)

Self-deception - the normal and the pathological

Bạn đang xem bản rút gọn của tài liệu. Xem và tải ngay bản đầy đủ của tài liệu tại đây (133.2 KB, 23 trang )

8 Self-deception: the normal
and the pathological
In the previous chapters, I have argued that neuroscience (and allied
fields) can shed light on some of the perennial questions of moral
theory and moral psychology: the nature of self-control and the
degree to which agents should be held responsible for their actions. In
this chapter, I explore another puzzle in moral psychology: the nature
and existence of self-deception.
Self-deception is a topic of perennial fascination to novelists
and everyone else interested in human psychology. It is fascinating
because it is at once puzzling and commonplace. The puzzle it poses
arises when we observe people apparently sincerely making claims
that seem obviously false, and against which they apparently possess
sufficient evidence. The man whose wife suddenly has many mys-
terious meetings, starts to receive unexplained gifts and is reportedly
seen in a bar on the other side of town with a strange man has every
reason to suspect her of infidelity. If he refrains from asking her
questions, or is satisfied with the flimsiest of explanations, and fails
to doubt her continued faithfulness, he is self-deceived. Self-decep-
tion is, apparently, common in the interpersonal sphere, but it is also
a political phenomenon. Western supporters of Soviet communism
were often, and perhaps rightly, accused of self-deception, when they
denied the repression characteristic of the regime.
We say that someone is self-deceived, typically, when they
possess sufficient evidence for a claim and yet continue, apparently
sincerely, to assert the opposite. Generally, self-deception seems to
be emotionally motivated: we do not deceive ourselves about just
anything, but only about things that are important to us and which
we are strongly motivated to believe. The man who deceives himself
about his wife’s faithfulness might not be able to contemplate a
single life; the woman who deceives herself about Soviet commun-


ism may have her narrative identity closely entwined with her
political allegiances.
theories of self-deception
We often say that the self-deceived person really or ‘‘at some level’’
knows the truth. The formerly self-deceived themselves sometimes
make this kind of claim, saying they ‘‘really knew all along’’ the
truth concerning which they deceived themselves. Many theories of
self-deception take this apparent duality of belief at face value, and
therefore devote themselves to explaining how ordinary, sane, indi-
viduals are capable of contradictory beliefs. There is no puzzle,
everyone acknowledges, with believing things that are mutually
contradictory, when the conflict between them is not obvious. All of
us probably have inconsistent beliefs in this sense: if we thought
about each of our beliefs for long enough, and traced their entail-
ments far enough, we could eventually locate a clash. But the self-
deceived agent apparently believes two contradictory statements
under the same description (or at least very similar descriptions). The
husband in our example might believe both that my wife is faithful
and my wife is having an affair, which is a bald contradiction, or
perhaps, slightly less baldly, my wife is faithful and all the evidence
suggests my wife is having an affair.
Some philosophers think that not only do the self-deceived
believe inconsistent propositions, they are self-deceived because they
have deliberately brought about their believing contradictory pro-
positions. The best example here is the existential philosopher Jean-
Paul Sartre. Sartre (1956) argued that self-deceivers have to know the
truth, in order to set about concealing it from themselves. Just as a
liar must know the truth in order to deliberately and effectively
deceive others, so the self-deceiver ‘‘must know the truth very
exactly in order to conceal it more carefully’’ (Sartre 1956: 89). Other

theories of self-deception
259
thinkers who, like Sartre, take contradictory beliefs to be character-
istic of self-deception also model it on interpersonal deception. Both
kinds of lying – to others and to oneself – are supposed to be inten-
tional activities. On what we might call the traditional conception of
self-deception – defended by thinkers as diverse, and as separated
from one another in time, as Bishop Joseph Butler (1970) in the
eighteenth century, to Donald Davidson (1986) in the late twentieth
century – self-deception is typically characterized by both these
features: contradictory beliefs and intentionality of deception.
The contradictory belief requirement and the intentionality
requirement are both extremely puzzling. How is it possible for
someone to believe two blatantly contradictory propositions at one
and the same time? How can anyone succeed in lying to him or
herself; doesn’t successful deception require that the deceived agent
not know the truth? Defenders of the traditional conception of self-
deception do not, of course, think that we succeed in lying to
ourselves in precisely the same manner in which we might lie to
another. Instead, they take self-deception to be an activity engaged
in with some kind of reduced awareness. Moreover, they do not
assert that the self-deceived believe their claims in precisely the
same way that we generally believe our normal beliefs. Instead,
they typically hold that the contradictory beliefs are somehow
isolated from one another. Perhaps, for instance, one of the beliefs
is held unconsciously. If the husband’s belief that his wife is having
an affair is unconsciously held, we may be able to explain how he is
able to sincerely proclaim her faithfulness. We might also be able
to explain the rationalizations in which he engages to sustain
this belief: they are motivated, we might think, by mechanisms

that are designed to defend consciousness against the unconscious
belief.
More recently, however, philosophers have begun to advance
deflationary accounts of self-deception. These philosophers point out
that the traditional conception is quite demanding: it requires the
existence of a great deal of mental machinery. It can be correct only
self-deception: the normal and the pathological
260
if the mind is capable of being partitioned, in some way, so that
contradictory beliefs are isolated from one another; moreover, typical
traditional accounts also require that both beliefs, the consciously
avowed and the consciously disavowed, are capable of motivating
behavior (the behavior of engaging in rationalization, for instance).
Given that the traditional conception is demanding, we ought to
prefer a less demanding theory if there is one available that explains
the data at least as well. These philosophers thus invoke Occam’s
razor, the methodological principle that the simplest theory that
explains the data is the theory most likely to be true, in defence of a
deflationary account.
Deflationary accounts of self-deception have been advanced by
several philosophers (Barnes 1997; Mele 1997, Mele 2001). These
accounts are deflationary inasmuch as they attempt to explain self-
deception without postulating any of the extravagant mental
machinery required by the traditional conception. They dispense
with the extra machinery by dispensing with the requirements that
necessitate it, both the intentionality requirement and the contra-
dictory belief requirement. By dispensing with these requirements,
deflationary accounts avoid the puzzles they provoke: we need not
explain how agents can successfully lie to themselves, or how they
can have blatantly contradictory beliefs. Of course, we still need to be

able to explain the behavior of those we are disposed to call self-
deceived. How are we to do that?
Deflationists argue, roughly, that the kinds of states we call
self-deception can be explained in terms of motivationally biased
belief acquisition mechanisms. We can therefore explain self-decep-
tion invoking only mechanisms whose existence has been indepen-
dently documented by psychologists, particularly psychologists in
the heuristics and biases tradition (Kahneman et al. 1982). Heuristics
and biases typically work by systematically leading us to weigh
some kinds of evidence more heavily than other kinds, in ways that
might be adaptive in general, but which can sometimes mislead us
badly. Thus, people typically give excessive weight to evidence that
theories of self-deception
261
happens to be vivid for them, will tend to look for evidence in favour
of a hypothesis rather than evidence which disconfirms it, are more
impressed by their more recent experiences than earlier experiences,
and so on. Deflationists argue, and cite experimental evidence to
show, that these biases can be activated especially strongly when the
person is appropriately motivated. Thus, when someone has reason
to prefer that a proposition is true, the stage is set for the activation of
these biasing mechanisms. For instance, the anxious coward will test
the hypothesis that they are brave, and therefore look for confirming
evidence of that hypothesis (setting in motion the confirmation
bias); as a result evidence which supports this hypothesis will be
rendered especially vivid for them, while evidence against it will be
relatively pallid.
If this is correct, then self-deception is not intentional: it is the
product of biased reasoning, but there is no reason to think the agent
is always aware of their bias (neither in general, nor of the way it

works in particular cases). Nor is there any reason to think that the
agent must have contradictory beliefs. Because the agent is motiva-
tionally biased, they acquire a belief despite the fact that the evidence
available to them supports the contrary belief: they cannot see how
the evidence tends precisely because of their bias.
Deflationists claim that their less extravagant theory explains
self-deception at least as well as the traditional conception. We have,
they argue, no need to invoke elaborate mental machinery, because
there is no reason to believe that the intentionality or contradictory
belief requirements are ever satisfied. Mele (2001), the most influ-
ential of the deflationists, argues that his theory, or something like it,
is therefore to be preferred unless and until someone can produce an
actual case of self-deception in which the agent has contradictory
beliefs, or in which they have intentionally deceived themselves.
1
In
what follows, I shall attempt to meet Mele’s challenge: I shall show
that there are cases of self-deception in which the self-deceived per-
son has contradictory beliefs. The evidence comes from the study of
delusions.
self-deception: the normal and the pathological
262
anosognosia and self-deception
Anosognosia refers to denial of illness by sufferers. It comes in
many forms, including denial of cortical (i.e., caused by brain lesion)
deafness, of cortical blindness (Anton’s syndrome) or of dyslexia
(Bisiach et al. 1986). Here I shall focus on anosognosia for hemiplegia:
denial of partial paralysis (hereafter ‘‘anosognosia’’ shall refer only to
this form of the syndrome). As a result of a stroke or brain injury,
sufferers experience greater or lesser paralysis of one side of their

body (usually the left side), especially the hand and arm. However,
they continue to insist that their arm is fine. Anosognosia is usually
accompanied by unilateral neglect: a failure to attend, respond or
orient to information on one side (again usually the left side) of the
patient, often including that side of the patient’s own body (personal
neglect). Anosognosia and neglect usually resolves over a period of
a few days or weeks. However, both have been known to persist
for years.
It is worth recounting some clinical descriptions of anosogno-
sia, in order to give a flavor of this puzzling condition. Asked to move
their left arm or hand, patients frequently refuse, on grounds which
seem transparent rationalizations: I have arthritis and it hurts to
move my arm (Ramachandran 1996); the doctor told me I should rest
it (Venneri and Shanks 2004); I’m tired, or I’m not accustomed to
taking orders (Ramachandran and Blakeslee 1998); left hands are
always weaker (Bisiach et al. 1986). Sometimes, the patients go so far
as to claim that they have complied with the request: I am pointing;
I can clearly see my arm or I am clapping (Ramachandran 1996); all
the while their paralyzed arm remains at their side.
It is tempting to see anosognosia as an extreme case of self-
deception. It looks for all the world as if the excuses given by patients
for failing to move their arms are rationalizations, designed to protect
them from an extremely painful truth: that they are partially paral-
yzed. However, most neurologists deny that anosognosia should be
understood along these lines. They point out that it has some fea-
tures which seem puzzling on the psychological defence view.
anosognosia and self-deception
263
In particular, a motivational explanation of anosognosia fails to
explain its asymmetry: it is rare that a patient denies paralysis on the

right side of the body. Anosognosia is usually the product of right
hemisphere damage (most commonly damage to the inferior parietal
cortex) that causes denial of paralysis on the left (contralateral to
the lesion) side of the body. Most neurologists therefore argue that
it must be understood as a neurological, and not a psychological,
phenomenon (Bisiach and Geminiani 1991).
Clearly, they have an important point: any account of ano-
sognosia must explain the observed asymmetry. Anosognosia is
indeed a neurological phenomenon, brought about as a result of brain
injury. Most other kinds of paralysis or disease, whether caused by
brain injury or not, do not give rise to it. However, it may still be
the case that anosognosia is simultaneously a neurological and a
psychological phenomenon. Perhaps, that is, neurological damage
and motivation are jointly necessary conditions for the occurrence of
anosognosia.
V.S. Ramachandran is one prominent neuroscientist who
interprets anosognosia along these lines. Ramachandran (1996;
Ramachandran and Blakeslee 1998) suggests that the observed
asymmetry can be explained as a product of hemispherical speciali-
zation. The left hemisphere, he argues, has the task of imposing a
coherent narrative framework upon the great mass of information
with which each of us is constantly bombarded. If we are not to be
paralyzed by doubt, we need a consistent and coherent set of beliefs
that makes sense of most of the evidence available to us. In order to
preserve the integrity of this belief system, the left hemisphere
ignores or distorts small anomalies. Since any decision is usually
better than being paralyzed by doubts, ignoring anomalies is gen-
erally adaptive. However, there is a risk that the agent will slip into
fantasy if the left hemisphere is allowed to confabulate unchecked.
The role of keeping the left hemisphere honest is delegated to the

right hemisphere. It plays devil’s advocate, monitoring anomalies,
and forcing the more glaring to the agent’s attention.
self-deception: the normal and the pathological
264
There is a great deal of independent support for Ramachandran’s
hemispherical specialization hypothesis. In particular, evidence
from cerebral commissurotomy (‘‘split-brain’’) patients is often
understood as supporting this view. On the basis mainly of this
evidence, Gazzaniga (1985; 1992) has suggested that the left hemi-
sphere contains an ‘‘interpreter,’’ a module which has the task
of making sense of the agent’s activities using whatever sources of
information are available to it. When it is cut off from the source
of the true motivation of the behavior, the left hemisphere con-
fabulates an explanation. Many researchers have followed or adapted
Gazzaniga’s suggestion, because it seems to explain so many observed
phenomena.
For our purposes, the hemispherical specialization hypothesis
is attractive because it neatly explains the asymmetry characteristic
of anosognosia. When the right hemisphere is damaged, the left
hemisphere is free to confabulate unchecked. It defends the agent
against unpleasant information by the simple expedient of ignoring
it; it is able to pursue this strategy with much more dramatic effect
than is normal because the anomaly detector in the right hemisphere
is damaged. But when the right hemisphere is intact, denial of illness
is much more difficult. On the other hand, when damage is to the left
hemisphere, patients tend to be more pessimistic than when damage
is to the right (Heilman et al. 1998). Ramachandran suggests that this
pessimism is the product of the disabling of the protective left
hemisphere confabulation mechanisms.
I do not aim to defend the details of Ramachandran’s account of

anosognosia here. However, I suggest that it is likely that the best
account of the syndrome will, like Ramachandran’s, explain it as
simultaneously a neurological and a psychological phenomenon.
Only a combination of neurological and psychological mechanisms
can account for all the observed data. Non-motivational theories of
anosognosia cannot do the job alone, as I shall now show.
Some theorists suggest that anosognosia is the product of an
impairment which makes the disease difficult for the patient to
anosognosia and self-deception
265
detect (Levine et al., 1991). A syndrome like neglect is, for its subject,
relatively difficult to discern; absence of visual information is not
phenomenally available in any immediate way. Somewhat similarly,
anosognosia for hemiplegia may be difficult to detect, because the
patient may have an impairment that reduces the amount and
quality of relevant information about limb movement. There are
several possible impairments that could play the explanatory role
here. Patients may experience proprioceptive deficits, they may
experience an impairment in feedback mechanisms reporting limb
movement (Levine et al. 1991), or they may experience impairments
in ‘‘feedforward’’ mechanisms, which compare limb movements to
an internally generated model predicting the movement (Heilman
et al. 1998).
These somatosensory explanations of anosognosia face a com-
mon problem: the mechanisms they propose seem far too weak to
explain the phenomenon. Suppose it is true that anosognosics lack
one source of normally reliable information about their limbs, or
even that they take themselves to continue to receive information
that their limb is working normally via a usually reliable channel;
why do they nevertheless override all the information they receive

from other reliable sources, ranging from doctors and close relatives
to their own eyes? After all, as Marcel et al.(2004) point out, the
impairments produced by hemiplegia are not subtle: it is not just
that patients fail to move their arms when they want to. They also
fail to lift objects, to get out of bed, to walk. It is extremely difficult
to see how lack of feedback, or some other somatosensory deficit,
could explain the failure of the patient to detect these gross
abnormalities.
More promising, at first sight, are theories that explain diffi-
culty of discovery as the product not of somatosensory deficits, but of
cognitive or psychological problems. On these views, anosognosia
might be the product of confusion, (another) delusion or of neglect
itself. In fact, however, these explanations do not suffice. It is true
that some patients are highly delusional (Venneri and Shanks 2004)
self-deception: the normal and the pathological
266

×