Behavioral and Brain Sciences

Open Peer Commentary

Self-deception: A paradox revisited

Albert Banduraa1

a1 Department of Psychology, Stanford University, Stanford, CA 94305. bandura@psych.stanford.edu www.stanford.edu/dept/psychology/abandura

Abstract

A major challenge to von Hippel & Trivers's evolutionary analysis of self-deception is the paradox that one cannot deceive oneself into believing something while simultaneously knowing it to be false. The authors use biased information seeking and processing as evidence that individuals knowingly convince themselves of the truth of their falsehood. Acting in ways that keep one uninformed about unwanted information is self-deception. Acting in selectively biasing and misinforming ways is self-bias.

(Online publication February 03 2011)

Related Articles

    The evolution and psychology of self-deception William von Hippel and Robert Trivers School of Psychology, University of Queensland, St Lucia, QLD 4072, Australia. billvh@psy.uq.edu.au http://www.psy.uq.edu.au/directory/index.html?id=1159; Department of Anthropology, Rutgers University, New Brunswick, NJ 08901. trivers@rci.rutgers.edu http://anthro.rutgers.edu/index.php?option=com_content&task=view&id=102&Itemid=136

    Von Hippel & Trivers (VH&T) present the case that self-deception evolved because it enables individuals to be good deceivers of others. By convincing themselves that their fabrication is true, people can enjoy the benefits of misleading others without the intrapsychic and social costs. The authors review a large body of evidence on biased information processing on the assumption that this is the means by which individuals unknowingly convince themselves of the truth of their falsehood.

    A major challenge to a functional analysis of self-deception is the problematic nature of the phenomenon itself. One cannot deceive oneself into believing something while simultaneously knowing it to be false. Hence, literal self-deception cannot exist (Bok 1980; Champlin 1977; Haight 1980). Attempts to resolve the paradox of how one can be a deceiver fooling oneself have met with little success (Bandura 1986). These efforts usually involve creating split selves and rendering one of them unconscious.

    Awareness is not an all-or-none phenomenon. There are gradations of partial awareness. Hence, self-splitting can produce a conscious self, various partially unconscious selves, and a deeply unconscious self. In this line of theorizing, the unconscious is not inert. It seeks expression in the intrapsychic conflict. The deceiving self has to be aware of what the deceived self believes in order to know how to concoct the self-deception. Different levels of awareness are sometimes proposed as another possible solution to the paradox. It is said that “deep down” people really know what they believe. Reuniting the conscious and unconscious split selves only reinstates the paradox of how one can be the deceiver and the one deceived at the same time.

    VH&T propose another variant of the split-self solution to the paradox. They posit a dissociated mental dualism in which the deceived mind is disconnected from the unconscious mind that knows the truth. Given the richly interconnected neuronal structure of the brain, is this Cartesian physical dualism at odds with what is known physiologically about the brain? The dissociated dualism removes not only the paradox, but also any commerce between the two minds. While the conscious mind is biasing information processing to bolster the self-deception, there is no mention of what the veridical unconscious mind is doing. If it is dissociated, how can it affect the conscious mind?

    A multitude of neuronal systems is involved in the processing of input information. However, when it comes to action, there is only one body. The diverse systems have to generate a coherent action. There is diversity in information processing but unity of agency in action (Bandura 2008; Korsgaard 1989). Contradictory minds cannot simultaneously be doing their own thing behaviorally. The authors do not explain how the disconnected conflicting minds can produce a coherent action.

    There are other epistemological issues, including the verifiability of the central thesis, that need to be addressed. The article presents an excellent review of biased information processing, but it leaves a lot to be desired in conceptual specification. How does one know what the unconscious mind knows? How does one assess the unconscious knowledge? By what criteria does one substantiate truth? Why is the unconscious mind veridical in self-enhancement but self-deceptive in other spheres of functioning? How does one gauge the benefits of self-deception, whether in the short term or the long term? Given the evolutionary billing of the article under discussion, what were the ancestral selection pressures that favored self-deception? Claiming that a given behavior has functional value does not necessarily mean it is genetically programmed. People are selective in their information seeking, often misconstrue events, lead themselves astray by their biases and misbeliefs, and act on deficient knowledge. However, biased information seeking and processing are not necessarily self-deception. VH&T cite, as an example of self-deception, conservatives favoring Fox News and liberals favoring MSNBC. By their selective exposure, they reinforce their political bias. But that does not mean that they are lying to themselves. The same is true for some of the other forms of biased information processing that are misconstrued as self-deception.

    In genuine self-deception people, avoid doing things that they have an inkling might reveal what they do not want to know. Suspecting something is not the same as knowing it to be true, however. As long as one does not find out the truth, what one believes is not known to be false. Keeping oneself uninformed about an unwanted truth is the main vehicle of genuine self-deception. By not pursuing courses of action that would reveal the truth, individuals render the knowable unknown (Haight 1980). Acting in ways that keep one uninformed about unwanted information is self-deception. Acting in selectively biasing and misinforming ways is a process of self-bias. These are different phenomena. The truth is not harbored in a dissociated unconscious mind. It exists in the information available in the avoided reality. The disconnected unconscious mind cannot know the truth if the evidential basis for it is avoided.

    VH&T emphasize the benefits of self-deception but ignore the social costs to both the self-deceiver and deceived. Being misled is costly to others. Therefore, the deceived do not take kindly to being led astray. Human relationships are long-term affairs. There are limits to how often one can mislead others. After a while, the victims quit caring about whether the deception was intentional or carried out unknowingly. If done repeatedly, the short-term gains of misleading others come with the cost of discrediting one's trustworthiness. Undermining one's ability to exercise social influence does not have adaptive value.

    Human behavior is regulated by self-sanctions, not just by social ones. Unless self-deceivers are devoid of moral standards, they have to live with themselves as well as with the social reactions of others to deceptive conduct. The maintenance of positive self-regard while behaving harmfully is a strong motivator for self-exoneration. Self-deception serves as a means of disengaging moral self-sanctions from detrimental conduct. The social cognitive theory of moral agency specifies a variety of psychosocial mechanisms by which individuals convince themselves that they are doing good while inflicting harm on others (Bandura 1999). By these means, otherwise considerate people can behave inhumanely without forfeiting their sense of self-worth.

    References