Unbelieving the unbelievable: Some people won’t reject proven false information



Thought piece

That human beings are, in fact, more gullible than they are suspicious should probably “be counted among the first and most common notions that are innate in us.”

The results suggest that both true and false information are initially represented as true and that people are not easily able to alter this method of representation. Results are discussed in terms of contemporary research on attribution, lie detection, hypothesis testing, and attitude change.

Baruch Spinoza was a Dutch philosopher of Sephardi Portuguese origin. The breadth and importance of Spinoza’s work was not fully realized until many years after his death. http://bit.ly/1ENbqI4

Extracts: Journal of Personality and Social Psychology, The American Psychological Association October 1990 Vol. 59, No. 4, 601-613,  http://bit.ly/1EN93VH

Libraries of the mind

Virtually all current and classical theories of mental representation presume that once the truth value of a proposition is assessed, the mental representation of that proposition is somehow altered or “tagged” to indicate its truth value–otherwise, people would have to reassess the validity of their knowledge each time they deployed it. Spinoza and Descartes seem to have agreed with this assumption, but disagreed about the precise nature of the tagging system itself.

A familiar metaphor may serve to illustrate the key elements of their division. Imagine a library of a few million volumes, of which only a small number are fiction. There are (at least) two reasonable methods by which one could tag the spines of books so that fiction could be distinguished from nonfiction at a glance. A first method would be to paste a red tag on each volume of fiction and a blue tag on each volume of nonfiction; a second method would be to paste a tag on each volume of fiction and to leave the nonfiction untagged. Although each method would allow a librarian to discriminate easily between the two types of book, the second method has both a unique advantage and disadvantage. The red—blue system requires that every volume in the library be tagged, and thus demands a great deal more time and effort than does the tagged—untagged system (which requires only the tagging of a few volumes). On the other hand, the efficiency of the tagged—untagged system trades on its accuracy: For example, when a new, untagged volume of fiction arrives on the library shelves, it may be mistaken for nonfiction before it is read.

In a sense, Descartes considered the mind to be a library of ideas that used something akin to the red—blue tag system. A new book (new information) appeared in the library (was represented in the mind), was read (assessed), and was then tagged (rerepresented) as either fiction (false) or nonfiction (true). Because new books (unassessed ideas) lacked a tag, they could not be identified as fiction or nonfiction until they had been read. Such unread books were “merely represented” in the library. Spinoza felt that the mind was more like a library that used a tagged—untagged system. Books were first represented and then their contents assessed, but, because of the particular tagging system used, a new book that appeared without a tag looked exactly like (and thus was treated as) a work of nonfiction.

In Spinoza’s library, a book’s spine always announced its contents, though sometimes erroneously. No book could be “merely represented” in the library because the absence of a tag was just as informative about the content of the book as was the presence of a tag. Analogously, ideas whose truth had been ascertained through a rational assessment process were represented in the mind in precisely the same way as were ideas that had simply been comprehended; only ideas that were judged to be false were given a special tag (cf. Wegner, Coulton, & Wenzlaff, 1985 ; see also Clark & Chase, 1972 , 1974 ; Gough, 1965 , 1966 ). It is difficult to know which of these models best describes the human mind.

Although the Cartesian and Spinozan systems are mechanically distinct, they produce the same conclusions under ideal conditions. For example, if one stood outside a library window and challenged the librarian’s knowledge of famous books (“Can you tell me about Civilization and Its Discontents without reading it?”), the librarian’s response (“That book is nonfiction”) would not enable one to determine whether the library used a Cartesian or Spinozan tagging system. In other words, if the Spinozan and Cartesian procedures (shown in Figure 1 ) were allowed to run to completion, undisturbed, they would produce identical products, and thus these products would not be informative about the nature of the systems that produced them.

Nonetheless, if one could sneak a new book (e.g., War of the Worlds ) onto the library’s shelves and somehow prevent the librarian from assessing its contents and tagging its spine, then the librarian’s response to an inquiry about that book would reveal a great deal about the library’s tagging system. If the library used the red—blue Cartesian system, then the librarian would shout through the window, “I don’t know what sort of book this is. Come back tomorrow after it’s been read and tagged.” If, however, the library used the tagged—untagged Spinozan system, then the librarian would mistakenly yell, “That one is nonfiction too!”


No one can enter the library of the mind, and thus one may only deduce its holdings from the librarian’s reports. Although one must always be cautious when inferring the nature of mental representation from behavioral responses ( Braitenberg, 1984 ; Lloyd, 1989 , pp. 3—11), the results of Study 1 are consistent with the idea that people initially represent false information as true. It is not clear, however, whether people are compelled to do so by the nature of the cognitive mechanisms they use or whether they choose to do so because of the nature of the tasks they attempt.

Societies place a premium on candor, and it seems likely that the majority of information that people encounter, assess, and remember is, in fact, true. (Indeed, the Spinozan tagging system is advantageous mainly to the extent that a library contains more nonfiction than fiction). Thus, it may be that people generalize from ordinary experience and consciously assume that all ideas are true unless otherwise noted. In other words, the initial belief in the truthfulness of information may be a flexible, heuristic assumption. If people choose, but are not compelled, to represent as true the propositions offered them, then this heuristic assumption should be modifiable.

For example, if people find themselves in situations in which they expect to receive false information (e.g., a propagandist speech), then they should be able to alter the default value assigned to incoming information. Individuals in such a position should adopt a “skeptic’s set”; that is, they should choose initially to represent ideas as false, and then to recode a select few of them as true. Interruption should cause such skeptics to mistake true propositions for false ones, but not vice versa–precisely the opposite of the effect seen in Study 1.

In the language of our metaphor, libraries whose holdings are primarily fiction may wish to modify the typical Spinozan procedure: They may wish to paste colored tags on their nonfiction and leave their fiction untagged. Workers in such a library should mistake untagged books for works of fiction.

The Spinozan position denies the possibility of a “skeptical reversal” of this sort. In its strongest form, the hypothesis states that people must initially represent ideas as true and only later rerepresent some of them as false, and they must conduct business this way regardless of the temporary circumstances in which they find themselves. If people can take changing contexts into account and consciously alter the default value assigned to incoming ideas, then this piece of the Spinozan position is plainly wrong.

Experiment summaries

Spinoza suggested that all information is accepted during comprehension and that false information is then unaccepted. Subjects were presented with true and false linguistic propositions and, on some trials, their processing of that information was interrupted.

As Spinoza’s model predicted, interruption increased the likelihood that subjects would consider false propositions true but not vice versa (Study 1).

This was so even when the proposition was iconic and when its veracity was revealed before its comprehension (Study 2).

In fact, merely comprehending a false proposition increased the likelihood that subjects would later consider it true (Study 3).

The results suggest that both true and false information are initially represented as true and that people are not easily able to alter this method of representation. Results are discussed in terms of contemporary research on attribution, lie detection, hypothesis testing, and attitude change.

General conclusions following a range of experiments

One of the dilemmas of mental life is that people need to know of things that are untrue, and yet need to know that these things are untrue. In the course of a single day, everyone is exposed to a variety of deceptive communications, ill-conceived opinions, and erroneous facts, many of which they must comprehend, remember, and yet somehow manage not to believe.

To forget that the moon is made of green cheese is to lose a precious piece of one’s childhood, but to act as though one believes this assertion is to forego the prospect of meaningful adult relationships. A ubiquitous paradox for natural thinking systems is that they must possess, but must not deploy, a wide range of false information. In theory, this would seem a rather simple task: Mental systems could keep false information from guiding their decisions simply by assessing each piece of information they encounter, and then coding that information as true or as false in the first place.

Such was Descartes’s view of the human mind. Spinozan systems, however, do not conduct business in this way. Rather, they easily accept all information before it is assessed, and then laboriously recode the information that is subsequently found to be false.

On occasion, of course, such attempts to recode false information will fail, and when this happens, a Spinozan system will find itself believing what it should not. This method of initially representing ideas as true may be economical and it may be adaptive, but any system that uses it will err on the side of belief more often than doubt.

That human beings are, in fact, more gullible than they are suspicious should probably “be counted among the first and most common notions that are innate in us.”





Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s