Belief Perseverance:
An Understated Variable in Human Affairs?
Keywords: Belief Perseverance, Conceptual Conservatism, Resistance to Conceptual Change, Open-Mindedness, Human Irrationality, Nazi holocaust, Moliere's Tartuffe, history of ideas
Source: Nissani, Moti. Conceptual Conservatism: An Understated Variable in Human Affairs? Social Science Journal, vol. 31, pp. 307-318 (1994)
"When faced with choice between changing one's mind and proving there is no need to do so, almost everyone gets busy on the proof."-- John Kenneth Galbraith
To properly perceive reality, human beings are occasionally called upon to do much more than acquire new information. They must open-mindedly weigh information and, if need be, discard old beliefs and adopt new ones. In social settings, intellectual growth presupposes, in part, a series of conceptual shifts. Unfortunately, experimental psychology conclusively shows that human beings are disinclined to let go of strongly held beliefs.
For our purposes here, belief perseverance can be defined as the cognitive tendency of human beings to cling to existing beliefs even after these beliefs have suffered decisive refutations. By and large, people are reluctant to let go of strongly-held beliefs, and they tend to ignore, deny, misperceive, or explain away contradictory evidence. Consequently, people tend to retain beliefs long after these beliefs have been decisively discredited. Such behavior is often attributed to greed, selfishness, emotional instability, and a variety of other factors, but here I shall emphasize the purely cognitive aspect of the failure to assimilate new ideas
A wealth of experimental evidence shows that "beliefs are remarkably resilient in the face of empirical challenges that seem logically devastating."1 Thus, intuitive misconceptions of science students are surprisingly resistant to change.2 The conceptual difficulties in gaining expertise in such diverse fields as mechanics, political science, and chess have been often documented.3 Evidence of belief perseverance is likewise encountered in numerous problem-solving contexts. When asked to reappraise probability estimates in light of new information, subjects display a marked tendency to give insufficient weight to the new evidence.4 In the four-card problem, an individual often chooses logically incorrect strategies of confirming hypotheses instead of choosing strategies which would permit their rejection.5 In some problem-solving situations, subjects continue to deploy the same general rule they have used in previous tasks, even when an easier and more efficient way of approaching the new task is available.6 In a similar situation, subjects perceiving boxes as containers are less likely to arrive at a solution which calls for the use of such boxes as platforms.7
Karmiloff-Smith and Inhelder8 observed the actions of young children in a block-balancing task. Of the seven types of blocks used, two types could be balanced at their geometric center, one could not be balanced at all, and four could only be balanced off center. For any block which could only be balanced off center, the subjects thus faced a choice between three competing hypotheses: this block could be balanced at its geometric center, it could not be balanced, or it could be balanced at a point other than its geometric center. Under these conditions, observations of the children's actions suggested that they often began by adopting the first hypothesis, then the second, and that they moved to the third only gradually and reluctantly.
Belief Perseverance had also been observed in members of a religious cult after a failure of a prophecy which was, until then, central to their belief system.9 More recently, Ross and colleagues10 employed the debriefing paradigm to study the perseverance of social theories. In one typical experiment, subjects were provided with case histories which led them to believe in either a positive or negative correlation between a firefighter's stated preference for taking risks and his/her occupational performance. In this case, significant levels of theory perseverance were recorded even after subjects were appraised of the fictitious nature of the initial case studies from which their theory was derived.
Recent experiments bring into particularly sharp relief people's reluctance to let go of strongly-held beliefs, even after these beliefs suffered decisive refutations.11 In these experiments, all subjects held a Ph.D. in a natural science. All were employed by two major research universities as postdoctoral fellows and professors. These subjects possessed, therefore, exceptional measurement skills and an above-average inclination to treasure their exceptions. They were led to believe that they were engaged in a professional evaluation of a self-discovery instructional manual. Before they could provide the needed evaluations, they were told, they needed to go through the manual in the same manner that prospective students later would.
At a certain critical point in this manual, these scientists were given a plausible but false formula which led them to believe that spheres are 50% more voluminous than they really are. Immediately after, they generated experimental evidence which dramatically discredited this formula. While the formula was leading them, for instance, to expect that a given sphere would contain three quarts of water, direct measurements yielded only two. They were then asked whether--in light of the experimental evidence they have just generated--the formula they were given was valid. To the experimenters' astonishment, not one scientist flatly rejected the formula. They were also given a paper-and-pencil question about the volume of a sphere of the same dimensions as the sphere for which they have obtained the discrediting evidence. All subjected chose the discredited value.
In view of these unexpected results, an additional segment was added to the instructional manual; this time with the goal of making the discrepancy between the false formula and observations so absurd as to enable all scientists to break away from the formula. Even though this segment tried in every possible way (short of telling these scientists that they had been deceived) to help them break away from the spurious belief, over 90% still failed. Surveys suggest that these results are counterintuitive--neither natural scientists nor psychologists were able to predict these subjects' striking reluctance to let go of an established belief.
In addition to confirming the near universality of belief perseverance, these findings suggest that human irrationality is often attributable to the psychological difficulty of replacing one belief with another:
The...outcome -- all subjects clung in practice to an observationally absurd formula and none rejected it outright even on the verbal level--is surprising. Even when we deal with ideologically neutral conceptions of reality, when these conceptions have been recently acquired, when they came to us from unfamiliar sources, when they were assimilated for spurious reasons, when their abandonment entails little tangible risks or costs, and when they are sharply contradicted by subsequent events, we are, at least for a time, disinclined to doubt such conceptions on the verbal level and unlikely to let go of them in practice.12
It is tempting to place these and similar such counterintuitive observations within one or another competing conceptual frameworks. Historians 13 have long noted the decisive role of belief perseverance in science. The process of conceptual change in science, according to this view, resembles Gestalt perceptual shifts (e.g., the difficulty encountered in seeing the hag as a young lady), adjustments to wearing inverted goggles,14 or the identification of anomalous objects15 (e.g., a red 6 of spades). The most difficult mental act, according to one science historian, is to re-arrange a familiar bundle of data, to look at it differently and escape from the prevailing doctrine.16 According to this view, then, the difficulty of switching from one belief to another--in science, experimental psychology, and elsewhere--is traceable to the difficulty of rearranging one's perceptual or cognitive field.
The theory of cognitive dissonance provides another framework for understanding belief perseverance.17 Thus, the discrepancy encountered by the subject-scientists described earlier can be presumed to have caused an acute cognitive incompatibility between, on the one hand, their view of themselves as competent observers and, on the other hand, their inability to reconcile the theoretical and experimental formulas of a sphere. These incompatible cognitions might have, in turn, aroused dissonance. To eliminate this unpleasant state of tension and its perceived unpleasant consequences,18 dissonance theorists might argue, the subjects explained away their observations, preferring instead to cling to the discredited formula.
Another interesting effort to understand the process of belief revision is provided by coherence theory.19 This theory assumes that conservatism is the key feature in the process of belief revision: we only change our views when we have good reasons to do so and when such a change increases the overall coherence of our beliefs. Thus, in revising our beliefs we seek to maximize coherence and minimize change. According to coherence theory, a belief acquires justification simply by being believed. The behavior of subjects in the debriefing paradigm20 is, according to this view, perfectly sensible and normatively correct, since subjects here follow the reasonable principle of conservatism: One is justified in continuing to accept something in the absence of a special reason not to do so.
Based on his experiences with bereaved people, another theorist21 argues that whenever people abandon a strongly-held belief, they must work through a process similar to the working out of grief. According to this view, in many seemingly diverse situations, change requires overcoming an impulse to restore the past. "The impulse to defend the predictability of life is a fundamental and universal principle of human psychology." Human beings possess "a deep-rooted and insistent need for continuity." Once we understand the anxieties associated with loss of cherished ideas, the tenacity of clinging to old beliefs can be understood. We often argue, for instance, about the need for social change, and we tend to explain conservatism as ignorance, cowardice, or protection of privilege. This is true in some cases, but often our resistance to change is merely traceable to a universal conservative impulse which is more pervasive and profound than simple prejudice or class interest. This conservative impulse is shown perhaps most clearly in clinical psychology, which clearly underscores the difficulty of restructuring mistaken, dysfunctional, and painful viewpoints and habits.
Others might wish to place belief perseverance in the extensively researched context of conformity or obedience to authority.22 Still other students of the history of ideas insist that it is greed and selfishness of established power figures which explain the unfavorable reception of new ideas, and not the cognitive difficulty of switching from one belief to another.23
Given this abundance of theoretical viewpoints, it may be best to focus on the empirical, and widespread, tendency of human beings to cling to discredited beliefs, deferring the placement of this tendency in a theoretical context to a later date. So, leaving theories aside, in the remainder of this paper, a few random illustrations from diverse fields will try to show that belief perseverance throws some unexpected light on history, culture, and society.
Before embarking, I should like to underscore one point. This paper is not aimed at showing that the episodes below are traceable to belief perseverance alone. My goal is far more modest. I shall only try to show that, first, among the constellation of causes which probably contributed to the observed behavior in each case, belief perseverance has a respectable place. Second, that the handful of theoreticians who insist on the key importance of belief perseverance in human affairs may very well be correct.
THE NAZI HOLOCAUST
The Holocaust literature is vast. Here I shall briefly try to reinterpret a key component in the account given by Zygmunt Bauman,24 one of the Holocaust's most insightful students. Prof. Bauman rightly insists, in my view, on the essential similarity of the Nazi Holocaust to such horrors as the Stalinist genocide and the atomic destruction of Nagasaki. I think he may be right in saying that, consequently, modern society is not a moralizing force. I find another key assertion less probable, namely, that the Holocaust is traceable to the tendency of modern culture and organizations to neutralize the disruptive "impact of moral behavior."25 Let me illustrate my reservations by means of one concrete example.
Nazi propagandists were aware of the tension between the stereotype of the Jew which they wished to foster in their subjects, and the image of the "Jew next door" which many of their subjects had. "There seemed to be amazingly little correlation," Bauman remarks,26 "between personal and abstract images; as if it was not in the human habit to experience the logical contradiction between the two as a cognitive dissonance, or--more generally--as a psychological problem; as if in spite of the apparently identical referent of personal and abstract images, they were not generally considered as notions belonging to the same class, as representations to be compared, checked against each other, and ultimately reconciled or rejected." As late as 1943, Bauman goes on to remark, "Himler complained in front of his henchmen that even devoted party members, who had shown no particular compunction concerning the annihilation of the Jewish race as a whole, had their own, private special Jews whom they wished to exempt and protect." Party members agree, Himler said, "that the Jewish people is to be exterminated...and each one has his own decent Jew. Of course the others are swine, but this one is a first-class Jew."
To this seemingly strange conduct, Bauman offers the following interpretation:27
It seems that what keeps personal images and abstract stereotypes apart and wards off the clash that every logician would deem inevitable is the mere saturation of the first and the morally-neutral, purely intellectual, character of the second. That proximity-cum-responsibility context within which personal images are formed surrounds them with a thick moral wall virtually impenetrable to "merely abstract" arguments. Persuasive or insidious the intellectual stereotype may be, yet its zone of application stops abruptly where the sphere of personal intercourse begins. "The other" as an abstract category simply does not communicate with "the other" I know. The second belongs within the realm of morality, while the first is cast firmly outside.
The second resides in the semantic universe of good and evil, which stubbornly refuses to be subordinated to the discourse of efficiency and rational choice.
This ingenious interpretation may very well be correct. But readers of these lines may already guess the point in recounting Prof. Bauman's hypothesis: Bauman's causal sequence can be either replaced or supplanted by a far simpler cognitive interpretation. For years, party members have been subjected to intensive, highly skilled, propaganda campaign.28 In the words of Robert Wistrich: "The willingness of Germans to participate in the mass murder of Jews had been conditioned by a totalitarian propaganda apparatus which for years had portrayed the Jews as a vermin to be mercilessly eradicated."29 Party members therefore believed in the worthlessness of Jews. Their living, breathing, Jewish acquaintances presented them with strong evidence against that belief. Logic indeed demanded that they discard their stereotypic view. Belief perseverance, however, suggests otherwise. Just as our scientists-subjects were unable to discard the formula of the sphere in the face of overwhelming evidence against it, just as Ross and Anderson's30 subjects were unable to discard false beliefs after the evidential basis in their favor has been discredited, so were the majority of Party members unable to discard their false image of Jews in the face of contradictory evidence against it. Once party members were led to believe in an evil and ludicrous ideology, the fates of their nation and victims had been sealed, for, as we have seen, it takes much more than decisive refutations to alter beliefs.
"To do evil," says one Gulag veteran, "a human being must first of all believe that what he is doing is good, or else that it's a well-considered act in conformity with natural law.... The imagination and spiritual strength of Shakespeare's evildoers stopped short at a dozen corpses because they had no ideology. Ideology...gives the evildoer the necessary steadfastness and determination."31 If I am right, the Third Reich only needed to inculcate its subjects with an ideology and safeguard them from too many decisive refutations. A few refutations were of small account, since most individuals are perfectly capable of brushing such refutations aside.
MOLIERE'S TARTUFFE
In this classical play, a confidence man who calls himself Tartuffe endears himself to Orgon, a middle-aged wealthy Parisian. In short order, Tartuffe becomes the key figure in Orgon's life. Despite Tartuffe's poverty, patent hypocrisy, and dubious past, he becomes Orgon's trusted friend and idol. Although Orgon's grown children, maid, brother-in-law, and young second wife see through Tartuffe's chicanery, Orgon ignores their advice, preferring instead to listen to Tartuffe's phony diatribes and to Madame Pernelle's (Orgon's mother) sympathetic views of the "holy" man. Even when presented with evidence that Tartuffe tried to seduce his wife, Orgon not only disbelieve his wife and son, but disowns his son for relating this incident. Orgon ascribes his family's conduct to jealousy, unkindness, and ingratitude, and makes up his mind to break his daughter's former engagement and marry her to Tartuffe, very much against her will. As if all this is not enough, he hands Tartuffe a legal deed to all his wealth. Only when Orgon sees Tartuffe actually trying to seduce his wife and hears him condescendingly describing him as a gullible fool, is Orgon cured of his malady. Orgon's mother, Madame Pernelle, who was not a witness to this scene of attempted seduction, hangs on to her belief in Tartuffe even after her son tells her what he has just seen and heard. She only wakes up when Tartuffe attempts to dispossess her son of his home and belongings.
What can we make of this charming plot? To begin with, everyday experience tells us that something like that can actually take place: that it is not the mere brainchild of an imaginative playwright. The experiments cited earlier reinforce this impression--as we have seen, human beings are not very good at letting go of discredited beliefs.
It is far more difficult to interpret Orgon's and Madame Pernelle's irrational conduct than to resolve the question of this play's verisimilitude to real life. Of the many attempts to throw light on Orgon's conduct, I shall briefly present here the one given by Moliere's ablest translator, Prof. Richard Wilbur:32
Tartuffe's attitude toward Orgon is perfectly simple: he regards his benefactor as a dupe, and proposes to swindle him as badly as he can. Orgon's attitude toward Tartuffe is more complex and far less conscious. It consists, in part, of an unnatural fondness or "crush"... It also involves, in the strict sense of the word, idolatry: Orgon's febrile religious emotions are all related to Tartuffe and appear to terminate in him. Finally, and least consciously, Orgon cherishes Tartuffe because, with the sanction of the latter's austere precepts, he can tyrannize over his family and punish them for possessing what he feels himself to be losing: youth, gaiety, strong natural desires.... Orgon is thus both Tartuffe's victim and his unconscious exploiter; once we apprehend this, we can better understand Orgon's stubborn refusal to see Tartuffe for the fraud that he is....he must see Tartuffe paw at his wife, and hear Tartuffe speak contemptuously of him, before he is willing to part with the sponsor of his spiteful piety.
Orgon's conduct might indeed be traceable to these three motives. But it also seems reasonable to suppose that Orgon's suffers from a rather advanced case of that near-universal human malady, belief perseverance. The parallels between Orgon's conduct on one hand, and the subject-scientists described earlier on the other hand, suggest belief perseverance as an equally probable determinant of his conduct.
A TRUE TALE OF A SMALL AMERICAN TOWN
Belief perseverance throws much light on some ordinary experiences. Most people resist reasonable evidence showing that someone they believed to be, say, an unprincipled bait-and switch salesperson, is in fact honest. Here I shall only recount one true tale from small-town America. The setting is a town in Wyoming. The main character is the town's physician, John Huntington Story. Ever since his arrival in town in 1958, Dr. Story seemed a model of respectability. Besides his educational achievements, he was a devoted Baptist, a dedicated husband, and a seemingly good father. Yet, for over twenty years this "good" man reportedly raped the town's women and girls. Over the years, more than 100 patients, ranging in age from 10 to 68, had been raped in Story's clinic. By 1985, Dr. Story was sentenced to fifteen years in prison. Yet, as late as 1989, many townspeople still maintained that Dr. Story has been maligned and unjustly convicted.33
To students of belief perseverance, these townspeople's conduct is nothing but mysterious. No doubt fear, obedience to authority, conformity, the desire to avoid scandal, puritanism, and ignorance about sexual matters played a part, but one should not ignore the simplest and most straightforward contributory factor: the cognitive difficulty of switching from one belief to another.
CULTURAL INNOVATIONS
The plight of the cultural innovator has often been explained in terms of belief perseverance. Imaginative painters, writers, or composers often go unrecognized in their lifetime, again showing that people are not very good in shifting from one way of viewing reality, truth, or beauty, to another. Here I should like to briefly illustrate this well-known and sad fate of cultural innovators through the history of mathematics and natural science. Max Planck, for instance, said: "The new scientific truth does not triumph by convincing its opponents and making them see the light, but rather because its opponents eventually die, and a new generation grows up that is familiar with it."34 Indeed, with a few possible exceptions (especially in the case of people who are already well-known by the time they make their landmark discovery), the history of science and mathematics seems to bear out this pessimistic view. According to Bertrand Russell,35 the mathematician Frege "published his first work in 1879...but, in spite of the epoch-making nature of his discoveries, he remained wholly without recognition until...1903." Similarly, John Speke, who had "actually got to the source of the While Nile in Uganda...had to wait twenty years or more before his discovery was acknowledged."36 Heinrich Samuel Schwabe observed the sun every day for 17 years, from 1825 to 1843. "By 1943, he was able to announce that sunspots did not appear utterly at random; there was a cycle.... Schwabe announcement was ignored (he was a pharmacist after all) until the well-known scientist Alexander von Humboldt mentioned the cycle in 1852 in his book." Or take another astronomical example: Edward Walter Maunder published his discovery of gaps in sunspot cycles in 1894 and 1922, but his work was only rediscovered and appreciated in the 1970s.37
The next example is taken from the history of medicine. Childbed fever once claimed countless lives in maternity wards. After many false starts, Ignaz Semmelweis discovered a simple preventive measure. "If you do not wish to kill your patients," he told his fellow gynecologists, "you must disinfect your hands before handling a patient. You cannot, in particular, dissect a cadaver or examine a sick patient and then proceed to deliver a baby with soiled hands." Now, one could scarcely imagine a more conclusive proof than the one proffered by Semmelweis, a greater urgency, a smaller sacrifice or inconvenience, or a better educated audience than the one to which his pleas was directed. Yet, Semmelweis and his petitions had been ignored for years and years, young women kept dying at childbirth, and Semmelweis himself ended his life in an asylum. 38
The most economical and straightforward explanation of this curious feature of the science again seems to be afforded by belief perseverance. Indeed, this feature of science lends further support to the contention that belief perseverance plays an important role in everyday life, culture, and history. After all, natural scientists are trained to be objective and flexible, to anchor their beliefs in external reality, to detach their egos from their theories, to think it possible that they are mistaken. If the history of science is comprised of endless tales of the innovative individual's struggle against her own, and then against her colleagues', belief perseverance,39 then it goes without saying that similar forces are likely to be found elsewhere.
CONCLUSION
Efforts of psychological reductionism go at least as far back as Freud, and still continue apace.40 They have never succeeded in explaining social realities in terms of a single cause, but they have occasionally managed to throw some new light on old problems. This paper suggests that one critically important but often overlooked psychological attribute--the tendency of human beings to unreasonably cling to discredited beliefs-- may provide a partial explanation to the conduct of both individuals and collectives.
Over many generations, a few social scientists, natural scientists, humanists, businesspeople, statesmen, and artists have noticed the decisive importance of belief perseverance in human affairs. For instance, Francis Bacon says: "The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects."41 A twentieth-century philosopher holds that this inertia "deserves to rank among the fundamental 'laws' of nature."42 For some strange reason, however, social scientists have so far failed to sufficiently incorporate this fundamental human failing into their account of the human condition. Given this failing's ubiquitousness and decisive importance, its nature, causes, and cures ought to be explored from a variety of disciplinary angles. To echo philosopher Gilbert Harman, my chief aim in this paper has been to show "that there is a subject here, change in view, a subject worthy of serious systematic study."43
Acknowledgments: I thank the anonymous reviewers for their thoughtful comments.
NOTES
1. Lee Ross and Craig A. Anderson, "Shortcomings in the Attribution Process: On the Origins and Maintenance of Erroneous Social Assessments," in Judgment Under Uncertainty: Heuristics and Biases, edited by D. Kahneman et al. (Cambridge: Cambridge University Press, 1982), p. 144.
2. R. Driver, and J. Easley, "Pupils and Paradigms: A Review of Literature Related to Concept Development in Adolescent Science Students," Studies in Science Education, 5 (1978): 61-84. See also M. Nissani, "A Hands-on Instructional Approach to the Conceptual Shift Aspect of Scientific Discovery," Journal of College Science Teaching, 19 (1989): 105-107.
3. Reviewed in M. T. H. Chi, R. Glaser, and M. J. Farr, eds. The Nature of Expertise (Hillsdale, NJ: Lawrence Erlbaum, 1988).
4. W. Edwards, "Conservatism in Human Information Processing," in: Formal Representation of Human Judgment, edited by B. Kleinmuntz (New York: Wiley, 1968), pp. 17-52.
5. P. C. Wason, "Response to Affirmative and Negative Binary Statements." British Journal of Psychology, 12 (1961): 129-140.
6. A. S. Luchins, Rigidity of Behavior (Eugene: University of Oregon Press, 1959).
7. K. Duncker, "On Problem Solving." Psychological Monographs, 58 (1945): 1-113.
8. A. Karmiloff-Smith, and B. Inhelder, "If You Want to Get Ahead, Get a Theory." Cognition, 3 (1975): 195-212.
9. Leon Festinger et al., When Prophecy Fails (Minneapolis: University of Minnesota Press, 1956).
10. Reviewed in: C. A. Anderson, "Abstract and Concrete Data in the Conservatism of Social Theories: When Weak Data Lead to Unshakeable Beliefs." Journal of Experimental Social Psychology, 19 (1983) , 93-108. See also A. J. Sanford, Cognition and Cognitive Psychology (London: Weidenfeld and Nicholson, 1985).
11. M. Nissani and D. M. Hoefler-Nissani, "Experimental Studies of Belief-Dependence of Observations and of Resistance to Conceptual Change," Cognition and Instruction, 9 (1992): 97-111.
12. M. Nissani, "An Experimental Paradigm for the Study of Belief Perseverance and Change," Psychological Reports, 65 (1989): 19-24.
13. For instance, J. B. Conant, Science and Common Sense (New Haven: Yale University Press, 1951); T. S. Kuhn, The Structure of Scientific Revolutions, revised edition (Chicago: University of Chicago Press, 1970).
14. G. M. Stratton, "Vision Without Inversion of the Retinal image." Psychological Review, 4 (1897): 341-360; 463-481.
15. J. S. Bruner and L. Postman, "On the Perception of Incongruity: a Paradigm," Journal of Personality, 18 (1949): 206-223.
16. H. Butterfield, cited in: W. I. B. Beveridge, The Art of Scientific Investigation, New York: Norton, 1950, p. 102.
17. Leon Festinger, A Theory of Cognitive Dissonance (Stanford: Stanford University Press, 1957); Leon Festinger and J. Merrill Carlsmith, "Cognitive Consequences of Forced Compliance," Journal of Abnormal and Social Psychology, 58 (1959): 203-210. For a version of this theory which appears to be most applicable in the present context, see C. M. Steel and T. J. Liu, "Dissonance Processes as self-affirmation," Journal of Personality and Social Psychology, 45 (1983): 5-19.
18. Joel Cooper and Russell H. Fazio, "A New Look at Dissonance Theory." Advances in Experimental Social Psychology, 17 (1984): 229-266.
19. Gilbert Harman, Change in View (Cambridge, Mass: MIT Press, 1986).
20. Ross and Anderson, "Shortcomings."
21. Peter Marris, Loss and Change, revised edition (London: Routledge, 1986). Quotes are taken from p. 2.
22. Solomon E. Asch, "Studies of Independence and Conformity: I. A Minority of One Against a Unanimous Majority," Psychological Monographs 9 (1956). Stanley Milgram, Obedience to Authority (New York: Harper & Row, 1974). See also M. Nissani, "A Cognitive Reinterpretation of Stanley Milgram's Observations on Obedience to Authority," American Psychologist, 45 (1990): 1384-1385.
23. Francis Bacon cited in W. I. B. Beveridge, The Art, p. 107. See also, Raymond A. Lyttleton, "The Gold Effect," in Lying Truths, compiled by Ronald Duncan and Miranda Weston-Smith (Pergamon: Oxford, 1979), pp. 181-198.
24. Zygmunt Bauman, Modernity and the Holocaust (Oxford: Polity Press, 1989). Zygmunt Bauman, "The Social Manipulation of Morality: Moralizing Actors, Adiaphorizing Action," Theory, Culture, & Society, 8 (1991): 137-151.
25. Bauman, "The Social Manipulation," p. 142.
26. Bauman, "Modernity," p. 187.
27. Bauman, "Modernity," pp. 187-188.
28. Alan Bullock, Hitler (New York: Harper & Row, 1962).
29. cited in Ivar Oxaal, "Sociology, History and the Holocaust," Theory, Culture & Society, 8 (1991), p. 164.
30. Ross and Anderson, "Shortcomings."
31. Alexander I. Solzhenitsyn, The Gulag Archipelago, Vol. I (New York: Harper & Row, 1974), p. 173-174.
32. Moliere, Tartuffe (translated and introduced by Richard Wilbur, New York: Harcourt, 1965), pp. 160-162.
33. Jack Olsen, Doc: The Rape of the Town of Lovell (New York: Atheneum, 1989).
34. Hans Eisenck, Rebel with a Cause (London: W. H. Allen, 1990), p. 67.
35. Bertrand Russell, A History of Western Philosophy, (New York: Simon and Schuster, 1945), p. 830.
36. Alan Moorehead, The Blue Nile (New York: Harper & Row, 1962), p. 281.
37. Isaac Asimov, Asimov's New Guide to Science (New York: Basic Books, 1984), pp. 102-103.
38. Paul De Kruif, Men Against Death (New York: Harcourt, 1932). See also M. Nissani, "Psychological, Historical, and Ethical Reflections on the Mendelian Paradox," Perspectives in Biology and Medicine, in press.
39. Kuhn, The Structure.
40. For instance, G. R. Boynton, "Ideas and Action: A Cognitive Model of the Senate Agricultural Committee", Political Behavior, 12 (1990): 181-213; Stanton K. Tefft, "Cognitive Perspectives on Risk Assessment and War Traps: An Alternative to Functional Theory," Journal of Political and Military sociology, 18 (1990): 57-77.
41. Cited in Richard Nisbett and Lee Ross, Human Inference (Englewood Cliffs, NJ: Prentice-Hall, 1980), p. 167.
42. F. C. S. Schiller, cited in Beveridge, The Art, p. 106.
43. Harman, Change in View, p. 116.