Keywords:  Belief Perseverance, Conceptual Conservatism, Resistance to Conceptual Change, Open-Mindedness, Human Irrationality, Belief-Dependence of Observations

Source: Nissani, M. & Hoefler-Nissani, D. M. Experimental studies of belief-dependence of observations and of resistance to conceptual change. Cognition and Instruction 9: 97-111 (1992).

goldline.gif (1214 bytes)

WHEN THEORY FAILS

(An Experimental Investigation of Belief Perseverance)

(Or: EXPERIMENTAL STUDIES OF BELIEF-DEPENDENCE OF OBSERVATIONS AND OF RESISTANCE TO CONCEPTUAL CHANGE)

ABSTRACT: Natural scientists were invited to evaluate a rediscovery-based written manual for teaching high school science and math. The first session re-familiarized participants with the concepts this experiment presupposed, reinforced the legitimacy of the instructional setup, and fostered tolerance for unconventional mathematical formulas. This session also involved the use of a cylinder for a hands-on confirmation that the two ways of measuring volume of geometrical solids--theoretical (through length measurements and the use of a formula) and experimental (through capacity measurements)--yield similar values. In the second individual session, an artificial clash was created: participants were given an incorrect theoretical formula which led them to believe that spheres are 50% larger than they are. They were then asked to compare expectations created by this formula to their own capacity measurements of two actual 10- and 20-cm spheres. The discrepancies between theoretical and experimental volumes frequently led to doubt, discomfort, adjustment of measurements, and ad hoc explanations. They rarely led to the abandonment of belief in the false formula. Based on these experimental results, several stages in the process of conceptual change are proposed, including discomfort, ad hoc explanations, adjustment of observations and measurements to fit expectations, doubt, vacillation, and--finally--conceptual shift.

goldline.gif (1214 bytes)

A great deal of anecdotal and experimental evidence suggests that, to some limited extent, we see what we believe (Strike & Posner, 1985). Indeed, in some fields researchers often go out of their way to protect their observations and measurements from the biasing influence of preconceived notions. For instance, a standard procedure in both clinical medicine and psychology--that of the double-blind protocol--is designed precisely to bypass the experimenter's and patient's tendencies to make reality fit the Procrustean bed of their beliefs.

The reaction of the scientific community to Dalton's atomic theory prompted philosopher of science Thomas Kuhn (1974, pp. 134-135) to go even farther:

Here and there the very numerical data of chemistry began to shift. When Dalton first searched the chemical literature for data to support his physical theory, he found some records of reactions that fitted, but he can scarcely have avoided finding others that did not. . . . it is hard to make nature fit a paradigm. That is why . . . measurements undertaken without a paradigm so seldom lead to any conclusions at all. Chemists could not, therefore, simply accept Dalton's theory on the evidence, for much of that was still negative. Instead, even after accepting the theory, they had still to beat nature into line, a process which, in the event, took almost another generation. When it was done, even the percentage composition of well-known compounds was different. The data themselves had changed. That is the last of the senses in which we may want to say that after a revolution scientists work in a different world. (p. 634)

By now, Kuhn's basic point is taken for granted by most philosophers of science, cognitive psychologists, and perception theorists. The old notion that knowledge arises directly from experience has given way to a more complex view about the nature of human perception and its correspondence to the outside world. This interactive conception raises however a host of questions. For instance, Can our tendency to see what we believe be subjected to a quantitative experimental analysis? To what extent is this tendency subject to individual variations? Are natural scientists--individuals who are often specially trained in measurements--less prone to make reality fit their preconceived notions than untrained lay people?

A question of particular theoretical import concerns the degree to which we make reality conform to our theories (see Thagard 1988, p. 151 for a discussion). Following Hanson (1958), some philosophers of science believe that our observations are theory-laden. Thus, if subscribers to two competing theories literally see a different world, then the contention that such theories are incommensurable would seem compelling enough. But one can also imagine an intermediate position between theory-relative observations and naive empiricism: "Observation is inferential, so that any given observation might be influenced by theory, but the inferential processes in observation are not so loose as to allow us to make any observation we want." (Thagard, 1988, p. 151). Experimental evidence presented in this paper throws some interesting light on these questions.

The empirical studies described in this paper also explorea related set of questions. As long ago as 1620, Francis Bacon commented:

The human understanding when it has once adopted an opinion draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects and despises, or else by some distinction sets aside and rejects, in order that by this great and pernicious predetermination the authority of its former conclusion may remain inviolate. (cited in Nisbett & Ross, 1980, p. 167)

The psychological literature lends credence to Bacon's pessimism: "Beliefs tend to sustain themselves even despite the total discrediting of the evidence that produced the beliefs initially" (Nisbett & Ross, 1980, p. 192; see also Festinger, Riecken, & Schachter 1964; Hardyck & Braden, 1962; Ross & Anderson, 1982; Karmillof-Smith & Inhelder, 1975; Milgram, 1984, Nissani, 1989). Ordinary human affairs, educational theory and practice (West & Pines, 1985), political history (Kull, 1988), and the history of science (Kuhn, 1974), also suggest that Bacon's claim rested on a shrewd insight into the human condition. In educational reserach, for instance, it has been repeatedly shown that intuitive misconceptions of science students are surprisingly resistant to change (Driver & Easley, 1978). Besides presenting observations on belief-dependence of observations, this paper presents striking evidence for the human tendency to cling to discredited beliefs.

This paper is primarily concerned with methodological questions and the presentation of empirical data, not with their interpretation. In particular, this paper will not attempt to assign relative weights to the cognitive, social (conformity and obedience), and motivational determinants of belief-dependence of observations and of the tendency to cling to discredited beliefs. In our view, a great deal of experimentation and collection of empirical data will be required before these issues can be meaningfully resolved. A tentative effort to explore the rather complex inter-relationships between resistance to conceptual change and obedience to authority has appeared elsewhere (Nissani, 1990). For similar reasons, this paper will refrain from presenting the data in terms of a particular conceptual framework (e.g., Cooper & Fazio, 1984; Eagly & Chaiken, 1984; Thagard, 1988).

METHOD

Subjects

The key experimental group in this study consisted of 19 natural scientists. Their names and phone numbers were either obtained from the faculty and staff directories of two major research universities, or through referrals. Prospective participants were contacted by phone and invited to serve as consultants, for a fee, in the evaluation of a rediscovery-based educational approach. At the initial stages, the only prerequisites for participation were consent and possession of a Ph.D. in a natural science. At a later stage, and in order to fit a reasonable ratio between those who did and did not know the volume formula of the sphere (see below), only those who did not happen to remember this formula were asked to take part. (In this and later parts of the experiment, knowledge of the formula was defined through its correct and immediate recall). Of 87 individuals contacted, 39 declined to take part and 10 were disqualified because they did not hold a Ph.D. degree in a natural science. Of the remaining 38, all 13 who did not know the volume formula of the sphere were invited to take part. In contrast, only the first 6 of the 25 scientists who knew the formula were asked to take part.

This group was comprised of 16 men and 3 women. Their mean age was 38. On average, they received the Ph.D. degree 9 years before participating in this study. Their doctorates were in the following fields: biochemistry (3), cell biology (1), chemical engineering (1), chemistry (1), environmental science (2), human genetics (2), limnology (1), organic chemistry (2), pharmacology (4), plant physiology (1), and zoology (1). Nine scientists were affiliated with one major research university, nine with another, and one was employed by the Federal Government. Their academic ranks ranged from postdoctoral fellow to full professor. The native language of 13 was English. Although the other six spoke fluent English, their mother tongues were Bengali, Chinese, French, German, Marathi, and Tamil. On average, each had 9.4 years of research experience and had published 12.3 single- or senior-authored papers in refereed journals, 3 book chapters, and 0.16 books.

Three additional groups took part. The first group consisted of three graduate students and one college senior recruited by phone or through advertising on university bulletin boards. The 4 mathematically proficient men who constituted this group took part in the exact same experiment as the 19 scientists who comprised the experimental group. The second group was comprised of 15 scientists with roughly equal credentials to the 19 in the experimental group. A third group consisted of 28 undergraduates: 15 taking a physical science class for non-science majors and 13 freshmen enrolled in an honors program.

Procedure

Experimental Design. Participants were recruited to evaluate the efficacy of a self-contained instructional manual. Before they could provide the needed appraisal, they were told, they needed to acquire a first-hand experience with this manual's content by studying it and following the instructions it provided. All 19 were therefore asked to go through this sample of our educational material in the same manner as our prospective students would. Each participant attended two separate sessions of approximately 1.5 and 2.5 hours duration. All sessions had a standardized content and format.

An experimenter was present throughout both learning sessions, carefully recording unusual occurrences and verbalizations. The learning sessions themselves took place at a college professor's office (9 scientists) or at the living room of a middle class home (10 scientists).

During the first session, the instructional manual gave special attention to the derivation of the area formula of the circle. For reasons that would become apparent later, the radius and conventional formula ( r2) were ignored. Instead, the manual reinforced the exclusive use of a diameter and the alternative--and in practical settings more convenient--formula .785D2.

The manual also dealt with two alternative approaches for measuring volume of geometrical solids; the theoretical method, which determines volume through length measurements of a given solid and plugging the results directly in the appropriate mathematical formula, and the experimental method, which relies on capacity measurements: filling the solid with liquid, transferring this liquid to a watertight box, and determining its volume through width, length, and height measurements. Participants were then asked to employ both approaches to determine the volume of an actual cylinder and then to assess these two approaches' comparative efficacy.

At the beginning of the second session, all participants were told that, besides the agreed upon consultation fee ($10-$15 an hour), they were still eligible for a supplementary bonus ($30) if they answered correctly all self-contained questions in the remainder of the exercise. A 4-minute-long preliminary survey purportedly assessed recall of instructional material from the first session, but was chiefly concerned with finding out whether participants knew the volume formula of the sphere just before the exercise began. This survey was followed by a 45-minute-long instructional segment which led participants to believe in an incorrect and unconventional volume formula for the sphere.

The conventional formula for the sphere--the only one which all the natural scientists we contacted were familiar with--is. Although it can be readily shown that this formula is roughly equivalent to .52D3, this equivalence is not immediately obvious. In this segment of the exercise, participants were led to believe that the correct formula is .785D3. As will be seen later, all participants in this experiment--including the 6 participants who walked into the exercise knowing the conventional theoretical volume formula for the sphere--accepted the validity of this wrong formula and used it to answer preliminary paper-and-pencil questions involving volumes of spheres.

The manual then drew participants' attention to a small plastic sphere on their desk. It asked them to measure this sphere's internal diameter and to use this value and the volume formula of the sphere to determine this sphere's theoretical volume. They were then asked to determine its experimental volume by filling it with water (through a single small opening), transferring this water to a box, and measuring its internal width, length, and height. Participants were then instructed to compare this sphere's theoretical and experimental volume, to calculate the discrepancy between the two in percent, and to evaluate its significance. This was followed by 8 review questions, of which 4 involved volume of spheres. One of these 4 questions concerned a sphere with a similar diameter to the sphere they had been working with earlier. The space provided for each of the 8 questions included a special "comments" section in which participants could explain their answers or express doubts.

Short of outright telling participants that the formula was false, the next segment of the manual was aimed at inducing them to discover the deception on their own. No time limits were imposed during this segment. It involved the use of a far larger sphere (20-cm diameter) than before, so that in absolute terms the discrepancy was much larger than before. The anecdotal, friendly instructional style was replaced by a rigid, businesslike set of instructions. In the preceding segment, participants were asked to determine the diameter of the sphere, a process which led most of them to underestimate its length, thereby producing a closer fit between the theoretical and experimental volume; in this segment, they were merely asked to confirm that its length was 20 cm. After uncovering the discrepancy, they were asked specifically to compare it to the much smaller discrepancy they observed for the cylinder (a value which was inserted earlier by hand in each participant's manual). They were then asked to assess the significance of the discrepancy they observed. This segment was concluded with nine paper-and-pencil questions. The instructions preceding these questions made it clear that an answer pointing in the right direction was better than an incorrect answer and they urged participants, whenever necessary, to use the "Comments" section following each question to explain their answer(s).

Two of the nine questions concerned the volume of spheres. One of these 2 questions asked participants to determine the volume of a sphere with a 20-cm diameter. This experiment was concluded with a written retrospective questionnaire and a taped oral debriefing. These retrospections made a minor contribution to the empirical portion of our study (factual data on belief dependence of observation and on conceptual shift) and they helped us gainedsome tentative qualitative insights into the underlying cognitive processes.

However, the questionnaire and the open-ended oral interview were chiefly aimed at debriefing participants. They provided participants with the correct formula of the sphere and assured them that, apart from this necessary deception, all the information provided in the manual was correct. More importantly, they attempted to alleviate any feelings of discomfort and unease by creating a friendly, accepting, and non-threatening atmosphere. In particular, participants were assured that their behavior was typical under the circumstances created by this experiment, even for people with their extensive scientific background.

There is little doubt that participants emerged from this study with their self-esteem as scientists, critical thinkers, and human beings unscathed. In fact, most viewed their experience in positive terms, thereby reinforcing our early impression of the suitability of our educational manual in ordinary instructional settings (and not only in research settings). To show this, we shall briefly describe here the written responses of the 13 scientists who constituted our main experimental group.

Only one participant felt that her participation had no educational value whatsoever to her as a scientist and a human being. The other 12 took a more positive view of the experience they had just gone through. Instead of recounting their comments (of which some were highly favorable), we shall only reproduce here a few questions from a retrospective survey and a summary of the answers given by all 13 scientists:

--As a result of participating in this exercise, will you treat your beliefs about the world--regardless of their source--with a greater degree of caution and skepticism? Yes: 5; Maybe: 3; No: 5.

On a scale of 1 to 10, where 1 = "not at all" or "untrue," and 10 = "a great deal," or "very true," how would you rank the statements:

--"As a result of taking part in this exercise, I have a better understanding of the way my mind works." Mean: 5.8

--"I am glad I took part in this exercise." Mean: 7.6

--"My participation in this exercise has the potential of improving my overall performance as a research scientist." Mean: 4.6.

RESULTS

This section will focus first on the performance of the 13 scientists who did not know the volume formula of the sphere. A later section will focus on the performance of the six scientists who were familiar with this formula.

Discomfort and Doubt

The willingness of all 19 scientists to use the wrong formula to answer preliminary volume questions without expressing reservations or doubts, as well as their subsequent retrospective reflections, suggest that they believed the incorrect volume formula when they were first introduced to it.

Upon first discovering the discrepancy, the 13 scientists who did not know the correct volume formula of the sphere found themselves in an awkward position. Everything they did so far led them to believe in the legitimacy, competence, and veracity of mathematical theory and the instructional set-up. For the cylinder they have observed a reasonably small discrepancy between the theoretical and experimental volumes (Mean=3.2% on the first try Mean=3.1% on the second try for those who chose to measure it again). For the small sphere, however, they had uncovered a surprisingly large discrepancy (see below for quantitative details). Their first unmistakable response was one of puzzlement, doubt, and discomfort. Among other things, this was apparent from the changed demeanor of some of the participants, from questions they directed at the experimenter (who professed ignorance and politely reminded them of the self-discovery nature of the exercise), and from their own post-experimental reflections.

Belief Dependence of Measurements

While working with the cylinder and the small sphere, all 13 participants who did not know the volume formula of the sphere were specifically encouraged to review the discrepancy between theoretical and experimental volumes and, if they felt this was necessary, to re-measure and re-calculate these volumes. The difference between the mean discrepancies for the two successive measurements of the first sphere, 45.3% and 30.5%, was significant, t(df=2, one-tailed, N=13)=2.7, p<.01. Three considerations suggest that these results are traceable to participants' efforts to create a closer fit between the two volume derivations of the small sphere by adjusting their measurements.

First, two successive measurements by the same individuals for the cylinder only yielded means of 3.2% and 3.1%.

Second, 15 scientists with similar qualifications to the 13 participants who did not know the formula were asked to measure the capacity and the diameter of the exact same small sphere used in the instructional setting. No reference to the volume formula of the sphere was made and they were specifically discouraged from using it or developing any other expectations of the two values. Using the wrong formula to obtain the theoretical volume, one gets for this expectation-free group a 44.4% mean discrepancy between theoretical and experimental volumes. This discrepancy is significantly larger than the 30.5% mean discrepancy obtained with the key experimental group of 13 scientists, t(df=26, one-tailed)=2.6, p<.01.

Third, in a retrospective questionnaire, participants were asked to answer (on a scale in which 1 stood for "not at all" and 10 for "a great deal") the question: "To what extent did you adjust your measurements to fit your expectations?" The mean response for all 13 scientists was 3.9. Written and oral retrospections of some of these scientists were consistent with their recollections. For instance, one scientist who obtained a 3.5% discrepancy for the cylinder, a 58% discrepancy for the first sphere which he erased and adjusted to 16%, a 67% discrepancy for the second sphere which he adjusted to 18%, and who ranked the belief dependence of his measurements as 2.5 wrote: "It is very difficult at this stage to remain loyal to yourself and stay with your experimental results. Very good exercise!" A scientist who obtained a 3.5% discrepancy for the cylinder, adjusted a 29% discrepancy to 14% for the first sphere, obtained a 13% discrepancy for the second, and ranked the belief dependence of his measurements as 2 wrote: "While I got a refresher course in spheres etc., what I really came away with is how I try to force the real world into the theoretical, and am uncomfortable when it doesn't fit."

In the experimental setting described here, belief dependence of observations seems to have its limits: even though expectations clearly influenced the measurements of some scientists in our experimental group, they did not determine them. Rather, most participants seem to have struck a compromise between their expectations and measurements, a compromise which seems to have been swayed more strongly by objective reality than by preconceptions. Thus, the mean discrepancy for the second measurements of the small sphere was 30.5%; for the larger sphere, 37.5%. Both are significantly different from participants' expectations, t(df=2, two-tailed, N=13)=5.9, p<.0005; expectations which were most likely based on the discrepancy they observed for the cylinder). Similarly, participants were not consistent in their adjustments of measurements; it was not unusual for the same individual to obtain a realistic discrepancy for one sphere and an unrealistic discrepancy for the other. One participant, for instance, obtained a 55% discrepancy for the first sphere, 3.9% for the second; another participant reported 5.5% for the first, 64% for the second.

Scientists in our sample also differed in their susceptibility to measurement bias: the measurements of some individuals appeared to be strongly affected by expectations; of others, not at all. In our sample of 13, the combined average discrepancies for the two spheres ranged from 13.9% to 53.5%, with standard deviations of 6.9 for the first sphere and 17.6 for the second. For each individual in the control group of fifteen scientists who had no preconceptions about their measurements, the discrepancy between the theoretical value (based on a one's measurements of the diameter and the incorrect formula) and one's capacity measurements were calculated. These discrepancies ranged from 28.2% to 61.8%, with a standard deviation of 10. In our sample of 13, the reported discrepancies for the cylinder ranged from .02% to 14.6%, with a standard deviation of 3.6.

For those who fitted measurements to expectations, the process of adjustment involved assigning first a realistic value to the diameter of the ball. Later on, upon discovering the large discrepancy, they measured and re-measured the diameter, often using a variety of approaches, until they came up came up with a smaller value. Apparently, even simple observations with a ruler fall within a range of probable values; these participants appear to have chosen the lowest reasonable value within this range. They all ascribed the discrepancies that remained to a variety of measurement errors, e.g., difficulties of measuring the diameter, the inevitability of errors in operations of this kind, difficulty of measuring the volume of a liquid in a box, and the rudimentary measurement tools at their disposal.

Conceptual Shift

In the portion of this experiment involving the first sphere, only one scientist of the 13 who did not know the formula expressed some doubts about the formula, and all 13 continued to use it, after they uncovered the discrepancy, in subsequent paper-and-pencil questions involving spheres.

As we have seen, the segment involving the second sphere was deliberately designed to help participants break away from the formula. Although this segment failed for the most part to achieve its objective, it did make a difference. After calculating the theoretical and experimental volumes of the second sphere, and after being reminded of the smaller discrepancy they observed for the cylinder, scientists in this group were asked: "Is the difference between the cylinder and the sphere significant?" Twelve felt that it was; only one (who obtained a 1.4% discrepancy for the cylinder and a 3.9% discrepancy for the second sphere) felt that it was not. Participants were then asked to answer and provide an explanation to the question: "Should mathematicians re-examine their long-established formula for the volume of the sphere?" Seven felt that they should not, three were unsure, and three felt that they should.

In the instructional setting created here, perhaps the most significant test of the tendency to cling to discredited beliefs is operational: unqualified use of the wrong formula in paper-and-pencil questions calling for volume determinations of spheres. The question about a sphere with an identical diameter (20 cm) to the one for which participants uncovered the discrepancy is of special interest. Would they unreservedly use the formula, would they use it but note their uncertainty, or would they rely on experimental results? Would they take advantage of the absence of time limits to derive a more correct formula, either empirically or through accepted mathematical procedures?

After uncovering the discrepancy for the second sphere, twelve out of 13 scientists continued the unqualified use of the incorrect formula. As we have seen, one mechanism of resolving the apparent conflict involved partial adjustments of measurements in order to bring them in closer conformity with expectations. A second common response involved the invocation of experimental error in an effort to explain the discrepancies that remained. Thus, most participants attributed the large discrepancy to such things as "vagaries of experimental error," "difficulties in establishing an accurate measure of the diameter of a real-life sphere," thickness of the spheres' walls, or deviations of the two spheres from ideal spheres. To our knowledge, and despite the unlimited time available in the segment involving the second sphere, participants did not attempt to refute these explanations either through direct, and fairly simple, tests, or by referring to their earlier work with the cylinder.

In this segment, only one scientist rejected the validity of the formula on the verbal level, used her observations of a 50% discrepancy to adjust the wrong formula (coming up with an inelegant but entirely correct 2/3 X .785D3), and used this revised formula to answer subsequent questions about the volume of spheres.

The 6 scientists who walked into the instructional setting knowing the correct volume formula for the sphere faced a markedly different task than their 13 counterparts who did not know it. Upon encountering the unexpected large discrepancy, they did not have to fall on their own resources to defy the incorrect formula, but merely revert to the formula they already knew. The large discrepancies, and the feelings of discomfort they engendered, one might surmise, would prompt them to abandon the wrong formula and use the formula which they remembered and which fitted in so well with their own observations.

In the segment involving the first sphere, only one (here referred to as Participant A) of these 6 scientists summarily rejected the incorrect formula upon encountering the large discrepancy and used the formula he knew in all subsequent questions. Another (B) summarily rejected it in writing but had spent so much time trying to resolve the observed discrepancy, as well as the conflict between the formula he remembered and the formula he was given, that he was unable to answer subsequent questions about the volume of spheres. Three others vacillated between the two formulas. One (C) used the incorrect formula but qualified his answer with the comment "using your formula." In one question which did not merely involve a numerical answer but a determination of comparative sizes of a cylinder and a sphere he used the correct formula. Another (D) used the false formula throughout, but gave two answers--based on the correct one he remembered and the false one he was given--in answering the last paper-and-pencil sphere question. Another (E) used the correct formula throughout the entire exercise, then erased all her answers and replaced them with answers based on the incorrect formula. Only one participant (F) behaved exactly as did the majority of participants who did not know the formula--verbally accepting the validity of the formula and employing it unquestionably in answering all questions about the volume of spheres.

By the second sphere, the picture changed. Participants A and B rejected the formula again in writing and both used the correct formula to answer volume questions. Participant C wrote down that the formula given was probably wrong, and went on to use the formula he remembered in all subsequent questions involving spheres. After discovering the large discrepancy, participant D felt that the given volume formula of the sphere was "highly suspect," continued to use the incorrect formula, but qualified each answer with the comment: "by the .7853 method." Participants E and F did not feel the volume formula needed to be re-examined, and continued to use it throughout this second part.

The retrospective reflections of these six participants throw some light on their conduct. Participant A: "Being able to compare my experimental value with the formula from my memory was most reassuring. Had I not remembered the formula, things would have been awkward." Participant C: "I was puzzled. . . . After double-checking my measurements and my math, I plugged in the formula I remembered and discovered the formula given was incorrect." Participant D: "I knew the volume of a sphere is and I was not happy about the discrepancies in the answers. . . . I originally assumed = .785D3 -- until the first measurement. . . . [I was] very puzzled, very uncomfortable. I could not reconcile the difference and I knew gave a much better answer. But still, I doubted what I knew to be correct and assumed that I made a fundamental error somewhere. I did know the correct formula at the outset--thus I qualified my answers as 'by the .785D3 method.' It's difficult to imagine that one could be deliberately deceived in an exercise like this, thus I was prone to doubt myself--even in the face of the large error." Participant E: "[I was] very puzzled, very uncomfortable. I believed my measurements were incorrect and that I had recalled incorrectly the formula for sphere volume. I accepted the formula because I became confused about my recollection of the formula." Participant F: "I was puzzled. I thought I was measuring the dimension with large percentage of error. Back in my mind, I thought the volume of the sphere was but when I saw this one, I believed it in total. The difficulty of switching from one theoretical scheme to another played a part [in my decision to use the theoretical formula despite the discrepancy I uncovered]. I didn't want to take the formula as incorrect. After work, I thought I was making mistakes in my measurements. . . . When I found [the error] was 50%, I tried to make it up, so that the error was 20-30%."

Additional Controls

To rule out the remote possibility that there is something peculiar about scientists which accounts for their conduct, we have carried identical observations with a group of four mathematically proficient students who did not know the volume formula of the sphere. We shall not detail here results from this small group, except noting that they confirmed in all essential respects results obtained with the larger group of natural scientists.

In another variation (Nissani & Maier, 1991), 28 undergraduates completed a take-home exercise in which the cognitive conflict revolved around the circumference of an ellipse. These students found it difficult to reject the false formula they were given, suggesting that the results reported in this paper are not ascribable to the difficulty of dealing with three dimensional objects.

DISCUSSION

Our results on belief-dependence of observations suggest that even when one deals with simple tasks such as measuring length with a ruler, and even when these tasks are undertaken by a select group of highly trained individuals, many peopletend to resolve a conflict between firm beliefs and sense data by adjusting the data. In the present setting, however, this process of adjustment rarely takes place outside reasonable boundaries. Needless to say, these results are entirely consistent with the position that "observation is inferential, so that any given observation might be influenced by theory, but the inferential processes in observation are not so loose as to allow us to make any observation we want." (Thagard, 1988, p. 151).

The tendency to cling to strongly held beliefs in face of overwhelming evidence against them is a recurring feature of human affairs, formal and informal learning, experimental psychology, and history (Ball, Farr, & Hanson, 1989; Nissani, 1991). Most participants in this study were unable to relinquish unreasonable beliefs, even when these beliefs have just suffered seemingly decisive refutations. Our results provide therefore an additional striking demonstration of this human tendency in a laboratory setting.

At this stage, it is too early to attempt a causal analysis of these observations. The most straightforward explanation would invoke the cognitive difficulty of switching from one belief to another as the decisive factor. Similarly, Milgram proposed obedience to authority as the underlying factor, while Asch suggested conformity as the underlying factor in his observations (both reviewed in Milgram, 1974). In all three cases, a causal analysis may prove far more complicated than initially believed. The observed obedience in Milgram's experiments, for example, may be traceable in part to belief perseverance (Nissani, 1990). It could be similarly argued that, in the present set-up, obedience, social conformity, trust, absence of a reasonably sound alternative concept, and other factors may have shaped the observed outcome. In our view, this important question can only be determined by future experimentation.

Refutations--regardless of their strength--do not automatically trigger conceptual shift; they only serve to set in motion a gradual internal process. A few signposts on the mind's journey from one belief to another tentatively suggest themselves. The first stage seems to involve discomfort and denial. At this point, we stick to the old belief, bypassing the discrediting evidence through contrived explanations and/or beating reality "into line" (Kuhn, 1974, p. 135). This seems to be followed by doubt, vacillation, and tension. We are aware by now of the seeming irreconcilability of the old belief and the new evidence, but are still reluctant to decisively replace the old with the new on the verbal level and we are even more wary of letting the implications of this new evidence guide our actions. Long ago, the importance of this second phase to intellectual progress has received much attention outside the laboratory. John Dewey, for instance, wrote that "thinking takes its departure from specific conflicts in experience that occasion perplexity and trouble. Men do not, in their natural estate, think when they have no troubles to cope with, no difficulties to overcome" (Dewey, 1920, pp. 138-139). With the passage of time, and with the mounting of refutations, the shift is complete: we now believe the new theory and let it guide our actions.

In many instances, of course, this conjectural process is forever stagnating at the first stage. In others, it may move more rapidly than the evidence presented here suggests. Although the transition appears to take place instantly in some cases, it may be preceded even there--perhaps unconsciously--by an incubation period. The combined use of (i) a much briefer version of our teaching manual involving the circumference of an ellipse, (ii) a sufficiently large discrepancy to trigger a gradual belief change in most participants, and, possibly, (iii) think-aloud protocols, may provide a powerful tool for routinely triggering conceptual shifts in the laboratory and for studying their underlying cognitive and affective aspects.

Although our results underscore people's disinclination to abandon beliefs in face of seemingly decisive refutations, they need not be interpreted as confirming the irrationalist side in the perennial debate over human rationality. To escape from the paradoxical situation in which they find themselves, our subjects must embrace one or more unlikely notions (deception, consistent error in an otherwise imaginatively written and conceived instructional manual, or a millennia-long fundamental flaw in mathematical theory). Under such conditions, doubting one's ability to carry out ruler measurements and simple multiplications may make sense. Hopefully, variations of the procedure described here may throw a clearer light on the connection between conceptual change and rationality.

Now and then, scholars, students, and instructors must meet the challenge of conceptual shift. Typically, progress depends on their ability to revise or abandon minor working hypotheses. More rarely, progress hinges on a given discourse community's ability to move from one major paradigm to another. The experimental results presented here and elsewhere lend a small measure of support to the contention (Kuhn, 1974; Strike & Posner) that the difficulty of switching from one belief to another is a major impediment to intellectual and social progress. Given its decisive importance, this contention deserves far more experimental attention than it has so far received.

Some theorists plausibly insist on the centrality of conceptual change in the process of learning. Learning, according to this view, is not an additive process. Rather, learning arises from the interaction between current experience and prior conceptions; involving assimilation of new concepts and the transformation of old ones. Its essence can be perhaps captured by the metaphor of embryological development, not by that of the stepwise construction of brick walls. This view, and the partial failure of traditional instructional approaches to induce desired conceptual shifts, led to such innovative instructional practices as Interview-About-Instances technique, ancillary knowledge strategies, and dialogue-based strategies. The experiments described here suggest that, if anything, theses strategies, as well as the conceptual change view of learning and understanding (Strike & Posner, 1985), underrate the challenge and importance of conceptual shift. Our experimental procedure may provide a useful tool for recognizing the central role of conceptual shift in learning, acquiring additional insights concerning this process, and applying these insights in the classroom (cf. Resnick, 1987).

ACKNOWLEDGMENTS

We warmly thank the many participants in this experiment for making it possible, and, at the same time, for making it a memorable, pleasant, and rewarding experience. Clifford L. Maier, Lauren B. Resnick, Norma Shifrin, James F. Voss, and two anonymous reviewers have kindly shared with us their comments on earlier drafts of this manuscript. This research was supported by a small research grant from Wayne State University.

REFERENCES

Cooper, J. & Fazio, R. H. (1984). A new look at dissonance theory. In: Advances in Experimental Social Psychology, 17: 229-266.

Dewey, J. (1957). Reconstruction in Philosophy (Boston: Beacon Press).

Driver, R., & Easley, J. (1978). Pupils and paradigms: A review of literature related to concept development in adolescent science students. Studies in Science Education, 5, 61-84.

Eagly, A. H. & Chaiken, S. (1984). Cognitive theories of persuasion. In: Advances in Experimental Social Psychology, 17: 267-359.

Festinger, L., Riecken, H. W., & Schachter, S. (1956). When Prophecy Fails. Minneapolis: University of Minnesota Press.

Hardyck, J. A., & Braden, M. (1962). Prophecy fails again: A report of a failure to replicate. Journal of Abnormal and Social Psychology, 65, 135-141.

James, W. (1907). Pragmatism. New York: Longmans, Green and Co.

Karmiloff-Smith, A., & Inhelder, B. (1975). "If you want to get ahead, get a theory." Cognition, 3, 195-212.

Kuhn, T. S. (1974). The Structure of Scientific Revolutions (Second Edition). Chicago: University of Chicago Press.

Milgram, S. (1974). Obedience to Authority, New York: Harper & Row.

Milgram, S. (1984). Cyranoids. A paper presented at the 1984 meeting of the American Psychological Association.

Nisbett, R., & Ross, L. (1980) Human Inference. Englewood Cliffs: Prentice Hall.

Nissani, M. (1989a). A hands-on instructional approach to the conceptual shift aspect of scientific discovery. Journal of College Science Teaching, 19: 105-107.

Nissani, M. (1989b). An experimental paradigm for the study of belief perseverance and change. Psychological Reports, 65: 19-24.

Nissani, M. (1990). A cognitive reinterpretation of Stanley Milgram's observations on obedience to authority. American Psychologist, 45: 1384-1385 (1990).

Nissani, M. (1991a). Lives in the Balance: The Cold War and American Politics, 1945-1990. Wolfeboro, New Hampshire: Longwood Academic, in press.

Nissani, M. & Maier, C. (1991b). Further explorations of belief perseverance: Reconciling incompatible beliefs and observations concerning the circumference of the ellipse. In preparation.

Resnick, L. B. (1987) Education and Learning to Think. Washington, DC: National Academy Press.

Ross, L., & Anderson, C. A. (1982). In A. Tversky & D. Kahneman (Eds.) Judgment under uncertainty: heuristics and biases. Cambridge: Cambridge University Press.

Strike, K. A., & Posner, G. J. (1985). A conceptual change view of learning and understanding. In: West. L. H. T., & Pines, A. L. (Eds.) Cognitive Structure and Conceptual Change. Orlando: Academic Press, pp. 211-231.

West. L. H. T., & Pines, A. L. (Eds.; 1985). Cognitive Structure and Conceptual Change. Orlando: Academic Press, pp. 211-231.

Thagard, Paul. (1988). Computational Philosophy of Science. Cambridge, Massachusetts: The MIT Press.

Back to Moti Nissani's Publications