Вы находитесь на странице: 1из 11

© Psychological Society of South Africa. All rights reserved. South African Journal of Psychology, 40(3), 2010, pp.

272-281
ISSN 0081-2463

Effect of a course in research methods on


scientific thinking among psychology students
Ashraf Kagee
Stellenbosch University, South Africa

Saalih Allie
University of Cape Town

Anthea Lesch
Stellenbosch University

This study followed a quasi-experim ental design to determ ine the effect of a course in re-
search m ethods on undergraduate students’ ability to reason scientifically. Two classes of
students in their first and second year of study were asked to participate in the study. The
second year class (n = 171) was taught a course in research m ethods, while the first year
class (n = 201) was taught a course in research m ethods. An instrum ent consisting of a
series of vignettes was adm inistered to all students at the beginning and at the end of the
quarter in which these courses were taught. Total scores on the instrum ent were used to
determ ine the extent of scientific thinking. Analysis of variance showed a non-significant
difference between the groups at pretest and a significant difference (p < 0.05) at post test.
These results were interpreted to m ean that the research m ethods course was responsible
for increasing students’ level of scientific thinking.

Keyw ords: causality; psychology students; research m ethods; scientific thinking

The broad purposes of teaching any science based discipline comprise two distinct aspects that need
to be addressed. On the one hand there is a body of broadly accepted knowledge, while on the other
hand there are the processes that lead to the creation and legitimatization of knowledge within the
discipline. A perusal of popular first year textbooks ranging from physics (Halliday, Resnick, &
W alker, 2007) to psychology (Swartz, De la Rey, & Duncan, 2004) shows that the emphasis is on
the first aspect, namely, content knowledge while the processes of science are usually only mentioned
briefly. One of the effects of this approach, coupled with traditional assessment practices, is that
scientific knowledge appears authoritative and unchallengeable. To this extent, following completion
of a first year course, students may embrace epistemologies that are weaker than those they accepted
prior to the course (Redish, Saul, & Steinberg, 1998). The consequence of following such curricula
can have the adverse effect of weakening student epistemology as has been shown, for example, in
studies on first year physics students using the Maryland Physics Expectations (MPEX) instrument
which was administered before and after their first year physics courses (Redish, Saul, & Steinberg,
1998).
Given the need for students to become familiar with a discipline, the amount of material covered
in most undergraduate courses is usually considerable. However, without an understanding and
appreciation of the nature of scientific processes and the way scientific knowledge is constructed
there is little on the surface to distinguish between accepted discipline based knowledge and
knowledge claims based on folklore or whimsy. Graduates who claim some level of certification in
a science-related discipline but who have not developed the tools to enter these debates are more
likely to undermine the scientific enterprise than to promote scientific thinking amongst society at
large.
South African Journal of Psychology, Volume 40(3), September 2010 273

Scientific thinking in psychology


The nature of psychology as a discipline is such that it is more difficult to separate out the role of
personal experience from disciplinary content than in, say, physics. It is, therefore, critical that part
of the training of psychology students explicitly addresses aspects of scientific thinking with the aim
that they are able to critique knowledge claims and arguments about human behaviour from a sci-
entific perspective. Legitimate forms of reasoning within the scientific paradigm are often loosely
referred to as scientific thinking. Examples of cognitive processes involved in scientific thinking
include induction, deduction, analogy, problem-solving and causal reasoning (Dunbar & Fugelsang,
2005). The issue of scientific thinking has not received widespread attention in the literature on
teaching in psychology, though the related notions of promoting critical thinking and stimulating the
use of higher order cognitive processes in students has received a fair amount of attention. Current
definitions of critical thinking describe it as a higher order thinking process in which individuals
apply information to analyse, make inferences about and evaluate knowledge claims, and recognize
and solve problems (Angelo, 1995; Beyer, 1985; Lewis & Smith, 2001). This body of research
highlights the crucial importance of developing these cognitive processes in students, whilst simul-
taneously acknowledging the difficulties in achieving this goal through teaching.
According to Lewis and Smith (2001) all disciplines need both lower and higher order thinking
in order to generate knowledge. These authors point out that psychologists tend to see higher order
thinking as problem-solving due to the discipline’s roots in experimentation and research. In psy-
chology, therefore, developing students’ ability to apply higher order thinking skills is viewed as
challenging them to interpret, analyse and manipulate information as distinct from lower order
thinking which involves the routine, mechanical application of previously acquired information
(Newman, 1990).
The development of higher order thinking skills is often perceived as being achieved via courses
in research methods within the psychology curriculum. A typical research methods course in psy-
chology tends to be centred around the technical aspects of scientific investigation, such as research
design, experimentation, and quasi-experimentation. Thus, there is a tacit assumption that engaging
successfully with these technical aspects will also lead to a deeper understanding of some of the key
themes associated with scientific reasoning. However, it is not easy to test routinely for thinking
skills. The degree to which the course may have had an impact on scientific thinking is usually
inferred from the results of formal testing that emphasize the technical issues of design rather than
deeper conceptual understanding. Hence, it would be assumed (to some extent at least) that a student
who is deemed successful, based on the results of traditional formal assessment, would possess a
sound understanding of the way in which scientific is knowledge is constructed and the tentative
nature of scientific tenets. Such an expectation is reasonable as a research methods course highlights
many ideas that are associated with scientific thinking.
Concepts typically emphasized in research methods courses include the role of empirical
evidence, the nature of conclusions based on probabilistic thinking, the uses of falsification (Popper,
1963) and the differences in weight attached to different types of confirmatory evidence (Stanovich,
2004). In this study, we aimed to facilitate among students an awareness and understanding of the
possible errors in reasoning that may occur when drawing conclusions based on observation. These
errors include the following: failing to seek alternative explanations or causes for observed pheno-
mena, focusing disproportionately on information that appears most vivid to the exclusion of other
less salient data, using the notion of “post hoc ergo propter hoc”, assuming that correlation equals
causation, relying on testimonial and anecdotal evidence, regarding intuition as a valid source of
evidence, confirmation bias, placing the burden of proof on the skeptic rather than the claimant and
failing to detect confounds in attributing causality. These elements, described in detail in Table 1,
were used as markers of scientific thinking in this study. W e report on an investigation into the effect
of a course in research methods on various facets of scientific thinking among psychology students.
274 South African Journal of Psychology, Volume 40(3), September 2010

M ETHOD
Participants
First and second year psychology students at a large residential South African university were invited
to participate in the study. They were informed about the study in class by a researcher and asked
to respond to a questionnaire that assessed the aspects of scientific thinking discussed above. Both
groups of students (first and second year) were asked to complete the pre-test and the posttest in
class, administered before and after the research methods course, respectively. As an expression of
appreciation for their participation they were given a R30 lunch voucher on completion of the
posttest. The study was approved by the university ethics committee. Students were informed that
they could choose not to participate if they did not wish to do so without the decision affecting their
course results.

Procedures
The study followed a quasi-experimental two group pre-test posttest research design. Second year
psychology students constituted the experimental group as they completed the research methods
course. The first year students who took a course in developmental psychology but not in research
methods were the comparison group. Both groups were assessed at the beginning and at the end of
the final quarter of 2010, which was when the research methods course was presented. The actual
content learning of each course was assessed by a combination of a class assignment (25%), a class
test (25%) and a final examination (50%).

Description of the intervention


The research methods course was delivered in a large auditorium attended by about 250 students.
The lecturer made liberal use of teaching aids such as Powerpoint slides, a chalkboard and an over-
head projector. Modes of delivery included traditional lecturing and extensive use of lecturer-student
interaction. In-class discussion included eliciting students’ insights into problems of constructing new
knowledge through research. The syllabus included a discussion of the epistemological assumptions
of science, components of scientific theory, hypothesis testing, sampling theory, various research
designs, internal and external validity including threats to validity and psychometric theory. In terms
of scientific reasoning, students’ attention was called to the pitfalls associated with making causal
attributions and the stringent criteria that need to be met when making claims involving causal
reasoning. Table 1 summarizes the key concepts that were explicitly addressed in the course. The
developmental psychology course addressed traditional concepts in developmental psychology and
did not specifically include aspects of research design.

Instrument
W e developed an 11 item instrument that required participants to read a vignette and respond to
specific questions that reflected the aspects of scientific thinking of interest, as articulated in Table
1. The response options were binary, in the direction of either a scientific or non-scientific response,
which were coded 1 and 2, respectively. The minimum possible score was 11 and the maximum pos-
sible score was 22. Responses to each item were coded as 1 and 2, with responses in the scientific
direction given a 2 and responses in the non-scientific direction given a 1. The scores were summed
with the lowest possible score at 11 and the highest possible score at 22. Appendix 1 contains one
of the 11 vignettes which were presented as part of the assessment instrument. In this particular
example the idea was to test whether or not students were aware of the possibility of alternative
explanations. Thus the sceptical stance of W alker is more compatible with accepted scientific rea-
soning than the conclusion of Rider. W e constructed each item, including the vignette, on the basis
of the epistemological principles covered in the course, for example, the hypothetical counterfactual
condition, temporal order between causes and effects and using correlational data in making causal
South African Journal of Psychology, Volume 40(3), September 2010 275

claims. Thus the scale was not used as a psychometric instrument possessing internal consistency,
but as a summation of individual items.

Table 1. Key underlying concepts used as markers of scientific thinking


Hypothetical When presented with an apparent cause and effect relationship, scientific thinking
counterfactual requires that one imagine whether the effect would also be observed in the absence of
the apparent cause. This envisaged hypothetical counterfactual set of circumstances
permits the conclusion that the stated or apparent cause is indeed the cause of the
observed effect. In experimental terms such a set of circumstances is given expression
in the form of a control or comparison group. In the absence of a control or
comparison condition it is often considered scientifically incorrect to make a causal
attribution (Campbell & Stanley, 1963), even though change in the apparent cause is
observed in tandem with change in the apparent effect.

The The availability heuristic is a rule of thumb or cognitive shortcut where one bases a
Availability prediction of an outcome on the vividness and emotional impact of an event rather
heuristic than on actual probability (Ruscio, 2000). People generally make a judgment based on
what they remember, rather than complete data. The availability heuristic is
particularly used for judging the frequency or likelihood of events. Thus people often
remember information about a few cases and assume that this is representative of a
population.

Reversed The burden of proof in science rests on the person who makes the scientific claim, not
burden of on the sceptic or critic (Shermer 1997). It is therefore inappropriate to expect that the
proof sceptic should demonstrate that a claim is false (e.g. effectiveness of a new technique).
Instead, the proponent of the claim must show that the claim is likely to be true. Thus,
if the evidence in favour of the effectiveness of a certain psychological procedure is
not forthcoming, a reasonable response is one of skepticism rather than a retort that
there is no evidence against the procedure and therefore the procedure is valid. The
assumption that a claim is likely to be correct because there is no compelling evidence
against has been termed the ad ignorantium fallacy (Walton, 1998).

Reliance on There is an assumption among many psychology students that clinical training will
intuition develop in them an intuitive sense about the clients with whom they work.
Psychologists are typically called upon to make assessments, diagnoses, and
predictions about events pertaining to clients such as future violence, recidivism,
hospitalisation, diagnosis, prognosis, and suicide attempts. However, in most studies
comparing actuarial and clinical methods of prediction, actuarial methods significantly
outperformed clinical methods (Aegisdottir et al. 2006). Invariably, objective data
such as test results yields outcomes that are superior to what is known as “clinical
intuition” (Meehl, 1954). Yet, many psychologists in training are schooled into
believing that their task is to develop a special sense of their patients that may be
obtained by verbal interaction. A more detailed discussion of the clinical and actuarial
methods of prediction is available elsewhere (Kagee, 2006).

Post hoc ergo Post hoc ergo propter hoc is a Latin term that translates as “After this therefore
propter hoc because of this”. This proposition is based on the mistaken notion that simply because
one event happens after another, the first event was a cause of the second event (Pinto
1995). While there are many sequences of events that may be both temporally and
causally related, temporality is only one condition out of several, for two events to be
causally connected. In and of itself temporality does not equal causation. A simple
example of this is engaging in a ritual, such as hand-clapping or finger snapping
before an examination, believing that this behaviour will cause high performance in
the examination.
276 South African Journal of Psychology, Volume 40(3), September 2010

Table 1. Continued
Overreliance Without dispute anecdotes are good educational devices. However, they are not
on testimony generally useful as a basis for generalisation or as evidence, as they are typically not
representative of the population of cases from which the anecdotes are drawn
(Casscells, Schoenberger, & Graboys 1978). For any empirically demonstrated
relationship there may be outliers that run foul of the apparent relationship. This does
not mean that the relationship is invalid but merely that exceptions to the rule do
occur. For example, in general, men are taller than women. However, in some
instances women may be taller than men. Similarly, there is an undisputed relationship
between smoking and lung cancer. Yet, everyone knows an elderly person who has
smoked heavily for decades, but whose health is excellent. An individual case does
not invalidate the empirically demonstrated relationship between smoking and cancer.
When discussing empirically demonstrated relationships between variables, students
may sometimes cite individual cases that go contrary to the data, mistakenly believing
that individual cases invalidate such relationships.

Correlation If two variables are shown to correlate with each other, a naïve explanation would be
equals to say that one causes the other. Thus if the correlation coefficient between shoe size
causation and reading ability among children is found to be high, it would be a erroneous to
conclude that having large feet causes children to read better. An alternative
explanation is that reading well causes feet to grow, but a more likely explanation is
that age, associated with cognitive development, results in physical growth as well as
an increase in reading ability. Age is therefore a third variable in the equation that is
the causal agent. The conclusion therefore is that correlation by itself does not equal
causation and is only one of several conditions to be satisfied for a causal relationship
to be determined (Kerlinger & Lee 2000).

Confirmation Confirmation bias is selective thinking whereby one tends to notice and look for
bias events that confirm pre-existing beliefs, and to ignore or undervalue the relevance of
those that contradict those beliefs (Stanovich 2004). Confirmation bias occurs when a
hypothesis is generated and evidence is sought that supports its tenability, to the
exclusion of evidence that refutes it. Confirmation bias is thus an error of inference
toward confirmation of the hypothesis that is being tested. Thus a determined advocate
of a belief can find at least some supportive evidence for virtually any claim
(Lilienfeld, Lynn, & Lohr 2004).

Hindsight bias Hindsight bias is the tendency to state after an event occurred that the event was
predictable (Hawkins & Hastie 1990). It is an inclination to see past events as being
predictable and reasonable to expect after the fact, rather than before they have
occurred.

RESULTS
Description of the sample
A total of 201 first year and 171 second year students agreed to participate in the study at the
beginning of the fourth quarter of the 2008 academic year. Only those students who had completed
the pretest were allowed to complete the posttest. Thus at the second assessment point, close to the
end of the course, 78 first year and 118 second year students had been retained in the study. The
decrease between pre-test and posttest may be attributed to the fact that many students who
completed the questionnaire at pretest did not attend class when the posttest questionnaire was
administered. The final sample consisted of 24 (12.2%) males and 172 (87.8%) females. The faculty
breakdown of the students was as follows: Arts and social sciences: 106 (54.4%); Science: 22
South African Journal of Psychology, Volume 40(3), September 2010 277

(11.3%); Health science: 48 (24.6%); Theology: 1 (.5); Other (18; 9.2). The mean age of the sample
was 20 (SD = 2).

Study results
The mean score for both the first and second year groups at pretest was 17.3, indicating that the
groups were equivalent at pretest in terms of the variable of interest, that is, the metric of scientific
thinking. The mean scores for the first and second year groups at posttest were 16.7 and 18.2,
respectively. An analysis of variance, as presented in Table 2, indicated a significant difference
between the two groups at posttest. Follow up t-tests revealed significant overall differences between
the pretest and posttest scores for each group. W e also conducted significance testing by means of
a series of z tests for differences between proportions to determine whether, at posttest, second year
students endorsed individual items in the scientifically compatible direction at a greater rate com-
pared with first year students. Figure 1 presents these results. As can be seen, of 11 comparisons 7
were significant (items 1, 2, 5, 7, 8, 10, and 11). These items measured awareness of alternative
explanations for phenomena (Item 1); the absence of a counterfactual condition (Item 2); the concept
of post hoc ergo propter hoc (Item 5); the limited utility of testimonial and anecdotal evidence in
making causal attributions (Item 7); the difference between intuition and empirical evidence in
decision-making; limitations imposed by sampling bias (Item 8), and the role of confounding
variables in limiting causal claims (Item 11). The results indicate that on these questions a greater
proportion of second year students endorsed the scientifically compatible response.

Figure 1. Percentages of students who selected the


“scientifically compatible” options at posttest

DISCUSSION
W ith regard to the group who took the research methods course there is evidence that there was an
increase in the rate of students’ overall endorsement of scientifically compatible responses at posttest
compared to pretest. The inclusion of a comparison group that was not exposed to the course and
whose scores did not increase permits such a conclusion. Also, on the majority of items, the students
278 South African Journal of Psychology, Volume 40(3), September 2010

who attended the course endorsed scientifically compatible options more frequently than the first
year developmental psychology students. These two observations, taken together, may be interpreted
to mean that the course was responsible for increasing scientific reasoning ability among students.
It therefore appears that the process of teaching students the technical aspects of research methods
in psychology may help in shifting their underlying epistemological reasoning, resulting in an in-
creased ability to reason scientifically.

Table 2. Results of Analysis of Variance


SS df ms F sig.

Total T1 Between groups 2.89 1 2.89 .91 .34


Within groups 529.81 167 3.17
Total 532.70 168

Total 2 Between groups 108.87 1 108.87 27.99 .00


Within groups 688.53 177 3.89
Total 797.40 178

W ith regard to the control group who followed a traditional first year teaching sequence, it was
interesting to observe a significant reduction of scores. This effect is similar to that found in tra-
ditional content-based first year physics courses, for example, where a general deterioration of
expectations, attitudes and epistemology has been measured over the period of instruction (Redish,
Saul & Steinberg, 1998).
This study brings into focus the question of mainstreaming the teaching of scientific reasoning
skills within the general psychology curriculum. Our data suggest that such mainstreaming is entirely
possible within the context of a research methods course. W e argue that integrating the teaching of
scientific reasoning skills within other courses such as abnormal, developmental, and social psy-
chology in the undergraduate curriculum warrants consideration. Courses of this nature may provide
the contextual space within which scientific reasoning skills may be facilitated while students are
simultaneously exposed to their content. W hile this idea lies outside of the scope of the present study,
we raise it in the hope that it may be considered in the future. However, the results from the control
group are somewhat concerning and have serious implications for the way in which science is taught
at first year level. The epistemological beliefs of students, including their views about the nature and
construction of knowledge deeply affects how students approach learning (Hammer, 1994; Elby,
1999) Thus, in the context of teaching physics, studies have been carried out in which epistemo-
logical issues have been explicitly introduced into the curriculum. Elby (1999) has reported that the
inclusion of such a strand has assisted substantially in changing students beliefs about the nature of
knowledge.

Implications for future research


The present study was based on a pilot intervention aimed at increasing scientific reasoning ability
among students. Such a process requires further refinement and evaluation so as to create visible and
more clearly articulated links between the technical aspects of research design and the underlying
scientific reasoning on which they rest. The specific didactic methods to accomplish this require
development, refinement, and evaluation.
The data did not permit us to examine the specific cognitive processes that informed the stu-
dents’ responses to the vignettes that formed part of the assessment instrument. The next step in this
line of research is to explore qualitatively the reasoning processes that led students to endorse the
items they did. Such processes may best be uncovered by asking students to explicate their thinking
South African Journal of Psychology, Volume 40(3), September 2010 279

processes when responding to questions that assess scientific reasoning and then analysing these
reflections.
The instrument used in this study requires further refinement and validation. It is not known
whether the individual items cohere sufficiently with one another so as to permit the underlying
construct of scientific reasoning to be identified. Reliability analyses are required to determine the
internal consistency of the scale as a whole and item analysis is required to determine the per-
formance of individual items. Such data will inform further refinement of the instrument so as to
yield an optimal assessment of the construct of scientific reasoning.

Concluding remarks
W hile at one level, the technical aspects of research methods form an integral component of
psychology curricula in most psychology departments around the world, it is at the deeper level of
scientific reasoning that an impact can be made on how people conceptualize the world they inhabit.
Technical knowledge is seldom retained if not used frequently, but changes brought on by under-
standing aspects of scientific reasoning may persist longer and thus influence the way psychology
graduates appraise knowledge claims. However, there is evidence that leaving such endeavours for
research methods courses taught after the first year of study is not optimal given our findings that
content driven courses may have a negative effect on the scientific mindset of students. This is in
keeping with the finding of Schommer (1990), for example, that “epistemological beliefs appear to
affect the critical interpretation of knowledge” (p. 501) that is, it was a question not of students' being
able to recall prominent information in the passages but rather of what they concluded from the
information. W hen one encounters content material that is tentative, strong beliefs in the certainty
of knowledge leads to the distortion of information in order to be consistent with this belief. The
recognition of this problem in the context of physics has brought about attempts to incorporate
epistemological themes and explicit development of scientific abilities into first year teaching (Etkina
et al., 2006; Etkina et al., 2008).
In the context of the prominence of popular and folk psychology in many societies around the
world, the results of this study are potentially important. Popular psychology is evident in the form
of television programmes that show distressed persons receiving psychological help, various self-
help books, and long-cherished folk wisdom about human nature. In many instances these sources
of knowledge have questionable scientific bases, even though they appeal to the popular imagination
and appear to make intuitive sense. However, intuition and even untrained observation can yield
inaccurate conclusions (Myers, 2002). For psychology to be able to make superior knowledge claims
about human nature as opposed to lay, popular, and folk understandings it is essential that psy-
chology curricula incorporate explicit strands that address the way in which scientific knowledge
about human behaviour is constructed.

ACKNOW LEDGEM ENTS


W e thank the following people for their insightful comments while we were undertaking the study
and during the drafting of the paper: Dedra Demaree, Eugenia Etkina, Dylan Fincham and Brenda
Liebowitz. W e also acknowledge the Fund for Innovation and Research into Learning and Teaching
(FIRLT), Stellenbosch University, and the National Research Foundation, South Africa, for financial
support for this project.

REFERENCES
Ægisdóttir, S., White, M.J., Spengler, P.M., Maugherman, A.S.,Anderson, L.A., Cook, R.S., Nichols, C.N.,
Lampropoulos, G.K., Cohen, G., & Rush, J. (2006). The Meta-Analysis of Clinical Judgment Project:
Fifty-Six Years of Accumulated Research on Clinical Versus Statistical Prediction. The Counseling
Psychologist, 34, 341-382.
Angelo, T.A. (1995). Classroom assessment for critical thinking. Teaching of Psychology, 22, 6-7.
280 South African Journal of Psychology, Volume 40(3), September 2010

Beyer, B.K. (1985). Critical thinking: What is it? Social Education, 49, 270-276.
Campbell, D.T., & Stanley, J.C. (1963). Experimental and quasi-experimental designs for research.
Chicago: Rand McNally.
Casscells, W., Schoenberger, A. & Graboys, T. (1978). Interpretation by physicians of clinical laboratory
results. New England Journal of Medicine, 299, 999-1001.
Dunbar, K., & Fugelsang, J. (2005). Scientific thinking and reasoning. In K. Holyoak, & R. Morrison
(Eds), Cambridge handbook of thinking & reasoning (pp. 705-725). New York, NY: Cambridge
University Press.
Elby, A. (1999). Helping students how to learn. American Journal of Physics (Physics Education Research
Supplement), 69, S54-S64.
Etkina, E., Heuvelen, A., White-Brahmia, S., Brookes, D., Gentile, M., Rosengrant, D., & Warren A.
(2006). Scientific abilities and their assessment. Physical Review Special Topics – Physical Education
Research, 2, 020103-1 – 020103-15.
Etkina, E., Karelina,A., & Ruibal-Villasenor, M. (2008). How long does it take? A study of student
acquisition of scientific abilities. Physical Review Special Topics – Physics Education Research, 4,
020108-1 – 020108-15.
Halliday, D., Resnick, R., & Walker, J. (2007). Fundamentals of Physics Extended, 8th Edn. New York:
Wiley.
Hammer, D. (1994). Epistemological beliefs in introductory physics. Cognition and Instruction, 12,
151-183.
Hawkins, S.A., & Hastie, R. (1990). Hindsight: Biased judgments of past events after the outcomes are
known. Psychological Bulletin, 107, 311-327.
Kagee, A. (2006). Where is the evidence in South African clinical psychology? South African Journal of
Psychology, 36, 233-248.
Kerlinger, F., & Lee, H. (2000). Foundations of behavioral research. Orlando, USA. Harcourt College
Publishers: Orlando, FL.
Lewis, A., & Smith, D. (2001). Defining higher order thinking. Theory into practice, 32, 131-137.
Lilienfeld, S.O., Lynn, S.J., & Lohr, J.M. (2004). Science and pseudoscience in clinical psychology: Initial
thoughts, reflections, and considerations. In S.O. Lilienfeld, S.J. Lynn, & J.M. Lohr, Science and
pseudoscience in clinical psychology (pp. 1-16). New York: Guilford.
Meehl, P.E. (1954). Clinical versus statistical prediction: A theoretical analysis and a review of the
evidence. University of Minnesota Press: Minneapolis.
Myers, D.G. (2002). Intuition: Its powers and perils. Yale: New Haven.
Newman, F.M. (1990). Higher order thinking in teaching social studies: A rationale for the assessment of
classroom thoughtfulness. Journal of curriculum studies, 22, 41-56.
Pinto, R. C. (1995). Post hoc ergo propter hoc. In H.H. Hansen, & R.C. Pinto (Eds), Fallacies: Classical
and contemporary readings. Penn State Press: University Park.
Popper, K.R. (1963). Conjectures and refutations: The growth of scientific knowledge. London: Routledge
& Kegan Paul.
Redish, E.F., Saul, J.M., & Steinberg, R.N. (1998). Student expectations in introductory physics. American
Journal of Physics, 66, 212-224.
Ruscio, J. (2000). Risky business: Vividness availability and the media paradox. Skeptical Inquirer, 24,
22-26.
Schommer, M. (1990). The effect of beliefs about the nature of knowledge on comprehension. Journal of
Educational Psychology, 82, 498-504.
Shermer, M. (1997). Why people believe weird things: Pseudoscience superstition and other confusions of
our time. New York: Freeman Press.
Stanovich, K.E. (2004). How to think straight about psychology. Boston: Allyn and Bacon.
Walton, D.N. (1998). A pragmatic theory of fallacy. Argumentation, 12, 115-123.
Swartz, L., De la Rey, C., & Duncan, N. (Eds). (2004). Introduction to Psychology. Cape Town: Oxford
University Press.
South African Journal of Psychology, Volume 40(3), September 2010 281

APPENDIX 1. Example of a vignette used in the assessment instrument


Professor Rider announces to a group of workers that he wants to study the effect of playing music on their
productivity. After a week of playing to them he finds that productivity has indeed increased. He then turns up
the volume and after another week finds that the productivity has increased further. He thus concludes that
music causes an increase in productivity. His colleague Professor Walker, however, says this is not a correct
conclusion.

With whom do you most strongly agree, Professor Rider or Professor Walker?
G Professor Rider
G Professor Walker
Copyright of South African Journal of Psychology is the property of South African Journal of Psychology and
its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's
express written permission. However, users may print, download, or email articles for individual use.

Вам также может понравиться