Академический Документы
Профессиональный Документы
Культура Документы
ABSTRACT. This study explores the impact of ‘Scientific Communication’ (SC) skills
instruction on students’ performances in scientific literacy assessment tasks. We present a
general model for skills instruction, characterized by explicit and spiral instruction,
integration into content learning, practice in several scientific topics, and application of
performance tasks. The model was applied through an instructional program that focuses
on the following learning skills: information retrieval, scientific reading and writing,
listening and observing, data representation, and knowledge presentation. Throughout the
7th–8th grades, 160 students learned the whole program or one of its components:
structured instruction (SI) of SC skills, or performance tasks (PT). A comparison group of
42 students did not receive instruction of SC skills. Students’ performances were assessed
through a questionnaire and a complex task that measured students’ scientific content
knowledge, SC skills, and the quality of the final products. Results indicated that students
who learned the whole program or one of its components achieved higher scores in all
categories than the comparison group students. High achievers can benefit from just one
component of the program: either structured instruction (SI) or learning from practice
(PT). However, they can hardly acquire SC skills spontaneously. Low and average
achievers require both components of the SC program to improve their performances.
Results show that without planned intervention, the spontaneous attainment of SC skills
occurs only to a limited extent. Systematic teaching of skills can make a significant
difference. The explicit instruction of skills integrated into scientific topics, the
opportunities to implement the skills in different contexts, the role of performance tasks
as ‘assessment for learning’—all these features are important and necessary for improving
students’ scientific literacy. Our general model of skills instruction can be applied to the
instruction of other high-order skills. Its application can lead to the realization of the
central goal of science education: literate students possessing scientific knowledge.
KEY WORDS: learning skills, LSS- learning skills for science, performance-based
assessment, scientific communication, scientific literacy
INTRODUCTION
scientific attitudes (Gauld & Hukins, 1980) and the abilities required to
construct understanding of science to apply these ideas to realistic
problems and issues involving science, technology, society, and the
environment, as well as to inform and persuade other people to take action
based on these science ideas (AAAS, 1993; Yore, Bisanz, & Hand, 2003).
This involves the ability to conduct lifelong, independent learning (Bol &
Strage, 1996; DeBoer, 2000). One important means of independent
learning is the capability of successfully implementing high-order skills
such as inquiry and problem-solving skills (BSCS, 1993; Schneider,
Krajcik, Marx & Soloway, 2002) as well as thinking and learning skills
(Berliner, 1992; Bol & Strage, 1996; Campbell, Kaunda, Allie, Buffler &
Lubben, 2000). Thus, students should acquire the ability to engage in and
to conduct exploratory activities, to locate and retrieve information, to
critically evaluate information, to organize and analyze the information, to
draw evidence-based conclusions, and to present the acquired knowledge
(AAAS, 1993; National Research Council, 1996; PISA, 2003).
It is well established that the ability to critically evaluate information is
essential. Students often seem to see science as a codified body of
knowledge that is basically unable to be challenged (Laugksch, 2000).
Science education textbooks and programs often fail to develop critical
thinking. Science education literature is replete with assertions and claims
about scientists’ ways of thinking, for example, the ‘scientific mind’ (Coll,
Taylor & Lay, 2008). Gauld & Hukins (1980) referred to it as the ‘scientific
attitude’. Gauld (1982) described the scientific attitude as follows:
"The scientific attitude as it appears in the science education literature embodies the
adoption of a particular approach to solving problems, to assessing ideas and information
or to making decisions. Using this approach, evidence is collected and evaluated
objectively so that the idiosyncratic prejudices of the one making the judgment do not
intrude. No source of relevant information is rejected before it is fully evaluated and all
available evidence is carefully weighed before the decision is made. If the evidence is
considered to be insufficient, then judgment is suspended until there is enough
information to enable a decision to be made. No idea, conclusion, decision or solution
is accepted just because a particular person makes a claim, but is treated skeptically and
critically until its soundness can be judged according to the weight of evidence which is
relevant to it. A person who is willing to follow such a procedure (and who regularly does so)
is said by science educators to be motivated by the scientific attitude" (Gauld 1982, p. 110).
Likewise, Zohar & Dori (2003) emphasize that high-order skills and
‘high’ literacy are essential for tackling the complexities of contemporary
life. Since information and knowledge are growing at a far more rapid
rate than ever before, the meaning of ‘knowing’ has shifted from being
able to remember and repeat information, to being able to find and use it
effectively. However, although the above-mentioned goals for science
education are well accepted, it is not yet clear how ‘scientific literacy’ and
‘lifelong learning’ are to be achieved (Anderson & Helms, 2001).
The study presented here attempts to cope with this question. It
suggests that the structured instruction of learning skills in science studies
and the practice of these skills via performance tasks, enhance students’
achievements, and scientific literacy. This article illustrates a model for
teaching skills in science education. It is applied by using the program
‘Scientific Communication’ in order to acquire high-order learning skills.1
It describes a study aimed at investigating the impact of ‘Scientific
Communication’ instructional interventions on the performances of
students in complex assessment tasks.
Table
Graph
Article
Textbook
Lecture
Scheme
Report
Demonstration
Library
Model Poster
Electronic Scientific Abstract
Database Video
Essay
Multimedia
Report
Experts
Oral
Presentation
Figure 1. The ‘Scientific Communication’ program focuses on six high-order learning
skills. Each of these high-order skills is composed of specific skills and sub-skills;
examples are shown
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
RESEARCH
METHODOLOGY
Research Sample
Students (N=202) from four different JHS schools participated in the
study. All of the schools were urban, and most students were from the
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
TABLE 1
The study sample was divided into two main groups: the ‘Scientific Communication’
group (SC group), and the comparison group
TABLE 2
Indicators for similarity of the study sample groups: a Prior general scientific knowledge,
b Prior academic level
N Average score SD
TABLE 3
The ‘Learning Situations’ questionnaire: description of items and examples of students’
phrases/keywords
coded according to criteria that specified good practice of the skills. This
process was validated by two science teaching researchers (with much
experience as science researchers), and one expert in qualitative research.
(2) The ‘Update Report’ Complex Assessment Task
The ‘Update Report’ is an extended task (lasting 3–4 weeks) that was
designed to assess students’ scientific communication capabilities. In order
to ensure that none of the students had experienced similar tasks during their
previous school studies, we verified it with their teachers (of all disciplines)
throughout the years in JHS and the last year of elementary school.
The ‘Update Report’ complex assessment task was designed as an
accumulation of three short performance tasks that served as one of the
two major components of the ‘Scientific Communication’ program. To
ORNIT SPEKTOR-LEVY ET AL.
Category Criteria 1 3 5
SC skills Variety of sources The student gathered information from a variety 1 source 2 sources 3 sources
of sources such as scientific books and articles, and more
Internet sites, experts, and government reports.
Relevancy & reliability – The student selected relevant information Low relevancy Relevant but Relevant &
concerning the pollutant. & unreliable not all reliable reliable
– The information is reliable and professional.
Quality of text The text is: 1 criteria 3 criteria All criteria
– Clear and understandable.
– Written by the student in his/her own words.
– Includes information from different sources.
– No more than three pages.
Visual representation The visuals (graphs, tables, schemes, etc.): 2 criteria 3 criteria All criteria
– Presenting data about the pollutant.
– Scientifically correct.
– Clear and well-designed illustrations.
Final products Computerized Regarding the slides: 2 criteria 3 criteria All criteria
presentation
– Text is easy to read.
– Good use of colors.
– Illustrations are clear.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
Research Design
Table 5 summarizes the overall design of the research. As previously
mentioned, the ‘General scientific knowledge’ questionnaire and the
ranking of ‘Prior academic level’ were used to compare the different
study groups. The ‘Learning situations’ questionnaire was administered
as a pre/post-test and the ‘Complex assessment task’ was administered as
a post-test to students from all four study groups.
TABLE 5
The study design and sample
‘General
scientific
knowledge’
+ ‘Prior ‘Learning Structured Short Complex ‘Learning
academic situations’ instruction performance assessment situations’
JHS students level’ quest (SI) tasks (PT) task Quest
SC group SI+PI ✓ ✓ ✓ ✓ ✓ ✓
N=160 SI ✓ ✓ ✓ − ✓ ✓
PT ✓ ✓ − ✓ ✓ ✓
Comparison ✓ ✓ − − ✓ ✓
group N=42
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
RESULTS
TABLE 6
Students’ descriptions of strategies they use to accomplish two SC assignments
Pre Post
SC groups
SC groups N=160 (%) N=160 (%)
Comp. group Comp. group
No. of keywords N=42 (%) SI+PT SI PT N=42 (%) SI+PT SI PT
The number of keywords used in their descriptions and the percentage of students that mentioned
them are indicated (Pre and Post). SC Scientific Communication; SI Structured Instruction; PT
Performance Tasks
ORNIT SPEKTOR-LEVY ET AL.
SI+PT SI PT
Groups’ Relationship
Structured Comp. Group (Statistically equal
Structured instruction + instruction Performance N=42 Adjusted F for groups are in
Categories Criteria Performance tasks (N=57) (N=53) Tasks (N=50) mean (SE) Groups parentheses)
Knowledge –Information about the pollutant 69.0 (3.0) 61.0 (3.1) 62.0 (3.2) 48.0 (3.5) 7.0*** Comp.G
–Main concepts & terms (SI, PT, SI+PT)
–Chemical processes
–The air as a mixture of gases
–Change of matter phases
–Effects of air pollution on
man the environment
–Ways to decrease pollution
SC skills –Information: variety of sources, 71.0 (3.0) 60.0 (3.1) 63.0 (3.2) 44.0 (3.5) 11.5*** Comp.G
reliable & relevant (SI, PT, SI+PT)
–Text: clear & professional SIG SI+PT
–Representation of quantitative data
–Computerized presentation
Quality of products –Text: comprehensive, edited 57.0 (2.8) 50.0 (2.9) 51.0 (3.0) 32.0 (3.2) 11.8*** Comp.G
according to demands (SI, PT, SI+PT)
–Analysis of data & conclusions
–Presentation: edited according to demands
–Integration of text & presentation
– Bibliography
Final score 66.0 (2.5) 57.0 (2.6) 59.0 (2.7) 42.0 (2.9) 13.3*** Comp.G
(SI, PT, SI+PT)
SIG SI+PT
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
2
Comparison Group (N=42) R =0.0011
2
Structured Instruction (SI) + Performance Tasks (PT) (N=57) R =0.0717
2
Structured Instruction (SI) (N=53) R =0.2065
2
Performance Tasks (PT) (N=50) R =0.163
100
90
80
70
Mean final score
60
50
SI + PT Comp. group
40
PT
30
SI
20
10
0
0 20 40 60 80 100
Mean score - General scientific knowledge
Figure 2. Regression analysis and correlation between students’ scores in the ‘General
Scientific Knowledge questionnaire’ and students’ final scores in the complex assessment
task ‘Update Report’
achieved the highest average final score on the complex task. The
regression analysis (R2 =0.0717) implies that the combined intervention
had a similar positive effect on students from different levels. One more
finding that emerges from Figure 2 is that in all intervention groups (SC
groups), the high achievers accomplished similar high scores. This may
imply that low and average achievers require both components of the
‘Scientific Communication’ program to improve their performances.
However, it seems that high achievers can benefit from just one
component of the program: either structured instruction or learning from
practice (the short performance tasks), and are able to acquire the
complementary competencies of scientific communication independently.
Figure 2 shows that even high achievers can hardly acquire scientific
communication skills spontaneously, as indicated by those students who
were part of the comparison group.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
DISCUSSION
In this study we found that JHS students who had learned from the
program ‘Scientific Communication’, or one of its components, improved
their use of professional terminology, and could describe in detail their
performance in situations that required learning skills. We also found that
these students performed significantly better on a complex scientific
literacy-oriented task than students who did not experience any
components of the program. This advantage was apparent in regards to
the ‘scientific communication’ skills, as well as the content knowledge.
Interestingly, students who experienced only short performance tasks,
without any explicit instruction inskills, achieved scores similar to those
who experienced only a structured instruction inskills, in most categories.
The effect of performance tasks on students’ achievements is not
surprising. These findings are in agreement with the approach of
‘assessment for learning’ and ‘formative assessment’. Black & Wiliam
(1998), who surveyed numerous studies and educational innovations,
concluded that a good test could be a learning as well as a testing
occasion. They also reported that programs that included practicing
formative assessment produced meaningful and often substantial gains in
learning. However, effective programs using formative assessment
involve far more than the addition of a few performance tasks to an
existing program. The tasks have to be justified in terms of the learning
goals that they serve, and they can work only if opportunities for students
to communicate their evolving understandings are incorporated into the
planning. Thus, opportunities for students to express their understanding
should be planned for every teaching occasion, because this will initiate
the interaction through which formative assessment aids learning. That is
why it is preferable to integrate into the learning frequent short tasks
rather than infrequent longer ones. The feedback through these assess-
ment tools provides each student with guidance on how to improve
(Black & Wiliam, 1998).
How can we explain this effect in the context of our study? We
designed the short performance tasks so that the complex assessment task
(the post-test) would reflect the skills required for performing the short
tasks. Thus, students who carried out the short tasks learned and practiced
the required competencies that helped them cope with the complex
assessment task.
In other words, the performance tasks provided opportunities for
learning and thus enhanced students’ scientific communication skills. It is
reasonable to assume that if we had tested the students on skills that were
ORNIT SPEKTOR-LEVY ET AL.
not practiced via the performance tasks, their achievement levels would
be lower than those of students who explicitly learned the skills through
‘structured instruction’. This conjecture should be studied further.
Students who experienced only structured instruction of skills, and did
not experience performance tasks, were exposed to this kind of
assessment only in the post-test complex task. Therefore, we believe that
these students had missed the opportunity for ‘assessment for learning’.
Yet, since they did score higher than students who did not receive any
skills instruction (the comparison group), we can conclude that the
structured instruction involved in the program is an important and crucial
component. The combination of the structured instruction of skills and
the practice of short performance tasks results in a synergistic effect and
tends to be the most effective method.
Although the instructional program improved students’ performances,
the level of achievement of students for the complex assessment task was
not high. There are some possible explanations: (1) the criteria and levels
that we defined in the scoring rubric of the ‘Update report’ task may have
been too demanding for JHS students. (2) The duration of our
intervention, 2 years, was probably not long enough to improve student
achievement so that they would reach the very high level of performance
that we anticipated. This explanation is supported by other studies that
emphasize the notion that attainment of skills must take place throughout
the school years, from the early stages of elementary school, until the
more academic studies stage (Gibbs, 1981; Hogan, 1999; Zohar & Dori,
2003). (3) The nature of performance-based assessment: different
assessment formats may require different competencies of students.
Ruiz-Primo et al. (2002) discussed the sensitivity of different assessment
methods in measuring scientific literacy. They designed a multilevel-
multifaceted approach in order to measure the extent, structure, and
precision of declarative, procedural, and strategic knowledge at different
periods, beginning from the time that a curriculum was implemented.
Such an approach may be considered in future studies. (4) The initial
level of students’ relevant skills was lower than we expected. Note that
according to international standard tests that were recently given in our
country (TIMMS, 2003; PISA, 2003), JHS students have difficulties in
basic skills such as reading comprehension and learning the main ideas
from texts. These findings support the fourth explanation and add to the
importance and relevance of teaching ‘Scientific Communication’ skills.
Although the complex assessment task assessed students’ final
products, we lacked information on the actual strategies and techniques
students used while implementing the skills. A clue to this can be found
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES
CONCLUDING REMARKS
ACKNOWLEDGEMENTS
We would like to thank Prof. Joe Krajcik from the University of Michigan
for his insightful and thoughtful comments and for the illuminating
discussions.
NOTE
1
The ‘Scientific Communication’ program was adopted and published in the UK,
where it is known as LSS – ‘Learning Skills for Science’.
REFERENCES
Bol, L., & Strage, A. (1996). The contradiction between teacher’s instructional goals and their
assessment practices in high school biology courses. Science Education, 80, 145–163.
BSCS (1993). Developing biological literacy pp. 107–124. Dubuque, Iowa: Kendall/Hunt.
Bybee, R. W. (1977). Achieving scientific literacy: From purpose to practice. Portsmouth,
NH: Heinemann.
Bybee, R. W., & Ben-Zvi, N. (1998). Science curriculum: transforming goals to practices.
In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of science education (pp.
487–498). Dordrecht: Kluwer Academic Publishers.
Campbell, B., Kaunda, L., Allie, S., Buffler, A., & Lubben, F. (2000). The
communication of laboratory investigations by university entrants. Journal of Research
in Science Teaching, 37, 839–853.
Castello, M., & Monereo, C. (1999). Teaching learning strategies in compulsory secondary
education. 8th European Conference for Research on Learning and Instruction, Sweden.
Champagne, A. B., Lovitts, B. E., & Callinger, B. J. (Eds.) (1990). This year in school
science. 1990: Assessment in the service of instruction. Washington, DC: American
Association for the Advancement of Science.
Coll, R. K., Taylor, N., & Lay, M. C. (2008). Scientists’ habits of mind as evidenced by
the interaction between their science training and religious beliefs. International Journal
of Science Education, 1–31, iFirst Article. Available at: http://pdfserve.informaworld.
com/82088_902013943_787688349.pdf.
Davidson, J. E., & Sternberg, R. J. (1985). Competence and performance in intellectual
development. In E. D. Neimark, R. De Lisi, & J. L. Newman (Eds.), Moderators of
competence (pp. 43–76). Hillsdale, NJ: Lawrence Erlbaum Associates.
DeBoer, G. E. (2000). Scientific literacy: Another look at its historical and contemporary
meaning and its relationship to science education reform. Journal of Research in
Science Teaching, 37, 582–601.
Dienes, Z., & Berry, D. (1997). Implicit learning: Below the subjective threshold.
Psychonomic Bulletin & Review, 4, 3–23.
Edelson, D. C. (1998). Realising authentic science learning through the adaptation of
science practice. In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of
science education (pp. 317–331). Dordrecht: Kluwer Academic Publishers.
Erickson, G., & Meyer, K. (1998). Performance assessment tasks in science: What are
they measuring? In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of
science education (pp. 761–789). Dordrecht: Kluwer Academic Publishers.
Eylon, B., & Linn, M. C. (1988). Learning and instruction: an examination of four research
perspectives in science education. Review of Educational Research, 58, 251–301.
Fleiss, J. L. (1981). Statistical methods for rates and proportions. New York: Wiley.
Gibbs, G. (1981). Teaching students to learn: A student-centered approach. Great Britain:
Open University.
Garvey, W. D., & Griffith, B. C. (1972). Communication and information processing
within scientific disciplines: Empirical findings for psychology. Information Storage
and Retrieval, 8, 123–126.
Gauld, C. F. (1982). The scientific attitude and science education: A critical reappraisal.
Science Education, 66, 109–121.
Gauld, C. F. (2005). Habits of mind, scholarship and decision making in science and
religion. Science & Education, 14, 291–308.
Gauld, C. F., & Hukins, A. A. (1980). ‘Scientific attitudes: A review. Studies in Science
Education, 7, 129–161.
ORNIT SPEKTOR-LEVY ET AL.
Scherz, Z., Michman, M., & Tamir, P. (1985). Preparing academically disadvantaged
students. Journal of College Science Teaching, March-April, 395–401.
Scherz Z., Spektor-Levy, O., & Eylon, B. (2005). Scientific Communication: An
instructional program for high-order learning skills and its impact on students’
performance. In: K. Boersma, M. Goedhart, O. de-Jong & H. Eijkelhof (Eds.),
Research and the Quality of Science Education (pp. 231–243). Netherlands: Springer.
Schneider, R. M., Krajcik, J., Marx, R. W., & Soloway, E. (2002). Performance of
students in project-based science classrooms on a national measure of science
achievement. Journal of Research in Science Teaching, 39, 410–422.
Shamos, M. H. (1995). The myth of scientific literacy. Rutgers, NJ: Rutgers Univ. Press.
Simonneaux, L. (2001). Role-play or debate to promote students’ argumentation and
justification on an issue in animal transgenesis. International Journal of Science
Education, 23, 903–927.
Spektor-Levy, O., Eylon, B. & Scherz, Z. (2008). Teaching communication skills in
science: Tracing teacher change. Teaching and Teacher Education, 24, 462–477.
Squire, J. (1983). Composing and comprehending: Two sides of the same basic process.
Language Arts, 60, 581–589.
Straka, G. A., Nenniger, P., Spevacek, G., & Wosnitza, M. (1996). A model for motivated
self-directed learning. Education, 53, 19–29.
TIMMS (1999). Science Items. Available at: http://isc.bc.edu/timss1999i/pdf/t99science_
items.pdf.
TIMMS (2003). Available at: http://nces.ed.gov/timss/.
Wiggins, G. (1998). Educative assessment: Designing assessments to inform and improve
student performance. San Francisco, Calif.: Jossey-Bass.
Wolf, D. P. (1993). Assessment as an episode of learning. In R. E. Bennet, & W. C. Ward
(Eds.), Construction versus choice in cognitive measurement. NJ: Lawrence Erlbaum.
Yore, D. L., Bisanz, L. G., & Hand, M. B. (2003). Examining the literacy component of
science literacy: 25 years of language arts and science research. International Journal of
Science Education, 25, 689–725.
Zohar, A., & Dori, Y. J. (2003). Higher order thinking skills and low-achieving students:
Are they mutually exclusive? The Journal of the Learning Sciences, 12, 145–181.
Zohar, A., & Nemet, F. (2002). Fostering students’ knowledge and argumentation
shills through dilemmas in human genetics. Journal of Research in Science
Teaching, 39, 35–62.
Ornit Spektor-Levy
Science Education, The School of Education
Bar Ilan University
Ramat Gan, 52900, Israel
E-mail: levyo@mail.biu.ac.il