Вы находитесь на странице: 1из 29

ORNIT SPEKTOR-LEVY, BAT-SHEVA EYLON and ZAHAVA SCHERZ

TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE


STUDIES: DOES IT MAKE A DIFFERENCE?
Received: 14 September 2007; Accepted: 13 January 2009

ABSTRACT. This study explores the impact of ‘Scientific Communication’ (SC) skills
instruction on students’ performances in scientific literacy assessment tasks. We present a
general model for skills instruction, characterized by explicit and spiral instruction,
integration into content learning, practice in several scientific topics, and application of
performance tasks. The model was applied through an instructional program that focuses
on the following learning skills: information retrieval, scientific reading and writing,
listening and observing, data representation, and knowledge presentation. Throughout the
7th–8th grades, 160 students learned the whole program or one of its components:
structured instruction (SI) of SC skills, or performance tasks (PT). A comparison group of
42 students did not receive instruction of SC skills. Students’ performances were assessed
through a questionnaire and a complex task that measured students’ scientific content
knowledge, SC skills, and the quality of the final products. Results indicated that students
who learned the whole program or one of its components achieved higher scores in all
categories than the comparison group students. High achievers can benefit from just one
component of the program: either structured instruction (SI) or learning from practice
(PT). However, they can hardly acquire SC skills spontaneously. Low and average
achievers require both components of the SC program to improve their performances.
Results show that without planned intervention, the spontaneous attainment of SC skills
occurs only to a limited extent. Systematic teaching of skills can make a significant
difference. The explicit instruction of skills integrated into scientific topics, the
opportunities to implement the skills in different contexts, the role of performance tasks
as ‘assessment for learning’—all these features are important and necessary for improving
students’ scientific literacy. Our general model of skills instruction can be applied to the
instruction of other high-order skills. Its application can lead to the realization of the
central goal of science education: literate students possessing scientific knowledge.

KEY WORDS: learning skills, LSS- learning skills for science, performance-based
assessment, scientific communication, scientific literacy

INTRODUCTION

Scientific literacy is a major goal of science education. In attempting to


find an appropriate meaning of scientific literacy, several definitions have
been proposed (AAAS, 1990; Linn, diSessa, Pea & Songer, 1994;
Shamos, 1995; Bybee & Ben-Zvi, 1998; DeBoer, 2000; PISA, 2003).
Over the last decade, science educators have relied on standards-based
definitions of science literacy and have defined science literacy as

International Journal of Science and Mathematics Education 2009


# National Science Council, Taiwan (2009)
ORNIT SPEKTOR-LEVY ET AL.

scientific attitudes (Gauld & Hukins, 1980) and the abilities required to
construct understanding of science to apply these ideas to realistic
problems and issues involving science, technology, society, and the
environment, as well as to inform and persuade other people to take action
based on these science ideas (AAAS, 1993; Yore, Bisanz, & Hand, 2003).
This involves the ability to conduct lifelong, independent learning (Bol &
Strage, 1996; DeBoer, 2000). One important means of independent
learning is the capability of successfully implementing high-order skills
such as inquiry and problem-solving skills (BSCS, 1993; Schneider,
Krajcik, Marx & Soloway, 2002) as well as thinking and learning skills
(Berliner, 1992; Bol & Strage, 1996; Campbell, Kaunda, Allie, Buffler &
Lubben, 2000). Thus, students should acquire the ability to engage in and
to conduct exploratory activities, to locate and retrieve information, to
critically evaluate information, to organize and analyze the information, to
draw evidence-based conclusions, and to present the acquired knowledge
(AAAS, 1993; National Research Council, 1996; PISA, 2003).
It is well established that the ability to critically evaluate information is
essential. Students often seem to see science as a codified body of
knowledge that is basically unable to be challenged (Laugksch, 2000).
Science education textbooks and programs often fail to develop critical
thinking. Science education literature is replete with assertions and claims
about scientists’ ways of thinking, for example, the ‘scientific mind’ (Coll,
Taylor & Lay, 2008). Gauld & Hukins (1980) referred to it as the ‘scientific
attitude’. Gauld (1982) described the scientific attitude as follows:

"The scientific attitude as it appears in the science education literature embodies the
adoption of a particular approach to solving problems, to assessing ideas and information
or to making decisions. Using this approach, evidence is collected and evaluated
objectively so that the idiosyncratic prejudices of the one making the judgment do not
intrude. No source of relevant information is rejected before it is fully evaluated and all
available evidence is carefully weighed before the decision is made. If the evidence is
considered to be insufficient, then judgment is suspended until there is enough
information to enable a decision to be made. No idea, conclusion, decision or solution
is accepted just because a particular person makes a claim, but is treated skeptically and
critically until its soundness can be judged according to the weight of evidence which is
relevant to it. A person who is willing to follow such a procedure (and who regularly does so)
is said by science educators to be motivated by the scientific attitude" (Gauld 1982, p. 110).

Scientific attitude can be described as ‘habits of mind’. According to


Gauld’s analysis, habits of mind for scientists can include open-minded-
ness, skepticism, rationality, objectivity, mistrust of arguments from
authority, suspension of belief, and curiosity. A number of these habits of
mind at first sight seem incompatible (for example, open-mindedness and
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

skepticism). However, it is the interplay of these habits of mind that


results in the scientific attitude. According to Merton (1976) the scientific
ethos comprises a ‘sociological ambivalence’ in which an interplay exists
between norms and counter-norms: Universalism - Particularism; Com-
munism - Solitariness; Disinterestedness - Interestedness; Organized
skepticism - Organized dogmatism; Emotional neutrality - Emotional
commitment; Rationality - Non-rationality. Merton (1976) wrote that
‘only through such structures of norms and counter-norms can the various
functions of a role be effectively discharged’ (p. 58).
Thus, it is no surprise to find in science curriculum documents, such as
"Science for all Americans", scientific habits of mind listed as important
for schools to develop (though not as complex and varied as Gauld and
others discussed):
“Quantitative, communication, manual, and critical-response skills are essential for
problem solving, but they are also part of what constitutes science literacy more
generally. That is why they are brought together here as scientific habits of mind rather
than more narrowly as problem-solving skills or more generally as thinking skills...”
(AAAS, 1993, Chap. 12).

Likewise, Zohar & Dori (2003) emphasize that high-order skills and
‘high’ literacy are essential for tackling the complexities of contemporary
life. Since information and knowledge are growing at a far more rapid
rate than ever before, the meaning of ‘knowing’ has shifted from being
able to remember and repeat information, to being able to find and use it
effectively. However, although the above-mentioned goals for science
education are well accepted, it is not yet clear how ‘scientific literacy’ and
‘lifelong learning’ are to be achieved (Anderson & Helms, 2001).
The study presented here attempts to cope with this question. It
suggests that the structured instruction of learning skills in science studies
and the practice of these skills via performance tasks, enhance students’
achievements, and scientific literacy. This article illustrates a model for
teaching skills in science education. It is applied by using the program
‘Scientific Communication’ in order to acquire high-order learning skills.1
It describes a study aimed at investigating the impact of ‘Scientific
Communication’ instructional interventions on the performances of
students in complex assessment tasks.

Instruction and Assessment of Skills—Diverse Approaches


Relevant research in the fields of cognitive science, learning and
instruction, and scientific literacy has revealed different agendas and
ORNIT SPEKTOR-LEVY ET AL.

approaches concerning the instruction of skills and the assessment of


scientific literacy capabilities. The following short review of the literature
represents the theoretical basis of the study presented here.
Some researchers and educators claim that skills and capabilities
develop by self-directed learning and by completing learning tasks
throughout the studies (Hudgins, Riesenmy, Mitchell, Klein & Navarro,
1994; Straka, Nenniger, Spevacek & Wosnitza, 1996; Bennett, 1999).
This approach of self-directed learning of skills relies on students’
motivation and initiatives (Kerstiens, 1998), which are not always part of
students’ learning characteristics (or habits). Others claim that skills are
attained through structured instruction involving planned learning
opportunities that require the performance of skills and the coaching of
students in the process (Shamos, 1995; Castello & Monereo, 1999). Most
studies do not support the claim that students acquire skills spontaneous-
ly, and in fact show that they need continuous scaffolding and direct
instruction (Scherz, Michman, & Tamir, 1985; Eylon, & Linn, 1988;
Dienes & Berry, 1997; Krajcik, Blumenfeld, Marx, Bass & Fredricks,
1998; Campbell et al., 2000; Kirkwood, 2000). This approach contends
that only teaching that promotes reflective, conscious, metacognitive
supervision of students’ use of their knowledge, and especially their
procedural knowledge, can guarantee that what is learned will be
implemented and transferred (Castello & Monereo, 1999; Campbell et
al., 2000). According to another approach, the integrated approach, the
instruction of skills should be in the context of subjects learned in class,
as an integral part of the learning activities. The learning activities
incorporate tasks that aid in understanding the concepts and terms of the
subject matter, and at the same time, engage the student in acquiring skills
(Squire, 1983; Roth & Roychoudhury, 1993; Hara, 1997). In the
literature, one can find variations concerning the length of time needed
for attaining skills: from short-term efforts, usually implemented through
intensive courses (Oosterhuis-Geers, 1993; Hogan, 1999), to long-term
efforts (Bangert-Drowns, Hurley, & Wilkinson, 2004; Klein, 2006). Some
of the long-term approaches support spiral instruction, whereby students
learn and practice advanced skills throughout the school years. Each year
they are introduced, in depth, to different skills and sub-skills. The
acquisition of skills includes a repetition of the same skill in different
scientific contexts and implementation of the skill in various learning
situations (Hogan, 1999; Zohar & Dori, 2003). Thus, the instruction of
skills must be well planned and structured.
The fact that students are introduced to skills does not necessarily lead
to their actual use of these skills. Davidson & Sternberg (1985) defined
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

the difference between competence and performance: competence is the


availability of skills and logical structures such as information processes,
knowledge, and representational processes, whereas performance is the
utilization of competence as mediated by the requirements of a given task.
How can one enhance the probability that competence will be realized
in performance? Edelson (1998) claims that integrating acquisition of
content and skills together into the design of learning activities provides
an opportunity to increase students’ ability to carry out authentic activities
while also achieving deeper content understanding and better skills’
performances. Since assessment should be carried out as an integral part
of the teaching and learning process, then, it should also take place as
authentic activity whose completion requires the performance of content
knowledge and skills. Such an authentic assessment serves as a learning
opportunity: “Good classroom assessments are not only measures of
learning, but genuine episodes of learning themselves” (Wolf, 1993, p.
213). Moreover, Black & Wiliam (1998) contend that opportunities for
students to express their understanding and capabilities should be
incorporated into any teaching occasion. They called for the implemen-
tation of formative assessment. This includes all activities that teachers
and students undertake to get information that can be used diagnostically
to alter teaching and learning. The feedback gained through these
assessment tools provides each student with guidance on how to improve.
Formative assessment occurs "when the evidence is actually used to adapt
the teaching work to meet the needs" (Black & Wiliam, 1998).
An important element of formative assessment is performance-based
assessment. Performance-based assessment requires individuals to apply
their knowledge and skills in context, not merely to complete a task on
cue (Wiggins, 1998). Thus, performance-based assessments should be
meaningful and engaging to students. In performance-based assessments,
students are requested to show what they can do, and are given an
authentic task, which is then judged using a specific set of criteria.
Performance tasks often have more than one acceptable solution: they
may require that a student create a response to a problem and then explain
or defend it. The process involves the use of higher-order thinking skills
(e.g., cause-and-effect analysis, deductive or inductive reasoning, exper-
imentation, and problem solving). Performance tasks may be used
primarily for assessment at the end of a period of instruction, but are
frequently used for learning as well as assessment.
Performance-based assessment represents a set of strategies for
applying knowledge, skills, and work habits (Champagne, Lovitts, &
Callinger 1990; Kulm & Malcom 1991; Erickson & Meyer, 1998). This
ORNIT SPEKTOR-LEVY ET AL.

approach is upheld by the OECD’s Program for International Student


Assessment (PISA, 2003), which uses performance-based assessment to
assess students’ levels in scientific literacy.

General Model for Skills Instruction


A review of the literature of the different approaches to skills instruction,
along with our field experience, supported the design of an instructional
model (Scherz, Spektor-Levy, & Eylon, 2005; Spektor-Levy, Eylon, &
Scherz, 2008) consisting of two major components, structured instruction,
and performance-based assessment:

(a) Structured instruction is characterized by the following factors:


 Explicit instruction: The actual instruction of all the components of
the skills is annotated and emphasized. Students should be aware of
the process of skills acquisition and should reflect on it.
 Spiral instruction: Throughout the school years, the students practice
generic skills such as gathering and analyzing information, representing
information, and presenting knowledge. However, each year students
are introduced to different skills and sub-skills in depth and continue to
practice them several times in the course of their science studies.
 Integration and practice: Teachers are expected to tailor the
instruction and practice of skills into specific content areas. The
integration is attained through the use of a framework of general
activities that can be used in conjunction with specific contents.
These activities are designed to practice the different high-order
skills as well as the sub-skills in any given scientific area.
 Flexibility & Modularity: The model is flexible and modular in a
way that enables teachers to choose the specific skills and activities
that will be implemented every year, as well as the content in which
it will be studied and practiced. Teachers can plan their own
sequence of instruction according to their agenda and adjust the
instructional activities to the level and needs of the students.
(b) Performance-based assessment: The assessment of students’
capabilities and skills is carried out through performance tasks that serve
as formative assessment. Each performance task is especially designed to
carry out and practice certain skills, and is integrated into the actual
science instruction.
In order to implement our model, we developed instructional materials,
namely, the program ‘Scientific Communication’ (Scherz, Eylon, & Bialer,
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

2008; Spektor-Levy, Eylon, & Scherz, 2008) that supports learners in


developing advanced learning skills and general strategies for the realization
of independent learning and operating towards achieving scientific literacy.

THE SCIENTIFIC COMMUNICATION PROGRAM

The ‘Scientific Communication’ program focuses on oral and written


communication skills, e.g., the processes of speaking, listening, writing, and
reading. These abilities are highly valued within the scientific community.
Scientists who communicate well are successful in gaining support from
members of their own communities, funding agencies, and society at large.
These abilities are greatly needed also in scientists’ daily lives. For example,
the unique features of the free flow and the unedited World Wide Web
increase the need for readers with sufficient knowledge, metacognitive
awareness, critical thinking, executive control of their reading comprehen-
sion, and proficiency with diverse learning abilities (Yore et al., 2003).
The ‘Scientific Communication’ program was developed for students at
the Junior High School (JHS) level with the aim of acquiring learning
skills by integrating them into the science studies. The program focuses
mainly on the following skills: information retrieval, scientific reading,
scientific writing, listening and observing, information representation, and
knowledge presentation. We refer to these skills as ‘scientific communi-
cation skills’. Like Garvey and Griffith (1972), who asserted that
“Communication is the essence of science”, we also relate to these skills
being the essential skills or basic skills that scientists have to master in
order to communicate their findings and ideas. Since scientific literacy is
a major goal of science education, an important component is teaching
students to communicate their knowledge and ideas, as scientists do.
Garvey and Griffith (1972) developed a model of the scientific
communication system that outlines the process by which research is
communicated and provides details of the various stages within a time
frame, from the initial concept to integration of the research as an
accepted component of scientific knowledge. Although since 1972,
computer-based information technologies have emerged that are begin-
ning to change the ways scientists use, produce, and disseminate
information (Hurd, 2000), still, the skills of information retrieval,
scientific reading, scientific writing, listening and observing, information
representation, and knowledge presentation are elementary skills one has
to master. These skills represent authentic science skills: mastering the
complexity of scientific texts, the need to read and understand scientific
ORNIT SPEKTOR-LEVY ET AL.

symbols or common scientific data representations, the uniqueness of the


structure of scientific research papers and the growing body of electronic
professional databases, all of which require instruction in specific skills in
light of how scientists implement them.
Each of these scientific communication skills is composed of specific
skills and sub-skills (Figure 1). The performance of each of these skills
and sub-skills and the ability to integrate them into complex learning
assignments determine the level of one’s learning capabilities. For
example, consider the high-order learning skill of ‘scientific writing’
needed for the complex assignment of writing a scientific report. This
assignment requires the implementation of various skills and sub-skills
such as gathering information from different sources, sorting and
organizing information, processing it, as well as representing and
summarizing the data. Good integration of all these skills can indicate
the mastery of the high-order skill—‘scientific writing’.
 The goals: The main goals of the program ‘Scientific Communica-
tion’ are as follows: (1) to enhance the performance of scientific

“Scientific Communication” Skills

Information Scientific Listening Scientific Information Knowledge


Retrieval Reading & Observing Writing Representation Presentation

Table
Graph
Article
Textbook
Lecture
Scheme
Report
Demonstration
Library
Model Poster
Electronic Scientific Abstract
Database Video
Essay
Multimedia

Report
Experts
Oral
Presentation
Figure 1. The ‘Scientific Communication’ program focuses on six high-order learning
skills. Each of these high-order skills is composed of specific skills and sub-skills;
examples are shown
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

communication skills and the scientific literacy of students; (2) to


furnish teachers with instructional materials and activities that can be
implemented in and integrated into a variety of scientific topics; and
(3) to design flexible instructional materials suitable for different
levels of students and for meeting the differing and specific needs of
the class and the teacher.
 The instructional materials: The ‘Scientific Communication’
program is designed as a learning package that includes materials
that follow the two main components of the instructional model:

(a) Structured instructional materials: a framework of generic activ-


ities that support the explicit instruction and practice of different
learning skills. These generic activities can be used in conjunction
with any given topic in science studies. Most of the activities are
challenging, demand creativity, and are designed to encourage
students to apply procedures for attaining skills.

The instructional materials include the activities for the students, an


extended guide for the teacher, and interfaces: relevant texts and articles
for several main topics of the science curriculum.

(b) Performance-based assessment tasks: The design of the perfor-


mance-based assessment tasks involved two steps: (1) Making a
decision regarding several clusters of skills (e.g., one of the clusters
included retrieving information from different sources, and writing a
summary). (2) Designing short performance tasks (1.5 h each) in which
students can demonstrate their mastery of each cluster of skills. These
clusters can be combined into complex performance tasks (1–4 weeks
long) that require mastery of a variety of scientific communication skills.

Each of the performance tasks includes a short authentic scientific


story and several related items through which the students have to
demonstrate their mastery of relevant skills. For example, in one of the
short performance tasks students were asked to play the role of an expert
who works for an educational website. The experts had to answer a
student’s question concerning nutrition and had to answer it based on
three different sources of information that they had received. The sources
included different representations of information (a graph, table, and plain
text) and the experts had to write a one-page answer.
The short performance tasks create opportunities for the students to
implement the skills that were learned in class and serve as a tool for
formative assessment. More specifically, the tasks equip the teachers with
ORNIT SPEKTOR-LEVY ET AL.

tools to evaluate students’ performances of the skills as well as their


knowledge of the subject, as part of the learning process. A detailed
scoring rubric was developed for each task. The teachers were instructed
to use the tasks as examples for developing similar ones by themselves.
In the next section, we present a study aimed at exploring the impact of
the two major components of the instructional model and program:
structured instruction of skills and performance-based assessment, on
JHS students. Our objective was to assess the impact of these two
components on students’ performances and achievements regarding their
scientific literacy capabilities.

RESEARCH

Context and Objectives


As was described before, the review of the literature presented in the first
part of this paper served as the theoretical basis of the study presented
here. The general model for skills instruction and the program Scientific
Communication, provide the framework and context of this study.
Consequently, we describe here a 2-year longitudinal study of JHS
students, which took place throughout the 7th and 8th grades. This study
investigated the attainment of skills through the program ‘Scientific
Communication’. We addressed the following questions:

1. How does the instruction of Scientific Communication skills affect


students’ performances in scientific literacy oriented tasks?

Y Do students who have learned and practiced scientific commu-


nication skills perform better on complex tasks than students
who have not?
Y How do the two components of the ‘Scientific Communication’
program affect students’ performances on complex tasks?

2. What is the impact of the ‘Scientific Communication’ program and its


components on the performance of students with different academic
levels?

METHODOLOGY

Research Sample
Students (N=202) from four different JHS schools participated in the
study. All of the schools were urban, and most students were from the
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

middle socio-economic level: these students had heterogeneous academic


abilities. The study was conducted throughout the 7th–8th grades. The
sample was divided into two main groups: the ‘Scientific Communica-
tion’ group (SC group), and the comparison group (Table 1).
The SC group was further divided into three subgroups: (1) students
who experienced structured instruction (SI) of scientific communication
skills, (2) students who experienced short performance tasks (PT), and (3)
students who experienced both components of the program: structured
instruction and short performance tasks (SI + PT).
The comparison group (N=42) and the subgroup PT of the SC group
(N=50) consisted of students that were randomly selected from four
classes from the same school. Students from the SI and the SI+PT groups
were randomly selected from three different urban JHS schools (Table 1).
The four groups were compared according to the following:
 General scientific knowledge questionnaire: a multiple-choice test, 11
questions that were selected from the published form of the
international standards test TIMMS (TIMMS, 1999). Internal reliability
was measured according to the Kuder-Richardson formula 20, α=0.82.
 Teachers’ judgments of Students’ prior academic levels, which was
based on a ranking related to students’ knowledge and capabilities,
provided by their school science teachers and science coordinators.
The teachers judged the students according to specific instructions
and criteria that we gave them and then categorized them into three

TABLE 1
The study sample was divided into two main groups: the ‘Scientific Communication’
group (SC group), and the comparison group

Component of the ‘Scientific


Communication’ program Study groups

Scientific Communication Comparison group


(SC) groups (N=160) (N=42)

N=53 N=50 N=57

Structured Instruction (SI) + − + −


Performance Tasks (PT) − + + −
The SC group was sub-divided into three groups: Two of the SC groups experienced one component
of the ‘Scientific Communication’ program whereas the third group experienced the two components.
The comparison group did not experience any part of the program
ORNIT SPEKTOR-LEVY ET AL.

levels: low-academic level, average-academic level, and high-


academic level (on a scale of 1 to 3, respectively). As shown in
Table 2, there were no significant differences between the sample
groups regarding these two indicators. This may imply that these
groups were similar at the beginning of the study.

Data Collection and Analysis


The research data were collected through: (1) the ‘Learning situations’
questionnaire, and (2) the ‘Update Report’ complex assessment task.
(1) The ‘Learning Situations’ Questionnaire

The questionnaire assessed the students’ ability to describe in detail


their experience regarding several scientific communication capabilities.
The students were presented with two learning situations and were asked
to describe in detail the strategies and methods they used to accomplish
the assignment in these situations. The students’ answers were analyzed
and graded according to the number of meaningful phrases or keywords,
which indicated a good performance of skills. Table 3 presents the two
learning situations ‘Scanning an article’ and ‘Navigation in the library’
and examples of phrases and keywords, indicating good practice of
scientific communication skills. Students’ answers were categorized and

TABLE 2
Indicators for similarity of the study sample groups: a Prior general scientific knowledge,
b Prior academic level

N Average score SD

(a) Students’ prior general scientific knowledge

Comparison group 42 79.1 17.1


SC group SI+PT 57 82.1 17
SI 53 80.5 15
PT 50 77.9 20.5
(b) Students’ prior academic level (on a scale of 1–3)

Comparison group 42 2.5 0.6


SC group SI+PT 57 2.5 0.6
SI 53 2.5 0.7
PT 50 2.4 0.7
(SC Scientific Communication; SI Structured Instruction; PT Performance Tasks)
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

TABLE 3
The ‘Learning Situations’ questionnaire: description of items and examples of students’
phrases/keywords

Learning situations Description Examples of phrases/keywords

‘Scanning an Let us suppose that.... I will check the following:


article’ You found a 10-page popular – Who is the author?
scientific article while searching – Who is the publisher?
for information as part of a task – What date was it published?
for your science class. You have – Is there a bibliography?
5 min to decide whether the – I will look at figures such
article is relevant for your task. as graphs, illustrations.
Describe in detail how you – I will read the abstract.
would do it. – I will skim the article.
‘Navigation in Let us suppose that.... – I will search in the
the library’ catalogue.
The science teacher asked you – I will find the Dewey
to search for three different Decimal Classification
sources of information in the number.
library, each of which deals – I will look in the index.
with a specific subject in science. – I will check the table
The task has to be accomplished of contents.
for the next lesson and has to be – I will look in a lexicon.
done in the library without any – I will search in scientific
help from the librarian. Describe journals
in detail how you would – I will look for a textbook.
accomplish the task. – I will consult a scientific
encyclopedia.

coded according to criteria that specified good practice of the skills. This
process was validated by two science teaching researchers (with much
experience as science researchers), and one expert in qualitative research.
(2) The ‘Update Report’ Complex Assessment Task
The ‘Update Report’ is an extended task (lasting 3–4 weeks) that was
designed to assess students’ scientific communication capabilities. In order
to ensure that none of the students had experienced similar tasks during their
previous school studies, we verified it with their teachers (of all disciplines)
throughout the years in JHS and the last year of elementary school.
The ‘Update Report’ complex assessment task was designed as an
accumulation of three short performance tasks that served as one of the
two major components of the ‘Scientific Communication’ program. To
ORNIT SPEKTOR-LEVY ET AL.

accomplish the complex task, the students had to apply ‘Scientific


Communication’ skills, such as searching for information from a variety
of sources (scientific books and articles, professional Internet sites,
experts, etc.), analyzing and evaluating the data, writing a report,
preparing illustrations, and designing a television broadcast.
The task was related to scientific topics from materials and earth
sciences (the atmosphere, air pollution, and the mutual influence of man
and environment). The student, who had to play the role of an expert in
atmospheric sciences, was asked to prepare a 5-min broadcast presenta-
tion about one of the causes of air pollution. The presentation had to
include an oral report (up to three pages of text) accompanied by relevant
visual aids (up to five slides), such as pictures, figures, graphs, and tables.
The instructions for completing the task were very detailed: how to
start, how to conduct the whole task step by step, how long the text
should be, what should be shown in the PowerPoint presentation, what
kind of resources are recommended, how the 5-min broadcast should be
planned, etc. Along with the detailed instructions, the students were
provided with the necessary scaffolding for achieving good results: The
students showed their manuscripts to the science teacher and they could
consult with the teacher about the design of the presentation: the teacher
was always available to answer questions. The students were also informed
about the criteria of assessment. Three main categories were defined: content
knowledge (knowledge), scientific communication skills (SC skills), and the
quality of the final products: broadcasting text and slides. For each category,
general criteria were developed and the levels of performance were defined
on a scale of 0–5 in order to determine the abilities of the students. Table 4
illustrates examples from the scoring rubric.
The process of developing and validating the task, ‘Update Report’ and
its scoring rubric (as well as the short performance tasks in the
intervention), involved a gradual refinement through interaction with
different groups of experts. The first step was a review by eight science-
teaching experts and researchers. After correcting and preparing an
improved version, an external committee of six science educators and
policy-makers prepared a second review. The improved task was assessed
for validity by three experienced science teachers and an expert in
alternative assessment. At that stage, the task was pilot-tested with JHS
students from schools that would not participate in the extended study.
Only after this stage was the task presented in its final version to the
students who participated in the main study.
Students’ productions were assessed, according to the scoring rubric,
by an experienced teacher who was unaware of the trial conditions and
TABLE 4
The complex performance task ‘Update Report’: examples from the scoring rubric concerning the performance of scientific communication (SC)
skills and final products

Level of performance (0–5)

Category Criteria 1 3 5

SC skills Variety of sources The student gathered information from a variety 1 source 2 sources 3 sources
of sources such as scientific books and articles, and more
Internet sites, experts, and government reports.
Relevancy & reliability – The student selected relevant information Low relevancy Relevant but Relevant &
concerning the pollutant. & unreliable not all reliable reliable
– The information is reliable and professional.
Quality of text The text is: 1 criteria 3 criteria All criteria
– Clear and understandable.
– Written by the student in his/her own words.
– Includes information from different sources.
– No more than three pages.
Visual representation The visuals (graphs, tables, schemes, etc.): 2 criteria 3 criteria All criteria
– Presenting data about the pollutant.
– Scientifically correct.
– Clear and well-designed illustrations.
Final products Computerized Regarding the slides: 2 criteria 3 criteria All criteria
presentation
– Text is easy to read.
– Good use of colors.
– Illustrations are clear.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

– Smart use of special effects - no


more than 5 slides.
ORNIT SPEKTOR-LEVY ET AL.

the study hypotheses as well as by one of the researchers. A comparison


of the scoring showed that there was a good degree of agreement (Fleiss,
1981; Agresti, 1990): Weighted kappa (KW) = 0.448 (SE=0.03; Z=12.94).
Whenever a disagreement was detected, the two evaluators discussed the
discrepancies and tried to adjust them. In the few cases where the
discrepancies were not resolved, the teacher’s judgment was chosen.
ANCOVA procedures were used to analyze the data and the potential
differences across study groups, with post-test scores of the complex task
as the dependent variable, and the pre-test ‘General scientific knowledge’
score as the covariate.
The data were assessed for linearity of the covariate, equality of slopes,
and independence of the covariate to satisfy assumptions for ANCOVA.
No violations were identified.

Research Design
Table 5 summarizes the overall design of the research. As previously
mentioned, the ‘General scientific knowledge’ questionnaire and the
ranking of ‘Prior academic level’ were used to compare the different
study groups. The ‘Learning situations’ questionnaire was administered
as a pre/post-test and the ‘Complex assessment task’ was administered as
a post-test to students from all four study groups.

TABLE 5
The study design and sample

Study sample Pre Intervention Post

‘General
scientific
knowledge’
+ ‘Prior ‘Learning Structured Short Complex ‘Learning
academic situations’ instruction performance assessment situations’
JHS students level’ quest (SI) tasks (PT) task Quest

SC group SI+PI ✓ ✓ ✓ ✓ ✓ ✓
N=160 SI ✓ ✓ ✓ − ✓ ✓
PT ✓ ✓ − ✓ ✓ ✓
Comparison ✓ ✓ − − ✓ ✓
group N=42
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

RESULTS

How does the instruction of Scientific Communication skills affect


students' performances in scientific literacy-oriented tasks?
The ‘learning situations’ questionnaire presented descriptions of two
learning situations (Table 3). It was introduced to students at the
beginning of the 7th grade and at the end of 8th grade. Students were
asked to record in detail how they would accomplish the assignments.
Students’ responses were analyzed as previously described.
In both ‘learning situations’, there was no significant difference
between the groups regarding the average number of keywords and
phrases that students used at the beginning of the 7th grade (Table 6).
However, at the end of the 8th grade, students who had experienced the

TABLE 6
Students’ descriptions of strategies they use to accomplish two SC assignments

Pre Post

SC groups
SC groups N=160 (%) N=160 (%)
Comp. group Comp. group
No. of keywords N=42 (%) SI+PT SI PT N=42 (%) SI+PT SI PT

Assignment 1: "Scanning an article"

0 33.3 33.3 39.7 9.1 20.0 8.1 21.5 21.0


1 50.0 39.4 27.9 63.6 45.0 27.4 24.6 42.1
2 16.7 21.2 17.6 27.3 35.0 29.0 26.1 31.6
≥3 6.1 14.7 35.5 27.6 5.3
Average no. 0.9 1.1 1.7 1.3 1.1 2.5*** 2.2*** 1.3
(Comp.,PT) G(SI+PT,SI)
*** PG0.005
Assignment 2: "Navigation in the library"

0 8.3 6.5 5.0 7.1 4.1 6.0 7.9 23.8


1 45.8 60.9 45.0 46.4 45.8 32.0 36.8 38.1
2 29.2 21.7 30.0 32.1 37.5 30.0 22.4 28.6
≥3 16.7 10.8 20.0 14.3 12.5 32.0 32.9 9.5
Average no. 1.5 1.7 1.8 1.6 1.6 1.9** 1.9** 1.2
(Comp.,PT) G(SI+PT,SI)
**PG0.01

The number of keywords used in their descriptions and the percentage of students that mentioned
them are indicated (Pre and Post). SC Scientific Communication; SI Structured Instruction; PT
Performance Tasks
ORNIT SPEKTOR-LEVY ET AL.

‘structured instruction’ (SI) of scientific communication skills and those


who had experienced ‘structured instruction’ plus the short performance
tasks (SI+PT) improved their performances and described their practice in
the different situations, in a more detailed and professional way than the
other groups (the comparison group and PT). The improvement was
manifested by a higher percentage of students that could mention three or
more keywords in describing their process for carrying out the assignment
and in describing their practice of skills (Table 6). It was also indicated by
a significant difference in the average number of keywords that they used.
Such an improvement was found to a lesser extent within the
‘performance tasks’ (PT) group on the first assignment and was not
attained at all for the second assignment. Similarly, students from the
comparison group did not improve their ability to use keywords and
professional terms when describing scientific communication strategies.
The above data indicate that the ‘structured instruction’ component of
the ‘Scientific Communication’ program contributes to students’ aware-
ness of scientific communication skills and improves their ability to
explain their scientific communication procedures. ‘Structured instruc-
tion’ combined with the ‘performance tasks’ contributed even more to
students’ performances in both learning assignments.
The results described thus far are based on students’ personal
descriptions of how they carried out scientific communication assign-
ments. However, in order to evaluate the actual performances of students,
we could not rely only on their declarations in questionnaires. The impact
of the instruction of scientific communication skills on students’
achievements and their performance of these skills was therefore
examined in the context of a complex assessment task that provided
students with the opportunity to use their learning capabilities. The next
section reports on these results.

Do students who have learned and practiced scientific communication


skills perform better in complex tasks than students who have not?
At the end of the 8th grade, all the students in the study were asked to
accomplish the complex assessment task: ‘Update Report’.
Table 7 presents students’ achievements on this complex assessment task.
The results indicate the advantage of the SC groups over the
comparison group that did not learn the program or any of its
components. Students from all the SC groups performed much better in
all three categories: knowledge, SC skills, quality of final products, and
also in the final score.
TABLE 7
Students’ scores in the complex assessment task ‘Update Report’ according to four categories

SC Group Adjusted mean (SE)

SI+PT SI PT
Groups’ Relationship
Structured Comp. Group (Statistically equal
Structured instruction + instruction Performance N=42 Adjusted F for groups are in
Categories Criteria Performance tasks (N=57) (N=53) Tasks (N=50) mean (SE) Groups parentheses)

Knowledge –Information about the pollutant 69.0 (3.0) 61.0 (3.1) 62.0 (3.2) 48.0 (3.5) 7.0*** Comp.G
–Main concepts & terms (SI, PT, SI+PT)
–Chemical processes
–The air as a mixture of gases
–Change of matter phases
–Effects of air pollution on
man the environment
–Ways to decrease pollution
SC skills –Information: variety of sources, 71.0 (3.0) 60.0 (3.1) 63.0 (3.2) 44.0 (3.5) 11.5*** Comp.G
reliable & relevant (SI, PT, SI+PT)
–Text: clear & professional SIG SI+PT
–Representation of quantitative data
–Computerized presentation
Quality of products –Text: comprehensive, edited 57.0 (2.8) 50.0 (2.9) 51.0 (3.0) 32.0 (3.2) 11.8*** Comp.G
according to demands (SI, PT, SI+PT)
–Analysis of data & conclusions
–Presentation: edited according to demands
–Integration of text & presentation
– Bibliography
Final score 66.0 (2.5) 57.0 (2.6) 59.0 (2.7) 42.0 (2.9) 13.3*** Comp.G
(SI, PT, SI+PT)
SIG SI+PT
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

The adjusted mean and standard error (SE) are presented


(*** PG0.0005)
ORNIT SPEKTOR-LEVY ET AL.

These findings indicate that although all interventions focused on


attaining scientific communication skills, differences between the inter-
vention and the comparison groups were found, not only in categories
that are directly related to the ‘Scientific Communication’ program (SC
skills and the quality of the final products) but also in the category of
content knowledge in science.
The highest average scores of students in the SC groups were attained
in the category ‘SC skills’. This was not the case in the comparison
group.
In all four groups, the average scores relating to the quality of the final
products were relatively lower than the scores in the other two categories.
In examining the adjusted mean scores of students’ achievements from
the intervention groups (SC groups) (Table 7), one can note that in
general, the scores were not very high. We suggest possible reasons that
may explain this in the discussion section.

How do the two components of the 'Scientific Communication' program


affect students’ performances in complex tasks?
Our results indicate that the highest scores in all three categories, of the
complex assessment task, were achieved by those students who had
learned through both the components of the program ‘Scientific
Communication’: structured instruction and practice of skills in short
performance tasks (SI+PT) (Table 7). Students in the comparison group
who did not experience any intervention through the ‘Scientific
Communication’ program, scored significantly lower in all categories.
The results also indicate that students who had carried out short
performance tasks (PT) had average scores similar to those who had
experienced structured instruction (SI) of skills (Table 7). Importantly,
the achievements of students from these two sub-groups were signifi-
cantly higher than the comparison group in all three categories, plus the
final score. Apparently, the completion of short performance tasks
provided opportunities to practise several scientific communication skills
that were assessed later in an integrated form by the complex assessment
task ‘Update Report’ (as was described before). It is reasonable to assume
that implementing these short performance tasks during the study gave
the students the opportunity to experience this type of assessment and to
practice the skills prior to the complex assessment task.
These results emphasize the importance of having created opportunities
to implement and demonstrate learning skills and scientific communica-
tion skills during the studies, in addition to teaching their methodology.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

The findings show that without planned intervention, the spontaneous


attainment of scientific communication skills occurs only to a limited extent.
The lack of an intervention, as was the case with the comparison group, led to
significantly lower achievements regarding content understanding, SC skills
performance, and the quality of learning products.
The application of structured instruction, together with performance-
based assessment, seems to have the strongest effect, which is in
accordance with current approaches to curriculum change, as will be
discussed in the last section of this article.

What is the impact of the 'Scientific Communication' program and its


components on the performance of students with different academic
levels?
To address this question, we examined the correlation between students’
scores in the general scientific knowledge questionnaire and the scores in
the complex assessment task ‘Update Report’. Interestingly, similar
patterns were found for all three categories of the ‘Update Report’ task:
knowledge, SC skills, and the final products, as well as in the final score.
Figure 2 presents the regression analysis of the final scores as a
representative example. As shown, the average final score of students in
the comparison group, who did not experience any component of the
‘Scientific communication’ program, was very low in the ‘Update Report’
task (∼40). Moreover, the regression line (R2 =0.0011) indicates a very weak
correlation between the final scores of students for the complex task and their
knowledge levels, as indicated by the ‘general scientific knowledge’
questionnaire. These results imply no differences in the performances of
the complex task by students in the comparison group who have different
learning levels. The regression line in the comparison group behaved
similarly in all other categories of the complex task (not shown here).
With the SC groups, the regression analysis revealed different patterns.
The regression line of the structured instruction (SI) group (R2 =0.2065)
indicates some correlation between the two variables. A similar but less
pronounced effect was found in the performance tasks (PT) group (R2 =
0.163). This implies that these two interventions differentially influenced
those students from various levels: students that scored lower in the ‘general
scientific knowledge’ questionnaire tended to score lower on the complex
task. Conversely, students that achieved high scores in the ‘general scientific
knowledge’ questionnaire achieved high scores for the complex task.
As mentioned before, students who experienced the combined
intervention: structured instruction and performance tasks (SI+PT)
ORNIT SPEKTOR-LEVY ET AL.

2
Comparison Group (N=42) R =0.0011
2
Structured Instruction (SI) + Performance Tasks (PT) (N=57) R =0.0717
2
Structured Instruction (SI) (N=53) R =0.2065
2
Performance Tasks (PT) (N=50) R =0.163
100

90

80

70
Mean final score

60

50
SI + PT Comp. group
40
PT
30
SI
20

10

0
0 20 40 60 80 100
Mean score - General scientific knowledge
Figure 2. Regression analysis and correlation between students’ scores in the ‘General
Scientific Knowledge questionnaire’ and students’ final scores in the complex assessment
task ‘Update Report’

achieved the highest average final score on the complex task. The
regression analysis (R2 =0.0717) implies that the combined intervention
had a similar positive effect on students from different levels. One more
finding that emerges from Figure 2 is that in all intervention groups (SC
groups), the high achievers accomplished similar high scores. This may
imply that low and average achievers require both components of the
‘Scientific Communication’ program to improve their performances.
However, it seems that high achievers can benefit from just one
component of the program: either structured instruction or learning from
practice (the short performance tasks), and are able to acquire the
complementary competencies of scientific communication independently.
Figure 2 shows that even high achievers can hardly acquire scientific
communication skills spontaneously, as indicated by those students who
were part of the comparison group.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

DISCUSSION

In this study we found that JHS students who had learned from the
program ‘Scientific Communication’, or one of its components, improved
their use of professional terminology, and could describe in detail their
performance in situations that required learning skills. We also found that
these students performed significantly better on a complex scientific
literacy-oriented task than students who did not experience any
components of the program. This advantage was apparent in regards to
the ‘scientific communication’ skills, as well as the content knowledge.
Interestingly, students who experienced only short performance tasks,
without any explicit instruction inskills, achieved scores similar to those
who experienced only a structured instruction inskills, in most categories.
The effect of performance tasks on students’ achievements is not
surprising. These findings are in agreement with the approach of
‘assessment for learning’ and ‘formative assessment’. Black & Wiliam
(1998), who surveyed numerous studies and educational innovations,
concluded that a good test could be a learning as well as a testing
occasion. They also reported that programs that included practicing
formative assessment produced meaningful and often substantial gains in
learning. However, effective programs using formative assessment
involve far more than the addition of a few performance tasks to an
existing program. The tasks have to be justified in terms of the learning
goals that they serve, and they can work only if opportunities for students
to communicate their evolving understandings are incorporated into the
planning. Thus, opportunities for students to express their understanding
should be planned for every teaching occasion, because this will initiate
the interaction through which formative assessment aids learning. That is
why it is preferable to integrate into the learning frequent short tasks
rather than infrequent longer ones. The feedback through these assess-
ment tools provides each student with guidance on how to improve
(Black & Wiliam, 1998).
How can we explain this effect in the context of our study? We
designed the short performance tasks so that the complex assessment task
(the post-test) would reflect the skills required for performing the short
tasks. Thus, students who carried out the short tasks learned and practiced
the required competencies that helped them cope with the complex
assessment task.
In other words, the performance tasks provided opportunities for
learning and thus enhanced students’ scientific communication skills. It is
reasonable to assume that if we had tested the students on skills that were
ORNIT SPEKTOR-LEVY ET AL.

not practiced via the performance tasks, their achievement levels would
be lower than those of students who explicitly learned the skills through
‘structured instruction’. This conjecture should be studied further.
Students who experienced only structured instruction of skills, and did
not experience performance tasks, were exposed to this kind of
assessment only in the post-test complex task. Therefore, we believe that
these students had missed the opportunity for ‘assessment for learning’.
Yet, since they did score higher than students who did not receive any
skills instruction (the comparison group), we can conclude that the
structured instruction involved in the program is an important and crucial
component. The combination of the structured instruction of skills and
the practice of short performance tasks results in a synergistic effect and
tends to be the most effective method.
Although the instructional program improved students’ performances,
the level of achievement of students for the complex assessment task was
not high. There are some possible explanations: (1) the criteria and levels
that we defined in the scoring rubric of the ‘Update report’ task may have
been too demanding for JHS students. (2) The duration of our
intervention, 2 years, was probably not long enough to improve student
achievement so that they would reach the very high level of performance
that we anticipated. This explanation is supported by other studies that
emphasize the notion that attainment of skills must take place throughout
the school years, from the early stages of elementary school, until the
more academic studies stage (Gibbs, 1981; Hogan, 1999; Zohar & Dori,
2003). (3) The nature of performance-based assessment: different
assessment formats may require different competencies of students.
Ruiz-Primo et al. (2002) discussed the sensitivity of different assessment
methods in measuring scientific literacy. They designed a multilevel-
multifaceted approach in order to measure the extent, structure, and
precision of declarative, procedural, and strategic knowledge at different
periods, beginning from the time that a curriculum was implemented.
Such an approach may be considered in future studies. (4) The initial
level of students’ relevant skills was lower than we expected. Note that
according to international standard tests that were recently given in our
country (TIMMS, 2003; PISA, 2003), JHS students have difficulties in
basic skills such as reading comprehension and learning the main ideas
from texts. These findings support the fourth explanation and add to the
importance and relevance of teaching ‘Scientific Communication’ skills.
Although the complex assessment task assessed students’ final
products, we lacked information on the actual strategies and techniques
students used while implementing the skills. A clue to this can be found
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

from students’ descriptions of their strategies and skills in certain learning


activities (the learning situation questionnaire). Students who had
experienced structured instruction seemed to possess a better ability to
explain and to reflect on their scientific communication strategies than
those who had experienced performance tasks only. Other studies in the
literature show that structured and explicit instruction do enhance
students’ abilities (Zohar & Nemet, 2002; Patronis, Potari, & Spiliotopoulou,
1999; Simonneaux, 2001). It is possible that the explicit and structured
instruction, using the scientific communication materials, contributed to
students’ awareness of their strategies and to their meta-cognitive
capabilities regarding these skills. Further research is needed in this
area.
Our general model for skills instruction was designed to address the
needs and level of all JHS students. This approach reflects the rationale
that scientific literacy education should be imparted to all students as
future citizens (American Association for the Advancement of Science,
1993; National Research Council, 1996). Therefore it was important to
examine the applicability of the model to students with different learning
achievements. The results of our research indicate that the general model
we offer for skills instruction did improve the performance of all students,
low, average, and high-achievers. However, experiencing only one
component of this model contributed mainly to average and high-
achieving students. These findings are in agreement with other studies
that found that low achievers encountered difficulties in attaining higher-
order skills and further instruction and scaffolding designed especially for
their capabilities were needed (Onwuegbuzie, Slate, & Schwartz, 2001;
Zohar & Dori, 2003; Rivard and Straw, 2000).

CONCLUDING REMARKS

Improvement of students’ performances as a result of the structured


instruction of scientific communication skills and performance tasks has
important implications. Structured instruction inskills supported by a
well-defined instructional model, as described here, is not widespread. On
the contrary, science curricula often declare the development of skills as a
central goal but provide minimal guidance regarding how to teach the
skills in class. There is a hidden assumption that the acquisition of skills
takes place spontaneously as a consequence of the different learning
activities in class. Our study shows that developing learning skills may
occur to some extent without any formal instruction. However, formal
ORNIT SPEKTOR-LEVY ET AL.

and systematic teaching of skills can make a significant difference. The


explicit instruction of skills integrated into scientific topics, the
opportunities to implement the skills in different contexts, the role of
performance tasks as ‘assessment for learning’—are all features important
and necessary for improving students’ scientific literacy.
Our model of skills instruction can be applied to the instruction of
other high-order and advanced skills such as thinking skills, inquiry, and
problem-solving skills. Thus, our general model has the potential to
enable teachers and educators to promote the acquisition of scientific
literacy. Its application can lead to the realization of the central goals of
science education: literate students possessing scientific knowledge.

ACKNOWLEDGEMENTS

We would like to thank Prof. Joe Krajcik from the University of Michigan
for his insightful and thoughtful comments and for the illuminating
discussions.

NOTE
1
The ‘Scientific Communication’ program was adopted and published in the UK,
where it is known as LSS – ‘Learning Skills for Science’.

REFERENCES

Agresti, A. (1990). Categorical data analysis. New York: Wiley.


American Association for the Advancement of Science (AAAS) (1990). Science for all
Americans: Project 2061. New York: Oxford University Press.
American Association for the Advancement of Science (1993). Benchmarks for science
literacy. New York: Oxford University Press Available at: http://www.project2061.org/
publications/bsl/online/bolintro.htm.
Anderson, R. D., & Helms, J. V. (2001). The ideal of standards and the reality of schools:
Needed research. Journal of Research in Science Teaching, 38, 3–16.
Bangert-Drowns, R. L., Hurley, M. M., & Wilkinson, B. (2004). The effects of school-
based writing-to-learn interventions on academic achievement: A meta-analysis. Review
of Educational Research, 74, 29–58.
Bennett, S. W. (1999). Skills taxonomy driver for designing an independent learning
course in environmental chemistry. Pure and Applied Chemistry, 71, 851–857.
Berliner, D. C. (1992). Redesigning classroom activities for the future. Educational
Technology, 32, 7–13.
Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through
classroom assessment. London: School of Education, King’s College.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

Bol, L., & Strage, A. (1996). The contradiction between teacher’s instructional goals and their
assessment practices in high school biology courses. Science Education, 80, 145–163.
BSCS (1993). Developing biological literacy pp. 107–124. Dubuque, Iowa: Kendall/Hunt.
Bybee, R. W. (1977). Achieving scientific literacy: From purpose to practice. Portsmouth,
NH: Heinemann.
Bybee, R. W., & Ben-Zvi, N. (1998). Science curriculum: transforming goals to practices.
In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of science education (pp.
487–498). Dordrecht: Kluwer Academic Publishers.
Campbell, B., Kaunda, L., Allie, S., Buffler, A., & Lubben, F. (2000). The
communication of laboratory investigations by university entrants. Journal of Research
in Science Teaching, 37, 839–853.
Castello, M., & Monereo, C. (1999). Teaching learning strategies in compulsory secondary
education. 8th European Conference for Research on Learning and Instruction, Sweden.
Champagne, A. B., Lovitts, B. E., & Callinger, B. J. (Eds.) (1990). This year in school
science. 1990: Assessment in the service of instruction. Washington, DC: American
Association for the Advancement of Science.
Coll, R. K., Taylor, N., & Lay, M. C. (2008). Scientists’ habits of mind as evidenced by
the interaction between their science training and religious beliefs. International Journal
of Science Education, 1–31, iFirst Article. Available at: http://pdfserve.informaworld.
com/82088_902013943_787688349.pdf.
Davidson, J. E., & Sternberg, R. J. (1985). Competence and performance in intellectual
development. In E. D. Neimark, R. De Lisi, & J. L. Newman (Eds.), Moderators of
competence (pp. 43–76). Hillsdale, NJ: Lawrence Erlbaum Associates.
DeBoer, G. E. (2000). Scientific literacy: Another look at its historical and contemporary
meaning and its relationship to science education reform. Journal of Research in
Science Teaching, 37, 582–601.
Dienes, Z., & Berry, D. (1997). Implicit learning: Below the subjective threshold.
Psychonomic Bulletin & Review, 4, 3–23.
Edelson, D. C. (1998). Realising authentic science learning through the adaptation of
science practice. In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of
science education (pp. 317–331). Dordrecht: Kluwer Academic Publishers.
Erickson, G., & Meyer, K. (1998). Performance assessment tasks in science: What are
they measuring? In B. J. Fraser, & K. G. Tobin (Eds.), International handbook of
science education (pp. 761–789). Dordrecht: Kluwer Academic Publishers.
Eylon, B., & Linn, M. C. (1988). Learning and instruction: an examination of four research
perspectives in science education. Review of Educational Research, 58, 251–301.
Fleiss, J. L. (1981). Statistical methods for rates and proportions. New York: Wiley.
Gibbs, G. (1981). Teaching students to learn: A student-centered approach. Great Britain:
Open University.
Garvey, W. D., & Griffith, B. C. (1972). Communication and information processing
within scientific disciplines: Empirical findings for psychology. Information Storage
and Retrieval, 8, 123–126.
Gauld, C. F. (1982). The scientific attitude and science education: A critical reappraisal.
Science Education, 66, 109–121.
Gauld, C. F. (2005). Habits of mind, scholarship and decision making in science and
religion. Science & Education, 14, 291–308.
Gauld, C. F., & Hukins, A. A. (1980). ‘Scientific attitudes: A review. Studies in Science
Education, 7, 129–161.
ORNIT SPEKTOR-LEVY ET AL.

Hara, K. (1997). A comparison of three methods of instruction for acquiring information


skills. Educational Research, 39, 271–286.
Hogan, K. (1999). Thinking aloud together: a test of an intervention to foster students’
collaborative scientific reasoning. Journal of Research in Science Teaching, 36, 1085–1109.
Hudgins, B. B., Riesenmy, M. R., Mitchell, S., Klein, C., & Navarro, V. (1994). Teaching
self-direction to enhance children’s thinking in physical science. Journal of Educational
Research, 88, 15–27.
Hurd, J. M. (2000). The transformation of scientific communication: A model for 2020.
Journal of the American Society for Information Science, 51(14), 1279–1283.
Kerstiens, G. (1998). Studying in college, then & now: An interview with Walter Pauk.
Journal of Developmental Education, 21, 20–24.
Kirkwood, M. (2000). Infusing higher-order thinking and learning to learn into content
instruction: a case study of secondary computing studies in Scotland. Journal of
Curriculum Studies, 32, 509–535.
Klein, D. P. (2006). The challenges of scientific literacy: From the view point of second
generation cognitive science. International Journal of Science Education, 28(2–3), 143–178.
Krajcik, J. S., Blumenfeld, P. C., Marx, R. W., Bass, C. M., & Fredricks, J. (1998).
Inquiry in project-based science classrooms: Initial attempts by middle school students.
Journal of the Learning Sciences, 7, 313–350.
Kulm, G., & Malcom, S. M. (1991). Science assessment in the service of reform.
Washington, DC: American Association for the Advancement of Science.
Laugksch, R. C. (2000). Scientific literacy: A conceptual overview. Science Education,
84, 71–94.
Linn, M. C., diSessa, A., Pea, R. D., & Songer, N. B. (1994). Can research on science
learning and instruction inform standards for science education? Journal of Science
Education and Technology, 3, 7–15.
Merton, R. K. (1976). Sociological ambivalence and other essays. New York: Free Press.
National Research Council (1996). National science education standards. Washington,
DC: National Academy Press.
Onwuegbuzie, A. J., Slate, J. R., & Schwartz, R. A. (2001). Role of study skills in graduate-
level educational research courses. The Journal of Educational Research, 94, 238–246.
Oosterhuis-Geers, J. (1993). PROcedure to promote effective and efficient study skills
(PROPES) with PA-students. Paper presented at the Annual Meeting of the American
Educational Research Association. Atlanta, GA.
Patronis, T., Potari, D., & Spiliotopoulou, V. (1999). Students’ argumentation in decision
making on socio-scientific issue: Implications for teaching. International Journal of
Science Education, 21, 745–754.
PISA (2003). Available at: http://www.pisa.oecd.org/science/struct.htm.
Rivard, L. P., & Straw, S. B. (2000). The effects of talk and writing on learning science:
An exploratory study. Science Education, 84, 566–593.
Roth, W. M., & Roychoudhury, A. (1993). The development of science process skills in
authentic contexts. Journal of Research in Science Teaching, 30, 127–152.
Ruiz-Primo, M. A., Shavelson, R. J., Hamilton, L., & Klein, S. (2002). On the evaluation
of systemic science education reform: Searching for instructional sensitivity. Journal of
Research in Science Teaching, 39, 369–393.
Scherz Z., Eylon, B., & Bialer, L. (2008). Professional Development in Learning Skills for
Science (LSS): the use of Evidence-Based Framework. International Journal of
Research in Science Education, 30, 643–668.
TEACHING SCIENTIFIC COMMUNICATION SKILLS IN SCIENCE STUDIES

Scherz, Z., Michman, M., & Tamir, P. (1985). Preparing academically disadvantaged
students. Journal of College Science Teaching, March-April, 395–401.
Scherz Z., Spektor-Levy, O., & Eylon, B. (2005). Scientific Communication: An
instructional program for high-order learning skills and its impact on students’
performance. In: K. Boersma, M. Goedhart, O. de-Jong & H. Eijkelhof (Eds.),
Research and the Quality of Science Education (pp. 231–243). Netherlands: Springer.
Schneider, R. M., Krajcik, J., Marx, R. W., & Soloway, E. (2002). Performance of
students in project-based science classrooms on a national measure of science
achievement. Journal of Research in Science Teaching, 39, 410–422.
Shamos, M. H. (1995). The myth of scientific literacy. Rutgers, NJ: Rutgers Univ. Press.
Simonneaux, L. (2001). Role-play or debate to promote students’ argumentation and
justification on an issue in animal transgenesis. International Journal of Science
Education, 23, 903–927.
Spektor-Levy, O., Eylon, B. & Scherz, Z. (2008). Teaching communication skills in
science: Tracing teacher change. Teaching and Teacher Education, 24, 462–477.
Squire, J. (1983). Composing and comprehending: Two sides of the same basic process.
Language Arts, 60, 581–589.
Straka, G. A., Nenniger, P., Spevacek, G., & Wosnitza, M. (1996). A model for motivated
self-directed learning. Education, 53, 19–29.
TIMMS (1999). Science Items. Available at: http://isc.bc.edu/timss1999i/pdf/t99science_
items.pdf.
TIMMS (2003). Available at: http://nces.ed.gov/timss/.
Wiggins, G. (1998). Educative assessment: Designing assessments to inform and improve
student performance. San Francisco, Calif.: Jossey-Bass.
Wolf, D. P. (1993). Assessment as an episode of learning. In R. E. Bennet, & W. C. Ward
(Eds.), Construction versus choice in cognitive measurement. NJ: Lawrence Erlbaum.
Yore, D. L., Bisanz, L. G., & Hand, M. B. (2003). Examining the literacy component of
science literacy: 25 years of language arts and science research. International Journal of
Science Education, 25, 689–725.
Zohar, A., & Dori, Y. J. (2003). Higher order thinking skills and low-achieving students:
Are they mutually exclusive? The Journal of the Learning Sciences, 12, 145–181.
Zohar, A., & Nemet, F. (2002). Fostering students’ knowledge and argumentation
shills through dilemmas in human genetics. Journal of Research in Science
Teaching, 39, 35–62.

Ornit Spektor-Levy
Science Education, The School of Education
Bar Ilan University
Ramat Gan, 52900, Israel
E-mail: levyo@mail.biu.ac.il

Bat-Sheva Eylon and Zahava Scherz


The Department of Science Teaching
The Weizmann Institute of Science
Rehovot, Israel
E-mail: bat-heva.eylon@weizmann.ac.il
bat-sheva.eylon@weizmann.ac.il
E-mail: zahava.scherz@weizmann.ac.il

Вам также может понравиться