Вы находитесь на странице: 1из 31

Reading Psychology, 29:86–115, 2008

Copyright 
C Taylor & Francis Group, LLC

ISSN: 0270-2711 print / 1521-0685 online


DOI: 10.1080/02702710701853625

READING COMPREHENSION: EFFECTS OF


INDIVIDUALIZED, INTEGRATED LANGUAGE ARTS
AS A READING APPROACH WITH STRUGGLING READERS

DANA G. THAMES, CAROLYN REEVES, RICHARD KAZELSKIS,


and KATHLEEN YORK
The University of Southern Mississippi
CHARLOTTE BOLING
University of West Florida
KAVATUS NEWELL
University of Mary Washington
YING WANG
Mississippi Valley State University

This study examined the effects of individualized, integrated language arts as


a reading approach on struggling readers’ comprehension scores obtained from
oral narrative, silent narrative, and silent expository passages at three levels:
below-grade, on-grade, and above-grade levels. Students (N = 93) in grades
four through eight, who were reading below grade level, participated in the study.
Treatment group students (n = 51) received individualized, integrated language
arts as a reading approach once a week in place of basal reading instruction.
Comparison group students (n = 42) received basal reading instruction for the
duration of the study. Multivariate analysis of covariance was used to analyze
posttest Analytical Reading Inventory (ARI) comprehension scores. Several sta-
tistically significant ( p < .001) differences in comprehension performance were
found for on-grade-level scores and for above-grade-level scores, but few differences
were found between treatment and comparison groups on below-grade-level scores.
All statistically significant differences favored students in the treatment group.
The findings of the study strongly suggest that the use of individualized, inte-
grated language arts as a method for teaching reading is an effective approach
for improving the reading comprehension performance of struggling readers.

The research reported herein was partially funded by Grant Number H326L000001
from the U.S. Department of Education, Office of Special Education Programs.
Address correspondence to Dr. Dana G. Thames, Professor of Literacy, The University
of Southern Mississippi, Department of Curriculum, Instruction, & Special Education, 118
College Drive #5057, Hattiesburg, MS 39406-0001. E-mail: dana.thames@usm.edu

86
Struggling Readers’ Comprehension 87

The debate over which instructional reading approach best


promotes reading comprehension continues. Historically, the de-
bate has focused on traditional approaches versus holistic, student-
centered approaches. Holistic approaches for teaching reading are
characterized by instruction that integrates “. . . speaking, listening,
writing, and reading into a unified approach to literacy instruc-
tion . . . to make conscious the connection between the student’s
emotional and personal life and the materials being presented”
(Harris & Hodges, 1995, p. 108), while traditional approaches cen-
ter around the use of a commercially produced program, such as
a basal reading program, which usually includes graded student
texts (readers), workbooks, teaching manuals, and supplemental
materials for use in developmental reading instruction.
Proponents of holistic, student-centered reading instruction
view reading as a meaning-making process (Goodman, 1984;
Weaver, 1990). Also, the National Reading Panel, who assessed
the status of reading research as well as the impact of various ap-
proaches used to teach children to read, emphasized that reading
is a meaning-making process since comprehension “. . . requires
an intentional and thoughtful interaction between the reader and
the text” (National Institute of Child Health and Human Develop-
ment [NICHD], 2000, p. 13.). Although holistic, student-centered
pedagogy is rooted in a strong tradition supported by reading
research (Goodman, 1989; Stephens, 1990), empirical evidence
from comparative studies of holistic approaches and basal reading
programs has not consistently favored holistic approaches over
basal reading programs, and vice versa, in terms of students’ read-
ing performance (Stahl, McKenna, & Pagnucco, 1994; Stahl &
Miller, 1989). In truth, both traditional approaches and holistic,
student-centered approaches have enjoyed equally modest advan-
tages in different contexts (Bottomley, Truscott, Marinak, Henk,
& Melnick, 1999).
In 1997, a panel funded by the NICHD issued a report enti-
tled Preventing Reading Difficulties in Young Children (Snow, Burns,
& Griffin, 1998). The report called for a “balanced approach”
to reading instruction, which is currently referred to as com-
prehensive reading instruction. Literacy educators who support
a balanced approach claim that it combines the best practices
from both traditional approaches and holistic, student-centered
approaches for teaching reading (Blair-Larsen & Williams, 1999).
88 D. G. Thames et al.

As a proponent of balanced reading instruction, Adams (2001)


urges teachers to help students learn to read “through reading,
and writing, and spelling, and language play, and conceptual ex-
ploration, and all manner of engagement with text, in relentless
enlightened balance” (pp. 314–315). Teachers wishing to follow
Adams’ advice need a clear understanding of what constitutes a
balanced approach to reading instruction. Unfortunately, agree-
ment among educators on a single definition of the balanced ap-
proach does not exist. Nevertheless, educators do have a set of
principles to inform their use of balanced reading instruction.
In 1993, the International Reading Association’s Balanced Read-
ing Instruction Special Interest Group adopted seven principles
of balanced instruction (Blair-Larsen & Williams, 1999). A close
examination of the principles reveals the embodiment of most of
the components of an integrated language arts approach to lit-
eracy instruction as well as holistic strategies, such as the use of
trade books as well as published teaching materials; the use of
authentic assessments in addition to norm-referenced standard-
ized tests; the integration of all the language arts processes (i.e.,
listening, speaking, reading, writing, viewing, and visual represen-
tation) within the context of reading instruction; the need for
teacher flexibility; and the importance of individualizing instruc-
tion to meet the needs of each student (Blair-Larsen & Williams,
1999).
Despite the fact that balanced reading instruction is sup-
ported by the International Reading Association, some schools
have shifted toward direct instruction characterized by teacher-
centered reading approaches and the use of standardized,
norm-referenced assessments. Holistic, student-centered ap-
proaches, such as the integrated language arts approach to
teaching reading, are often dismissed or ignored as a result of
pressures from high-stakes testing and teacher overload, largely
because student-centered approaches are viewed as placing more
demands on teachers’ preparation time and requiring more in-
structional time than do streamlined, direct approaches. A review
of the literature related to the use of integrated language arts as an
approach for teaching reading suggests that it may be an effective
alternative to traditional approaches especially for students who
struggle to read (Clary, 1992; Eldredge & Butterfield, 1986;
Thames & Reeves, 1994; Walmsley & Walp, 1990), but empirical
Struggling Readers’ Comprehension 89

research is needed to determine the extent to which such an


approach may, or may not, influence reading comprehension
performance.
It may be that stronger interest in using student-centered ap-
proaches for reading instruction would be stimulated by research
that identifies ways in which data from informal measures may be
used to both inform and assess instruction that is student-centered.
The Informal Reading Inventory (IRI), which has a long-standing
tradition in reading assessment (Jongsma & Jongsma, 1981), has re-
ceived minimal attention as a measure of reading comprehension
in published studies designed to examine the effects of instruc-
tional approaches on reading comprehension. An IRI provides in-
formation about the reader’s “. . . strengths, weaknesses, and strate-
gies in word identification and comprehension” based on both oral
and silent reading performance (Harris & Hodges, 1995, p. 116).
When the goals of classroom reading instruction are (a) to deter-
mine reading levels of students so that texts of suitable difficulty
can be selected for instruction; (b) to evaluate individual progress
in reading so that specific abilities and instructional needs of stu-
dents may be identified; and (c) to determine which students may
be grouped together for small-group instruction that focuses on
their particular literacy needs, the data obtained from IRIs will be
of great value to the classroom teacher (Anderson, 1977a; Caldwell,
1985; Gerke, 1980; Helgren-Lempesis & Mangrum, 1986; Henk,
1987; Jongsma & Jongsma, 1981). Although teachers often choose
to make their own IRIs by using passages from basal texts, com-
mercially prepared instruments are available, and according to
Anderson (1977b), commercial IRIs provide “more valid, more
reliable, and potentially more useful diagnostic information than
teacher-made IRIs” (p. 99). Among the advantages of commercial
IRIs are the inclusion of multiple reading selections for each grade
level, evaluation of the student’s prior knowledge about the topic,
evaluation of the student’s recall and understanding of a passage
through retelling, and evaluation of the student’s ability to answer
specific kinds of comprehension questions (ranging from literal
to inferential comprehension).
Because there is a large population of poor readers (Fletcher
& Lyon, 1998), many of whom are being taught with traditional
approaches, such as basal reading instruction, and because few in-
structional programs are using diagnostic information from IRIs
90 D. G. Thames et al.

to inform instruction, there is a need for research studies which ex-


plore the effects of using individualized (student-centered), holis-
tic approaches to literacy learning based on diagnostic informa-
tion. Further, few empirical studies have been located that focus
on improving the reading performance of student samples, partic-
ularly samples comprised primarily of African American students,
by using assessment information for the purpose of informing liter-
acy instruction via integrated language arts as a reading approach.
It appears that studies designed to examine the effectiveness of
individualized (student-centered), holistic approaches to literacy
learning, using diagnostic information both to inform and to assess
the impact of such approaches, are needed.
The purpose of this study was to determine the effects of
using individualized, integrated language arts as a reading ap-
proach on the comprehension performance of struggling read-
ers as measured by the Woods and Moe (1999) sixth edition of
the Analytical Reading Inventory (ARI). It was anticipated that the
individualized, integrated language arts approach would improve
the comprehension scores of students who comprised the treat-
ment group. Our specific research questions were (a) What are
the effects of an individualized, integrated language arts approach
to reading on struggling readers’ total comprehension scores ob-
tained from oral narrative, silent narrative, and silent expository
passages at three levels: below-grade, on-grade, and above-grade
levels? and (b) What are the effects of an individualized, inte-
grated language arts approach to reading on struggling readers’
comprehension scores for each of four types of comprehension
questions at three levels: below-grade, on-grade, and above-grade
levels?
Individualized, integrated language arts as a reading ap-
proach, as used in this study, is defined as literacy instruction that
engages students in reading, writing, listening, speaking, viewing,
and visual representation activities designed to meet their individ-
ual reading needs, using selected trade books and expository texts
related to the topical interests of the students. Basal reading in-
struction, which was used exclusively with the comparison group
in this study, is defined as teacher-directed lessons based on a col-
lection of student texts and workbooks, teacher’s manuals, and
supplemental materials designed to teach developmental reading
in the elementary grades.
Struggling Readers’ Comprehension 91

Method

Participants

Prior to the study, 110 students in grades four through eight were
randomly selected from a list of students identified by classroom
teachers as being in the lowest quartile in reading performance
based on results from the reading subtest of the norm-referenced
assessment mandated by the state. The 110 students were randomly
assigned to treatment and comparison groups. At the end of the
study, complete data were not available for 17 students so their data
were excluded from analysis. Thus, a total of 93 students remained
in the study for its duration.
The treatment group (n = 51) consisted of 24 females and
27 males; 49 of the students were African American and two were
European American. The comparison group (n = 42) consisted
of 13 males and 29 females; 26 of the students were African Ameri-
can students, 8 were European American, and racial identification
was not designated for 8 students. The distribution by gender and
ethnicity across grade levels was similar for the treatment and com-
parison groups.
The predominately African American sample attended a low-
socioeconomic neighborhood school located in the southeastern
region of the United States. One additional student, in grade five,
came from a nearby school. By grade level, 25.8% of the students
were in grade four, 29.0% were in grade five, 21.5% were in grade
six, 10.8% were in grade seven, and 12.9% were in grade eight. At
the time of this study, basal reading instruction was being used in
the schools.

Instrument

This study used students’ scores on comprehension passages from


the Analytical Reading Inventory (ARI), developed by Woods and
Moe (1999), as pre- and post-measures. The ARI is a comprehensive
assessment instrument used to measure the reading performance
and behaviors of students in grades one through nine. Adminis-
tered individually, the ARI is designed to allow the examiner to
observe, analyze, and record data about a reader’s word recogni-
tion performance (word lists), reading miscues, comprehension of
92 D. G. Thames et al.

narrative and expository passages, fluency, and personal behaviors


exhibited during assessment. In this study, only the comprehen-
sion data from the ARI were used.
Forms A, B, and C of the ARI are alternate forms that may be
used to assess narrative reading, with the form selected to assess
oral narrative reading being eliminated as a choice for assessing
silent narrative reading (e.g., if Form A is used for the oral read-
ing assessment, then Form B or C is used for the silent reading
assessment). Forms S and SS of the ARI are alternate forms that
may be used to assess expository reading related to science and so-
cial studies topics, respectively, with each form containing graded
expository passages ranging in difficulty from Level 1 to Level 9.
For all forms of the ARI, each narrative and expository passage
is followed by six to eight comprehension questions designed to
assess four types of comprehension: Retells in Fact (RIF), Puts In-
formation Together (PIT), Connects Author and Reader (CAR),
and Evaluates and Substantiates (EAS). The RIF questions are lit-
eral and the answers require explicitly stated facts from the text.
An example of a RIF question is “Who are the characters in the
story?” The PIT questions are both literal and implied. To answer
them correctly, the reader must combine two or more explicitly
stated facts located either next to one another or in different
places in the passage. Also, PIT questions may require the reader
to put together explicitly stated facts with information implied
in the passage. An example of a PIT question is “What is John’s
problem?” To answer the CAR questions correctly, the reader must
combine her prior knowledge with the author’s choice of words
and phrases used in the passages. An example of a set of CAR
questions is “What do you know about the word sympathy? What
does sympathy have to do with the story?” To answer the EAS ques-
tions correctly, the reader must make a judgment, generate an
opinion, generate an emotional response, or make a prediction
based upon inference drawn from the passage. Also, the reader
must substantiate her response with explicit information from
the passage. An example of a set of EAS questions is “How do
you think Mark’s grandmother felt? You think this because. . . ?”
Each type of comprehension question is represented one or more
times following each passage, so that all four types of compre-
hension are assessed after the reader has completed reading the
passage.
Struggling Readers’ Comprehension 93

Before the student reads an entire ARI passage (for both oral
and silent reading assessments), the proctor tells the student to
read the title and the first two sentences of the passage, and then
the proctor asks the student to predict what the rest of the passage
will be about. After the student finishes reading a passage (for both
oral and silent reading assessments), the proctor says to the stu-
dent, “Retell everything that you can remember from the passage,
and I will write down what you say.” The retelling is followed by
the proctor asking a set of six to eight comprehension questions,
representing RIF, PIT, CAR, and EAS types of comprehension; the
proctor marks on his or her copy a plus sign (+) beside questions
answered correctly, a minus sign (−) beside questions answered
incorrectly (as well as writing the student’s incorrect response to
the question), and writes 1/2 beside two-part questions when half of
the question is answered correctly. Scores for comprehension are
obtained by counting the number of incorrect responses to ques-
tions; a scoring guide for each passage is provided to indicate the
designated number of incorrect responses associated with each of
three reading comprehension levels: independent (0–1 incorrect
response), instructional (1–2 incorrect responses), and frustration
(3–4+ incorrect responses).
For the purposes of our study, the modified version of the
ARI described by Woods and Moe (1999) was used to assess read-
ing comprehension, which included below-grade, on-grade, and
above-grade levels of comprehension assessment. Thus, the com-
prehension data consisted of scores for total comprehension and
for type-of-question comprehension, calculated as the percentage
correct per passage (i.e., oral narrative, silent narrative, and silent
expository) at each of the graded levels (below-grade, on-grade,
and above grade) for each student. Each total comprehension
score represented the percentage of correct responses for the en-
tire set of questions following each passage. Each type-of-question
comprehension score represented the percentage of correct re-
sponses for each of the four question types (RIF, PIT, CAR, and
EAS) per passage.
While the ARI administrator’s manual does not provide di-
rect information about score reliability and validity, it does con-
tain information about the instrument’s development and valida-
tion. The narrative passages of the ARI were written, field tested,
computer analyzed, and revised several times. The content of the
94 D. G. Thames et al.

narrative passages is based on the reading interests of students at


various grade levels, and the writing style of the passages is grade ap-
propriate. To maintain the integrity of the assessment instrument,
the same topics occur across alternate forms for each graded level.
For example, all three passages at Level 9 are about meeting critical
environmental needs within communities. This consistency allows
examiners to use alternate forms with some degree of confidence.
The vocabulary used in the narrative passages was governed by a
computer analysis of the number of difficult words, using Readabil-
ity Calculations (Micro Power and Light, 1995a). The expository
passages represent the style of writing found in science and social
studies textbooks. The science and social studies passages were se-
lected from various textbooks used in grades one through nine.
Readability formulas were used to establish grade-level validation
for each narrative passage in Forms A, B, and C. In addition, the
extent to which words differ within the narrative and expository
passages was calculated using Vocabulary Diversity Score (Micro
Power and Light, 1995b) procedures to determine the progres-
sive difficulty of the graded passages and to ensure consistency
within grade levels and across forms (Woods & Moe, 1999). The
instrument was field tested, using approximately 200 students in
grades two through eight, and revisions were made in passages and
comprehension questions based on feedback from the students.
Anderson (1977a) evaluated three commercially prepared
IRIs, theClassroom Reading Inventory (Silvaroli, 1976), the Sucher-
Allred Reading Placement Inventory (Sucher & Allred, 1973), and the
Analytical Reading Inventory (ARI) by Woods and Moe (1977). An-
derson found that the ARI had the greatest potential for collect-
ing sound diagnostic data because it was built on the most recent
knowledge from the literature. A search of the literature did not
reveal a more recent critique of commercially produced IRIs than
Anderson’s study (1977a).

Procedures

Administrators of schools in the local area gave permission for the


research to be conducted and supported the researchers’ request
to randomly select and assign students to treatment and compar-
ison groups. Prior to beginning the study, graduate students in
literacy education and preservice teachers who had been trained
Struggling Readers’ Comprehension 95

to administer the ARI conducted the reading comprehension as-


sessments of students in the treatment and comparison groups.
Forms A, B, and S of the ARI were used to assess oral narrative,
silent narrative, and silent expository reading comprehension at
three levels: below grade, on grade, and above grade level; results
from these assessments served as the pretest data for the study.
The week after the 20th treatment session, ARI posttest compre-
hension data were collected using forms B, C, and SS to assess
oral narrative, silent narrative, and silent expository reading, re-
spectively, at three levels: below grade, on grade, and above grade
level.

TREATMENT GROUP
Meetings were held with classroom teachers whose students
would be participating in the study to explain the instructional
treatment approach and to arrange for preservice teachers to visit
their classrooms once per week to observe their assigned students
in the context of the students’ regular reading block instruction.
A total of twenty 90-minute instructional sessions constituted the
treatment period, with 10 once-a-week sessions occurring during
the fall semester and 10 once-a-week sessions occurring during the
spring semester.
For the fall semester, preservice teachers were paired with
students in the treatment group using random assignment pro-
cedures. Each preservice teacher had completed three courses in
literacy instruction and was enrolled in the fourth course, which
focused on reading assessment and instruction and required that
each preservice teacher use diagnostic reading information to plan
lessons and instruct an assigned student within the context of a
school setting.
During the first two weeks of the fall semester, each preservice
teacher studied the results of his assigned student’s ARI data, which
had been obtained prior to the initiation of the study; observed the
assigned student in his classroom and visited with the classroom
teacher; and administered an informal interest inventory and the
Elementary Reading Attitude Survey (McKenna & Kear, 1990) to the as-
signed student. Based on the data obtained from these formal and
informal sources, each preservice teacher prepared a written, de-
tailed, diagnostic analysis of their assigned student’s strengths and
weaknesses; this information was used by the preservice teacher to
96 D. G. Thames et al.

prepare a series of integrated language arts lessons that centered


on the student’s diagnosed reading needs, using trade books and
expository texts on topics that were of interest to the student. All
lesson plans were reviewed by the reading course instructor and
revised by preservice teachers as needed, prior to the lessons being
taught to students.
During the subsequent 10 weeks, preservice teachers met with
their assigned students once a week to provide one-on-one instruc-
tion via the integrated language arts lessons they had designed. All
instructional sessions took place once a week in the school’s cafe-
teria, during the school’s designated 90-minute block for reading
instruction, using reading carrels to create a private space for each
session. Also, each preservice teacher observed once each week in
the classroom of their assigned student during the reading block
period, as treatment-group students continued to receive basal
instruction on all class days except for the once-a-week, individu-
alized instruction.
Near the end of the fall semester, each preservice teacher
wrote a case study report documenting the reading progress made
by his assigned student based on the authentic portfolio of the
student’s responses to the one-on-one instructional sessions; the
report included specific recommendations for continuing the stu-
dent’s growth in reading performance. The case study report was
part of the requirements for completing the course in reading
assessment and instruction; each preservice teacher reviewed the
case study report with the course instructor, who sometimes sug-
gested additional recommendations for continuing the student’s
growth. A copy of the final case study report for each of the treat-
ment students was made available to the classroom teacher and to
the student’s parent or guardian.
For the subsequent spring semester, a new group of preservice
teachers was randomly assigned to the treatment-group students,
thus continuing the individualized, integrated language arts in-
structional treatment to the end of that semester. The new group
of preservice teachers used the case study reports and instructional
recommendations that had been prepared by the fall-semester
preservice teachers to develop integrated language arts lessons
designed to meet their assigned students’ specific reading needs,
thus building on the progress that had been made with students
during the fall semester. For the remainder of the spring semester,
Struggling Readers’ Comprehension 97

the procedures followed by the preservice teachers were the same


as those used during the fall semester, with the exception that ARI
forms B, C, and SS were used for the posttest measure of reading
comprehension performance.
When developing the individualized, integrated language arts
lessons, the preservice teachers followed a specific lesson plan
format that included both narrative (trade book) and expository
(information) texts, used context clues to teach new vocabulary,
and provided before-, during-, and after-reading components that
used listening, speaking, reading, writing, viewing, visual repre-
sentation, and metacognitive activities to enhance comprehen-
sion of the narrative and expository texts. A copy of a sample
lesson plan is presented in Appendix A. For each of the lesson
plans, the topic of the reading materials was selected by the pre-
service teacher based upon the student’s interests and reading
needs as identified through interest and attitude surveys and ob-
servations by the preservice teacher. Additional types of expository
reading materials were often used in the lessons as well, such as
newspapers, magazine articles, and Internet-text selections. For
example, if the student were interested in learning about Native
Americans, the preservice teacher’s lesson might include the fol-
lowing: viewing and talking about paintings of Native Americans
and nature; listening to the teacher read the trade book Brother
Earth, Sister Sky; having the student read a short Internet article
about Chief Seattle; asking the student to tell why he believes
Chief Seattle’s speech is important; writing a biopoem describing
Chief Seattle; and illustrating the poem with a drawing of Chief
Seattle.
Also, instructional strategies such as the Directed Reading-
Thinking Activity (DRTA), List-Group-Label, ReQuest, Graphic
Organizers, and Think Alouds were incorporated into the pre-
service teacher’s lesson plans as part of the course requirements.
Each of the preservice teacher’s lesson plans was examined by the
course instructor prior to being implemented, and the course in-
structor observed a portion of each preservice teacher’s instruc-
tional session each week, making notes about the extent to which
the instructional activities appeared to be meeting the student’s
reading needs. The observational notes were used to provide feed-
back to each preservice teacher on a weekly basis. Also, once a
week the course instructor observed a portion of the basal reading
98 D. G. Thames et al.

instruction offered in those classrooms in which treatment group


students were enrolled.

COMPARISON GROUP
Students in the comparison group received basal reading in-
struction daily from their classroom teachers, during the 90-minute
reading block, for the duration of the study. Reading materials for
the comparison group included the text selections contained in
the basal reading program. Classrooms that contained comparison
students were observed by the course instructor, who made weekly
classroom observations during the reading block period, to ensure
that classroom teachers followed the teacher’s guide which accom-
panied the basal reader series. The basal reading series focused on
the skills of vocabulary acquisition, word recognition, and compre-
hension. Posttest ARI assessments of students in the comparison
group occurred during the same time frame as that of the treat-
ment group and were conducted by graduate students majoring
in literacy education.

Analysis of Data

For the purposes of this study, data were excluded from an anal-
ysis if any data points were missing or if on-grade-level total com-
prehension scores exceeded 75% (this was the case with two stu-
dents), as 75% is usually considered to be an adequate level of com-
prehension. A multivariate analysis of covariance was carried out
for each of the below-grade-level, on-grade-level, and above-grade-
level comprehension measures for oral narrative, silent narrative,
and silent expository passage categories. There were four depen-
dent variables for each analysis (RIF, PIT, CAR, and EAS types of
comprehension). The initial pretest total comprehension score for
the corresponding passage category was used as the covariate in
each analysis.
The dependent variable scores tended to be negatively
skewed, and the means and standard deviations tended to be pro-
portional. Therefore, the data were analyzed in their original form
and using both a square-root and an arcsine transformation.
In order to control the type I error rates across the nine multi-
variate analyses, a Bonferroni adjustment was used. Thus, only the
Struggling Readers’ Comprehension 99

multivariate analyses resulting in associated probabilities less than


.05/9 = .0055 were considered statistically significant.
Statistically significant multivariate findings were followed up
with univariate analyses of covariance, using the initial pretest total
comprehension scores for the corresponding passage categories as
the covariates. Homogeneity of regression was examined for each
of the univariate analyses. To further control type I error rates, the
Bonferroni adjustment for the use of four variables was applied
within each analysis by using the .05/4 = .0125 level.
Additionally, univariate analyses of covariance were carried
out for the total comprehension scores for below-grade, on-grade,
and above-grade levels for each of the three passage categories. The
initial pretest total comprehension scores for the corresponding
passage categories were used as covariates.

Results

Because results of the analyses were virtually identical for the orig-
inal and transformed scores, only the results based on the analyses
of the original scores are reported since these scores are in a more
natural metric. The results of the multivariate analyses of covari-
ance for the original scores are presented in Table 1. Statistically
significant ( p < .05/9 = .0055) multivariate results were found for
on-grade-level and above-grade-level scores for the silent narrative
and silent expository passage scores and for the above-grade-level

TABLE 1 Summary of Results of Multivariate Analyses of Covariance for the


Original Scores

Reading
Category Grade Level F df p Eta-squared

Silent Narrative Below 3.78 4/52


On 6.71 4/51 <.001 .345
Above 8.06 4/46 <.001 .412
Oral Narrative Below 1.33 4/40
On 3.63 4/40
Above 6.71 4/39 <.001 .408
Silent Expository Below 2.38 4/55
On 10.93 4/55 <.001 .443
Above 14.95 4/55 <.001 .521
100 D. G. Thames et al.

TABLE 2 Summary of Results of Univariate ANCOVAs Comparing Treatment


and Control Groups Across Reading Category: Original Scores

Silent Narrative Oral Narrative Silent Expository

Level/Type F p Eta2 F p Eta2 F p Eta2

Below Grade Level


Total 0.91 3.94 6.83 <.01 .105
On Grade Level
RIF 12.96 <.001 .232 19.38 <.001 .250
PIT 1.78 44.26 <.001 .433
CAR 0.03 6.33 <.01 .098
EAS 0.20 10.92 <.01 .158
Total 3.78 17.86 <.001 .249 28.78 <.001 .332
Above Grade Level
RIF 20.11 <.001 .324 27.94 <.001 .363 43.17 <.001 .427
PIT 8.09 <.01 .162 24.91 <.001 .337 6.47 <.01 .100
CAR 0.38 0.64 15.46 <.001 .210
EAS 1.17 6.41 <.01 .116 25.96 <.001 .309
Total 11.38 <.01 .213 25.94 <.001 .346 39.55 <.001 .405

scores for the oral narrative passage. No statistically significant re-


sults were found for the below-grade-level passage scores for any
of the three types of passages. The multivariate eta-squared values
ranged from .345 to .521, indicating large effects across all analyses
(Cohen, 1988).
The results of the follow-up univariate analyses for type-of-
comprehension question scores and total comprehension scores
are presented in Table 2. The assumption of homogeneity of re-
gression was found to hold for all analyses. The reported probabili-
ties are one-tailed, reflecting the a priori prediction of differences
favoring the treatment group. For the silent expository passage,
scores statistically significant differences were found for each of
the four dependent variables (i.e., RIF, PIT, CAR, EAS) for both
the on-grade-level and above-grade-level scores. For the silent nar-
rative passage category, RIF score differences were found for both
the on-grade-level and above-grade-level scores, and PIT score dif-
ferences were found for above-grade-level scores. RIF, PIT, and
EAS score differences were found for above-grade-level scores for
the oral narrative passage category. All statistically significant dif-
ferences favored the treatment group (Tables 3, 4, and 5).
Struggling Readers’ Comprehension 101

TABLE 3 Summary Data: Silent Narrative

Pre Post
Adj. Posta
Exp. Comp. Exp. Comp.
Level/ Exp. Comp.
Type M SD M SD M SD M SD M M

Below Grade Levelb


RIF 33.96 37.79 44.68 29.92 62.96 31.20 41.16 33.43 65.33 39.10
PIT 37.59 34.62 35.71 28.63 51.26 32.25 41.13 31.60 52.35 40.18
CAR 34.26 39.66 45.16 39.31 51.56 40.36 47.06 36.12 54.51 44.94
EAS 32.41 36.57 53.23 48.07 44.44 34.20 51.61 45.61 45.32 50.85
Total 35.85 27.83 44.71 21.52 52.93 22.55 44.61 26.46 54.92 42.88
On Grade Levelc
RIF 34.88 31.31 31.13 32.25 58.56 28.18 24.97 37.88 58.64 24.89
PIT 29.67 31.50 25.30 28.75 38.96 31.08 12.20 26.19 39.01 12.16
CAR 15.74 26.08 24.17 34.42 29.63 33.99 25.83 36.84 29.69 25.78
EAS 23.15 32.20 30.00 45.20 41.67 42.18 26.67 35.92 41.73 26.61
Total 27.26 23.08 27.50 24.49 44.00 20.67 19.83 27.38 44.06 19.78
Above Grade Leveld
RIF 45.00 41.02 27.22 31.21 55.40 32.01 10.19 29.63 52.99 12.42
PIT 26.32 31.31 21.59 31.57 35.32 24.10 5.85 16.60 34.09 6.99
CAR 9.00 22.65 22.22 34.90 17.00 29.51 9.26 27.86 16.32 9.89
EAS 35.00 43.90 23.15 38.56 33.00 38.00 7.41 26.69 30.76 9.49
Total 31.04 25.91 23.00 26.87 36.88 20.10 7.89 20.82 35.25 9.40
a Total pre-scores used as covariate.
bn = 27, ncomp = 31.
exp
exp = 27, ncomp = 30.
cn

exp = 25, ncomp = 27.


dn

For total comprehension scores, statistically significant results


were found for all analyses with the exception of two perfor-
mance levels (i.e., for below-grade-level scores on the silent nar-
rative passage category and the oral narrative passage category
and for on-grade-level scores on the silent narrative passage cat-
egory). Again, all statistically significant results favored the treat-
ment group (Tables 3, 4, and 5). The assumption of homogene-
ity of regression did not hold for the total comprehension scores
for the above-grade-level silent expository reading. Therefore, a
groups-by-trials (treatment, comparison by pretest, posttest) anal-
ysis of variance was carried out for these scores. A statistically signifi-
cant ( p < .01) groups-by-trials interaction was found. Examination
of the cell means indicted that the mean comprehension scores
for the treatment group increased from pre- to post-assessments,
102 D. G. Thames et al.

TABLE 4 Summary Data: Oral Narrative

Pre Post
Adj. Posta
Exp. Comp. Exp. Comp.
Level/ Exp. Comp.
Type M SD M SD M SD M SD M M

Below Grade Levelb


RIF 63.83 23.40 68.91 29.87 78.61 31.46 73.57 29.69 79.87 72.31
PIT 44.57 30.54 52.17 35.29 64.65 26.03 50.65 28.69 64.24 51.07
CAR 32.61 33.23 48.91 36.52 36.96 22.45 42.39 32.36 38.30 41.05
EAS 55.43 39.14 57.61 42.26 63.04 37.59 63.04 43.22 62.66 63.43
Total 48.70 23.43 56.61 24.49 61.35 14.44 56.17 24.94 61.69 55.83
On Grade Levelc
RIF 51.39 27.12 59.78 29.34 72.83 28.68 35.52 40.15 72.76 35.59
PIT 51.47 31.84 39.93 29.62 51.48 26.45 38.83 36.64 51.38 39.93
CAR 23.57 30.63 26.09 28.68 33.70 30.72 31.52 38.60 33.49 31.73
EAS 50.00 33.71 45.65 39.64 41.30 46.84 46.74 45.42 41.27 46.97
Total 44.65 17.18 44.09 18.50 52.61 19.39 37.26 32.76 52.48 37.39
Above Grade Leveld
RIF 49.26 37.12 56.09 40.37 60.87 33.95 21.59 38.03 62.64 19.74
PIT 34.78 33.12 44.73 45.27 42.43 32.84 18.18 32.92 43.60 19.97
CAR 20.65 27.85 35.23 43.41 18.48 27.40 15.91 28.40 19.57 14.77
EAS 30.43 36.12 40.91 45.35 25.00 36.15 15.91 35.81 26.08 14.78
Total 37.83 22.40 43.91 34.94 40.13 22.75 17.68 31.47 41.38 16.37
a Total pre-scores used as covariate.
bn = ncomp = 23.
exp
exp = ncomp = 23.
cn

exp = 23, ncomp = 22.


dn

while the mean comprehension scores of the comparison group


decreased slightly. These results support the viability of the indi-
vidualized, integrated language arts instructional approach used
with the treatment group.

Discussion

This study examined the effects of using individualized, integrated


language arts as a reading approach, offered once a week by pre-
service teachers, on the reading comprehension scores of strug-
gling readers. The results strongly favored students who partici-
pated in the individualized, integrated language arts lessons, with
statistically significant ( p < .001) multivariate effects found for on-
grade-level comparisons on silent narrative and silent expository
Struggling Readers’ Comprehension 103

TABLE 5 Summary Data: Silent Expository

Pre Post
Adj. Posta
Exp. Comp. Exp. Comp.
Level/ Exp. Comp.
Type M SD M SD M SD M SD M M

Below Grade Levelb


RIF 39.70 32.70 39.79 28.19 52.52 25.29 37.50 31.95 52.65 37.40
PIT 26.85 35.30 39.71 42.69 50.33 32.13 39.24 28.70 51.31 38.46
CAR 22.22 28.02 30.15 37.33 48.15 39.18 27.21 36.60 50.18 25.59
EAS 31.48 37.08 32.35 45.86 29.63 34.69 35.29 43.57 30.44 34.65
Total 32.15 20.64 37.71 25.60 46.67 19.22 36.71 20.99 49.41 36.12
On Grade Levelc
RIF 28.11 19.82 24.09 25.28 41.07 38.03 7.82 20.15 41.09 7.81
PIT 14.19 32.19 42.65 44.80 36.56 20.18 4.91 16.74 36.55 4.92
CAR 16.67 26.85 13.97 27.66 24.07 37.00 5.88 17.47 24.08 5.88
EAS 48.15 40.39 14.71 33.78 38.89 37.55 10.29 29.59 38.89 10.20
Total 25.78 16.39 25.88 23.17 35.15 18.62 8.32 19.70 35.15 8.32
Above Grade Leveld
RIF 31.26 33.55 18.15 30.38 27.22 23.38 0.97 5.66 26.37 1.65
PIT 16.37 23.89 23.82 33.22 21.89 28.59 4.91 20.35 21.09 5.54
CAR 8.33 23.00 11.03 23.19 28.70 35.15 2.21 12.86 28.04 2.73
EAS 24.07 37.65 22.06 41.18 40.74 34.77 4.41 18.94 39.90 5.08
Total 22.48 20.88 17.79 22.89 29.56 20.60 2.97 12.45 28.83 3.55
a Total pre-scores used as covariate.
bn = 27, ncomp = 34.
exp
exp = 27, ncomp = 34.
cn

exp = 27, ncomp = 34.


dn

passages and for above-grade-level comparisons on silent narrative,


oral narrative, and silent expository passages. Of the five statisti-
cally significant univariate effects related to on-grade-level compar-
isons, four were on the silent expository passage category. Twelve
of the 15 above-grade-level comparisons of performance on the
three types of passages were statistically significant, and five of
these comparisons of comprehension performance were on the
silent expository passage category. Differences between the two
groups for below-grade-level comprehension appeared to be mini-
mal, since only a statistically significant univariate effect was found
for total comprehension scores.
The findings related to the expository passage category were
particularly promising because, in general, students have more
104 D. G. Thames et al.

difficulty comprehending expository texts (Calfee & Curley, 1984;


McGee & Richgels, 1985; Meyer & Rice, 1984; Pappas, 1993;
Piccolo, 1987; Vacca & Vacca, 1996). This outcome may have oc-
curred because each lesson plan included an expository text re-
lated to the theme of the trade book, with associated language arts
activities that encouraged students to derive meaning through a
range of connected literacy activities; also, the one-on-one instruc-
tional setting provided ample opportunities for language interac-
tions between the preservice teacher and her student. This kind
of instructional approach contrasts with traditional instructional
approaches for teaching science, social studies, or mathematics,
which often lack the support of language arts activities to facili-
tate the learning of subject-matter content and are usually taught
in group settings rather than through one-on-one interactions.
The role of individualized, integrated language arts instruction in
students’ comprehension of expository text deserves further inves-
tigation.
While the results of this study strongly suggest that individu-
alized, integrated language arts instruction as a reading approach
may be effective in improving the comprehension performance
of struggling readers, the findings need to be considered in light
of possible limitations of the study. The students’ attitudes toward
reading could have influenced their performance. Since all the stu-
dents in this study needed help with reading, it is possible that some
students’ attitudes about reading were negative, causing them to
put forth less effort during the assessment procedures. Also, it is
important to keep in mind that we sampled reading behavior on
a specific day and time, and Jongsma and Jongsma (1981) remind
us that, “On another set of passages, given on another day, you
might get different results” (p. 704). Another limitation is that the
inclusion of zero scores on comprehension questions, obtained by
students who attempted to read passages that were at their frus-
tration level, may have influenced the results, although both the
treatment and comparison groups contained a few students whose
scores were zero on various types of comprehension questions.
Since the design of our study included below-grade-, on-grade-,
and above-grade-level assessments of struggling readers, there were
some passages that were at the frustration level for some students.
Finally, it was not possible to control for passage familiarity. Dur-
ing the administration of the ARI, it was observed that some of
Struggling Readers’ Comprehension 105

the students in both the treatment and comparison groups were


familiar with the content of two of the ARI passages because these
readings were based on the storylines of books that are popular
among children and young adults (i.e., The Incredible Journey and
The Outsiders). Students’ prior knowledge of the content of these
books may have positively impacted their comprehension scores
on these passages.
Although it is clear that the use of individualized, integrated
language arts instruction as a reading approach had a positive
impact on the comprehension of treatment-group students, we ac-
knowledge that this approach may not have been the sole factor
influencing comprehension performance. A large part of the suc-
cess of this study may have come from the meaningful, one-on-one
interactions between the student participant and her preservice
teacher. Meaning-making is a process that is facilitated by such so-
cial interaction (Vygotsky, 1962). Some educators may complain
that it is too daunting a task to expect teachers to individualize in-
struction and to give one-on-one attention to students, yet the lit-
erature supports the need for individualized instruction for those
who struggle with school learning (Blair-Larsen & Williams, 1999;
VanSciver, 2005). The findings of this study imply, however, that
through the use of certain methodologies it is possible for teach-
ers to individualize lessons and facilitate meaningful social inter-
actions during reading instruction. For example, after adminis-
tering and interpreting an informal reading inventory and inter-
viewing students to determine their interests, teachers can group
students into small learning communities based on their reading
instructional needs and topical interests. Students in these collab-
orative teams can work on lessons designed to meet their reading
needs as they engage in teacher-recommended activities that in-
volve listening, speaking, reading, writing, and critical thinking
experiences at their current levels of performance. These kinds
of meaningful group interactions allow each student to make con-
tributions based on her strengths, thus performing as a learning
community in which much is learned from one another. While
students are working with their collaborative teams, teachers can
pull individual students away from the team for 10–15 minutes
to give one-on-one attention to a specific literacy need that was
identified by the student’s performance on the informal reading
inventory.
106 D. G. Thames et al.

Another important implication of this study is that preservice


teachers can be successfully trained to administer and use informal
reading inventories to diagnose reading needs and to construct lit-
eracy lessons centered around the literacy needs of students. We
believe that with the population of poor readers continuing to in-
crease (Gonzalez, Brusca-Vega, & Yawkey, 1997; Graves, Van den
Broek, & Taylor, 1996), institutions of higher learning have a re-
sponsibility to train future literacy educators to effectively adminis-
ter and interpret informal assessments to meet the individual read-
ing needs of students in their classrooms and to equip preservice
teachers with the skills needed to create and provide alternatives
to traditional reading instruction for those students who have not
experienced much success with traditional approaches.
The findings of this study contribute in three ways to the lit-
erature related to the use of individualized, integrated language
arts instruction as a reading approach and the use of informal as-
sessments. First, the strongest implication of the findings of this
study is that individualized, integrated language arts instruction as
a reading approach designed to meet the student’s reading needs
appears to enhance the reading comprehension of struggling read-
ers. The findings underscore the benefits of reading instruction
that encourages students to refine their abilities to think critically,
talk, listen, read, write, and visually represent ideas from narrative
and expository texts, and the results support the use of differen-
tiated instruction for students based on assessment information.
Since our sample was comprised mostly of African American stu-
dents from low socioeconomic neighborhoods, the findings make
a significant contribution to the literature related to effective in-
structional approaches for students with similar backgrounds.
Second, the importance of designing reading instruction for
struggling readers based on the results of diagnostic, informal as-
sessments was clearly indicated by our findings. For example, if a
student scored low on a particular type-of-comprehension ques-
tion (e.g., RIF, PIT, CAR, or EAS) across passages, then the pre-
service teacher’s lesson plans would include activities and teacher-
student interactions that focused on strategies for improving the
student’s understanding of how to use information from the text
in combination with his or her own critical thinking processes to
answer the various types of questions. Although Shell and Hanna
(1981) advised against interpreting low scores on particular types
Struggling Readers’ Comprehension 107

of comprehension questions (measured by IRIs) as indicating a


deficit in a particular area, because they believe there are too few
questions of a particular type to be a reliable indicator of a weak-
ness, we found that using this information to plan instruction in
the context of a holistic approach benefits students’ comprehen-
sion skills. It is important to clarify that in our study, the entire
lesson for each instructional session did not focus on a particular
weakness but that a portion of each lesson consistently addressed
identified weaknesses. More research is needed to determine the
extent to which holistic instruction that includes attention to one
or more type(s) of comprehension questions, as identified through
informal assessment, influences comprehension performance of
struggling readers.
Third, the findings of the study provide evidence that assess-
ment data, obtained from ARI data or from some other informal
reading inventory, may be used both to inform instruction and to
evaluate program and group performance. This may be the first
study that has used diagnostic data for the purposes of individ-
ual assessment, instruction, and analyses of group performance,
and while there may be some researchers who question using di-
agnostic data to evaluate group performance, we believe that the
outcomes of this study demonstrate the value of using diagnostic
data to examine the performance of individual students as well as
groups of students; the results of group analyses based on diag-
nostic data may be used to identify the salient traits exhibited by
groups of students as well as to identify strengths and weaknesses
of instructional approaches designed to address the specific liter-
acy needs of students. We are hopeful that the results of this study
will prompt additional study of the effectiveness of individualized,
integrated language arts instruction as a reading approach on the
comprehension performance of elementary students in other ge-
ographical locations and settings.

References

Adams, M. (2001). Theoretical approaches to reading instruction. In E. Cushman,


E. Kintgen, B. Kroll, & M. Rose (Eds.), Literacy: A critical sourcebook (pp. 309–
315). New York: Bedford/St. Martin’s.
Anderson, W. (1977a). Commercial informal reading inventories: A comparative
review. Reading World, 17, 99–104.
108 D. G. Thames et al.

Anderson, W. (1977b). Informal reading inventories commercial or conven-


tional? Reading World, 17, 64–68.
Blair-Larsen, S., & Williams, K. (1999).The balanced reading program: Helping all
students achieve success. Newark, DE: International Reading Association.
Bottomley, D., Truscott, D., Marinak, B., Henk, W., & Melnick, S. (1999).
An affective comparison of whole language, literature-based, and basal
reader literacy instruction.Reading Research and Instruction, 38(2), 115–
129.
Caldwell, J. (1985). A new look at the old informal reading inventory.Reading
Teacher, 39, 168–193.
Calfee, R. C., & Curley, R. (1984). Structures of prose in content areas. In J. Flood
(Ed.), Understanding reading comprehension: Cognition, language, and the structure
of prose (pp. 161–180). Newark, DE: International Reading Association.
Clary, L. M. (1992). Integrated language arts for adolescents with learning dis-
abilities. The Clearing House, 65(5), 315–318.
Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale, NJ:
Lawrence Erlbaum.
Eldredge, J. L., & Butterfield, D. (1986). Alternatives to traditional reading in-
struction. The Reading Teacher, 40, 32–27.
Fletcher, J. M., & Lyon, G. R. (1998). Reading: A research-based approach. In W.
M. Evers (Ed.), What’s gone wrong in Americas classrooms? (pp. 49–90). Stanford,
CA: Hoover Institution Press.
Gerke, R. (1980). Critique of informal reading inventories: Can a valid instruc-
tional level be obtained? Journal of Reading Behavior, 22(2), 155–158.
Gonzalez, V., Brusca-Vega, R., & Yawkey, T. (1997). Assessment and instruction of cul-
turally and linguistically diverse students with or at-risk of learning problems. Boston:
Allyn & Bacon.
Goodman, K. S. (1984). Unity in reading. In A. C. Purves & O. S. Niles (Eds.),
Becoming readers in a complex society (pp. 79–114). Chicago: University of Chicago
Press.
Goodman, K. S. (1989). Whole-language research: Foundation and development.
Elementary School Journal, 90(2), 207–221.
Graves, M. F., Van den Broek, P., & Taylor, B. M. (Eds.). (1996). The first R: Every
child’s right to read. New York: Teachers College Press.
Harris, T. L., & Hodges, R. E. (Eds.). (1995).The literacy dictionary: The vocabulary
of reading and writing. Newark, DE: International Reading Association.
Helgren-Lempesis, V., & Mangrum, C. (1986). An analysis of alternate-form reli-
ability of three commercially-prepared informal reading inventories. Reading
Research Quarterly, 21(2), 209–215.
Henk, W. (1987) Reading assessments of the future: Toward precision diagnosis.
The Reading Teacher, 40, 860–870.
Jongsma, K. S., & Jongsma, E. A. (1981). Test review: Commercial informal read-
ing inventories. The Reading Teacher, 34, 697–705.
McGee, L. M., & Richgels, D. J. (1985). Teaching expository text to elementary
students. The Reading Teacher, 38, 739–748.
McKenna, M., & Kear, D. (1990). Measuring attitude toward reading: A new tool
for teachers. Reading Teacher, 43, 626–639.
Struggling Readers’ Comprehension 109

Meyer, B. J. F., & Rice, G. E. (1984). The structure of text. In P. D. Pearson (Ed.),
Handbook of reading research. New York: Longman.
Micro Power and Light. (1995a). Readability calculations. Dallas, TX: Author.
Micro Power and Light. (1995b). Vocabulary assessor. Dallas, TX: Author.
National Institute of Child Health and Human Development. (2000).Report of
the national reading panel: Teaching children to read (NIH Pub. No. 00-4754).
Washington, DC: U.S. Government Printing Office.
Pappas, C. C. (1993). Is narrative “primary”? Some insights from kindergartners’
pretend reading of stories and information books. Journal of Reading Behavior,
25, 97–129.
Piccolo, J. A. (1987). Expository text structure: Teaching and learning strategies.
The Reading Teacher, 40, 838–847.
Shell, L. M., & Hanna, G. S. (1981). Can informal reading inventories reveal
strengths and weaknesses in comprehension subskills? The Reading Teacher, 35,
263–268.
Silvaroli, N. (1976). Classroom reading inventory (3rd ed.). Dubuque, Iowa: William
C. Brown.
Snow, C., Burns, M., & Griffin, P. (1998).Preventing reading difficulties in young
children. Washington, DC: National Academy Press.
Stahl, S. A., McKenna, M. C., & Pagnucco, J. (1994). The effects of whole language
instruction: An update and a reappraisal. Educational Psychologist, 29(4), 175–
185.
Stahl, S. A., & Miller, P. A. (1989). Whole Language and language experience
approaches for beginning reading: Quantitative research synthesis. Review of
Educational Research, 59, 87–116.
Stephens, D. (1990). Research on whole language: Support for a new curriculum.
Katonah, New York: Richard C. Owen.
Sucher, F., & Allred, R. (1973). Sucher-Allred reading placement inventory. Oklahoma
City: Economy.
Thames, D. G., & Reeves, C. K. (1994). Poor readers’ attitudes: Effects of using
interests and trade books in an integrated language arts approach.Reading
Research and Instruction, 33(4), 293–308.
Vacca, R. T., & Vacca, J. L. (1996). Content area reading (5th ed.). New York: Long-
man.
VanSciver, J. H. (2005). Motherhood, apple pie, and differentiated instruction.
Phi Delta Kappan, 86(7), 534–535.
Vygotsky, L. S. (1962).Thought and language (E. Hanfmann & G. Vakar, Eds. &
Trans.) Cambridge: MIT Press.
Walmsley, S., & Walp, T. (1990). Toward an integrated language arts curriculum
in elementary school: Philosophy, practice and implications. The Elementary
School Journal, 90(3), 252–259.
Weaver, C. (1990). Understanding whole language: From principles to practice.
Portsmouth, NH: Heinemann.
Woods, M. L. & Moe, A. J. (1977). Analytical reading inventory. Columbus:
Merrill.
Woods, M. L. & Moe, A. J. (1999). Analytical reading inventory (6th ed.). Columbus:
Merrill.
110 D. G. Thames et al.

Appendix A

Sample Lesson Plan, CIR 412L

PRESERVICE TEACHER’S NAME


STUDENT’S NAME
Instructional Time: 8–9:30 a.m. Grade: 2
Storybook

Numeroff, L. (1998). If You Give a Pig a Pancake. New York: Laura


Geringer Book.
Information Text

Burns, D. (1990). Sugaring Season: Making Maple Sugar. Minneapo-


lis: Carolrhoda.
Children’s Dictionary

Thorndike, E. L., & Barnhart, C. L.(1997). Thorndike Barnhart Stu-


dent Dictionary. Glenview, IL: Scott Foresman Addison Wesley.
Student’s Reading Needs

The student needs to know how to use compounds to decode grade


appropriate words. The student needs to retell stories in her own
words to convey the beginning, middle, and end of the main events
in the story in proper order. She also has difficulty answering literal
and inferential questions about what she has read.
Student’s Reading Strengths

The student reads with some voice expression. She enjoys reading
aloud, using the popcorn strategy. She is positive about attending
the reading sessions with the tutor.
Objective(s) for This Lesson

r The student will identify and circle words in sentences that pro-
vide context clues to the meanings of the underlined vocabulary
words.
Struggling Readers’ Comprehension 111

r The student will make predictions about what will happen in the
story based on the title and pictures on the cover before reading
and prove or revise her predictions based on the text and picture
cues during reading.

r The student will identify smaller words in compound words to


decode new words.

r The student will retell the main events in the story in their proper
sequence, using a circular graphic organizer and picture cues.

Vocabulary

Word Definition
Homesick Missing home and family while away
Suitcase A flat rectangular traveling bag
Mailbox A box that holds letters and envelopes
Wallpaper Paper with decorations for pasting on walls

Presenting Vocabulary in Sentence Contexts

(Underlined words omitted on sentence strips)

1. The pig was homesick and wanted to go to the farm to see her
family.
2. She packed her suitcase before she traveled back to the farm.
3. You’ll have to give her some envelopes and stamps and take her
to the mailbox.
4. She’ll ask for wallpaper and glue to decorate her walls at home.

Integrated Language Arts Reading Lesson

Note: All of the following language arts components are integrated


throughout the lesson: listening, speaking, reading, writing, view-
ing, visual representation, and metacognitive/cognitive thinking.
Parenthetical references at the end of each subsection identify
language arts components integrated in the preceding section.
112 D. G. Thames et al.

Lesson Description

Before Reading the Storybook

r Preteach the Vocabulary Words

Prewrite the words: home, sick, wall, paper, mail, box, suit, and case
on eight 3” by 5” note cards. Mix the cards up and place the
words in front of the student and read the words aloud to the
student, pointing to and pronouncing each one correctly. Ask,
“Do you notice anything special about these words?” Help the
student discover that the words can be combined to form com-
pound words: homesick, wallpaper, mailbox, and suitcase. Explain
that a compound word is two words combined to make one
word. Next, present vocabulary in context by showing the stu-
dent the four sentences on sentence strips. Point out that each
sentence is missing a word. Say, “One of the four compound
words completes this sentence best. Can you guess which one?”
Have the student write the missing word in the blank. Ask, “How
did you know which word fit best here?” Help the student find
and circle the clue words in the sentences that hint at the mean-
ing of the missing word. For example, in the following sentence
the words envelopes and stamps provide clues to the meaning
of the vocabulary word mailbox: “You’ll have to give her some
envelopes and stamps and take her to the mailbox.” Guide the
student in finding and circling the clue words and filling in the
blank in the other sentences in the same manner.
(listening, speaking, reading, viewing, writing, thinking)

r Make a Prediction

Introduce the story. Read the title to the student and show her
the picture on the cover of the book. Ask: “What do you think
happens when you give a pig a pancake?” “Do you think this is
what will happen in the book?” Look at the pictures together.
Ask the student, “Do you want to add anything to or change
anything in your prediction based on the pictures?” Tell the
student to write her prediction on the white board. Say, “Let’s
read and see what happens.”
Struggling Readers’ Comprehension 113

r Introducing the Concept of Circular Stories

Introduce the concept of circular stories by drawing half of a circle


on a sheet of paper. Pass the paper to the student to draw the
rest of the circle. When the student has completed the circle,
ask, “How do you know where the circle will end?” Explain the
storybook today is a circular story, so it follows the path of a
circle and ends where it begins.
(listening, speaking, reading, viewing, writing, visual represen-
tation, thinking)

During the Reading of the Storybook

Read a few pages of the story and stop at a predetermined stopping


point that enables predicting. Ask the student to predict what will
happen next. Continue reading the story aloud taking turns with
the student. Pause at other predetermined points in the story and
ask, “What do you think will happen next?
(listening, speaking, reading, thinking)

After Reading the Storybook

At the end of the story discuss how each event caused another
event to take place, emphasizing the cause/effect pattern in the
book. Also, discuss how the story ended up in the same place it
began. Remind the student this is what a circular story does.

r Retelling the Story

Tell the student that she is going to retell the story, using pictures
as cues. Spread out a set of cards with pictures from the story.
Ask the student, “Can you tell me which of these events in the
story came first?” Have the student sequence the pictures of the
events in the appropriate order. Encourage “look backs” if the
student has difficulty remembering the order of events. Tell the
student to sequence the cards by laying them out in a circle on
the table (forming a graphic organizer). Then have the student
retell the story in her own words, using the cards in the circle to
prompt her retelling of the events. Write the students retelling
on paper as she dictates it. Tell the student to read her retelling
114 D. G. Thames et al.

aloud and give her positive feedback regarding the strengths of


her retelling. Give her some tips for future retellings.

r Writing a Creative Story

Next, tell the student she will write a creative circular story that
begins, “If you give a a .” “Ask, what animal
do you want to be the main character in your story?” Use a circu-
lar graphic organizer on mapping paper to draft the student’s
story. Ask, “What food item will you give to your animal at the
beginning of the story?” Write the story as the student dictates
it. Continue in the same manner until the student dictates sev-
eral cause/effect elements in her creative story. Ask the student,
“How can you get back to the beginning now?” Discuss how she
might move her story in a circular direction back to the begin-
ning. Have the student read her creative story aloud. Say, “Good
job. You created your own circular story.”
(listening, speaking, reading, writing, visual representation,
thinking, viewing)

Before Reading the Information Text

Tell the student that you will read information to her about making
maple syrup. Say, “Because you told me in the interview that your
favorite food is pancakes covered with lots of maple syrup, I chose
this information book with pictures for us to explore, so we can
learn more about maple syrup.”

During Reading the Information Text

Tell the student to listen as you read aloud the information on


the process of making maple syrup. Read the information to the
student. Show the pictures and stop at predetermined stopping
points to talk about the process with the student.

After Reading the Information Text

Say, “We are going to retell how maple syrup is made. What do you
remember about the how it is made or the steps in the process?”
Spread cards with picture cues out in front of the student. Say, “I
Struggling Readers’ Comprehension 115

have prepared these pictures to help you retell. Put the pictures in
order. Use “look backs” if needed. Guide the student through the
sequencing process. Once the cards are in order, let the student
use the photos to retell the process.”

Conclusion

Tell the student, “Today you accomplished many things. You


learned about using small words to decode compound words, us-
ing context clues to figure out the meanings of words, and using
circular story maps and pictures to retell stories and information
that you learned about maple syrup.”

Вам также может понравиться