Вы находитесь на странице: 1из 42

Running head: IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Impact on Student Learning: Assessment Project


Jerry S. Pollatos
Florida SouthWestern State College
MAE4940 Internship in Middle/Secondary Education with Math Emphasis
Prof. Mary Robertson
December 7th, 2014

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Abstract
The purpose of this assignment is to allow the teacher candidate to create an instructional unit on
a chapter or unit decided by the teacher candidate and his or her mentor teacher. During the
instructional unit, the teacher candidate we create a pre and posttest in order to analyze the
results and modify lesson plans and pace. My mentor teacher and I decided to base this project
on a unit within the Algebra 1 textbook containing systems of linear equations and inequalities.
Both assessments were administered and created by me, with collaboration from my mentor
teacher, Ms. Costello. I assessed both of my 8th grade classes, periods 3 and 5, and the results
will be discussed in depth in the following report.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Impact on Student Learning: Assessment Project


Introduction
Classroom assessment is the process of collecting, synthesizing, and interpreting
information to aid in classroom decision making. Assessment helps educators towards
determining and modifying classroom management, instruction, student learning, and planning.
Teachers use various methods of assessment all day long, from deciding on whom to call to
respond to a question, to seating students to boost the effectiveness of the learning environment.
This project focuses more on using pretest and posttest data to show the impact on students
learning throughout an entire unit.
During the fall 2014 semester, I was placed at L.A. Ainger Middle School as a final, fulltakeover intern. My mentor teacher, Ms. Costello, is very experienced and taught gifted 6th, 7th,
and 8th grade math classes for many years. Ms. Costello focuses on cooperative learning and
reinforces this through active investigations of mathematical concepts and essential questioning
techniques, with minimal direct instruction. Independence, success, and the ability to use math
everyday are the goals she has set forth for all of her students.
My mentor teacher and I agreed to use the unit on Systems of Equations and Inequalities
in our 8th grade Algebra 1 Honors class, as they would be the least affected by the loss of two
days of instruction due to administering the pre and posttest. This decision was based mostly on
the shift to Florida Common Core State Standards, as the 7th grade class needs to learn new
standards that they did not meet last year, and the 6th grade class has a lot of new material added
to the curriculum, which they never needed to master at this level prior to the switch.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

The purpose of this project is to select a unit which comprises a majority of my time
student teaching, in which I will use pre and post assessments and the data collected from them
to not only drive my instruction, but also assess my own effectiveness as an educator objectively
in order to find areas to develop or improve.
For this project, I focused four weeks, or 20 days, of instruction towards the unit
contained within chapter 6 of our Pearson Algebra 1 Honors textbook, Solving Systems of
Linear Equations and Inequalities. The unit met with all of the Florida Common Core State
Standards set forth by the district in the curriculum-MAP on the cPalms website. For more
details, refer to Appendix A: Standards & Instructional Map.
Demographics
The classroom dynamic is one of openness and inquiry. In this project, I am assessing
Ms. Costellos 8th grade classes, nearly all of which have had her as a math teacher since 6th
grade. The environment in these two classrooms, I believe, is the catalyst of my instruction.
Teaching a TAG (Talented and Gifted) class is exciting and fun every day because the students
really care about their learning and are comfortable in doing so because they are surrounded by
all their like-minded friends. All of my students love math and continually try to practice and
improve; in my classroom, the most popular students are typically the ones with the highest
grade. Many times, due to Ms. Costellos method of self-discovery and investigation teaching
model in 6th and 7th grade, I find students reading ahead in the book to be more prepared. This is
almost unheard of in middle school, but it is a great skill to have prior to high school and college.
In terms of gender, my classrooms combined are mostly even. One class has 11 boys and
9 girls, while the other has both 8 boys and girls. In my opinion, gender has nothing to do with a

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

students ability to learn or even retain information, and I have never understood why so many
people compare people in this way. In the spirit of equality, I will not be analyzing test score data
in terms of gender, as the results would not be reliable and is the consequence of an ignorant 20th
century mindset.
The ESE population is immense; however, it is only constituted by gifted, exceptionally
gifted, and ADHD, a different side of the ESE spectrum than most normally think about. Also,
out of both periods, I have no ELL students. While my ESE population is a little unorthodox, it
provides its own challenges. I have almost no true accommodations for my students set forth by
the schools ESE committee, besides extended time, preferred seating due to eye glasses and
allowing students to stand up and stretch periodically, none of which affect my teaching as I take
these factors into account with or without an IEP. The only difficulty lies in the small quirks of
my brilliant and brainy students. For instance, some students may get mild panic attacks when
they have trouble understanding concepts. This can sometimes hinder my instruction due to
outbursts. Also, occasionally students may request more work in order to practice; this however
affects my planning more so than my instruction as I need to include extension activities in my
plans.
In the beginning of the year, my mentor teacher and I reviewed all of our classes FCAT
scores in order to help map our instruction throughout the year and identify students who will
need to be observed and instructed more carefully. That being said, all of our students scored in
mathematics within the level 4 and 5 bracket, meaning that they have shown mastery of all
sections assessed. This means that the students are very advanced among their peers and that as
an educator, Ms. Costello is very effective.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Historically in 8th grade, Ms. Costello typically breaks her students into two groups, this
year included. In order to embrace her self-discovery approach to learning, she breaks the class
into two groups: Independent Study (I.S.) and Direct Instructional (D.I). The I.S. group works,
each at their own pace, independently or within a small group. Their only assignments in class
are note taking, so that at the end of the year they have all the textbooks information in their
notebooks, ultimately creating their own textbook. Next, the Direct Instructional group, as
implied by their name, is the group of students whom I instruct on a daily basis. They are
comprised of mostly type B personality students, who struggle with note taking and organization;
I believe that this in no way hinders their ability to learn. These two groups also provide an
interesting sample set, as the data I gathered from them will show an experimental group, D.I.
Group, and a control group, I.S. Group.
Pretest
As previously stated, this Assessment Project is aimed towards the unit of the textbook
containing Systems of Linear Equations and Inequalities. Systems, in my opinion, are the
backbone of all algebraic thinking, reasoning, and problem solving; it is a framework for how to
quickly and effectively solve mathematical problems using equations.
I decided to make my pretest simple: four questions, each showing mastery of at least one
of the concepts learned in the unit (refer to Appendix B: Pretest Sample). The first question of
the test simply requires students to find the solution using the substitution method and is worth
three points; one point for using the correct method and one for each of the results, x and y. The
second question is much like the previous question; however, it requires students to use the
elimination method rather than substitution. One point was awarded for using the correct

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

method, while the other two points were awarded for a correct x and y solution. For both of these
questions, I allowed for partial credit. If the students got the wrong solution due to one math
error, but the rest of their work was correct, I would award them one point of the three.
The 3rd and 4th questions provided more of a challenge and likewise they were worth
more points. The third question was a word problem, requiring students to utilize at least half of
the units concepts and synthesize the information given in order to create a system of equations
in order to find the solution. In this question, worth 4 points, students were graded on four
criteria: reasoning as to the correct system of equations, identifying variables, as well as finding
the correct solution for both variables. The fourth and final question of the Pretest encompassed
all of the concepts learned throughout the entire unit. The question involved graphing a system of
linear inequalities and was worth a total of 5 points. In order to receive all points, students
needed to correctly graph the system of inequalities, provide three possible solutions to the
system (1 point each), as well as include answers contained within two different quadrants.
Question 4 was by far the most difficult question as it comprised all of the skills and concepts
within the unit as a whole.
While assessing content mastery within the unit, I always try to assess prior knowledge as
well. This can be seen as students are required to identify their variables in the third question, a
point that I try to make every day while teaching. Also, students needed to provide answers
found in two different quadrants in the final question. During the previous unit, I noticed that
students had trouble remembering the quadrants, and so I tried to incorporate this into my lessons
as much as possible and also assess their mastery of the concept as it pertains to other units.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

These two concepts have been learned and mastered earlier in the year, so they should
have provided the students with easy points, while also allowing me to identify and isolate
misconceptions students were having with these concepts, in order to help modify and drive my
instruction.
The pretest was set to be administered on October 17th, just as students began the unit.
The students had roughly 30 minutes to take their pretest because the other portion of class time
before the assessment was dedicated to an anticipatory set on graphing to find solutions. My
mentor teacher advised me to do this, and I agree with her, that it would be a shame to waste a
full day of class administering and pretest.
Pretest Data Analysis
Administering a pretest is a great way to get a baseline of what students already know, in
order to map their academic and intellectual growth from the start of a unit until its end. In most
cases, a pretest before an instructional unit will result in many zeros. As can be seen in the graph
and table below of both classes and groups combined, while I did get many zeros (roughly 35%
of students) I was surprised by how well some students did; teaching gifted students can be
unique in this regard.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Pretest Periods Combined


Frequency

14
12
10
8
6
4
2
0

15 14 13 12 11 10 9

frequency 0

2 10 2 12

Score of 15

As can be seen above, the data that surprised me most was the outliers. While a majority
of students scored very low, which was expected, five of my students scored at least a 30%,
meaning 20% of my students were able to correctly answer one-third of the test.

Period 3 Pretest Scores

6
Frequency

5
4
3
I.S.

D.I.

1
0

15 14 13 12 11 10 9

D.I. 0

I.S.

Score of 15

The above graph compares the pretest data obtained by my third period class. The blue
line indicates the frequency of scores obtained by the I.S. Group, while the red line shows the
score frequency of the D.I. Group. As can be seen, the Independent Study Group scored on

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

10

average higher than the Direct Instructional Group. The mean score of the I.S. Group was about
29%, while the D.I. Groups was 8.2%. These averages however do not show any concrete data,
as the I.S. Group is comprised of three students, while the D.I. Group contained thirteen students.

Period 5 Pretest Scores

Frequency

5
4
3
I.S.

D.I.

1
0

15 14 13 12 11 10 9

D.I. 0

I.S.

Score of 15

Similar to the previous graph, this graph compares the pretest score data of both groups in
my 5th period class. As can be seen, the D.I. scores of period 5 are very similar to the scores from
period 3; however, the I.S. group scores in period 5 are drastically different from period 3. This
is due to the volume of students in the I.S. group. In this period, the Direct Instructional Group
had only nine students, while the Independent Study Group also had nine students, three times as
many as in period 3. The data shown above is more reliable as both groups have equivalent
population sizes, thus comparing them is easier. While this period does have more students, the
mean score was lower than period 3. The D.I. Group on average scored 5.8%, while the I.S.
group scored 23%. It is also worth mentioning, one student scored a 47% on the pretest, so his or
her posttest score should be an interesting comparison.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

11

Frequency

Pretest Scores Period Comparison


7
6
5
4
3
2
1
0

Period 3
Period 5
15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0

Period 3 0 0 0 0 0 0 0 0 0 0 1 3 1 4 1 6
Period 5 0 0 0 0 0 0 0 0 1 0 3 0 1 6 1 6
Score of 15

Finally, the above graph compares the scores obtained by both periods, side by side. As
can be seen, both periods scored on average relatively close. While 5th period had a larger range
of scores, the 5th period students scored lower on average than the 3rd period. When comparing
the average scores of both classes, combining the two groups, 3rd period got about 12% while 5th
period got about 14%.
Instructional Reflection
Before we began the unit, Ms. Costello and I collaborated on the pace in which I would
instruct the unit. Unit 6 is broken down into six sections in the textbook, and I wanted students to
have at least three days of practice between each section. This would give me 18 days of
instruction, plus two days for the pre and posttest, totaling 20 days. The instructional map can be
found in Appendix A for reference. While planning for the instruction, I also needed to be
mindful of the pace of summative assessments which would be graded and provide both students
and parents with feedback on their progress in the course.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

12

The first day of the unit consisted of administering the pretest, but also an anticipatory set
activity, which was aimed to prepare students for the next lesson, by reviewing what they knew
about graphing equations. It went very well and gave the students a good foundation for section
6-1 (solving systems of equations by graphing).
The actual instructional unit began on Tuesday, October 21st, in which students learned
how to solve a system of equations by graphing. This entire section was planned to be taught in
one day, which concerned me a little due to the time it takes to actually graph equations.
However, since technology is at the cornerstone of education in the 21st century, the students did
not need to graph the equations by hand, as they had graphing calculators. The students had also
mastered graphing prior to this unit, so it was relatively simple compressing the section into just
one day. The students had a little trouble identifying the types of solutions as one, zero, and
infinite; however, I assured them that as we progressed in the unit, the concept would become
clearer to them, which in fact did. The next day, the students tested out of the section through a
TI-Nspire calculator-based assessment.
Section 6-2 concerned solving systems of equations using substitution. This is a big
section, as the concept that is learned will follow them for the rest of their studies in
mathematics. Many of the students initially had trouble with the concept due to anxiety using
fractions; however, after much modeling, I showed them how fractions make the solutions easier.
Many times students forget that a fraction is simply a division of two numbers, so instead of
multiplying a fraction or a decimal, I showed them that they simply multiply numerators with
numerators and denominators with denominators. After modeling this a few times, the students
anxiety involving fractions quickly dissipated, and the following lessons went by smoothly. The
students were also amazed by how simple classifying the solutions of the system was using

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

13

algebra. When x and y were equal to something, then it is one solution; when the result was a
true statement, such as 2=2, then the solution was infinite because the statement is always true,
and if the statement was untrue, such as 2=1, then there is no solution. Due to the importance of
this section, Ms. Costello wanted the students to be well versed in substitution, so she decided to
both quiz and test the students in this section, the results of which were very good.
The ninth day of instruction covered section 6-3, solving systems of equations using
elimination, and I was met with mixed reactions. Many of the students felt like substitution was
easier than elimination and vice versa, so about half of the class was resistant towards using a
new method. It was not until I modeled the usefulness of each type of method in different
situations and how a combination of both methods is often times the most efficient method
towards solving a system that they started to see its usefulness. At this point, the students had
become very comfortable with systems of equations, and they only found trouble with small
mathematical errors. After three days of instruction and a day off for Election Day, students took
a summative assessment on the section before we moved on.
The twelfth day of instruction marked the half-way point of the unit. From here, the
students would need to utilize what they had learned so far and apply it in new ways. The next
section, 6-4, concerned using systems of equations in order to solve real world problems. This
entire section was based on decoding word problems and using systems in order to find the
solution. I thought that this was a great section because it provided the students with real world
applications in which they can use systems of equations. Many times students tend to ask,
When am I going to use this? The answer this time was simple as the entire section was
focused on giving the students a multitude of examples of when and where they can use systems.
The students also really liked this section even though it involved word problems because it gave

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

14

them the opportunity to utilize concepts and apply them to their own lives. While teaching this
unit, I created several word problems to be solved with systems of equations. The students really
enjoyed it because I tried to make each word problem applicable to their lives and also used
students names in each question. Some examples of questions involved choosing which
videogame console to buy, or which pay-to-play game will cost more to play after a year. We
spent two days on this section, and the students not only had a lot of fun in class, but also showed
me that they had mastered the topic.
The following section was 6-5, in which students simply graphed linear inequalities. This
section was aimed towards teaching students how to graph an inequality and what side of the line
contains the solution. I liked that the textbook had this section and did not just jump right into
graphing systems of linear inequalities, as it gave students time to practice the concept of
graphing an inequality. During this section, students learned about the different types of
inequalities and how each of them relates to different solutions, even if the equation of the line is
the same. The section also gave me the opportunity to explain how to use a test point, which is a
very helpful technique for checking work in the following section. Due to the simplicity of this
section, we only spent one day on it, which was not a problem because about 90% of the students
were very comfortable with it.
The last section of the unit was 6-6, solving systems of linear inequalities. This is one of
the most difficult concepts to master in Algebra because it takes only one small mistake to get a
totally wrong answer. Students need to be mindful of the inequalities and when to flip them.
They need to remember where to shade and whether or not the line is solid or dashed. Students
also need to be extra careful with their arithmetic because, if their final equations are wrong, then
their test points will not work properly. Keeping up with my original plan of finishing the unit in

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

15

20 days, we had only 3 days to cover this crucial section. While during the first days of the
section the students struggled a little, especially with arithmetic, they slowly became more and
more proficient. By the third and final day, almost the entire class could graph a system of linear
inequalities correctly about 80% of the time on their first try.
The final two days of the unit were dedicated to testing. The 19th day consisted of the
students testing out of the second half of the unit, covering sections 4, 5, and 6. This was a
summative assessment and counted as a grade. The 20th and final day was dedicated to
administering my posttest, the results of which will be discussed later in this project.
Posttest
The posttest was administered on Monday, November 17th, and was very similar to the
pretest. I did this on purpose, as testing the students using the same, or very similar, questions
from the pretest provides more valid results, as it assesses the same skills; I did however modify
it slightly.
The posttest, as can be seen in Appendix C: Posttest Sample, is remarkably similar to the
pretest. I used the same template and the same outline. Both tests have the same amount of
questions, and each question is worth the same amount of points. The only difference, besides
the name, can be found in question 2. I decided to modify question 2 because it was too simple.
To truly assess if the students understand elimination, they have to be able to manipulate each
equation of the system. I wanted the manipulation to be less apparent than the pretest. In the
pretest, the students simply had to subtract the two equations or multiply one of them by negative
one and then add. On the posttest, the students had to find the greatest common factor among the
coefficients before deciding by what numbers to multiply each equation.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

16

Pretest and Posttest Data Analysis


By the 20th day of the unit, I was confident that my students had mastered the concepts.
On the previous day, the students took Ms. Costellos summative assessment and had a very
large amount of As, with only two students initially getting below a 70%. A few of the students
had to retake the test because they had been called out of the class halfway through the exam.
To ensure that my assessment was valid and reliable, as previously stated, I kept the
exam the same for the most part. After I had graded the students posttests however, I was
startled by the average scores and how they compared to Ms. Costellos assessment. The average
of their summative assessments was in the low B range, while the average of the posttest scores
was below 70.

Combined Average Scores


80
70

Percent

60
50
40

Pretest Average

30

Posttest Average

20

Percent Increase

10
0

D.I. Group

I.S. Group

Combined

Pretest Average

6.695156695

25.92592593

13.0787037

Posttest Average

67.92022792

70

67.38425926

Percent Increase

61.22507123

44.07407407

54.30555556

As can be seen in the histogram above, which compares the average test scores of all
students in both periods, differentiated by group and combined to the right, the students scores

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

17

have been converted into percentages rather than ratios. The red bars represent the students
pretest mean percent, while the blue represents their posttest mean, and finally the green
represents the percent increase between both assessments.
The average percent scored by students was significantly lower than Ms. Costellos
assessment. This makes me question the validity of my assessment. After much reflection, I
interpolated a few possible reasons. The students may have not tried as hard on my exam
because they took it second and knew it was not to be graded. Eight students in period 3 did not
finish the exam, as they got called out by the front office; this contributed to 3rd period having a
majority of low scores. My assessment might have been too hard to get an A on because it only
had four questions and was out of 15 points. Finally, approximately 50% of the students final
summative assessment scores were based on a take home portion of the exam, which ensured
them a higher grade and many more possible points.

3rd Period Posttest Completion Rate

5
38%

Did not finish


8
62%

Finished

The above graph displays the large quantity of students who did not finish their exams
during the 3rd period class. Due to only 38% of students actually completing the exam, the

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

18

validity of the posttest data, therefore, is not valid for all students in the class. Many of the
students who did not complete the exam, however, did get many of the 15 points. Other students
who work at a slower pace ended up receiving very low scores, even if they had not made any
errors in the work they completed. While the average percent score between both periods is low,
in ensuring validity the eight students who did not finish must be considered in the analysis. If
they had finished, the mean score would surely have been higher.
The more important statistic is the percent increase, which shows how much of what the
students have learned during the unit was retained and mastered. The group that I personally
instructed increased their score on average about 60%. Also my instructional group earned nearly
the same average as the more advanced group working at a faster pace. I believe this is attributed
to the low amount of practice the I.S. group got, compared to my D.I. group.

Period 3 Average Scores


80
70

Percent

60
50
40

PreTest Average

30

PostTest Average

20

Percent Increase

10
0
PreTest Average

D.I.

I.S.

Combined

8.205128205

28.88888889

12.08333333

PostTest Average 61.02564103

71.11111111

62.91666667

Percent Increase

42.22222222

50.83333333

52.82051282

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

19

The above graph shows the percent mean scores of my 3rd period class. This histogram
follows the same format as the previous one. The data is differentiated by the instructional group,
and their scores are combined to the right. The first bar represents the pretest, the second shows
the posttest, and the third is the percent increase. This was the period in which eight students
were called out due to a science fair competition, and thus many of them only attempted the first
two problems. As explained previously, each question was worth a different amount of points;
the first two questions being worth only six of fifteen total possible points.

Period 5 Average Scores


80
70

Percent

60
50
40

PreTest Average

30

PostTest Average

20

Percent Increase

10
0

D.I.

I.S.

Combined

5.185185185

22.96296296

14.07407407

PostTest Average 74.81481481

68.88888889

71.85185185

Percent Increase

45.92592593

57.77777778

PreTest Average

69.62962963

In contrast the above graph shows the percent mean scores of my 5th period class. In this
class everybody completed the exam, as can be seen their percentages on average are
significantly higher than 3rd period. In this period the average score earned is 74%; this shows
approximately a 70% increase. The histogram also shows that on average the D.I. group scored
higher percentages on average when compared to the I.S. group. Initially I had assumed this

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

20

would happen, because the D.I. group not only went over the material at a slower pace, thus
gaining more practice time with the concepts, but they also got the opportunity to be taught
directly rather than learning independently. The students within my D.I. group were assisted and
assessed each day towards gaining the most mastery of each concept and skill, while the I.S.
group was only assessed once a week with minimal feedback. This statistic also shows the
heightened intellect of the students within the I.S. group, as many of them could master the
concepts independently at an accelerated rate.

Student Score Frequency

Period 5 Posttest Comparison

D.I group
I.S. group

0
15 14 13 12 11 10

Score (out of 15)

The above graph shows 5th periods score frequency on the posttest comparing each
instructional group. As can be seen, the points earned by the students within each group are very
similar. On average the D.I. group did achieve a higher amount of points then the I.S. group,
however, the data has a marginal factor of 1. While the histogram does show the D.I. group
achieving a higher average of points, the minimal marginal factor shows that the scores on a
larger scale would be mostly equivalent.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

21

Combined Posttest Percent Frequency

Frequecy

4
3

Period 3
Period 5

2
1
0
100

90+

80+

70+

60+

50+

40+

30+

20+

10+

1+

Percent range (base ten)

The above histogram shows the combined frequency of percentages achieved by students
in both 3rd and 5th period. As this histogram compares both periods, it is clear that period 5
achieved higher scores than period 3. This histogram, unlike the previous ones, depicts the data
not as a ratio between the points earned, but as a range of percentages. This data shows that a
majority of test scores in both classes are between 60 and 70 percent.
This graphic also shows the alarming reality of the reliability of my grading scale. Due to
the low amount of possible points that can be earned in the assessment, missing just one question
drastically affects the outcome of the students grade. After studying and analyzing the data, I
came to the conclusion that the usual grading scale is not a valid measure of a students success
in this assessment. Due to the how my assessment was structured, it would have been very hard
for a student to get into the upper percentiles, as my assessment was based out of 15 possible
points.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

22

Percentage

Assessment Possible Scores


100
90
80
70
60
50
40
30
20
10
0
15 14 13 12 11 10

Possible Points

As can be seen in the above graph, each point earned by students dictated their grade by
almost a factor of 10. This means that if the student missed even one point on the exam their
grade plummeted on average by about a letter grade. To address this issue with the assessment I
first had to analyze the points earned in each question. This would give me a good idea of how
much students truly had mastered each question because each question was dedicated to one
skill. I could organize the data into graphs which would provide me a visual towards ascertaining
how much of the concepts students understood and how much of their grade was based on small
errors in their math. Below are a series of graphs depicting all possible points for each question
of the exam (refer to Appendix C: Posttest sample for the specific questions).

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

23

Question 1 Points Earned


18
16
14
12
10
8
6
4
2
0
Substitution

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

This histogram provides a visual representation of how many points each student earned
in question one. The legend provides what period each bar represents and how many students are
in each period as well. It can be seen that all but two students in each period were able to use
substitution correctly within an equation. The following two sets of data show that all but around
four students were able to find the correct solution to x and y, with possibly one mathematical
error. The final set shows the amount of students who did not determine the correct values for the
variables of x and y, due to one error in their work.

Question 2 Points Earned


18
16
14
12
10
8
6
4
2
0
Elimination

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

Much like the previous histogram, this one represents the data of points earned in
question two. The points earned in this question as well as the method of the solution are very

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

24

similar. The students also earned nearly the same distribution of points when compared to
question one. It appears as if the students had less difficulty using elimination than substitution,
because a larger quantity of students were able to solve for the correct values of the variables,
however a few students were still making small mathematical errors which resulted in an
incorrect solution.

Question 3 Points Earned


18
16
14
12
10
8
6
4
2
0
I.D. variables

Correct EQ

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

This data surprised me. One of the concepts that I tried to model as much as possible
while teaching section 6-4 was identifying variables. While this was a major focus during my
instruction, many students failed to do this, even though it was explicitly written in the
directions. One possible conclusion to this could be that the directions were not given clear
enough. In future assessments I will have to make the directions more clearly by bolding or
underling key words, or possibly listing explicitly how the students are graded on each question.
Almost all students were able to correctly identify the equations of the system using the
information from the word problem; a majority of the students that missed points on this
question were a result of not finishing the exam due to the science fair.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

25

Question 4 Points Earned


14
12
10
8
6
4
2
0
1 solution 2 solutions 3 solutions 2 quadrants
Period 3 (16 students)

Correct
graph

1 error

DNF

Period 5 (18 students)

Finally the data from the points earned on the fourth and final question are presented in
the above histogram. As previously stated, this was the most difficult question in the assessment,
as it required students to utilize all of the skills learned within the unit. A majority of students in
both classes did everything correctly, with the exception of one or two mathematical errors in
deriving the inequalities of the lines within the system. This small miscalculation had almost a
snowball effect as it resulted in the students total possible points being very limited. While many
of the students resulted in creating graphs that were incorrect, they did however recognize that
the solutions needed to be within two quadrants, and a majority of them did display their answers
as so. The data also shows that a majority of the 3rd period students did not attempt or complete
this question, which resulted in many of the students to have such low point totals.
The results of the data shown in the previous four histograms show that while the
students did not achieve high percentages of possible points, they did in fact show proficiency in
the content, as just one error drastically affected their grades. The implications of this show that
my future assessments need to allow students more opportunities to get points. Question four, in
particular, limited the amount of points students could earn; if they showed proficiency in

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

26

graphing and selecting points but their inequalities were incorrect, they have shown proficiency
but did not earn a score to show that.
Final Reflection
I felt really good after I completed this unit. The deliveries of each of my lessons were
fluid and comprehensive. I modeled all concepts, methods, and solutions in such a way as to
assure each student had clear and effective notes. Each day I pointed out the little steps which
were crucial toward finding correct solution and assessed the students ability to recognize
algorithms in each solution, through oral questioning.
Initially I was surprised by the mean scores of my students on the posttest, due to their
low percentages. I learned that assessments are not always as clear cut as I initially thought.
While the questions do need to assess each of the concepts to be a reliable assessment, the
formatting of the assessment needs to be finely tuned so that the assessment itself is still valid. I
thought that my assessment was perfect, but I failed to realize that the structure and formatting
inhibited students ability to achieve a high score.
I have now learned that valid and reliable assessments are not always easy to achieve.
Much thought and experimenting must go into designing assessments, so that they not only show
proficiency in concepts but also allow for student to gain a high percentage.
Education by definition is learning. This project has taught me so much about the impact
I have on students learning and designing and implementation of assessments. While I did make
a few errors, I learned where I was wrong and have made effort in correcting them for future
assessments. With time and practice all people improve; as an educator I will try to improve

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

27

every day in some way. This project taught me a lot about assessments, much more so than any
other project because it was based on experimental data, rather than theoretical conjecture.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

28

Appendix A: Standards Met & Instructional Map

Date
10-17-14

# of
Days
1

10-21-14

10-22-14

10-23-14
10-24-14
10-27-14

10-28-14
10-29-14

10-30-14
10-31-14
11-3-14

11-4-14
11-5-14
11-6-14
11-7-14

1
2

11-10-14

11-11-14
11-12-14
11-13-14
11-14-14

11-17-14

Topic

Cluster/ Standards

Anticipatory Activity: Graphing (15 min)


Pretest (30 min) Formative Assessment
Students will solve systems by graphing
and then classify the system. Sec 6-1

MAFS.912.A-REI.3.6
MAFS.912.N-Q.1.2
MAFS.912.N-Q.1.3
MAFS.912.A-REI.3.6

Students Test on 6-1


Summative Assessment
Students will solve systems of linear
MAFS.912.Aequations using substitution, and
CED.1.3
identify 1, 0, and infinite solutions. Sec MAFS.912.A-REI.3.5
6-2
MAFS.912.A-REI.3.6
Quiz and Test on 6-2. Using
Substitution
Summative Assessment
Students will understand how to solve
MAFS.912.Asystems of linear equations using
CED.1.3
elimination, and identify 1, 0, and
MAFS.912.A-REI.3.5
infinite solutions. Sec 6-3
MAFS.912.A-REI.3.6
No School Election day
Test on 6-3 Summative Assessment
Students will understand how to set up MAFS.912.Aand solve systems of equations to solve CED.1.3
real world problems. Sec 6-4
MAFS.912.A-REI.3.5
MAFS.912.A-REI.3.6
Students will graph linear inequalities
MAFS.912.Aand recognize graphical solutions to
CED.1.3
linear inequalities in two variables. Sec MAFS.912.A6-5
REI.4.12
Students will solve systems of
MAFS.912.Ainequalities by graphing and identify
REI.4.12
solutions. Sec 6-6
Test on systems of linear inequalities
Summative Assessment
Posttest on Unit 6
Formative Assessment

Homework

Text p.364
#31-38
Workbook p. 173
Take home test
Text p. 371
#11-22
#23-31
Workbook P. 175
WB p. 177 Take
home test
Text p. 378
#7-14 #15-26
WB p. 180 #19-27

Text p. 386
#7-16
#19-28
Text p. 393
#8-21

Text p. 399
#7-15 #16-27
WB p. 189
WB p. 193
Take home test

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Appendix B: Pretest Sample

29

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

Appendix C: Posttest Sample

30

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

31

Appendix D: Data Tables and Graphs

Pretest Periods Combined


Frequency

14
12
10
8
6
4
2
0

15 14 13 12 11 10 9

frequency 0

2 10 2 12

Score of 15

The above graph represents the pretest scores earned by students in both classes combined

Frequency

Pretest Scores Period Comparison


7
6
5
4
3
2
1
0

Period 3
Period 5
15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0

Period 3 0 0 0 0 0 0 0 0 0 0 1 3 1 4 1 6
Period 5 0 0 0 0 0 0 0 0 1 0 3 0 1 6 1 6
Score of 15

The above graph compares by period, the pretest scores earned by students.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

32

Period 3 Pretest Scores

6
Frequency

5
4
3
2

I.S.

D.I.

15 14 13 12 11 10

D.I. 0

I.S.

Score of 15

The above graph represents the pretest score frequency earned by students in 3rd period. The
data is further segregated to show the frequency of scores between the I.S. and the D.I. groups.

Period 5 Pretest Scores

Frequency

5
4
3
I.S.

D.I.

1
0

15 14 13 12 11 10

D.I. 0

I.S.

Score of 15

The above graph represents the pretest score frequency earned by students in 5th period. The
data is further segregated to show the frequency of scores between the I.S. and the D.I. groups.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

33

Period 3 Combined Score


7
Score Frequency

6
5
4
3

Pre Freq

Post Freq

1
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph represents the frequency of scores earned by all students within the 3rd period
class. The graph compares posttest and pretest scores, as can be identified using the legend to
the right.

Period 3 D.I. Group Score


7
Score Frequency

6
5
4
3

Pre Freq

Post Freq

1
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph represents the frequency of scores earned by students within D.I. group of the
3rd period class. The graph compares posttest and pretest scores, as can be identified using the
legend to the right.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

34

Period 3 I. S. Group Score


Score Frequency

2.5
2
1.5
Pre Freq

Post Freq
0.5
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph represents the frequency of scores earned by students within I.S. group of the
3rd period class. The graph compares posttest and pretest scores, as can be identified using the
legend to the right.

Period 5 Combined Score


7
Score Frequency

6
5
4
3

Pre freq

Post Freq

1
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph represents the frequency of scores earned by all students within the 5th period
class. The graph compares posttest and pretest scores, as can be identified using the legend to
the right.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

35

Period 5 D.I. Group Score


6
Score Frequency

5
4
3

Pre Freq

Post Freq

1
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph represents the frequency of scores earned by students within D.I. group of the
5th period class. The graph compares posttest and pretest scores, as can be identified using the
legend to the right.

Period 5 I. S. Group Score


3.5

Score Frequency

3
2.5
2
Pre Freq

1.5

Post Freq

1
0.5
0
15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
Score (out of 15)

The above graph represents the frequency of scores earned by students within I.S. group of the
5th period class. The graph compares posttest and pretest scores, as can be identified using the
legend to the right.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

14

Total Combined Scores

12
Score Frequency

36

10
8
Pretest

Posttest
4
2
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph compares the pre and posttest scores earned by students in both instructional
groups and both periods combined.

Student Score Frequency

Period 5 Posttest Comparison

2
D.I group
I.S. group

0
15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
Score (out of 15)

The above graph compares the posttest scores earned by students of the D.I. and I.S.
instructional groups within 5th period.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

37

Combined Posttest Scores

5
Students

4
3

Period 3

Period 5

1
0
15 14 13 12 11 10 9

Score (out of 15)

The above graph compares the frequency of scores earned by students on the posttest by period.
(Instructional groups are combined)

Combined Posttest Percent Frequency

Frequecy

5
4
3

Period 3

Period 5

1
0
100 90+ 80+ 70+ 60+ 50+ 40+ 30+ 20+ 10+ 1+
Percent range (base ten)

Similar to the previous graph, the above graph compares the frequency of scores earned ny
students on the posttest by period. In this graph however the range of scores used are a range of
percentages rather than a ratio out of 15.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

38

Percent

Period 3 Average Scores


80
70
60
50
40
30
20
10
0

PreTest Average

PreTest Average
PostTest Average
Percent Increase
D.I.

I.S.

Combined

8.205128205 28.88888889 12.08333333

PostTest Average 61.02564103 71.11111111 62.91666667


Percent Increase 52.82051282 42.22222222 50.83333333

The above graph shows the average percentage scores earned by students within the 3rd period.
The first set of data represents the D.I. group, the second the I.S. group, and the final represents
both instructional groups combined. The data also compares the pre and posttest scores as well
as the percent increase on average.

Percent

Period 5 Average Scores


80
70
60
50
40
30
20
10
0

PreTest Average
PostTest Average
D.I.

I.S.

Percent Increase
Combined

PreTest Average 5.185185185 22.96296296 14.07407407


PostTest Average 74.81481481 68.88888889 71.85185185
Percent Increase 69.62962963 45.92592593 57.77777778

The above graph shows the average percentage scores earned by students within the 5th period.
The first set of data represents the D.I. group, the second the I.S. group, and the final represents

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

39

both instructional groups combined. The data also compares the pre and posttest scores as well
as the percent increase on average.

Percent

Combined Average Scores


80
70
60
50
40
30
20
10
0

Pretest Average
Posttest Average
D.I. Group

I.S. Group

Combined

Percent Increase

Pretest Average 6.695156695 25.92592593 13.0787037


Posttest Average 67.92022792

70

67.38425926

Percent Increase 61.22507123 44.07407407 54.30555556

The above graph shows the average percentage scores earned by students within both periods
combined. The first set of data represents the D.I. group, the second the I.S. group, and the final
represents both instructional groups combined. The data also compares the pre and posttest
scores as well as the percent increase on average.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

40

Percentage

Assessment Possible Scores


100
90
80
70
60
50
40
30
20
10
0
15 14 13 12 11 10 9

Possible Points

The above graph illustrates the range of percentages possible on the pre and posttest
assessments due to the structure. As can be seen, each point affects the total percentage by
nearly a factor of 10.

3rd Period Posttest Completion Rate

5
38%

Did not finish


8
62%

Finished

The above circle graph illustrates the portion of the 3rd period class which completed the posttest
assessment. This was due to a majority of students getting called out for science fair.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

41

Question 1 Points Earned


18
16
14
12
10
8
6
4
2
0
Substitution

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

The above graph illustrates how many points were earned in the first question of the posttest.
The y-axis represents the frequency in which students earned each point.

Question 2 Points Earned


18
16
14
12
10
8
6
4
2
0
Elimination

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

The above graph illustrates how many points were earned in the second question of the posttest.
The y-axis represents the frequency in which students earned each point.

IMPACT ON STUDENT LEARNING: ASSESSMENT PROJECT

42

Question 3 Points Earned


16
14
12
10
8
6
4
2
0
I.D. variables

Correct EQ

Correct x

Period 3 (16 students)

Correct y

1 error

Period 5 (18 students)

The above graph illustrates how many points were earned in the third question of the posttest.
The y-axis represents the frequency in which students earned each point.

Question 4 Points Earned


14
12
10
8
6
4
2
0
1 solution 2 solutions 3 solutions

2
quadrants

Period 3 (16 students)

Correct
graph

1 error

DNF

Period 5 (18 students)

The above graph illustrates how many points were earned in the fourth question of the posttest.
The y-axis represents the frequency in which students earned each point.

Вам также может понравиться