Вы находитесь на странице: 1из 32

Tutors Name: Gary Chitty

Tutorial Time: Wednesday 11-1pm


Date Required: June 16th 11:59PM

Assessment Task 2: Part B Reflecting on Assessment Design


1,600 or equivalent
(Not including tables or headings)

Notes for tutor:

As my group did not present me with specific feedback pertaining to my assessment approach on the
bottom of the rubric I will also be drawing from my notes of the feedback my group gave me verbally.
Both the rubric feedback and this written feedback will be included in the same appendix.

Also keep in mind the hyperlinks in the document take you to the specific locations mentioned. I have
included as many as possible to ensure that it is easy to navigate through all the appendices and the
contents of this word document.

Introduction Learning and Teaching Context

The overall context:

The context in which the learning sequence of four lessons was taught was as part of a
collaborative teaching program in the placement school where all students in Years 3 and 4
were separated according to their ability. For this placement, I was presented with the highest
level ability group to teach. Students as part of this reading group were working 6-12 months
ahead of their year 4 peers. The topic being taught to them in this four lesson learning
sequence was the reading skill of prediction in texts and how to best approach it. The lesson
sequence involved a whole class big book reading each day to model prediction skills
followed by three activity rotations for students to reinforce them. This included, guided
reading and two independent tasks until the final session where they completed their post-test
(See Appendix 1).

Key influences on my assessment approach:

There were a couple key influences on my assessment approach such as the feedback
between my mentor and I in the second week (See Appendix 1). As placement commenced
the ideas that were initially developed for the assessment approach changed considerably by
the time of its execution, this was due to discussions with the mentor teacher that occurred
throughout the teaching rounds (See Appendix 1 & feedback). Additionally, the research into
particular assessment styles influenced the approach I adapted into my learning sequence. As
I was interested in implementing some kind of pre-test I was warned that it was best to first
consider what ensures an effective use of the pre and post-test design and to adapt that into
my own assessment approach. For example, the Boston University Medical campus states
that a pre-test may be used to alter the content of the course to build upon weakness ,
which is what I ensured with my pre-test. Based upon the scores in Table 2: Pre-test, I
changed the focus questions for the guided reading session as well as identifying some focus
students to be given additional help throughout the learning sequence (BMCU, p.2).

Assessment Design and Development

Rationale for Assessment Task:

In this lesson sequence the learning outcome is to be able to use the text processing strategy
of predicting, this is achieved by using features of the text such as the title, captions, pictures
or text to make reasonable predictions of the text and build overall comprehension of the text.
This pre and post-test design model of assessment task aimed to not only assess the prior
knowledge of students in their understanding of prediction but also to inform the teacher of
any areas to focus on in the sequence. The post-test aspect reviewed the learning that has
taken place to gauge whether or not the lessons provided a significant improvement of
understanding of using the prediction skill when compared to the knowledge assessed in the
pre-assessment.

Type of assessment created and implemented:

This pre and post-test design is separated into a diagnostic and summative assessment task
(See Table 1).
Table 1: Summary of Assessment Task and Criteria
Aspect of Assessment Rationale/Description
Task
Assessment Task Pre and Post Test on Prediction

Type of Assessment The pre-test is a diagnostic assessment that seeks to identify any prior
knowledge of the lesson sequence topic while the post test is a summative
assessment which seeks to reveal the level of improvement in
understanding when compared to the pre-test results.

When will assessment The pre-assessment will take place in the very first session of the lesson
take place? sequence before delving into what prediction is in order to see what
students know beforehand as well as what they dont know, which will
inform the lesson sequence focus.

The post-assessment will take place on the very last, fourth session of the
lesson sequence and will be used to check student improvement and
outcomes of this lesson sequence and to inform students of their progress.
(The feedback of this assessment was shown to students the next day
where they received verbal feedback in addition to their written
feedback.)

Links to Curriculum Interpreting, analysing, evaluating


Read different types of texts for specific purposes by combining phonic,
semantic, contextual and grammatical knowledge using text processing
strategies, including monitoring meaning, skimming scanning and
reviewing. (VCELY278)

Assessment Criteria The results from the post assessment will be assessed against a rubric
(See Appendix 2) that is student friendly.

Feedback to Students Student feedback will be presented in a written format, but some extra
oral feedback will also be provided to students when receiving their
written feedback.

(Feedback on their current class work will also be provided to help them
make some changes to their approach before the post assessment, See
Appendix 2 for in-class work sample feedback.)

By providing some written and oral feedback before the post-assessment


(formative) students can work on developing their predicting skills
knowing some of their areas needing improvement.

Through giving specific feedback like this on the post-assessment as well


as formatively, students will be able to be more aware of their learning
and develop strategies for their weaknesses in the future.

Assessment support PowerPoint to provide the material for some of the questions on the pre-
material test. Hand-out for students to fill in for the pre-test.
The book Teacup by Rebecca Young and Matt Ottley to help students
answer the questions on the post-test handout. (See Appendix 2)
Why does my assessment task demonstrate validity and reliability? (150)

In terms of validity, I attempted where possible to reduce the possibility of error in my pre
and post-tests by going over all the questions with the whole class or giving them a chance to
read before allowing them to start it. As my mentor mentioned in her feedback in Appendix
1, in order to further increase this validity I should have written the questions in the order
they are being tested rather than having to remind them this question is on the top right, top
left and so on. Tomlinson states that for a test to be reliable, it raises the question of
consistency or stability of results and that the more consistent a students score on
multiple assessments [are], the more reliable the measure (2013, p.124). This is
demonstrated in my assessment tasks when looking at Table 2 as the results for questions
various learning areas remain consistent in the post-test for students (See Appendix 2 for the
assessment task and materials).
Implementation of Assessment strategy

Whole class assessment data

Table 2: Summary of Class results for pre assessment task

Table 2.1: Summary of Class results for post assessment task

Note: Each area is equal to a maximum of 4, with a total result of 12.

Analysis of data

o What did the students learn?

According to the data, the students have learned how to better identify clues in a book to
assist them make accurate and valid predictions. They also improved their ability to make
predictions as some students started to explain the clues that they used to help them predict.
o What most students appeared to understand, needs and a need for greater challenge
that became apparent for students:

By the end every student knew how to identify at least two clues to help them predict. They
were all also able to make a reasonable prediction about the text but only some were able to
explain the clues that helped them come up with that prediction. Thus, the area of need I
noticed was in many who did not explain their clues that were used in making predictions.

For those who knew how to identify and explain clues, I shouldve challenged by getting those
students to quote from the text as evidence and work on their description of pictures and
encourage them to include more detail.

There were a couple students who missed out on many of my lessons due to extra-curricular
events such as cross country and district athletics who have revealed gaps in their knowledge
regarding the ability to list clues and using them in their predictions, for example student 2 and
18 who were only around for two of my sessions.

Whole class achievement, how will I modify my teaching practice?

As a whole, most of the students in the class have definitely improved their prediction skills.
Some more than others, particularly the students who were present for every session. Such as
student 1, 3 and 20 who made significant improvements from their pre-test.

Some of the students however I believe wouldve done better had I made a question in my
post-test more explicit. For example, the difference between a score of 3 and 4 in the rubric
for understanding predictions requires only an example, which I did not ask for in the
question (See Appendix 2, and here for rubric). Thus, in the future I will modify my teaching
practice by ensuring that the test is more explicit for students to accurately demonstrate their
understanding.

Feedback on students assessment work samples and analysis

o Analysis of student assessment work sample #1 (See Appendix 2: Work sample)

Student 7 was one of the students who showed a significant improvement when comparing
the pre-test result and post-test result together. They were able to explain it is important to
predict and explain in detail a prediction beyond the text that was read to them supported by
clues that were also explained.

o Analysis of student assessment work sample #2 (See Appendix 2: Student 1)

Student 1 made a huge improvement when it came to making predictions, he was also a focus
student of mine in one of the smaller reading groups. He was able to explain the importance
of predicting, list clues that could help him predict and make a reasonable prediction about
the text. Student 1 could benefit from working on explaining his clues a bit more however.

o Analysis of student assessment work sample #3 (See Appendix 2)

Student 9 has a good grasp on the importance of predicting in books and understands why we
need to use predicting as a skill. However, when it comes to making his own prediction he
uses a very limited amount of detail and needs to work on explaining the clues that help him
to predict more in the future (See Work Sample 2 & Feedback in Appendix 2).

How does my feedback support each students learning and address their individual
needs and learning goals?

My feedback supports each students learning and individual needs by addressing not only
areas of strength but at least one area they could improve on. The feedback is written in a
way to show students what they can do but also what they can do to challenge themselves. In
the form of a feedback sandwich (see week 4 class notes). For example, (See Appendix 2:
Feedback to Student 7).

Reflecting Critically and Stance on Assessment Practice


Reflection on assessment design and teaching

o What role did peer and Mentor feedback and literature play in your assessment
design? What did you learn from implementing this assessment approach?

While peer feedback gave me some ideas (See Peer feedback & peer feedback rubric) it was
the mentor feedback which was instrumental in my final assessment design. My mentor
teacher would provide me with feedback each week, including the feedback addressing my
initial design (See Appendix 3: Initial design).
Initially, I wanted to make my post and pre-test two mind maps to reveal student
understanding, though my mentors concern was that without any specific questions to guide
the students it would not be a reliable way of showing what students have understood.
Though this style of pre and post-test worked really well for me in mathematics, it was not an
appropriate test for literacy.

The other more notable change was from deciding to make my summative task a role play
back into an individual assessment (See Appendix 3: Week 7 feedback). When I explained
how it was to be implemented my mentor appreciated what I tried to do but she played the
devils advocate and asked me how I will ensure the validity of the assessment. That I will be
unable to tell whose ideas were whose and at what level do each individuals
understand (See Appendix 3: Week 7 feedback). Thus I made the final lasting change in
my assessment approach to the pre and post-test design I ended up using (See Appendix 2).

I learned that assessment, like teaching is an art in of itself. When looking back at all the
design changes that I went through to ultimately implementing my assessment I noticed how
my assessment transformed from a surface way of thinking about assessment to becoming
more aware of the wider ramifications of assessment, for example the notion of validity,
reliability, errors and teacher bias and how they all interconnect. For example being aware
that questions which werent explicit reduced the validity of the test and thus increased the
chance of student error (2013, p.124: See modification of teaching practice).

My Teacher Stance on Assessment

o Did you notice any changes in your stance from the AT1? Explain.
o What role does assessment reporting play in student learning? What are my teacher
beliefs about assessment, what is its purpose? 148

My stance on assessment has changed drastically since assessment task one. Though I read
the theory about assessment and understood it I didnt really think deeply about assessment
until I actually put it into practice with the students in my classroom. It was then that I made
connections between the theory I had read and what I was implementing in the classroom.

I believe that assessment reporting plays an integral part of teaching and that it is an
invaluable tool to support and enhance [the] learning of students (Shepard 2008, p.96).
When discussing with my mentor (see Appendix 3: Week 6 feedback) she would consistently
prompt me to consider each individual students level and needs and finding a way to be
inclusive in my teaching and assessment design, one student for example did not often come
to class and refused to work with any other student, which wouldve made the roleplay task
very difficult for them to complete (See Appendix 4:focus student notes). It is my firm belief
that the purpose of assessment is to inform the teaching practice in the classroom in order to
make both the teaching and learning process more effective for students, for example
tweaking lessons in order to suit a common area of difficulty (Shephard 2008, p.98).

REFERENCES:
Boston University Medical Campus, n.d.,Pre- and Post- testing, retrieved 24 May
2017, < http://www.bumc.bu.edu/fd/files/PDF/Pre-andPost-Tests.pdf >

Shepard, LA 2008, The Role of Assessment in a Learning Culture, Journal of


Education, vol. 189, no.1/2.

Tomlinson, CA & Moon, TR 2013, Assessment and Differentiation: A Framework for


Understanding, ASCD Publishing, Alexandria, Va.

Victorian Curriculum and Assessment Authority 2016, English Curriculum, levels


4, 5 and 6, retrieved 25 May 2017, <
http://victoriancurriculum.vcaa.vic.edu.au/english/curriculum/f-
10?y=4&y=5&y=6&s=R&s=W&s=SL&layout=2 >
APPENDIX:
(Go back)
Appendix 1: Overall Lesson Planners for both weeks of lesson sequence -
disregard Fridays session in week 8, it is not part of my lesson sequence. (That was done on request
of my mentor as she was away in the last week of placement, something to revise with them.)

Overview of pre-test:

Four questions that aimed to gauge whether or not students had much prior knowledge in
each particular area of prediction as a skill. Tested to see if they know much about which
clues to use, how they would approach a prediction task as well as describing it in their own
words.
Week 8 Overall Rotation Planner, Thursday was changed.

Change for Thursday:


POST TEST - As a whole class, students are to complete a post-test which aims to discover
whether or not students have learned and picked up on the various cues used for predictions
from the whole-class big books and guided reading sessions followed by some open
questions to test their comprehension of the skill and why it might be important to know how
to do. Before starting this test the teacher is to read the book Teacup by Rebecca Young and
Matt Ottley to the rest of the students.
Appendix 2: Assessment Task Materials in Full
Pre-test:
PowerPoint slides that guided the pre-test questions:
Post-assessment:
Book cover:
Rubric for pre-test results:

Rubric for post-test results:

The two rubrics are similar in the way that they are assessing the same overall skills, just with minor
tweaks to suit the style of the pre & post tests.
Formative Assessment samples from workbooks:
Evidence: 3x student assessment work samples and feedback

Student 7 work sample and feedback: (Go back)


Student 7 feedback: Go back

Transcript of feedback:

____, I really like the level of detail you used when explaining the clues that helped you to make
predictions. Maybe next time you could challenge yourself by taking quotes from the text to include
as evidence and describe the pictures in more detail using your WOW words! Great effort!
Student 1 work sample:
Student 1 feedback:

Transcript of feedback:

____, I really liked the amount of detail you used to describe your clues and I can really tell
you understand prediction well because of how you explained it. Maybe next time to
challenge yourself, try and think about what other details can I predict? Spectacular work!
Student 9 work sample:
Work sample student 9 part 2 (go back):
Student 9 feedback:

Transcript of feedback:

____, I can really tell that you understand why we predict and what clues can help us.
However, next time to challenge yourself a bit more I think you should try and explain the
clues that helped you to predict in more detail. Great effort!
Appendix 3: Peer and Mentor feedback on proposed assessment design

Initial assessment approach feedback from AT 2, Part A:

Transcript of feedback:

In the initial discussion with Laura about her assessment, she suggested the students create a
mind map about what they understood about prediction. This is a good idea, however she
would need to have set questions to guide them of what they were expected to show. We also
discussed some students may write very little not because it was a reflection of their
understanding but mind maps are very open & sometimes unclear. The students at this
group are mostly above the expected level, with set questions Laura would get a better
understanding of where they are starting from. We decided to include pictures & reading text.
Laura took on this feedback well & immediately changed her pre-test to better cater to the
students abilities, to get a more clearer starting point assessment Signed Mentor Teacher

Mentor feedback from week 6 (1st week of placement) (go back)

Transcript of mentor feedback:

When discussing assessment, we talked about what information she wanted to get, the level
of the students, what they may already know and how to implement to get the desired
outcome. After this discussion she planned and implemented a clear pre assessment with
clear expected outcomes. Next time I would order the questions with no.1 starting at the top
left and then continue in or, otherwise students may complete wrong questions, but leave the
understanding and it wouldnt be shown on the test Signed Mentor Teacher

Mentor feedback from week 7 (go back)

Transcript of Mentor feedback:


We discussed that the role play task was better for a class learning activity rather than an
assessment. This is because when they complete the roll play you are unable to tell whose
ideas were whose and at what level do each individuals understand. This is why individual
written form was chosen, also as a simpler form to keep track and refer to and score
Signed Mentor Teacher.

Relief teacher feedback from end of week 2, after post assessment (go back)

Transcript of supervising relief teacher feedback:

The assessment worked well, I liked the fact that the children were able to give their own ideas about
why we make predictions, most gave some reasonable explanations. The assessment had a good mix
of open questions, children demonstrated their understanding by listing clues they may find in books.
The 2nd side more specific to the book this monitored the understanding of the text.
Peer feedback Rubric & notes from class:

Transcript of overall feedback:

Perhaps too much text on slides


Transcript of notes:
Feedback suggestions attempt to incorporate peer assessment, more explicit about learning
outcomes, more learning context of classroom, guided reading, rotations.

Appendix 4: Reflections on lesson plans/day

Week 4 notes on feedback:


Notes made on focus students: