Академический Документы
Профессиональный Документы
Культура Документы
LESSON ROER:
HO8 TO DE,ELO A RU$RIC
.tep 9: Determine te type of rubric you !is to use " olistic or analytic *1arriveau, :A9A+#
.tep :: $%entify !at you !ant to assess# !hese form the criteria for the assessment. !hese are
usually part of the description of the assignment or task.
.tep ;: $%entify te caracteristics to be rate% &ro!s'#
.pecify the skills, knoledge, andHor behaviors that you ill be looking for.
5imit the characteristics to those that are most important to the assessment.
.tep <: $%entify te le(els of mastery)scale &columns'#
!ip: Aim for an even number *I recommend <+ because hen an odd number is used, the
middle tends to become the Qcatch$allQ category.
.tep =: Describe eac le(el of mastery for eac caracteristic &cells'#
Describe the best ork you could expect using these characteristics. !his
describes the top category.
Describe an unacceptable product. !his describes the loest category.
Develop descriptions of intermediate$level products for intermediate
categories. Important: ,ach description and each category should be mutually
exclusive.
'ocus your descriptions on the presence of the %uantity and %uality that you
expect, rather than on the absence of them. )oever, at the loest level, it ould
be appropriate to state that an element is FlackingG or FabsentG *1arriveau, :A9A+.
4eep the elements of the description parallel from performance level to
performance level. In other ords, if your descriptors include %uantity, clarity,
and details, make sure that each of these outcome expectations is included in
each performance level descriptor.
.tep =: Try out te rubric#
Apply the rubric to an assignment.
.hare ith colleagues.
!ip: 'aculty members often find it useful to establish the minimum score needed for the student
ork to be deemed passable. 'or example, faculty members may decide that a Q9Q or Q:Q on a <$
point scale *<Kexemplary, ;Kproficient, :Kmarginal, 9Kunacceptable+, does not meet the
minimum %uality expectations. !hey may set their criteria for success as ?AM of the students
3 | P a g e
must score ; or higher. If assessment study results fall short, action ill need to be taken.
.tep C: Discuss !it collea*ues# +e(ie! fee%back an% re(ise#
Important: #hen developing a rubric for program assessment, enlist the help of colleagues.
Rubrics promote shared expectations and grading practices hich benefit faculty members and
students in the program.
DIRECTIONS FOR RU$RIC CALI$RATION
>elo are directions for the rubric calibration process that are provided on the 6niversity of
)aaii at "anoa assessment ebsite *)aaii, :A9:+.
.uggested materials for a scoring session:
1opies of the rubric
1opies of the QanchorsQ: pieces of student ork that illustrate each level of
mastery. .uggestion: have C anchor pieces *: lo, : middle, : high+
.core sheets
,xtra pens, tape, post$its, paper clips, stapler, rubber bands, etc.
)old the scoring session in a room that:
Allos the scorers to spread out as they rate the student pieces
)as a chalk or hite board
Process:
9. Describe the purpose of the activity, stressing ho it fits into program assessment plans.
,xplain that the purpose is to assess the program, not individual students or faculty, and
describe ethical guidelines, including respect for confidentiality and privacy.
:. Describe the nature of the products that ill be revieed, briefly summari(ing ho they
ere obtained.
;. Describe the scoring rubric and its categories. ,xplain ho it as developed.
<. Analytic: ,xplain that readers should rate each dimension of an analytic rubric
separately, and they should apply the criteria ithout concern for ho often each score
*level of mastery+ is used. )olistic: ,xplain that readers should assign the score or
level of mastery that best describes the hole pieceR some aspects of the piece may not
appear in that score and that is okay. !hey should apply the criteria ithout concern for
ho often each score is used.
=. 3ive each scorer a copy of several student products that are exemplars of different
levels of performance. Ask each scorer to independently apply the rubric to each of
these products, riting their ratings on a scrap sheet of paper.
C. -nce everyone is done, collect everyone&s ratings and display them so everyone can see
the degree of agreement. !his is often done on a blackboard, ith each person in
turn announcing hisHher ratings as they are entered on the board. Alternatively, the
facilitator could ask raters to raise their hands hen their rating category is announced,
making the extent of agreement very clear to everyone and making it very easy to
identify raters ho routinely give unusually high or lo ratings.
@. 3uide the group in a discussion of their ratings. !here ill be differences. !his discussion
is important to establish standards. Attempt to reach consensus on the most appropriate
rating for each of the products being examined by inviting people ho gave different
3! | P a g e
ratings to explain their /udgments. Raters should be encouraged to explain by making
explicit references to the rubric. 6sually consensus is possible, but sometimes a split
decision is developed, e.g., the group may agree that a product is a Q;$<Q split because it
has elements of both categories. !his is usually not a problem. Oou might allo the group
to revise the rubric to clarify its use but avoid alloing the group to drift aay from the
rubric and learning outcome*s+ being assessed.
B. -nce the group is comfortable ith ho the rubric is applied, the rating begins.
,xplain ho to record ratings using the score sheet and explain the procedures.
Revieers begin scoring.
?. If you can %uickly summari(e the scores, present a summary to the group at the end of
the reading. Oou might end the meeting ith a discussion of five %uestions:
Are results sufficiently reliable0
#hat do the results mean0 Are e satisfied ith the extent of students&
learning0
#ho needs to kno the results0
#hat are the implications of the results for curriculum, pedagogy, or student
support services0
)o might the assessment process, itself, be improved0
TIS FOR DE,ELOING A RU$RIC
,in% an% a%apt an e-istin* rubric. It is rare to find a rubric that is exactly right for your
situation, but you can adapt an already existing rubric that has orked ell for others
and save a great deal of time. A faculty member in your program may already have a
good one.
/(aluate te rubric. Ask yourself:
o Does the rubric relate to the outcome*s+ being assessed0 o
Does it address anything extraneous0 *If yes, delete.+
o Is the rubric useful, feasible, manageable, and practical0 *If yes, find multiple
ays to use the rubric, such as for program assessment, assignment grading, peer
revie, student self$assessment+
Bencmarkin* 0 collect samples of stu%ent !ork tat e-emplify eac point on te scale
or le(el# A rubric ill not be meaningful to students or colleagues until the
anchorsHbenchmarksHexemplars are available.
Anticipate tat you !ill be re(isin* te rubric#
Sare effecti(e rubrics !it your collea*ues#
SUMMARY:
After making the rubric, hand out the rubric to your students along ith the assignment. Discuss
in class ho it ill be used. #hen students turn in their completed ork, grade it using the
rubric. *"ake copies of the rubric so that you are filling one out for each student.+ 1ircle
3" | P a g e
appropriate items on the rubric hen grading, add comments if necessary, and add up the points.
"ake photocopies of the completed rubrics. .taple a completed *filled$out+ rubric to each
assignment and return the ork to students. Analy(e and reflect on the results. #ere there any
dimensionsHprimary traits that had lo overall scores0 'ocus your improvement efforts on those
aspects. Report on your results and the improvements made as a result of the assessment.
E,ALUATION:
Directions: Read each statement belo carefully. Place a ! on the line if you think a statement it
!R6,. Place an ' on the line if you think the statement is 'A5.,
888888888889. Rubrics do not promote shared expectations and grading practices hich benefit
teachers and students in the program.
88888888888:. #hen developing a rubric for program assessment, you should be the only one
responsible in developing it.
88888888888;. It is rare to find a rubric that is exactly right for your situation, but you can adapt
an already existing rubric that has orked ell for others and save a great deal of time.
88888888888<. In making group rubric alays attempt to reach consensus on the most
appropriate rating for each of the products being examined by inviting people ho gave different
ratings to explain their /udgments.
88888888888=. 3ive each scorer a copy of several student products that are exemplars of
different levels of performance.
REFERENCES:
Ca(()*ea+, R. -2%1%.. Connecting the Dots. De/01/, T2: Fa/&3 F14 P+5')&a0)1/6, I/&.
R1ge(6, G. -2%11, J+'3 1$.. Be60 7(a&0)&e6 )/ a66e66)/g 60+8e/0 'ea(/)/g. The institute on
quality enhancement and accreditation. F1(0 91(0h, Te4a6, :SA: S1+0he(/
A661&)a0)1/ 1; C1''ege6 a/8 S&h11'6 C1<<)66)1/ 1/ C1''ege6.
:/)*e(6)03 1; Ha=a)>). -2%12, A+g+60 22.. Assessment. Re0()e*e8 ;(1<
A66e66<e/0: h007:??===.<a/1a.ha=a)).e8+?a66e66<e/0?
$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$2-!)I23 '-55-#.$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$$
3# | P a g e