Вы находитесь на странице: 1из 37

Module 001 Nature and Roles of Assessment

At the end of this module you are expected to:


1. Identify terms related to nature and roles of assessment
2. Determine t.he difference as well as the relationship among t.he process
of measurement. testing, assessment and eva luation
3. Analyz.e basic concepts and pr inciples of the nature and roles of
assessment

Natur e and Roles of Assessment

At the core of the learning process, assessment is there. Assessment is done fo r ga uging
and enhancing student learning. Teachers, students and stakeholders of education should
understand \1vhat assessment is real1y all about. why it is needed and hov.• it is connected to
measurement, testing and evaluation.

An important element in the curriculum development process is the assessment. It is used


in det ermining, learning needs of students, monitoring their progress and examining their
performance against the identified student learning outcomes. Assessment is implement ed at the
fol1ov.'ing phases of instruction:

• Pre -assessment (before instruction}


• Fon n ativeassessment ( during instruction)
• Summative assessment (after instruction)

According to De Guzman, Estefania.S,, Adamos, Joel L. (2015) Assessement ofLearning 1.


Manila: Adriana Pub1ishing Co., {nc., it is imperative that educators are a ware of the emphasis of
Outcome based education (OBE} in t erms of assessment because it is the Commission on Higher
Educ,ition (CHED) to implement OBE a cross all programs.

What is Outcome· based Curriculum aHabout? According to OBE Principles and Process.
(n.d.). Retrieved August 28, 2018, from http:flcei.ust.hk/t ea.ching·resources[outcome·based·
educ.ationfi nstitutional-resources/ obe·princip1es·and·process# l , outcome ·based education
refer s to the process of d early focusing and organizing everything in the system of education
around \1vhat is important for aU students to be able to do successfully at the end of their learning
experiences.
Course Module

Like\vi s e, according to QBE Principles and Process. (n.d.). Retrieved August 28, 2018, from
http: //cei.us t.hk/ teachin g·resources / outcome-ba sed-education {institutional· resources / obe·
princip}e-s·and·process#1. outcome-bas ed education curriculum m eans starting \r\tith a clear
picture of what is relevant for students to b e able t o do, then organizing the cur riculum,
instruction a nd ass essment s o as to ensure that the sa id }earning ultimately happens. OBE
curriculum has the fo1lowing principles according to the same s ource and thes e are the fo llowing:
1, Clarity of focus
Focus of w hat the teachers do should be on \1vhat the said teachers want st udents to
know, understand and be able to do. Helping others to develop the knowledge, skills,
persona1ities and attitudes of students which wil1 enable them to achieve the intended
out come \1vhich have been d early articulated should be t he fo cus of teachers.

2. Des igning dovm


Curriculum design should start with a clear definition of the intended outcom es
w hich students are to be achieved by the end of the progra m. AH instructional decisions
are then made to ensure achieve the desired end res ult once this has been done.

3. High expectations
In order to encourage students t o engage deeply in w hat they are }earning, teachers
s hould estab1ish high and chaUenging performance s tandards. Su ccess ful }earning
promotes more s uccessful learning and so students should be helped to achieve high
s tandards.

4. Expanded opportunities
Teachers should strive so as t o provide expanded opportunities for an students.
This is bas ed on the idea that not a ll learners can learn the sa me thing in the same time and
in the same way. If students are given appropriate opportunities, said students can
achieve high standards.

Measur ement. Test ing. Assessment and Evaluation

MEASUREMEN T
According to De Guzman. Est efania.S., Adamos, Joel L (2015) Assessment of Learning
1. Manila: Adriana Publishing Co., Inc, pa.ge 2, measurem ent comes from the Old French
word mesure meaning 1 imit or quantity". Measurement basica1ly is a quantitative
description of characteristic or attribute of an object. In education, m easurem ent refers to
w hat do teachers measure and what instruments do they use in m easuring. In determin ing
how much learning a student has acquired compared to a standard or criterion or in
reference to ot her students in a group (norm-referenced), teachers are interested. The
particular elements of learning like students' readiness to learn. demonstration of specific
skills, recall of facts, or their ability to analyze and solve applied problems are usually
m easured by t eachers. To obtain pertinent information, teachers utilize tools or
in struments Hke oral pres entations, written reports, portfolios and rubrics.
TESTING

What is testing According to Testing Definition in the Cambridge English Dictionary.


(n.d,}, Retrieved August 28, 20 18, from
https://dictionary.~ mbridge.org/ us/dictionary/english /testing , testing refers to the
practice or act of giving tests so as to measure someone' s know1edge or abi1ity while
according to De Guzman, Est efania.S., Adamos, Joel L (2015) Assessement of Learning 1.
Mani1a: Adriana Publishing Co., lnc., pp. 2-3, testing refers to a systematic and formal
procedure for gathering informatio n. A tool which consists of a set of questions given
during a fixed period of tim e under comparable conditions for aU students is c.alled TEST.
It is utilized to measure a construct and make decisions. Educational tests may be used to
measure the progress of student's learning which are formative in purpose or summative,
a comprehensive one which covers a more eh'tended time frame. In order to obtain
numerical descriptions of performance of students, teachers score tests,

According to th e same source, the most dominant form of assessment is the test. It
is a traditional assessment and may not be the best \ivay to measure how much students
have }earned but t hey still provide valuable information about th e }earning and progress
of students.
What are t he purposes of test? According to Francisco, M, {20 15, December 07).
Type of Test , Retrieved August 28, 2018, from
https· //y.ryy1\' slideshare net/ManilynErancisc.oAype·nf·test·55896-S44, th e purposes of
test are the fo1lo,vi ng:
1. To assess understanding of students to the given topic within a subject a nd
determine what th ey have }earned;
2. To evaluate the progress of students in subject within a given period of time;
3. To assess th e strengths a nd \iveaknesses of students for focus, assistance or
individual instructions;
4. To identify \1vho \r\ri.11 receive awards, scholarships and recognition;
5. To provide basis of qualification for entry into a progr11m, school higher-
education, scholarship or int ernship;
6. To gain col1ege credit -a dva nced placement exam;
7. To measure the effectiveness of a teacher or schools as part of teach er
evaluation, high stakes testi ng

ASSESSMENT
According to The Role of Measurement a nd Assessment in Teaching (n.d.) Retrieved
August 28, 2018 fro m
https://v\T'\VV,1.geneseo.edu / sites/ default/fi1es/ sites / edu~ tion/ p12resources-a ssessment·
informs·teacbing pdf , assessment refers to procedures used to ga in information about the
}earning of students and th e value judgment fo rmation concerning learning progress. The
general principles of assessment a ccording to the same source ar·e the following:

1. The highest priority is clearly stating what is to be assessed.


Course Module

2. Because of the relevance of asses sment procedure to the characteristics or


perfo rmance to be measured. it should be selected.
3. Variety of procedures is required in a comprehensive ass essment.
4. An a warenes s of t he limitations of assessment procedures is required in the
proper use of the said procedures.
5. Assessment is a means to an end, not an end in its elf. information about
assess ment aids us in making decisions about s tudents, instructions and
curriculum. {t should never be utili2ed just to give a grade even if that is the
purpose. Everyone should know and understand \1vhat the purpose of a s pecific
type of ass essment is.

EVALUATION

According to Differences betvi.reen Testing, Assessment, a nd Eva luation (n.d.},


Retrieved August 28, 20 18, from http://tutorials.istudy.psu.edu/ tes ting/ testing2.html.
evaluation refers to the pr-ocess of making judgm ents based on criteria and evidence.

Likev.'lse, according to De Guzm an, Estefania.S., Adamos, Joel L. (20 15) Assessement
of Learning 1. Manila: Adriana Publishing Co" Inc., p. 10, after the data had been collected
from an assessment task. eva luation comes in. It is the process of judging the quality of
performance or course of a ction. To make sound decisions about the students a nd the
teaching-learning proces s, assessment data ga thered by the t eacher have to be
int erpreted. To uncover how the learning process is developing, evaluation is carried out
bot h by the teacher and his students.

RELATIONSHIP AMONG MEASUREMENT, TEST AND EVALUATION


According to Bhat. T. R. (20 17, February 15). introduction to test m eas ur·e and
evaluation. Retrieved August 29, 2018, from
hllp_wwww,slideshar.e,net/.numbjzz/jntrQduction-to·test-measute.cand-er.luation, the
three - meas urement, tes t and evaluation are interrelated. Specific ins trument fo r
meas urement is the test It is the s pecific ins trument for meas urement and tes t
administration is a meas urem ent process, without a test. meas urement is not poss ible. A
technique necessary for eva luation is the meas urement {t represents a status of a certain
properties or attribut es a nd is a t erminal process. A s ituation is described by measurement
and evaluation judges its worth of va)ue. A technique of evaluation is called measurement
and t es t a re too) of meas urement. The t erm meas urement. t es t a nd evaluation are distinct
but related. Teacher obtained measures from test so as to make a n eva luation \1vhich is fair
- about s pecific characteristic or trait of t he students. One or more tests a re involved in an
evaluation a nd test is involved one or more measurement .
EVALUATION

Fl-~ m Fh.>:kKl",inh >-1mr. ;,, !low. ( ZGl s. Mit d \ 3:)-:>Jndi n:.~,:-,t,;J,:c!'!H·pt~ i d p m:.: tpk~ m Li r-¼Jllf~ T~t1r4 . ::mi~v~e A~Uc:t
:.~. 201S. from E ~ / W"l\"W.~hd ~fr,i r~.nu /kb :ih.mmhah !!1:n~ m~:-.:il...:onu pt~•i d •~tln:ir k ~·m•b'n&~;;i;••~~rnng

According to De Guzman, Estefan ia.S. Adamos, Joel L. (2015) Assessement of


Learning 1. Manila : Adriana Publis hing Co., lnc., page 10 , tests provide quantitative
m eas ures, res u)ts of tes t may be us ed fo r eva luation or othenivise. In the figure above,
area 1 is evaluation whic.h does not invo)ve test s or m eas urem ents. An example is the
utilization of qualitative descriptions to describe performance of st.udents. Non-test
procedures \1vhich can be used t o diagnose learning problems am ong s tudents are
obsenrations. Ar·ea 2 is the non-test measures for evaluation. An example of a non-test
m eas ure for evaluation used by teac.hers in assigning grades is the ranking. Ar·ea 3 is
where al1 three converge. Examples of these are the teacher-made t es ts. Area 4
pertains to non-eva luative tes t measures. Examples of these are the tes t s cores utiJiz.e d in
correlational studies. And, are 5 pertains to non-tes t m eas ures and non-eva luative like
assigning numerical codes to responses in research study. Nominal scales used in labeling
educational attainment ar·e an exam ple of these.
Assessment: Natur e and Pur poses

According to De Guzman, Estefania.S" Adamos, Joel L. (20 15) Assessement of


Learning 1. Manila : Adriana Publishing Co .. Inc., pp 7-10, the term Assessment comes from
the Latin word, assidere whic.h means "'to sit beside a judge." It signifies that asses sment is
Cour se Module

tied up with evaluation. What's the other meaning of assessment? Aside from the
definitions given beforehand, assessment also refers to the information collection about the
performance of learners u sing differe nt methods and tools. It can be cl.assified according to
the foHowing categories:
1. Maximum perform.uiee
When students are motivated to perform well maximum performan ce is
achieved. Results of assessments: form maximum performance shows what
students can do at their level best • their achievements and abihaes. Students are
encouraged to a im fo r a high score in th is category. Examples of measures of
maximum performance are aptitude and achievement tests.

2. Typical performance
Typical performance measure manifests what students will do or choose to
do. It assesses how the ability oflearner is .-1dent !f demonstrated on a regular
b asis. It is more focused on the level of motiva tion of learner rather than his
optimal ability. Measures for this are interest, attitude and personality in,·entories.
peer appr•isals and observation techniques. Insights Into the learner's Interests.
personality traits and potential career preferences.

According to th e same sour ce, purposes of assessment are given and these are the
following:
1. Assessment for Learrung
tr pertain.s to assessment tasks which are diagnostic and formative in nature
and which are utilized to Identify the learning needs, monitor progress of students
academically dunng a unit or block of instruction and guide instructions. On·going
and immediate descriptive feedback concerning their performa."!ce is given to
students. T..chers can make adjustments which are based on assessment results.
when necessary m their methods of teaching and strategies to support ?earning.
They can decide whether there is a need to differenti:ate instruction or make ~
design learning activities which are more appropriate so as to clarify and
cons oh date knowledge, understanding and skills of students.
Examples:
• Pre•te.sts
• Qwzzes
• \.Yritten assignments
• Focused questions
• Concept maps
2. Assessment as Learning
1, employs activities or tasks which provide opportunity to students m
momtonng and furthenng their own Jearnmg - to think about their personal
learning habits and how to adjust their learning strategies so as to achieve their
goals. Metacogniti,•e processes like self regulation and self·reflectton is involved
here so as to allow students to use t.heir strengths and \\'Ork on thelT weaknesses
by direcing and regulating their learning. Studects are accountab}e and
responsible for their own learning. It is also form.alive which may be given at any
learning process phase.
Examples:
• Self· assessm ent rubrics
• Peer assessment rubrics
• Portfo lios

3. Assessment of Learning
It is summative in nature. lt is done at the end of a task, unit, process or
period. To provide evidence of the level of achievement of students in relation to
curricu1ar outcomes is its purpose. lt is used for purposes of gra ding, evaluation
and reporting. The concerned students, his parents and other stakeho1ders are
given evaluative feedback. Foundation for decisions on placem ent and promotion
of students is provided by assessment of learning.
Examples:
• Unit tests
• Final projects

RELEVANCE OF ASSESSMENT
In the same source, it \11t•as discussed that assessment has importance and \11t•ork for
the fo llowing:
1. Students
Students become actively engaged in t he process of learning through the
different constructive and }earner-centered assessment tasks. Because of these,
students take responsibility for their own }earning and }earn a lso to monitor
changes in their }earning p.atterns wit h the guidance of the teacher. Such
assessments lead to better student achievement.

2. Teac.h ers
Assessment gives information to t ea chers about the knowledge and
performance based of students. Because of it, most effective teac.h ing methods and
approaches can be revealed. Direction is provided as t o how t eachers can help
learners more a nd what they should do next. Assessment procedures also support
decisions of instructors on managing instructions, assessing competence of
students, placing students to levels of education programs, assigning grades t o
learners, guidance and counseling, certifying ,compet ence and selecting students
fo r education opportunities.

3. Parents
Education is shared partnership. ln the assessment process, pa rents should
be involved. They ar·e valued source of assessment information on the }earning
habits and educational history of their children especially for preschoolers who
Course Module

don't understand their deve1opmental progress yet Needs of children for


appropriate intervention are also identified through assessment data.

4. Administrators and program suff


In identifying strengths and v,reakness.es of the program, administrators and
school planners utilize assessment. Administrators and school planners are able
to d•ngnate program priorities, assus options and lay down plans for
lmpro•·•ment. Assessment data are utihzed to make decisions regarding promotion
or retention of students and faculty development program arrangements.
5. Policymakers
Information about the achievements of students which in turn reflect the
quality of education being provided by the sehool is provided by the assessm<nt.
With the .. id Information, different go,·ernment agencies can set or do some
mod1ficab.ons on standar~ give rewards or sanction schools and direct educational
resources.
Results of assessment also serve as basis for new law formulation. A good
example is the RA 10533 or the K to 12 Enhanced Basic Education Act of 2013
wherein one of the rationale for the said law Implementation was the low scores
obtamed by Filipino pupds m standardized tests hke the National Achievement Test
(NAT) and the Trends in International Mathematics Study (TIMSS).

Types of Test

Accordmg to De Guzman, Estefania.S" Adamos, Joel L. (2015) Assessement of


Learning 1. Manila: Adriana Publishing Co., Inc., pp. 3·5 t ests can be d assified according to
the following:
A. Mode of response
1. Oral test - answ~rs to this test are spoken. Oral communication skills can
b, measured using this type of test. Underst anding of theories, cone<plS and
procedures of students may be checked using this.

2. Written t ests - students se1ect or provide response to a prompt. Written test


has its positive points and cam~ administered co a la.rge group at one time.
Students' writun communication skills can be measured by written=·
Lower a nd higher cognition levels can be assessed by usmg writun tests,
provided that questions are phrased appropriately. Assessment of wide
range of topics is enab1ed through this type of test.
FORMS OF WRJTTEN TEST,
• Multiple choice
• True or false (alternate response)
• Matchmg type
• Completion
Identification
• Short·answer
• Ess.ays
3. Performance tests - These are usual1y called as performance assessments.
These require }earners t o demonstrate their ability or skil1s to perform
specific actions. These inc1ude inquiry tasks, problem-based }earning,
exhibits, demonstration tasks, presentation tasks and capstone
performances. These tasks are designed to be meaningfu L authentic. in·
depth and multidimensional. Some of the dravibacks ar·e cost and efficiency.

B. Ease of qualification of response


1. Objective tests - Objective test can be corrected and quantified easily. Items
in th e test have a specific or single convergent response. It includes mult.ip}e
choice, completion, true-false and matching type.

2. Subjective tests - Varied responses are e1ic.ited by a subjective test. Here, a


certain question may have more than one answer. These include extended
and restricted-response essays. This type of test is not easy to check beca use
students have the Jiberty t o \vTite their answers to a test questfon. Test
answers ar·e usually different. Usua1ly personal opinion or judgment by the
person doing the scoring influences the scores in t hese tests.

C. Mode of administration
1. Individual test - It is a test given to one person at a time. To gather
informa tion extensive1y about each student's cognitive functioning and his
abi1ity to process and perfo rm specific tasks, individual cognitive a nd
achievement tests.

2. Group test - It is administered to a group of examinees or a class of students


simultaneously. The reason \1vhy it \11t•as developed is to address the practical
need of testing.

D. Test constructor
1. Standardized tests· SpeciaJiz.ed who ar·e versed in th e assessment principles
are the ones who prepared these tests. These are administered to a large
group of examiners or students under similar conditions. Scoring procedures
and interpreta tion are consistent. To aid in t he administration and
interpretation of test results, available a nd guides are available.

2. Non·standa.:-dized tests · These tests are usual1y prepared by teachers ~ d


usually administered to one or few classes to neasure the course or sub:ect
achievement.

E. Mode of interpreting results


1. Tests that yield norm-referenced interpretations- in th ese tests,
performance of students are. measured in relation to the performa nce of a
group on the same tests. c,, mparisons are r.iade and th e student's re:.ative
position is determined.
Cour se Module

2. Tests that allow criterion-referenced interpretations - In these tests,


performance of a student is des cribed against an agreed upon or pre ·
established performance level or criterion. Th e criterion is the domain of
s ubject matter which is the range of well-define d instructional outcomes or
objectives.

F. Nature of answer
1. Personality tests- These tes ts meas ur·e one's personality and behavioral
sty le. There's no right or \'°JT ong answer s in tiltese tes ts. It is ut ilized in
recruitment as it helps employers t o determin e how a potential employee
\r\til1 respond to diffe rent work-related activities.

2. Achievem ent tests - These meas ure the learning of students as a result of
instruction and tra ining experiences. They serv e as a bas is for promotion to
the next )eve) or grade.

3. Aptitude tests - These tests determine the potential of students to learn and
do new tasks. These aids in choosing the best line of work for an individual
based on his interes ts and skills.

4. Intel1igence tests - Learner's metal ability or innate intelligence a re


m eas ured in thes e tests.

5. Sociometric t es ts - Interpersonal relationships in a social group are


m eas ured in thes e tests.

6. Tra de or vocational tests • These tes ts ass ess knowledge, skills and
competence of an individual in a particular occupa tion. Trade tests may
consist of practical test and a theory test s.

Roles of Assessment

According to De Guzman, Estefan ia.S., Adamos, Joel L. (2015} Assessement of


Learning 1. Manila : Adriana Publis hing Co., lnc., pp. 21·24, the roles of ass essment are t he
follo\vi ng:
1. Placement assessment
It is basically utilized to determine an entry perfo rmance of learner. At
the beginning of t he instruction, assessment is done by the t eacher. This
assess ment is done thr-ough a pre-test whether students possess prerequisite
ski.Us needed prior t o instruction. The teacher can provided }earning
experiences to help students when the pre -requis ite skills are insufficient If the
students are ready, teacher can proceed with t he planned instruction. Example:
Arithmetic t est fo r students who are ab out to take elementa ry a lgebra.
[t is also utili2ed to see if the students have already acquired the int ended
outcomes. items which measure knowledge and skills of students in reference
to t.he learning targets ar·e the contents of a placement pre-test.

2. Formative assessment
Ct mediat es t he teaching and learning process. It is both t each er -centered
and learner-cent ered. It occurs dur ing instruction and context-specific~ It is
utilized as feedback to enhance teaching and improve learning process.
Format.ive assessment results are recorded for the monitor ing learning progress
of student purposes.

3. Diagnostic assessment
To identify learning difficulties during instruction is the int ention of
diagnostic assessment. Commonly held misconceptions in a subject can be
detected using a diagnostic test. Ct is utilized to detect causes of persist ent
learning difficulties despite the pedagogical remedies applied by the teacher and
it no not p.a.rt of mark of achievement of students.

4. Summative assessment
To determine the extent to which the student s have attained the }earning
outcomes, it is done at t he end of instruction. It is utilized for assigning and
reporting grades or used to certify the mast ery of concepts a nd skills. A written
examination a t the end of the school year to determine who passes and \1vho fails
is an example of this assessment.
Module 002 Appropriateness and Alignment of
Assessment Methods to Learning Outcomes

At t he end of t his mod ule you ar e expected to:


1. Jd entify terms r e1ated to ap pr opr iateness and aHgnm en t o f assessmen t
methods to }earning outcomes
2. Analyz.e basic concepts, p rincip)es and pr oced ur es in id en tifying
learning o utcom es. taxonomy of )ear ning domains. lear ning targets
and assessment method s.
3. Deter mine the d ifferen t )evels, pr ocesses and action verbs t hat may be
used for the differen t taxonomy of lear ning d omains
4. Examine the differen t types of assessmen t methods

Approptiateness and Alignment of Assessment Methods to Learning Outcome s


Assessment in c1assroom begins wit h ~sking the r eason fo r assessment. After t his asking
wha t t o assess shou ld be th e next q uestion t o ansv,re r. Thi s pe rt ain s t o th e stud en t )ea r ning
outcomes fo r w hat t he teachers w ou)d like stud en ts to knov,r and be ab)e to do afte r taking up a
certain unit o r section. After d efining the ou tcom es or targets the next t hing to answer is hov,r the
teacher wiH assess the said targets. To proo;id e stud en ts with op portunities which ar e rich in
br eadt h and d ept h and to p rom ote d eep u nd er standing. assessment too1s measur e the said
lear ning ou tcomes shou)d be par a11e1 to the said o utcomes or targets.

Identifying learning Outcomes

Accor ding to De Guzman, Estefania.S.. Ad am os, Jo el L. (2015) Assessement of Learning 1.


Manila: Ad riana Publishing Co.. In c. , p p. 34, learning o utcom e r efer s to a particular level of
know ledge, sills and values which a stud en ts has acq uir ed at the end o f a per iod or u nit of study
as a r esult o f his engagemen t in a set o f meaningful and ap pr opr iate )earning ex-periences.
Teachers are able to plan and deliver appr op riate instruction and d esign valid assessment tasks
and str ategies because of an organized set of learning outcomes.
The Assessment Proce ss (n.d.) Retr ieved September 9. 2018 from
htt ps · //www,missouristate,edul assessment/89 380,htm cited t he fo ur steps in t he assessment
cycle and these ar e t he follow ing:
1. Clear ly d efine and id en tify learning o utcomes
Course Module

Three to five learning outcomes should be formulated for each unit so as to


describe what students should be able to do, know and appreciate.

2. Se lect appropriate assessment measu res and assess the learning outcomes.
\Vays of assessing the learning outcomes are selected and utilized. It is
recommended to focus on the direct measures of learning. More ofte n than not,
student performance levels for each outcome is described and assessed with the use of
r ubrics. It is relevant to identify how the data wUI be collected and who will be
responsible for the collection of data.

3. Analyze the results of the outcome assessed


It is significant to analyze a nd report the assessment results in a meaningful way.
4. Adjust or improve programs follo\\ing the results of the learning outcomes assessed.
A critical step of the process of assessment Adjustments or improv ements
should be done based o n the results of the assessment process. The said results should
be utilized to improve teaching and learning.

In step number 1 , it was clearly stated that learning outcomes should be clearly
defined and identified. What is the purpose of identifying them? Acrording to Identifying
Learning Outcomes and selecting Assessment Tasks (n.d.) Retrieved September 9. 2 0 18
from hllPsi/ /teachin1com mons-vorku,ca /resources /cl earning/s:Jcarning/identifying·
learning-outcom.es-and:selecting•assessmenMasksL its purpose is to express the desired
results of a particular learning experience. Teachers need to decide on how to ask their
students to evidence their learning through assessment tasks o nce there are learning
outcomes identified. Assessment tasks refer to acth,; ties students will undertake so as to
confirm whether or not the outcome has in fact been achieved during and a~.er the process
of learning. The said activities are used to tell how well students are learning in relation
to the stated learning outcomes and to provide students with feedback. One of the most
re levant determinants of assessment l'.isks is that the assessment has to be consistent with
the lea rning outcomes identified.

What are the characteristics of the student learning outcomes/ According to Taking
Aim at Student Learn ing (n.d.) Retr ieved September 9 2018 from
https; / /vp,studcntlife,uiow.t,cdu/a,ssets /Taking-Aim· Prescnution.pdf, the desired
characteristics of student learning o utcomes are the following :
• Align with the goals o f division~department and institution
• Describe a behavior w hich i s
✓ Specific
✓ Meaningful
✓ Measurable
✓ Attainable
• Describe a single behavior
• Describe knowledge. skills and habits of mind
Taxonomy of Lear ning Domains

Statemen ts of per fo rmance ex-pectations are learning outcomes. Cognitive. affective and
psychomotor ar e t he domains of )ear ning characterized by lear ning behavior change. According
to Taxonomies of Learning. [n.d.). Retrieved Septem ber 9, 2 018, from
htt ps:/fbokcenter.harvard.edu/taxonomies learning, in d esigning learni ng o bjectives, types o f
4

wor k which stud en ts wi11 d o to d em onstr ate the achievemen t o f d esir ed outcomes should be
plan ned and thought of.

Accor ding to Thr ee Domains of l earning · Cognitive, Affective, Psychomotor . (n.d.) .


Ret r ieved Septem ber 9, 2 018, from bttps:f/thesecnndpdncip)e com /instr11ctlnnal 4

design/th reed omainsoflear ning/t he said t hr ee domains - cognitive, affective and psychomotor
shou)d be know n and u tilized b y t he teacher in constructing t he lesson and of cour se in assessing
the students' )ear ning.

A. COGNITIVE

According to Learning Domains · St ud en t Life Learning & Assessment. (n.d.) .


Retr ieved Sep tember 9, 2018, fro m https:!/www.emporia.ed u /st uden tlife/ learning-and -
assessment /guid e/ domains.ht m1. cogni tive do main dea)s with how w e acquire, p rocess,
and u ti1ize knov,r)edge. It is the "'t hi nking" do main.
A new er version v,ras spear headed b y one of Blo om' s fo r mer students, Lor in
Anderson and Bloom's or iginal partner, David Krat hw ohl. The t wo highest for ms of
cognition have been rever sed. In t he new ver sio n. steps change to ver bs - knowing,
understanding, ap plying, analyzing, evaluating and t he highest and last function, cr eating.

Blooms T aJC onomy • Revised


Co u r se Mo dul e

Ac:c:ording to Revised Bloom's Taxonomy, (n,d.), Retrieved September 9 . 2018. from


htt.p ://www.celt.iast•te.edu / tu.ching/effectlve•teaching· practlces/reyised·blooms·
taxonom)· verb and an object are the contents of a statement o f a learning objective. A
continuum of increasing cognitive complexity from rem ember to create are represe nte d in
the cognitive process dimension.
To understand bette r. the contents o f the revised Blooms taxono my from the
information given from the s• me source and De Guzman. Estefania.S., Ad amos. Joel L.
(2015) Assessm ent of Learning 1. Man ila: Adrian a Publishing Co .. Inc. • page 35. Tab le 1 may
be considere d and utilized:

Table 1 • Cognitive Levels, Processes and Sa mple Action Verbs

LEVELS PROCESSES ACTION VERBS THAT MAY


BE USED
- 1. Remembering
-- • Recognizing • Define
-
(Identifying) • Identify
• Recalling • Describe
(Retrieving) • Label
• List
• Match
• Outline
• Name
• Select
State

+
Rep roduce
- 2. Understanding • Interpreting Describe
-
(paraphrasing, Convert
clarifying. • Estimate
representing. • Distinguish
translating) • Extend
• Classif);ng
• Paraphrase
( categorizing)
• Summarize
• Exemplifying
• Rewrite
(illustuting) • Generalize
• Summarizing
(generalizing,
I
abstracting)
• Comparing I
(m_!!ll'!!>S- matchin•.
contrasting)
• Inferring (p redicting,
concludin g)
• Explainin g
( construct ing
modelsl
3. Applying • Imp leme nting • Change
(Using) • App ly
• Exec uting (Car rying • Comput e
out ) • Classify (examp les of
con cept)
• Demonstrate
• Modify
• Discover
• Pr ed ict
• Relate
• Show
• Prepare
• Op er ate

4. Analyzing • Diffe r entiat ing • Arrange


(Distin guishing, • Analyze
Discrim inating. • Compare
Selecting, Fo cusin g) • Asso ciate
• Organiz ing • Infe r
(coherence, fin ding,
Ou tlining,
• Contra st

in tegrating.
• Organize

stru ctu ring)


• Support (a t hesis)

• Attrib uting (
de constructln,..,
5. Evaluating • Cr itiqu in g (Judging) • Compare
• Checking ( Detect ing, • App raise
monitoring, testing. • Contrast
coordinating) • Conclu de
• Criticize
• Ju dge
• Evaluate
• Support (a judgement
• Verify
• Justify

6. Creating • Pla nning (Design in g) • Create


• Generatino • Construct
(Hypothesizing) • E>-.-tend
• Producing • Generate
(Constr ucting) • Formulate
• Synthesize
• Classify (infer the
c1asslfication system)

B. AFFECTIVE

Accor ding to Learning Domains - Student Life Learning & Assessment. (n.d.).
Retr ieved September 9, 2018, fro m https://www.emporia.ed u/studentlife / learning-and-
assessment /guide / domains.htm1. affective domain deals with the values, attitude and
emotions. It ls the "valuing'" domain.

Organizing

Valuing

Figu~ 2 • Krathwobil's Ta.'tonomy o: Affet:tiv: l ea.ming

f.rathw<thfs Taxonc:ny of Mfm h·e l earning (n,d,) Rmi i.Yed S: ptembe: 9, 2018
from bnps:/JY."l."••••r.:s: arch,..ai:a.,n: t /firure/Kn!hwohls.•Ta.1:c:ioi:w-of•Aff: cti\•: •i : antin: fi: 2 21371'3847

To unde rs tand be tter, Table 2 contains the leve ls. descriptio ns and sample
action verbs that may be used in writing learning outcomes. Contents are from De
Guzman, Estefania.S., Adamos, Joel L. (2015) Assessement ofLearning 1. Manila:
Adriana Publishing Co., Inc., page 37 and l earning Taxono my • Krathwohl's
Affective Domain ( n.d.) htt ps: //glo bal.ind iana.edu /documents / Learning·Taxonomy·
Affective.pdf.

Table 2 • Affective Levels, Descriptio ns and Sample Action Ver bs

LEVELS DESCRIPTIO NS ACTION VERBS THAT


I MAY BE USED

1. Re ceiving • refe rs to the student's • Choose


w il1ingne ss to atte nd • Ask
to something in the • Follow
e nviro nme nt • De scribe
• From s imple • Identify
av,rareness that a thing
e xists to selective
• Locate

attention on the part


• Give

o f the learner.
• Hold

• It repre sents the


• Point to

Jo west Jeve ls of
• Name

)earning outcome s in • Select


the affective domain. • Use
• Reply

2. Respond ing • Students actively


• Assist
participate
• Participates and • Ans wer

reacts to a some thing • Conform


in the environm ent. • Co mplie
• Shows som e new • Gree t
behavior as a re sult of • Help
an e xperience. • Label
• Perform
• Pre sent
• Practice
• Re cite
• Report
• Write
• Select
• Tell

3. Valuing I • concerned \'°J'ith the • De scribe


value or worth a student • Co mplete
. attaches to a particular
• Differentiate
object. ph enomenon, or
behavior.
• Follow
• For m
• This ranges in degree
• Expla in
from t he simpler
acceptance of a valu e to • In vit e
th e more complex level • Init iat e
of commitment • Justify
{assumes responsibility • Rea d
• Learning outcomes in • Re port
this a rea are concer ned • Shar e
v.-ith behavior that is
stable a nd consistent
• Select

enough to make the


value clearly
.• Wor k
St udy
identifiable.

• Bringing together • Airer


4. Or gani zi ng
I different values,
resolving conflicts


Ad he r e
Com b ine
bet\11t•een th em, a nd
beginning th e building • Ar range
of an internally • Expla in
consistent value system • Defend
is t he concern of t.his • Gene ralize
level. • Int egr a te
• Comp.a.ring, relating, a nd • Id en t ify
synthesizing valu es are
th e focus of t his level
• Ord er
• Modi fy
• Learning outcomes may
• Pre pa re
be concerned \r\ri.th the
conceptualization of a • Re la t e

I value or wit h the


organization of a value
.• Organ ize
Syn th esize
system
• Inst ruct ional object ives
relating to the
development of a

I philosophy of life would


fall int o this c.a.teeorv

5. In te rnalizi ng va lu es • Cha racteriza tion by a • Discriminat e


va lu e or va lue • Displa y

I
comp lex acting • Act
consist ent ly wit h th e
n ew va lue
• In flu e n ce
• Modify
• Learning outcomes at • Liste n
t.his level cover a broad
range of activities, but • Pra ct ice
th e major emphasis is • Perform
on the typical • Qua lify
chara cteristic of t he • Question
student • Solve
• Serve
• Use
• Ve rify
C. PSYCHOMOTOR

Accor ding De Guzman, Estefania.S., Adamos, Joel L. (2015) Assessement of Learning


1. Manila: Ad riana Publishing Co., Inc. • p. 36, the focus of psychomotor domains ar e the
physical and me chanical skHJs involving brain coordinatio n and muscular activity. It
answers what actions do teachers w ant students to be able to perform. It deals with
physical or man ual skills.

To understand better, Table 3 contains the levels, de scriptions and sam ple action
verbs that may be used in writing learning outcomes. Unde r psychomotor domain.
Co ntents ar e from De Guzm an, Estefania.S., Adamos, Joel L. (2015) Assessementoflearning
1. Manila: Ad riana Publishing Co., Inc., page 36 and Psychomotor Domain. [n.d.). Retr ieved
September 9, 20 18, from http://users.rov1ran.edu /~cone/ curriculum / psychom otor.htm

Table 3 - Psychomotor l evels. Definition and Sample Action Ve rbs

LEVELS DEFINITION ACTION VERBS THAT MAY


BE USED

1. Observing • Active me ntal • Detect


attending of a • De scribe
physical event • Distinguish
• Relate
• Differe ntiate
• Select

2. Imitating • State of attem pted • Begin


copying of a • Explain
physical behavior • Display
• Proceed
• Move
• React
• State
• Voluntee r

3. Practicing • State of trying a • Construct


part.icu)ar activity • Bend
ove r and over • Calibrate
• Differentiate
• Faste n
• Measure
Co ul'se Module
• Gr asp
• Mix
• Measur e
• Organize
• Operate
• Mend
• Manipulate

4. Adapting • Fine t uni ng • Combine


• In or der to perfect • Ar range
i t making minor • Construct
adjustmen ts in the • Create
p hysical activity • Originate
• Design
• Reorganize
• Rearr ange

Types o f As sessment Methods


Acco r d ing to De Cu ~man, E:Jtcfania ,S., Ad amo :l, Joel L. ( 20 1 5) A ss essment of Lear ning 1.
Manila: Ad riana PubHshing Co., Inc.. 3 7-39. assessment met ho ds ar e categ or ized according to
1. Selected - r esponse
Fr om a given set of op t.ions, students select to answer a question or a p r ob)em in a
selected -r esponse fo rmat. Se1ected -r esponse item s ar e objective and efficient because
ther e is only one best a nsv,re r. Item s ar e easy to gr ade. The teacher can assess and score
these types of assessment quickly.
Teacher s commonly assess students using quest.ions and items that ar e
• Multip)e choice - consists of STEM w hich maybe in a fo rm o f q uestion or
statem en t. v,.rith fo u r or fi ve ansv,rer choices or distracters.

• Tr ue or Fa1se (Alternate r esponse) - t hese are binary cho ice type.

• Matching type - consists of a set or column of descriptions and w ord s,


images or phrases. Each stem is r eview ed by students and match each w it h
a word, p hra se, or image from t he response Hsts.

2. Constructed -r esponse
The constructed -response type is mor e useful in tar geting higher Jevels of cognition.
Jt is subjective type. Jt d emand s stud en ts to pr oduce or created t heir ow n answers in
response to a p roblem. question or task. Jtem s may fal) under any o f t he following
categories:
• Br ief-constructed r esponse items - only short r esponses from students ar e
r equir ed. Examples of these ar e t he following:
✓ Sen tence completion - students fill in the blank at t he end o f each
statemen t
✓ Short answer to open-end ed questions
✓ La beling a diagram
✓ Answer ing a Math pr oblem by showing t heir solutions

• Per fo rmance assessments - students are r equir ed to per fo rm a task rather


t han select fro m a given op tion set. St udents have to come u p w ith a mo re
elaborate and extensive response o r answer . These ar e also cal1ed aut hen tic
o r alternative assessments. Students are r eq uired to demonstrate what
they can d o th rough p roblems, activities or exer cises. \1/hen g rading
per for mance tasks, a scoring ru bric wit h t he per fo rmance criteria is needed.
It may be analytic scoring or a holistic r ubric. In an analytic ru br ic. differen t
d im ensions and perfo rmance characteristics are identified and mar ked
separately and over al1 pr ocess or pr oduct is r ated in a holistic r ubric.

Opp ortunities ar e pr ovid ed to stu de nts to a pp ly t heir knowledge a nd


skills in a real-wor ld con text by per for mance tasks. Performance tasks may
be skHJ-o rien ted or p roduct-based. Students have to pr oduce or cr eate
)ear ning evid ences or to d o so met hing and exhibit t heir skHJs.
PRODUCTS:
✓ \1/r itten reports
✓ Poems
✓ Projects
✓ Portfolio
✓ Spr eadsheets/ w or ksheets
✓ Audio -visual mater ials
✓ We b p ages
✓ Jou rnals
✓ Ta bles
✓ Gr a phs
✓ Reflection paper s
✓ lllustr ation/ models

PERFORMANCE/ SKILLS-BASED ACTIVITIES:


✓ Speech
✓ De bate
✓ Recital
✓ Teach ing d em o n s::tr~tio n
✓ Dramatic r eading
✓ Role play
✓ Ath le tics
• Essay a ssessments- Ansv,rering a q uestion or p r oposition in w ritten fro m is
in volved in essay assessmen ts. Here, stud ents ar e a1lowed to ex-pr ess
t hemselves and demonst rate their reasoning.
✓ Restricted response is an essay item which r equir ed a few
sentences. Questions are mor e focused.
✓ Extend ed response - aUows mor e flexibility on the part of the
stud en ts. Responses ar e m ore complex and longer .
Essay requires much tho ught on the part of t he teacher . There sho uld
be clear quest.ions on essays so t hat students can organize their t houghts
q uickly and answer q uestio ns dir ectly. In scoring essay. ru br ic is also used.

• Ora) q uestio ning- To check stud en t und er standing, it is a common


assessmen t m et hod du ring instruction. Ora) questioning may be taken in
t he fo rm of an inter view o r confer ence when d one formal1y. The teacher can
keep students on t heir toes. receive acceptable r esponses, elicit differ en t
types o f reasoning from t he students and at t he same time strengthen their
co nfidence by master ing the art of questioning.

3. Teacher observations

These ar e for m of on -going assessment, done usuaUy in combination wit h oral


q uestio ning. To check o n t he stud en ts' understanding. teachers regular ly observe
them . The teacher can get infor mation if learning is taking place in the classr oom by
watching how st ud en ts r espond to oral questio ns and behave d ur ing the ind ividual and
co1labor at.ive activities. To descr ibe how stud en ts }earn in ter ms of concept building,
commu nication skills and p ro blem solving, it w ould be beneficial if teacher s make
observational or anectodal notes.

In assessing the effectiveness of teaching str ategies and acad emic interventio·ns,
this assessmen t method can also be used. Str engths and w eaknesses of individual
students and t he c1ass as a who le may be r evealed through t he info rmation gather ed
from observations.

4. Student self-assessment

In relatio n to a set of assessment criter ia, students are given a chance to r eflect
and r ater t heir own wor k and judge how well t hey have performed in this pr ocess.
Students d o the t racking and evaluating t heir ow n performance or pr og ress.
SELF-MONITORING TECHNIQUES:
• Activity checklists
• Self-r eport in ven tories
✓ Questionnaires or surveys w hich r eveal students attitudes and
beliefs about them selves and ot hers.
• Diaries

Through self-assessmen t exercises. stud en ts are pr ovided w it h an op portu nity to


reflect on t heir performance, monitor their pr ogr ess in }earning. motivate t hem to do w ell
and give feed back to teacher w hich the latter can utilize to impr ove t he subject ,

Lea rning Tar gets and t he As sessmen t Met ho d s

Accor ding to De Guzman, Estefania.S.. Adamos, Joel L. (2015) Assessement of Learning 1.


Manila: Ad riana Publishing Co., Inc. , 40·4 2, anot her nam e for a )earning outcome is a )earning
target. A learni ng target refer s to t he performance description w hich in c1ud es what the }earners
should know and be able to do. The criteria used to judge performance of stud en ts ar e inc1ud ed
in the learning tar gets. Likewise, accor ding to DIGITAL CHALKBOARD. (n.d.) . Retr ieved
September 9. 20 18, from
htt ps://www.mydigitalchalkboard.org / portal /defa uJt/ Content/ Viewer / Content?action=2&scid =
505 706&scild=15337 ther e ar e types of instructional learning targets and these are the
know ledge targets, r easoning targets. skiH targ ets and pr oduct targets. In assessing t hem the
follow ing guid elines on the use of assessment methods for t he said tar gets ar e given by De
Guzman, Estefania.S., Adamos, Joel L. (2015) Assessement of Learning 1. Manila: Adriana
Pu blishing Co., Inc.. 40-42 and these are the following:
1. Knowledge and simple under standing
• It r efer s to t he mastery o f substantive subject matter as w el1 as t he
p ro ced ur es.
• Constr ucte.d·response and selecte.d·response item s ar e best in a ssessing
1ow·1ear ning targets in ter ms of efficiency and cover age
• Pencil -and-paper tests ar e g ood in d eter mining concep ts. facts pr oced ur es.
p rinciples and r esponse patter ns

2. Deep u nd er standing and r easo ning


• Involve higher o rder thinking skills of evaluating, analyzing and synthesizing.
• Essays are best in assessing o r checking fo r deep understanding and
r easoning.
• Ora) q uestioning can also used to assess deep u nd er standing and reasoning
b ut less time efficient than essays
• Per fo rmance tasks ar e also effective in assessing deep understanding and
r easoning (ex. Action r esear ch)
• An in terp retative exer cise may be considered too

3. Skills
• Per fo rmance assessment is t he super ior method for assessment.
• \tVhen it is used in a meaningful and r eal·Jife context, it beco mes authen tic.
• Applications w it h less·str uct ur ed pr oblem s w here pr oblem id en tificatio n.
collection. in tegration, organization and info r mation evaluation and o riginality
ar e emphasiz.ed, performance a ssessment ar e suited.
4. Products
• These are tangib)e and substantial output that showcases understanding of
students of concepts and skills and their ability to ap ply. analyze. evaluate and
have it in tegrated wit h t hem.
• Performance tasks are used so as to ad eq uately assessed p rod ucts
• Observation can be used to v,ratch and inspect h ov,r students br ing the pr od uct
elements toget her
• Based on a set of )earning criter ia. self-assessmen t and peer eva)uation in
fo rmative assessmen t m ay al1ov,r students in r eflecting and making judgm en ts
abou t t he quality of wor k and that o f their peer s.

5. Affective
• Affect refers to attitudes, values and interests students m anifest.
• Self-assessment is the best method for this learning target - may be in the for m
o f responses of stud en ts to self-r eport affective in ventor ies using r ating scales.
• Observation is also used in assessing affective q ualities like honesty/ integ rity.
w ellness. persona) discipline, etc.
• Ora) questioning may also be used in assessing affective traits.
Module 003 Validity and Reliability

At t he end o f t his module you ar e expected to:


1. Identify terms r elated to valid ity and r eliability
2. Ana1yze concepts, princip1es and p ro ced ure in ensuring t he validity and
reliability o f the different methods of assessing t he learning of stud ents
3. Exam ine t he sour ces of validity and reliabi1ity evidences

Validity and Reliability


\1 \lhen gathering information or evidences about stud en t achievement, bot h va1idity and
r eliability are considered. Validity a.Jone wi11 not ensure assessment to have high qua1ity. Reliability
of test results should also be checked.

Validity

What is valid ity? Accor ding to De Gu zman, Estefania.S., Adamos, Jo el L. (20 15) Assessm ent
of Learning 1. Manila: Adriana Publishing Co., Inc., pp. 52, valid it y is a term d er ived from the Latin
wor d. VaUdus w hich mean s str ong. \1\lith r ega rd s to assessment. it is con sidered va lid if it
measur es w hat it is sup posed to. Fo r teacher s, validity pertains to the in ference accur acy w hich
teacher s make abou t students based on t he gather ed info rmation from an assessmen t. It im plies
that if ther e ar e strong and sound evidences of the ex-tent of stud en ts' )earning. teachers'
evaluation of t heir students per fo rmance is vaUd.

Likew ise, as cited alr eady from t he same sour ce, if an assessment measures t he actual
knowledge and perfo rmance v,rit h r espect to t he in tended outcomes and not something else. it is
considered as valid.

Likew ise, accor ding to Assessment / Quality Test Co nstr uctio n I Special Con nections. (n.d.).
Ret rieved September 11, 2018, from
.httB://www,speciak o.nn.ecti.o.ns_.ku..ed.u /?q =as_s..e.s.s.m.e nt/qualit;.y test. cons..tr uction, validity is t he
degr ee to which theory or evidence supports any conc1usions or interpretations abou t a certain
stud ent base o n his test per for mance. To make it simple. it is how one knows that an English test
measur es students' English ability and not t heir Math ability.

Evidences fo r Validity

Accor ding to Sour ces of Validity Evidence (n.d.) Retrieved Sep tember 11, 2 018 from
bttp://www siop nrg/ princip)es/ pages l 3to26 pdf in making infer ences from the r esults of a
select.ion pr oce du re to t he sub sequ en t w ork b eh avior o r ou tcom e perfo rman ce slho uld b e based
on e vid en ces th at \ '\rill sup port t he said in fer en ces.
Accor ding to De Gu zman. Estefania.S., Ad amo s. Joel L. (2015) Assessem ent ofLearning 1.
Manila: Ad riana Publishing Co., Inc. . pp. 53 -5 7, t he following ar e t he sour ces of valid ity:
1 . Co n tent-r elated eviden ce
Con te nt -r ela te d e vid en ce fo r validity r e fers to t he ex'ten t t o which th e t est cover s
th e entire cont en t d omain. The assessme nt sh ould con ta in items from each t opic, if a
summative test cove rs a u nit with fou r t opics. Ad eq ua te samp ling o f con te nt can b e
do ne to materia lize t his. Per fo rman ce of stude nt s in th e test may b e utiHz.ed as an
indica tors of his con tent knowledge. For in sta n ce. if a Gra de 11 student was a ble t o
corre ct ly answer 8 5% of t he it ems in Scien ce test a b out matter, th ,e teach er may
con c1u de t ha t the said st ud en t kn ows 8 5% of th e con t ent a rea.

Face va lidity is possessed by a t est which a pp ea rs t o a d equ at e ly measure t he


}ea rn ing out comes an d con tent. It is b ased on t he o pin ion of t he o n e re vie wing it
which is subject ive a nd it. is con side re d as n on -scient ific or n on -syst ematic.
2. Criterion -r elate d evidence

The d egree t o which test scor es agr ee with an ex'ter nal crit e rion is caHed
criter ion -relat ed evide nce. It. is r e la ted t o ex'ter nal validity. The r elat io nship bet we e n
an assessment a nd a n oth er measur e o f the same tr a it is examin e d by crite rion -r ela te d
e vid en ce. Thr ee typ es of criter ia a re

• Achie ve me nt test scores


• Grades, r atings and ot her n umerical judgmen t.made by the teacher
• Care er da ta

The e xt en t to w hich stu dents ha d acquire d kn owledge ab out a pa rtic ula r a r ea or


topic or master e d ce rt ain skins at specific time as a r esult of plan n ed tr a ining or
instru ction . is measur ed by t he a chievem en t t est. A summative test on
Ent repr e ne ursh ip g ive n to Gra d e 11 stu de nts a t t he e nd of th e fi rst quarter can serve
as a cr iterion . A r e adiness test in En tr ep r en eu rship ca n b e comp ar e d to t he
p eriodical test result s t hr ough cor r elat ion . To va1ida te n e wly d evelop ed p er sona lity
measu res, estab lishe d perso nality in ve nt or ies can be used as a crit er ia. To de te rm in e
validity. vocational in te rest surve y results can b e comp ar ed to ca re er da ta.

According to t he sam e so ur ce, crit er ion evidence is of tw o types an d th ese a re


the following:
1. Concur re nt va lidity - An estimat e of a cur ren t perfo rman ce of stud en ts in
r e la tion to pr eviou sly va lid at ed or estab lish ed measur es is pr ovide d by t his
type.

2. Pr edict ive va1idity - It p e rtain s to th e u sefu lness or pow er of test scor es to


p redict fu t ur e p er forman ce. For examp le. in a q uestio n if scores in t he
a dmi ssion t est (pr edict or) b e u sed t o pr e dict coHege success (cr it e rion )?
Ther e is a pr e dict ive va lidity if t he re is a significan tly high corr e la tion
bet\veen ent rance examination scor es and First Year gra de point averages
(GPA) assessed a year later, then there is predictive validity, A good
m easur ement tool for student selection was t he ent rance examination.
Likewise, an aptit ud e test given to high school stud en ts is pr edictive of how
w el1 they will d o in college, The grade point average is t he criterion. Job
p ro ficiency ratings. twen ty-fir st century skills like citizenship and annual
income may also be used by educators as criteria.

3. Co nstruct· r elated evidence

Let's start wit h the term construct? Construct r efer s to an individ ual characteristic
which e:,rplains some behavior aspect. An assessmen t of t he quality of t he instrument
used is w hatconstruct-r e)ated evidence is. The extent on which t he assessm en t is a
meaningful m easur e on an unobservable tr ait or characteristic is measur ed by construct -
related evidence.

A good construct has a t heor etical basis which means t hat it sho uld be operationa11y
defined or explained unambiguously to differentiate it from ot her constr ucts. In
establishing construct validity, tw o methods may be used and t hese ar e the fol1owing:

a., Con vergent valid ation - it occu rs \•vhen measur es of constructs w hich are
r elated are in fact observed to be related.

b. Diverg en t valid ation (discriminant) - it o ccurs when u nr elated constructs are in


r eality observed no to be.

UNIFIED CONCEPT OF VALIDITY

Accor ding to De Guzman, Estefa nia.S.. Adam os, Joel L. (20 15) Assessement of
Learning 1. Manila: Adr iana Publishing Co.. Inc. , pp. 57-59, Messick (1989 ) proposed a
unified concept of validity which is based on an expanded theory of constr uct validity, It
ad dr esses meaning of scores and so cial values in test interp retatio n and test use. In t he
said concep t o f unified validity, conslderat.ions o f conten t7 criter ia and consequences ar e
in tegrated in to a construct fram ew ork fo r t he em pirical testing of rational hypotheses
about how to give meaning to scor e and t heor etically importan t r elationships. Si.x distinct
construct validity aspects \•vere pr esented and t hese are the
• Con tent- it is parallel to con tent -related evidence v,.rhich calls fo r con tent
r elevance and repr esentativeness.
• Su bstantive - it refers to t he theor etical constructs and empir ical evidences.
• Gener alizability - How score p roperties and interp retations gener alize to
and acr oss pop ulation gr ou ps, con texts and tasks is examined in t his aspect.
This is called ex'ternal validity,
• Struct u ral - how w el1 the scor ing struct u re matches the constr uct d omain is
being assessed in this aspect.
• Exter na) - Con vergent and discriminant evidences taken from Mu1t.i trait·
mu1thnethod studies ar e inc)ud ed in this aspect.
• Con seq uen tial - deals wit h t he in te nd ed a nd unintend ed assessmen t e ffect s
o n t eaching an d }earning.

Assessm ent. Met.hod Valid ity

Jn a ssessme nt method s oth er t h an th e tra dition al assessments, validity sh ould b e


considered also. Accor ding to De Guzman. Estefania.S.. Adam os, Joel L. (2015) Assessme nt of
Learning 1. Manila: Adriana Publishing Co., lnc. , pp. 59-60. in developing per for mance
assessmen ts t hr ee ste ps ar e involve d and t hese a re th e fol1ov,.ring :
1. Defi ning the purpose.
It involves ide ntifying t he essen tia l ski11s stu de nt s n eed t o de velop a nd th e
cont e nt worthy of und er sta nding. In or der t o ha ve validity e viden ce with r ega rd s to
cont e nt, t he re shou ld be a r eview of pe r fo rmance assessme nt a nd shou ld be don e b y
q ualified cont en t e xp erts.

2. Choosing the activity.


From t he same sour ce, five r ecommen da tion s w hich a re in trinsicaUy asso cia te d
to assessmen t va lidity we re given an d t hese a re th e fol1owing :
• Va lued activity shou ld be reflect ed in th e select e d per fo rman ce.
• The perfo r man ce assessment comp le tion sh ould p rovide a va lu ab le
lea rn ing e:-..-pe rie nce.
• Goa ls a nd object ive sta te men t s sho uld be clear ly a lign ed wit h the
mea su ra ble ou tcomes of th e a ct ivity in th e p er fo r mance test.
• Per fo rman ce tasks sh ould not examine t he uni nt ende d or ext ra n eo us
va r iab les.
• Per fo rman ce assessments shou ld be fre e from b ias.
3. Deve loping cr it er ia for scor ing.
A r ub ric o r ra ting sa le has t o b e created in scor ing. Distracting factors s uch as
the handw rit ing of studen ts a nd th e pr odu ct n e atness affe ct r a ting of tea cher s must b e
car e ful. To a void }ov,rering t he pe r fo rmance assessme nt va lidity, pe rson a} idiosyn crasies
should be avoided.

Ora l q uestioning h as high va lidit y in contr ol1e d conditions. Va lidity of assessment can be
en sured t hr ough a ch ecklist v,.rhich de fin es t h e ou tcomes to b e cove red a n d th e c rite r ia or
stan da rd s to be achieve d . Th er e sh ould be a sta nd ar d or structur ed list of qu estion s for
summat.ive p urposes,

For ob serva tions, t he be havior of inter est shou ld be accura t ely descr ibe d by t he
ope ra tion al a n d r esp onse de fin ition s. If e vide nce is p ro pe rly r ecor de d a nd inte rp re ted, it wiH
conside re d as high ly va lid. If a ddit ion al stra tegies on assessmen t wil1 b e utilize d with
observat ion like surveys, int e rview s a nd q uan titative methods like tests , va lidity wm be str onger .
For se lf-a ssessme nt, stu de nts shou ld be in forme d o f th e domain in wh ich t he task is
emb ed de d t o increase validity. St ud ent s sh ould be t aught an d a t t he same time }ea rn h ow to
assess th e wor k object ively based on t he d ea r ly d efined crite ria an d dismiss any inte rest b ias.
Stud ents may b e ind uces t o make a ccu ra t e assessment of t he ir ow n perfo rmance if th ey wil1 be
info rmed tha t t he ir self-assessme nt ratings wil1 b e comp ar ed to those mad e by t he ir peer s a nd
teach er .

No method of d a ta coHect.ion or single typ e of in stru me nt or t oo) can a ssess differen t


lea r ning a nd d evelo pme nt ou tcomes in a schoo l pr ogr a m and so teachers sh ould )ea rn h ov,r to
utiHze a wid e r ang e of assessme nt tools so as to build a complet e p ro file of th e str engths,
wea kn esses an d int e11ectua l a chieve men ts of stude nt s. Dir ect assessmen t m et hod s sho uld be
combined with in direct assessment meth ods like interview s, surveys and focus gro up s be cause if
direct assessment s examin e actual e vid en ces of stu de nt ou t comes. indirect assessmen ts ga th er s
th e p er cep tive da ta or fe e dback from stud ents or oth er individuals w ho may have imp ort an t
info rmat ion a bout t he q uality o f the learning pr ocess.

Accor ding to AMLE - Association for Midd le Level Education. (n.d.). Ensuring Valid,
Effective, Rigor ous Assessments. Retrieved September 15, 2018, from
https://www.amle.org / BrowsebyTo pic/WhatsNew /WNDet/ Tabld / 270 /Art MID /888 /ArticlelD /5
70 / Ensuring-Va1id -Effe ct ive -Rigo rous-Assessment s.aspx,. a pr ocess v,.rhich \•viii e nsu re th e
utiHza t ion of valid, e ffect ive an d d emanding assessme nts for stu dent s is comp osed of t he
following steps:
1. Deco nstr uct t h e standa rd s.
Iden tification of stan d ar ds w hich w m be a ddr essed in a cert a in unit of study should
be d one. Afte r which, deconstruct ea ch stan da rd . It in volves br eaking t he said stand ar d
in t o diffe r en t )e arning tar get s a n d a ligning ea ch of th e sa id lea rn ing targets t o var ying
achievem en t 1evels and th ese a r e t he fo l1owing :
1. Knowledge - It focu ses on kn ov,ring a nd u nd er standing Jike vocabu lary.
synt a ct ic stru ct u res, n umbe rs an d n umeration systems.

2. Reasoning - u tiHzi ng knowledge an d unde rsta nding in solving p rob lems and
in int erpr eting info rmat ion.

3. Pe rformance ski11s - tu r ning knowledge int o action : Pr ocessing


4 . Pr odu cts - Cr eating so met hing w hich is tangible w hich wil1 repr esent
co nt ent a pplicat ion .

2. Align test items and thinking levels.

After d eriving t he sp ecific learning t arg ets from t he sta nd ar ds, assessme nt t o u se to
determin e if th e stud en ts h ave )ea r ned t he mat er ial should be conside red - e it h er pap e r-
penci1 assessmen t wit h mat ching. sh ort -an swe r times an d multiple ch oice q uestions or
may b e in a for m of p erfor man ce based a ssessme nt like projector performance. The
assessme nt to b e u sed sh ould b e aligne d wit h t he lear n ing target s. These sh ould cover also
a varie ty of critical-t hin k ing levels.
3. Create valid and re1iab1e assessmen ts,
Consider validity and r eHabiHty in cr eating assessment. It sho uld measure \•vhat is
supposed to be measured and if an assessmen t is vaHd. it will be r eliable. However, a
re1iab1e test or assessmen t d oes it is vaHd. To make test both vaHd and r eliable. guidelines
in p reparing traditional tests should be considered.

Accor ding to Constr ucting tests. [n.d.). Retrieved September 15, 2 018. from
http· / /www wasbington.edu /teaching /teacbing-resonrces/ preparing-to-
teach / constructing-tests/ . the fo1lowing ar e the gener al guidelines in constructing
tr aditional tests:
• Take in consideration reasons for testing
Jn giving a test, the r easons wm help the teacher determine feat ur es
s uch as for mat. length, level of detail r equired in answ er s and the time frame
fo r r et ur ning r esults to students.
Some o f the r easo ns fo r testing ar e to
✓ Monitor t he pr ogress of students so as for a teacher to adjust
t he cour se pace
✓ Motivate students
✓ Provid e daUl for stud en t's grade
✓ Challenge students to apply concep ts learned

• Maintain consistency
Teacher should consider the maintenance of consistency betw een the
co urse goals, met ho ds of teaching and the tests utilized to mea su re go al
achievement.

• Utilize testing met hods which ar e app rop r iate to }earning objectives
Met hod s should be app rop r iate to the }ear ning outcomes. For
instance multiple choice may be useful in demonstrating recaH and memory
b ut to demonstrate mor e independ ent analysis and syn thesis, an essay or
o pen-ended p roblem solving may be used.
• Help students pr epar e
By clar ifying the cou rse goals as wel1 as review materials, teachers
may help stud en ts pr epare fo r the test. By d oing so, test will be allowed to
r einfor ce w hat the teacher wan ts his students to r etain and }earn.

• Utilize co nsisten t language


Jn d escribing expected outcomes. teacher should use consistent
language n stating go als and objectives, in talking in class and in writing
q uestions in the test.

• Design or construct test items which al1ow students to shov,r a r ange of


}ear ning
Wh en t he teach er is a b le to d esign o r con stru ct such test items,
stud ent s w ho w er e not a b le t o master ful1y e verything in a su bject may still
b e able to d emon str at e h ow mu ch t hey h ave lear n ed .

4. Ta ke t he test ite ms to the next level wit h rigor a nd im p ort an ce.


Teach er s sh ould sta rt wit h items th ey a lrea dy h ave. \.Yit h rigor ous assessments,
the goal must be fo r t he students to move up the lad der o f Blo om's taxonomy Ad ding
reflective compon e nt s and by en cou raging creative and cr itical t hough t sho uld be
considered . \ Vit h such assessme nt s, stu dent s will b e a ble t o
1

Think clearly and accurately.


De t ermin e a nd con sid er not only one mea ning b ut multiple ones.
Engage in d isciplined thought and inquiry
Perfo rm ou tside th e n orm .
Tr ansfe r kn ov,rledge to diffe re nt sit uation s.
Adjust a p pr oach w hen t hrow n a cu rve ba l1.
Persevere and t oler at e uncertain ty

5. Ma ke assessment part of p lann ing.


Prio r t o p reparing lesso n p lans. p lanning h ow t o assess effectively sh ould be
con sid er ed . Fo r summative tests, w ha t th e teacher w an ts his stude nt t o kn ov,r an d b e
a ble t o do sho uld be includ ed in th e lesson plan ning. Jn doing so , teach er will ha ve
th e a ssura n ce tha t stu de nt s will b e engaged in }ea rn ing a ct ivities wh ich wiH lead th em
to be successfu l in t ak ing th e summative tests. Decon stru ct ing stan da rd s and dr a ft ing
assessment ite ms facilit at es th e sa id ou tcome.

Threats to Validity

According to De Guzman, Estefania.S., Adamos, Joel L. (2015) Assessment of Learning 1. Manila:


Adr iana Publishing Co., Inc., page 6 0, t he following are the defects in the construction of
assessmen t t asks which can make assessmen t in feren ces inaccu ra t e:

TRADITIONAL TEST AND PERFORMANCE TASKS:


1. Test dir ect ions wh ich a r e u nc1ea r
2. Voca bu lary and sen te nce stru ct u re which a r e complicated .
3. Ob scure stat em en ts
4. Insu fficie nt t im e limits

BRIEF-CONSTRUCTED RESPONSE AND SELECTED -RESPONSE ITEMS:


1 . Unsuitable d ifficulty level o f test items
2. Poor ly t est it ems con struction
3. Not a pp rop r ia te test ite ms fo r o ut comes b eing measur e d.
4. Sho rt test
5. Impr oper test ar rangement
6. Identifiable answ er patter ns

Reliability

What is r eliabilicy? Accord ing to De Guzman, Estefania.S., Adamos. Joel L. (20 15)
Assessment of Learning 1. Manila: Ad riana Publishing Co., Inc.. page 6 1. r eliabilityrefers to the
repr odu cib ility a nd consisten cy in criteria and meth ods. If a n assessmen t pr odu ces t he same
resul ts if given to stud ents o n tw o occasions, it is said to be reliable. ReHabiHty pertains to t he
obtained results of assessment s an d not t o t he t est or an y ot he r assessment tools/ in strumen ts.
In ad dition to that, reliability is u nlikely to tu rn ou t 10 0% because no two tests v,ri11 consistently
pr oduce the same r esults. Some envir onm en t.a) factors may affect r eHabiHty.

Likew ise, accord ing to Developing Reliable Student Assessments I Cen ter fo r Teaching and
l earning. [n.d.). Ret rieved Sep tember 15. 2018, from https://ctl.vale.edu /ReliableAssessmen ts.
reliability also r efer to how \•vell a scor e repr esents studen t's abiHty and ensur es that assessments
measur e accurately student knowledge. A full test or ru b ric can not be d escribed as r eliable or
unreliable because r eliability r efers to scor e specifically. Students are able to gr asp t heir
develo pment level and aid teachers to im pr ove t heir teaching t hr ough reliable scor es. To
estimate scor e reliability variety of met hod s ar e used and instructors can make such reliability
methods t ranspar ent so as to m otivate students to make effo rt and to a ssur e them o f accuracy.
The same so urce p rovided examples of reliabi1ity mea su res v,.rhich are also considered as
the different types of r eliability and these ar e t he following:
1. Jn ter·rater - Ther e wil1 be tw o separ ate individ uals v,rh o w m evaluate and scor e a test,
perfo rmance or essay which are given in a subject. Scores from each o f t he evaluator s
are cor r elated using the correlatio n coefficien t which is used as and estim ate o f
r eliabilit y, Anot her statistical tool which is called Cohen' s kappa m ay be utilized
wher ein t he amoun t of agr eement which may occur betw een t he tw o eva)uator s wiH be
consid er ed as a r esult of chance.

2. Test-r etest - On a separate occasions, individua1s take the same test and t he scores
ob tained can be corr e1ated b y teacher s using t he correlation coefficient as t he estim ate
of reliabiHty. This app ro ach should be sensitive to t he amount o f time and learni ng
degree betvveen test administrations because stud ents learn from tests.

3. Par al1e1 fo rm s - The same gr oup of individ uals ar e given wit h tw o equiva)ent tests
which measu re same concepts, knowledge, abilities and skills and the scor es can be
correlated b y teacher s. Corr elation coefficien t w hich is the estimate o f reliability w m be
utilized but teachers shou)d know that d esigning tw o separate b ut identica) tests ma be
so difficult.

4. Split-half- A test is d ivid ed into tw o sets of items. Scor es gained by the stud ents on half
of t he test is cor related with the scor es o f the said students on the ot her half o f the test.
Splitting t est may b e d one in d iffer ent wa ys but teach er s sh ould b e a wa r e t ha t t his
me th od w ill in fl ue nce the cor re la tion coefficien t.

5. Cron bach's Alpha - When an a lyzing mu lt iple choice t ests or Likert typ e scales, t his is
th e mo st common ly re ported measu re o f r e1iab ility. It is th e mea n of a ll possib1e sp1it -
ha lf comb ina tion s or t he a verage or cen tra l t en de ncy w hen a test is s plit aga inst itse lf.
It can be ca1cu1at ed by teache rs usin g Excel or a n y oth er sta tistical softwa r e pa ckage.

Sources of Reliab ility Eviden ces

Accor ding to De Gu zman. Estefania.S. Ad amo s. Joe l L. (20 15) Assessm ent of Learning 1.
Ma n ila: Ad riana Publish ing Co., Inc.. pp . 6 1-6 3, in te r ms of sou rces of r e lia bility e vide nce , t he re
a r e five c1asses and th ese a r e t he follow ing :
1. Evid en ce based on stab ility
Th e test-retest relia bility is u sed t o de te rmin e th e test result stab ility over time. It
supp orts th a t th er e is n o con sid er ab le cha nge in t he constru ct b etv,re en t he first a n d
second t esting . Becau se ch aract er istics may cha nge if th e time int erva l is t oo long. timing is
cr itical. Becau se stud e nt s may still r ecal1 th eir r espon ses in th e first test. a shor t ga p
betw een t he testing sessions is n ot r e comme nd able too.

2. Evid en ce b ased on equiva lent forms


Par allel fo rm s o f relia bility sup port th e eq uiva lency of fo rms . Eq uiva len t fo r ms
ar e id ea l fo r actio n r esear ches which will u tilize pr e and p ost t est as w ell as fo r make up
tests. New and diffe rent nev,r items sh ould be th ought o f b ut measuring t he same
constru ct.

3. Evid en ce b ased on in tern al con sistency


Int erna l consisten cy signifies t ha t a stud e nt who has t he mastery lea r ning v,ri11 ge t
aU o f m ost of th e test it ems corr e ct ly wh ile a stu dent who h as litte r maste ry or n othin g
ab out t he subje ct matt er will get most o r a ll o f th e it ems w rong . The split h alf meth od can
be used t o che ck fo r int er nal consisten cy. For larg e q uestionn air es or test w it h seve ra l
He m ::. m ea ::.u r iu!S th e ~a m ~ t.:u u ::.i..r u._:4 ::.pli.i..- h a lf i::. e cr~t:.i..ive , Ite m ::. w ilh Ju w t.:u rTdctLi.ou ::.
ar e eith er modi fie d or remove d to imp rove th e r eliab ility o f t he test

4. Evid en ce b ased on scor er or ra te r con sist ency


Ha "vi ng mu1t.i p le r at e rs can in crease r eliab ility lik e seve ra l ite ms in a t est can
impr ove t he re lia bility of a stan da rd ized test. In ter-ra t er relia bility refer s to t he degree t o
which d iffer en t ra ter s, ob serve rs or judges agr ee in th eir d ecision s r ega rdi ng th e
assessment. A wise se1ection and tr aining of good evaluator s and u tilization o f ap plicable
statistica1 techniques are suggested to mitigate rating error s. Jn gr ading essays,
perfo rmance assessments, w riting samples and portfolios, inter -r ater r eliability is usefu1.
In doing so, t her e should be a g ood var iation of pr oducts to be evaluated, criteria in
scoring ar e d ear and rater s are trained and know]edgeable on how to use the observation
instr umen t.
The spear man's r ho or Cohen's kappa may be used to ca1culate t he correlation
coefficient betv,reen or among the ratings so as to estim ate inter -rater reliability.

5. Evid ence based on d ecision consistency

Decision consistency displays how consisten t the classification decisions are . It.is
not how consisten t the scores are. It is seen in situations when teachers or instructor s
decid e w ho wil1 receive a PASS OR FAJL mar k. or consid er ed to have t he master y or not.

A g ood example is hov,r Jevels of pr oficiency ar e adop ted in t he onset of


implem en tation of K to 12 assessments w herein grades o f lear ner s in K to 12 are r eported
and d escribed based on the levels of p roficiency and no ton their specific grad es.

Reliability of Assess men t Metho ds

Accor ding to De Guzman, Est.efa nia.S., Adamos, Joel L. (2 015) Assessment of Learning 1.
Manila: Adr iana Publishing Co., Inc. , page 65, w ell-constructed objective tests have better
reliability than per fo rmance assessment because o f judgmen tal scor ing. Depending on the
raters. inconsisten t scores may be ob tained may be because o f inad eq uate tr aining of rater s or
inad eq uate specifications o f the ru brics used fo r scoring. In ad dition to t hat. t here is a limited
sampling o f cour se con tent in a perfo rmance assessmen t. Reliability o f perfo rmance
assessmen ts may be r aised by constr aining t he domain coverage or by struct ur ing the responses.
By using analytic and top-specific ru brics completed wit h t he exemplar s and tr aining of raters.
reliab1e scoring of per for mance assessmen t can be enhanced.
Jn imp roving o ral exam inations, increasing the n umber of q uestions. n umber of
examiner s, response time and using a r ubric or mar king guide which con tains t he stand ard s
and criter ia may be consid ered.
With regar ds to observations, observation instr uments should be comp rehensive so as to
ad equately sample occur r ences and non occurrences of behavior, bu tstil1 manageable to cond uct.
Data from direct o bservatio n can be enhanced thr ough t he in ter -observer agr eemen t and the
in tr a-observer reliability. Inter -observer agreement r efer s to t he consistency of observation d ata
gather ed by multip1e observes and teachers w hile intra -obser ver reliability pert ains to t he
consistency of the coHected d ata o n behavior mul tiple tim es by a single o bser ver or teacher .

The same sour ce id en tified some ways on how to imp rove t he r e1iability of assessmen t
resul ts and t hese ar e the follov,.ring:
1. Provide mor e time, mo re questions and mor e observation w henever practical so as to
Jengt hen t he assessment pr oced ur e
2. Assess all the important aspects of the largest }earning per fo rmance so as to b ro ad en
the scope of t he pr oced ur e
3. By utilizing a systematic and mor e fo rmal pr oced ur e for scor ing stud en t perfor mance,
impr ove objectivity.
4. Employ in ter -r ater r eHabiHt y so as to u tilize multiple mar ker s
5. Combine r esults from several assessmen ts especial1y when mak ing cr itical decisions
6. In completing the assessmen t p ro ced ur e. pr ovid e sufficient time to students
7. By pr ovid ing p ractice and t raining to students and m otivating t hem. teach students
how to per for m their best.
8. Provide tasks t hat ar e neither too easy nor too difficult and tailor the assessmen t to
each student's ability level \•vhen possible so as to match the assessmen t difficulty to
the students' level of ability.
9. Select assessmen t tasks which d istinguish or discriminate the best from t he }east able
students so as to differ en tiate among stud en ts.

Likewise, Developing Reliable Student Assessments I Center fo r Teaching and Learning.


(n.d.) . Retrieved Septem ber 15, 2018, from ht!ps•!/ctl,ya)e,edu /Reliahle.Assessments pr ovided
suggestions and r ecommendations on how to increase reliability of the differen t metho ds o f
assessmen ts ,
A. ESSAY OR PERFORMANCE BASED ASSESSMENTS
1. Design a ru bric - Rub rics aid t he evaluator s to focus on the same criter ia across aU
the submissions.

2. Gr ade item by item - Instructor s can grad e the first p roblem/ essay on each students
paper befor e g rading the second pr oblem/ essay. That is if students are given
multiple p roblems or essays.

3. Gr ade anonymously - To avoid biases, teacher sho uld g rade anonymously so as to


mini mize the effect of bias in the grading pr ocess. Teachers may do t his b y
bypassing a student's name w hen grading.

4. Train evaluator s - Teachers as evaluators should be pr ovided wit h training on how


to utilfa.e r ub rics o r grading cr iter ia in grading the essay or perfo rmance based
assessments.

B. MULTIPLE CHOICE TEST/ LIKERT · TYPE ITEM S

1. Design the assessmen t using a table of specifications - Assessment or test fo r t his


types should be guided with the table of specifications designed by the teacher
p rior to the adm inistration of tests. It allows for su bsca}es to be cr eated amo ng
multiple concepts being tested. For instance, fo r item s w hich test t he fir st unit and
items that measu res t he second unit, a separate reliability coefficient may be
calculated .

Вам также может понравиться