Академический Документы
Профессиональный Документы
Культура Документы
At the core of the learning process, assessment is there. Assessment is done fo r ga uging
and enhancing student learning. Teachers, students and stakeholders of education should
understand \1vhat assessment is real1y all about. why it is needed and hov.• it is connected to
measurement, testing and evaluation.
What is Outcome· based Curriculum aHabout? According to OBE Principles and Process.
(n.d.). Retrieved August 28, 2018, from http:flcei.ust.hk/t ea.ching·resources[outcome·based·
educ.ationfi nstitutional-resources/ obe·princip1es·and·process# l , outcome ·based education
refer s to the process of d early focusing and organizing everything in the system of education
around \1vhat is important for aU students to be able to do successfully at the end of their learning
experiences.
Course Module
Like\vi s e, according to QBE Principles and Process. (n.d.). Retrieved August 28, 2018, from
http: //cei.us t.hk/ teachin g·resources / outcome-ba sed-education {institutional· resources / obe·
princip}e-s·and·process#1. outcome-bas ed education curriculum m eans starting \r\tith a clear
picture of what is relevant for students to b e able t o do, then organizing the cur riculum,
instruction a nd ass essment s o as to ensure that the sa id }earning ultimately happens. OBE
curriculum has the fo1lowing principles according to the same s ource and thes e are the fo llowing:
1, Clarity of focus
Focus of w hat the teachers do should be on \1vhat the said teachers want st udents to
know, understand and be able to do. Helping others to develop the knowledge, skills,
persona1ities and attitudes of students which wil1 enable them to achieve the intended
out come \1vhich have been d early articulated should be t he fo cus of teachers.
3. High expectations
In order to encourage students t o engage deeply in w hat they are }earning, teachers
s hould estab1ish high and chaUenging performance s tandards. Su ccess ful }earning
promotes more s uccessful learning and so students should be helped to achieve high
s tandards.
4. Expanded opportunities
Teachers should strive so as t o provide expanded opportunities for an students.
This is bas ed on the idea that not a ll learners can learn the sa me thing in the same time and
in the same way. If students are given appropriate opportunities, said students can
achieve high standards.
MEASUREMEN T
According to De Guzman. Est efania.S., Adamos, Joel L (2015) Assessment of Learning
1. Manila: Adriana Publishing Co., Inc, pa.ge 2, measurem ent comes from the Old French
word mesure meaning 1 imit or quantity". Measurement basica1ly is a quantitative
description of characteristic or attribute of an object. In education, m easurem ent refers to
w hat do teachers measure and what instruments do they use in m easuring. In determin ing
how much learning a student has acquired compared to a standard or criterion or in
reference to ot her students in a group (norm-referenced), teachers are interested. The
particular elements of learning like students' readiness to learn. demonstration of specific
skills, recall of facts, or their ability to analyze and solve applied problems are usually
m easured by t eachers. To obtain pertinent information, teachers utilize tools or
in struments Hke oral pres entations, written reports, portfolios and rubrics.
TESTING
According to th e same source, the most dominant form of assessment is the test. It
is a traditional assessment and may not be the best \ivay to measure how much students
have }earned but t hey still provide valuable information about th e }earning and progress
of students.
What are t he purposes of test? According to Francisco, M, {20 15, December 07).
Type of Test , Retrieved August 28, 2018, from
https· //y.ryy1\' slideshare net/ManilynErancisc.oAype·nf·test·55896-S44, th e purposes of
test are the fo1lo,vi ng:
1. To assess understanding of students to the given topic within a subject a nd
determine what th ey have }earned;
2. To evaluate the progress of students in subject within a given period of time;
3. To assess th e strengths a nd \iveaknesses of students for focus, assistance or
individual instructions;
4. To identify \1vho \r\ri.11 receive awards, scholarships and recognition;
5. To provide basis of qualification for entry into a progr11m, school higher-
education, scholarship or int ernship;
6. To gain col1ege credit -a dva nced placement exam;
7. To measure the effectiveness of a teacher or schools as part of teach er
evaluation, high stakes testi ng
ASSESSMENT
According to The Role of Measurement a nd Assessment in Teaching (n.d.) Retrieved
August 28, 2018 fro m
https://v\T'\VV,1.geneseo.edu / sites/ default/fi1es/ sites / edu~ tion/ p12resources-a ssessment·
informs·teacbing pdf , assessment refers to procedures used to ga in information about the
}earning of students and th e value judgment fo rmation concerning learning progress. The
general principles of assessment a ccording to the same source ar·e the following:
EVALUATION
Likev.'lse, according to De Guzm an, Estefania.S., Adamos, Joel L. (20 15) Assessement
of Learning 1. Manila: Adriana Publishing Co" Inc., p. 10, after the data had been collected
from an assessment task. eva luation comes in. It is the process of judging the quality of
performance or course of a ction. To make sound decisions about the students a nd the
teaching-learning proces s, assessment data ga thered by the t eacher have to be
int erpreted. To uncover how the learning process is developing, evaluation is carried out
bot h by the teacher and his students.
Fl-~ m Fh.>:kKl",inh >-1mr. ;,, !low. ( ZGl s. Mit d \ 3:)-:>Jndi n:.~,:-,t,;J,:c!'!H·pt~ i d p m:.: tpk~ m Li r-¼Jllf~ T~t1r4 . ::mi~v~e A~Uc:t
:.~. 201S. from E ~ / W"l\"W.~hd ~fr,i r~.nu /kb :ih.mmhah !!1:n~ m~:-.:il...:onu pt~•i d •~tln:ir k ~·m•b'n&~;;i;••~~rnng
tied up with evaluation. What's the other meaning of assessment? Aside from the
definitions given beforehand, assessment also refers to the information collection about the
performance of learners u sing differe nt methods and tools. It can be cl.assified according to
the foHowing categories:
1. Maximum perform.uiee
When students are motivated to perform well maximum performan ce is
achieved. Results of assessments: form maximum performance shows what
students can do at their level best • their achievements and abihaes. Students are
encouraged to a im fo r a high score in th is category. Examples of measures of
maximum performance are aptitude and achievement tests.
2. Typical performance
Typical performance measure manifests what students will do or choose to
do. It assesses how the ability oflearner is .-1dent !f demonstrated on a regular
b asis. It is more focused on the level of motiva tion of learner rather than his
optimal ability. Measures for this are interest, attitude and personality in,·entories.
peer appr•isals and observation techniques. Insights Into the learner's Interests.
personality traits and potential career preferences.
According to th e same sour ce, purposes of assessment are given and these are the
following:
1. Assessment for Learrung
tr pertain.s to assessment tasks which are diagnostic and formative in nature
and which are utilized to Identify the learning needs, monitor progress of students
academically dunng a unit or block of instruction and guide instructions. On·going
and immediate descriptive feedback concerning their performa."!ce is given to
students. T..chers can make adjustments which are based on assessment results.
when necessary m their methods of teaching and strategies to support ?earning.
They can decide whether there is a need to differenti:ate instruction or make ~
design learning activities which are more appropriate so as to clarify and
cons oh date knowledge, understanding and skills of students.
Examples:
• Pre•te.sts
• Qwzzes
• \.Yritten assignments
• Focused questions
• Concept maps
2. Assessment as Learning
1, employs activities or tasks which provide opportunity to students m
momtonng and furthenng their own Jearnmg - to think about their personal
learning habits and how to adjust their learning strategies so as to achieve their
goals. Metacogniti,•e processes like self regulation and self·reflectton is involved
here so as to allow students to use t.heir strengths and \\'Ork on thelT weaknesses
by direcing and regulating their learning. Studects are accountab}e and
responsible for their own learning. It is also form.alive which may be given at any
learning process phase.
Examples:
• Self· assessm ent rubrics
• Peer assessment rubrics
• Portfo lios
3. Assessment of Learning
It is summative in nature. lt is done at the end of a task, unit, process or
period. To provide evidence of the level of achievement of students in relation to
curricu1ar outcomes is its purpose. lt is used for purposes of gra ding, evaluation
and reporting. The concerned students, his parents and other stakeho1ders are
given evaluative feedback. Foundation for decisions on placem ent and promotion
of students is provided by assessment of learning.
Examples:
• Unit tests
• Final projects
RELEVANCE OF ASSESSMENT
In the same source, it \11t•as discussed that assessment has importance and \11t•ork for
the fo llowing:
1. Students
Students become actively engaged in t he process of learning through the
different constructive and }earner-centered assessment tasks. Because of these,
students take responsibility for their own }earning and }earn a lso to monitor
changes in their }earning p.atterns wit h the guidance of the teacher. Such
assessments lead to better student achievement.
2. Teac.h ers
Assessment gives information to t ea chers about the knowledge and
performance based of students. Because of it, most effective teac.h ing methods and
approaches can be revealed. Direction is provided as t o how t eachers can help
learners more a nd what they should do next. Assessment procedures also support
decisions of instructors on managing instructions, assessing competence of
students, placing students to levels of education programs, assigning grades t o
learners, guidance and counseling, certifying ,compet ence and selecting students
fo r education opportunities.
3. Parents
Education is shared partnership. ln the assessment process, pa rents should
be involved. They ar·e valued source of assessment information on the }earning
habits and educational history of their children especially for preschoolers who
Course Module
Types of Test
C. Mode of administration
1. Individual test - It is a test given to one person at a time. To gather
informa tion extensive1y about each student's cognitive functioning and his
abi1ity to process and perfo rm specific tasks, individual cognitive a nd
achievement tests.
D. Test constructor
1. Standardized tests· SpeciaJiz.ed who ar·e versed in th e assessment principles
are the ones who prepared these tests. These are administered to a large
group of examiners or students under similar conditions. Scoring procedures
and interpreta tion are consistent. To aid in t he administration and
interpretation of test results, available a nd guides are available.
F. Nature of answer
1. Personality tests- These tes ts meas ur·e one's personality and behavioral
sty le. There's no right or \'°JT ong answer s in tiltese tes ts. It is ut ilized in
recruitment as it helps employers t o determin e how a potential employee
\r\til1 respond to diffe rent work-related activities.
2. Achievem ent tests - These meas ure the learning of students as a result of
instruction and tra ining experiences. They serv e as a bas is for promotion to
the next )eve) or grade.
3. Aptitude tests - These tests determine the potential of students to learn and
do new tasks. These aids in choosing the best line of work for an individual
based on his interes ts and skills.
6. Tra de or vocational tests • These tes ts ass ess knowledge, skills and
competence of an individual in a particular occupa tion. Trade tests may
consist of practical test and a theory test s.
Roles of Assessment
2. Formative assessment
Ct mediat es t he teaching and learning process. It is both t each er -centered
and learner-cent ered. It occurs dur ing instruction and context-specific~ It is
utilized as feedback to enhance teaching and improve learning process.
Format.ive assessment results are recorded for the monitor ing learning progress
of student purposes.
3. Diagnostic assessment
To identify learning difficulties during instruction is the int ention of
diagnostic assessment. Commonly held misconceptions in a subject can be
detected using a diagnostic test. Ct is utilized to detect causes of persist ent
learning difficulties despite the pedagogical remedies applied by the teacher and
it no not p.a.rt of mark of achievement of students.
4. Summative assessment
To determine the extent to which the student s have attained the }earning
outcomes, it is done at t he end of instruction. It is utilized for assigning and
reporting grades or used to certify the mast ery of concepts a nd skills. A written
examination a t the end of the school year to determine who passes and \1vho fails
is an example of this assessment.
Module 002 Appropriateness and Alignment of
Assessment Methods to Learning Outcomes
2. Se lect appropriate assessment measu res and assess the learning outcomes.
\Vays of assessing the learning outcomes are selected and utilized. It is
recommended to focus on the direct measures of learning. More ofte n than not,
student performance levels for each outcome is described and assessed with the use of
r ubrics. It is relevant to identify how the data wUI be collected and who will be
responsible for the collection of data.
In step number 1 , it was clearly stated that learning outcomes should be clearly
defined and identified. What is the purpose of identifying them? Acrording to Identifying
Learning Outcomes and selecting Assessment Tasks (n.d.) Retrieved September 9. 2 0 18
from hllPsi/ /teachin1com mons-vorku,ca /resources /cl earning/s:Jcarning/identifying·
learning-outcom.es-and:selecting•assessmenMasksL its purpose is to express the desired
results of a particular learning experience. Teachers need to decide on how to ask their
students to evidence their learning through assessment tasks o nce there are learning
outcomes identified. Assessment tasks refer to acth,; ties students will undertake so as to
confirm whether or not the outcome has in fact been achieved during and a~.er the process
of learning. The said activities are used to tell how well students are learning in relation
to the stated learning outcomes and to provide students with feedback. One of the most
re levant determinants of assessment l'.isks is that the assessment has to be consistent with
the lea rning outcomes identified.
What are the characteristics of the student learning outcomes/ According to Taking
Aim at Student Learn ing (n.d.) Retr ieved September 9 2018 from
https; / /vp,studcntlife,uiow.t,cdu/a,ssets /Taking-Aim· Prescnution.pdf, the desired
characteristics of student learning o utcomes are the following :
• Align with the goals o f division~department and institution
• Describe a behavior w hich i s
✓ Specific
✓ Meaningful
✓ Measurable
✓ Attainable
• Describe a single behavior
• Describe knowledge. skills and habits of mind
Taxonomy of Lear ning Domains
Statemen ts of per fo rmance ex-pectations are learning outcomes. Cognitive. affective and
psychomotor ar e t he domains of )ear ning characterized by lear ning behavior change. According
to Taxonomies of Learning. [n.d.). Retrieved Septem ber 9, 2 018, from
htt ps:/fbokcenter.harvard.edu/taxonomies learning, in d esigning learni ng o bjectives, types o f
4
wor k which stud en ts wi11 d o to d em onstr ate the achievemen t o f d esir ed outcomes should be
plan ned and thought of.
design/th reed omainsoflear ning/t he said t hr ee domains - cognitive, affective and psychomotor
shou)d be know n and u tilized b y t he teacher in constructing t he lesson and of cour se in assessing
the students' )ear ning.
A. COGNITIVE
+
Rep roduce
- 2. Understanding • Interpreting Describe
-
(paraphrasing, Convert
clarifying. • Estimate
representing. • Distinguish
translating) • Extend
• Classif);ng
• Paraphrase
( categorizing)
• Summarize
• Exemplifying
• Rewrite
(illustuting) • Generalize
• Summarizing
(generalizing,
I
abstracting)
• Comparing I
(m_!!ll'!!>S- matchin•.
contrasting)
• Inferring (p redicting,
concludin g)
• Explainin g
( construct ing
modelsl
3. Applying • Imp leme nting • Change
(Using) • App ly
• Exec uting (Car rying • Comput e
out ) • Classify (examp les of
con cept)
• Demonstrate
• Modify
• Discover
• Pr ed ict
• Relate
• Show
• Prepare
• Op er ate
in tegrating.
• Organize
• Attrib uting (
de constructln,..,
5. Evaluating • Cr itiqu in g (Judging) • Compare
• Checking ( Detect ing, • App raise
monitoring, testing. • Contrast
coordinating) • Conclu de
• Criticize
• Ju dge
• Evaluate
• Support (a judgement
• Verify
• Justify
B. AFFECTIVE
Accor ding to Learning Domains - Student Life Learning & Assessment. (n.d.).
Retr ieved September 9, 2018, fro m https://www.emporia.ed u/studentlife / learning-and-
assessment /guide / domains.htm1. affective domain deals with the values, attitude and
emotions. It ls the "valuing'" domain.
Organizing
Valuing
f.rathw<thfs Taxonc:ny of Mfm h·e l earning (n,d,) Rmi i.Yed S: ptembe: 9, 2018
from bnps:/JY."l."••••r.:s: arch,..ai:a.,n: t /firure/Kn!hwohls.•Ta.1:c:ioi:w-of•Aff: cti\•: •i : antin: fi: 2 21371'3847
To unde rs tand be tter, Table 2 contains the leve ls. descriptio ns and sample
action verbs that may be used in writing learning outcomes. Contents are from De
Guzman, Estefania.S., Adamos, Joel L. (2015) Assessement ofLearning 1. Manila:
Adriana Publishing Co., Inc., page 37 and l earning Taxono my • Krathwohl's
Affective Domain ( n.d.) htt ps: //glo bal.ind iana.edu /documents / Learning·Taxonomy·
Affective.pdf.
o f the learner.
• Hold
Jo west Jeve ls of
• Name
I
comp lex acting • Act
consist ent ly wit h th e
n ew va lue
• In flu e n ce
• Modify
• Learning outcomes at • Liste n
t.his level cover a broad
range of activities, but • Pra ct ice
th e major emphasis is • Perform
on the typical • Qua lify
chara cteristic of t he • Question
student • Solve
• Serve
• Use
• Ve rify
C. PSYCHOMOTOR
To understand better, Table 3 contains the levels, de scriptions and sam ple action
verbs that may be used in writing learning outcomes. Unde r psychomotor domain.
Co ntents ar e from De Guzm an, Estefania.S., Adamos, Joel L. (2015) Assessementoflearning
1. Manila: Ad riana Publishing Co., Inc., page 36 and Psychomotor Domain. [n.d.). Retr ieved
September 9, 20 18, from http://users.rov1ran.edu /~cone/ curriculum / psychom otor.htm
2. Constructed -r esponse
The constructed -response type is mor e useful in tar geting higher Jevels of cognition.
Jt is subjective type. Jt d emand s stud en ts to pr oduce or created t heir ow n answers in
response to a p roblem. question or task. Jtem s may fal) under any o f t he following
categories:
• Br ief-constructed r esponse items - only short r esponses from students ar e
r equir ed. Examples of these ar e t he following:
✓ Sen tence completion - students fill in the blank at t he end o f each
statemen t
✓ Short answer to open-end ed questions
✓ La beling a diagram
✓ Answer ing a Math pr oblem by showing t heir solutions
3. Teacher observations
In assessing the effectiveness of teaching str ategies and acad emic interventio·ns,
this assessmen t method can also be used. Str engths and w eaknesses of individual
students and t he c1ass as a who le may be r evealed through t he info rmation gather ed
from observations.
4. Student self-assessment
In relatio n to a set of assessment criter ia, students are given a chance to r eflect
and r ater t heir own wor k and judge how well t hey have performed in this pr ocess.
Students d o the t racking and evaluating t heir ow n performance or pr og ress.
SELF-MONITORING TECHNIQUES:
• Activity checklists
• Self-r eport in ven tories
✓ Questionnaires or surveys w hich r eveal students attitudes and
beliefs about them selves and ot hers.
• Diaries
3. Skills
• Per fo rmance assessment is t he super ior method for assessment.
• \tVhen it is used in a meaningful and r eal·Jife context, it beco mes authen tic.
• Applications w it h less·str uct ur ed pr oblem s w here pr oblem id en tificatio n.
collection. in tegration, organization and info r mation evaluation and o riginality
ar e emphasiz.ed, performance a ssessment ar e suited.
4. Products
• These are tangib)e and substantial output that showcases understanding of
students of concepts and skills and their ability to ap ply. analyze. evaluate and
have it in tegrated wit h t hem.
• Performance tasks are used so as to ad eq uately assessed p rod ucts
• Observation can be used to v,ratch and inspect h ov,r students br ing the pr od uct
elements toget her
• Based on a set of )earning criter ia. self-assessmen t and peer eva)uation in
fo rmative assessmen t m ay al1ov,r students in r eflecting and making judgm en ts
abou t t he quality of wor k and that o f their peer s.
5. Affective
• Affect refers to attitudes, values and interests students m anifest.
• Self-assessment is the best method for this learning target - may be in the for m
o f responses of stud en ts to self-r eport affective in ventor ies using r ating scales.
• Observation is also used in assessing affective q ualities like honesty/ integ rity.
w ellness. persona) discipline, etc.
• Ora) questioning may also be used in assessing affective traits.
Module 003 Validity and Reliability
Validity
What is valid ity? Accor ding to De Gu zman, Estefania.S., Adamos, Jo el L. (20 15) Assessm ent
of Learning 1. Manila: Adriana Publishing Co., Inc., pp. 52, valid it y is a term d er ived from the Latin
wor d. VaUdus w hich mean s str ong. \1\lith r ega rd s to assessment. it is con sidered va lid if it
measur es w hat it is sup posed to. Fo r teacher s, validity pertains to the in ference accur acy w hich
teacher s make abou t students based on t he gather ed info rmation from an assessmen t. It im plies
that if ther e ar e strong and sound evidences of the ex-tent of stud en ts' )earning. teachers'
evaluation of t heir students per fo rmance is vaUd.
Likew ise, as cited alr eady from t he same sour ce, if an assessment measures t he actual
knowledge and perfo rmance v,rit h r espect to t he in tended outcomes and not something else. it is
considered as valid.
Likew ise, accor ding to Assessment / Quality Test Co nstr uctio n I Special Con nections. (n.d.).
Ret rieved September 11, 2018, from
.httB://www,speciak o.nn.ecti.o.ns_.ku..ed.u /?q =as_s..e.s.s.m.e nt/qualit;.y test. cons..tr uction, validity is t he
degr ee to which theory or evidence supports any conc1usions or interpretations abou t a certain
stud ent base o n his test per for mance. To make it simple. it is how one knows that an English test
measur es students' English ability and not t heir Math ability.
Evidences fo r Validity
Accor ding to Sour ces of Validity Evidence (n.d.) Retrieved Sep tember 11, 2 018 from
bttp://www siop nrg/ princip)es/ pages l 3to26 pdf in making infer ences from the r esults of a
select.ion pr oce du re to t he sub sequ en t w ork b eh avior o r ou tcom e perfo rman ce slho uld b e based
on e vid en ces th at \ '\rill sup port t he said in fer en ces.
Accor ding to De Gu zman. Estefania.S., Ad amo s. Joel L. (2015) Assessem ent ofLearning 1.
Manila: Ad riana Publishing Co., Inc. . pp. 53 -5 7, t he following ar e t he sour ces of valid ity:
1 . Co n tent-r elated eviden ce
Con te nt -r ela te d e vid en ce fo r validity r e fers to t he ex'ten t t o which th e t est cover s
th e entire cont en t d omain. The assessme nt sh ould con ta in items from each t opic, if a
summative test cove rs a u nit with fou r t opics. Ad eq ua te samp ling o f con te nt can b e
do ne to materia lize t his. Per fo rman ce of stude nt s in th e test may b e utiHz.ed as an
indica tors of his con tent knowledge. For in sta n ce. if a Gra de 11 student was a ble t o
corre ct ly answer 8 5% of t he it ems in Scien ce test a b out matter, th ,e teach er may
con c1u de t ha t the said st ud en t kn ows 8 5% of th e con t ent a rea.
The d egree t o which test scor es agr ee with an ex'ter nal crit e rion is caHed
criter ion -relat ed evide nce. It. is r e la ted t o ex'ter nal validity. The r elat io nship bet we e n
an assessment a nd a n oth er measur e o f the same tr a it is examin e d by crite rion -r ela te d
e vid en ce. Thr ee typ es of criter ia a re
Let's start wit h the term construct? Construct r efer s to an individ ual characteristic
which e:,rplains some behavior aspect. An assessmen t of t he quality of t he instrument
used is w hatconstruct-r e)ated evidence is. The extent on which t he assessm en t is a
meaningful m easur e on an unobservable tr ait or characteristic is measur ed by construct -
related evidence.
A good construct has a t heor etical basis which means t hat it sho uld be operationa11y
defined or explained unambiguously to differentiate it from ot her constr ucts. In
establishing construct validity, tw o methods may be used and t hese ar e the fol1owing:
a., Con vergent valid ation - it occu rs \•vhen measur es of constructs w hich are
r elated are in fact observed to be related.
Accor ding to De Guzman, Estefa nia.S.. Adam os, Joel L. (20 15) Assessement of
Learning 1. Manila: Adr iana Publishing Co.. Inc. , pp. 57-59, Messick (1989 ) proposed a
unified concept of validity which is based on an expanded theory of constr uct validity, It
ad dr esses meaning of scores and so cial values in test interp retatio n and test use. In t he
said concep t o f unified validity, conslderat.ions o f conten t7 criter ia and consequences ar e
in tegrated in to a construct fram ew ork fo r t he em pirical testing of rational hypotheses
about how to give meaning to scor e and t heor etically importan t r elationships. Si.x distinct
construct validity aspects \•vere pr esented and t hese are the
• Con tent- it is parallel to con tent -related evidence v,.rhich calls fo r con tent
r elevance and repr esentativeness.
• Su bstantive - it refers to t he theor etical constructs and empir ical evidences.
• Gener alizability - How score p roperties and interp retations gener alize to
and acr oss pop ulation gr ou ps, con texts and tasks is examined in t his aspect.
This is called ex'ternal validity,
• Struct u ral - how w el1 the scor ing struct u re matches the constr uct d omain is
being assessed in this aspect.
• Exter na) - Con vergent and discriminant evidences taken from Mu1t.i trait·
mu1thnethod studies ar e inc)ud ed in this aspect.
• Con seq uen tial - deals wit h t he in te nd ed a nd unintend ed assessmen t e ffect s
o n t eaching an d }earning.
Ora l q uestioning h as high va lidit y in contr ol1e d conditions. Va lidity of assessment can be
en sured t hr ough a ch ecklist v,.rhich de fin es t h e ou tcomes to b e cove red a n d th e c rite r ia or
stan da rd s to be achieve d . Th er e sh ould be a sta nd ar d or structur ed list of qu estion s for
summat.ive p urposes,
For ob serva tions, t he be havior of inter est shou ld be accura t ely descr ibe d by t he
ope ra tion al a n d r esp onse de fin ition s. If e vide nce is p ro pe rly r ecor de d a nd inte rp re ted, it wiH
conside re d as high ly va lid. If a ddit ion al stra tegies on assessmen t wil1 b e utilize d with
observat ion like surveys, int e rview s a nd q uan titative methods like tests , va lidity wm be str onger .
For se lf-a ssessme nt, stu de nts shou ld be in forme d o f th e domain in wh ich t he task is
emb ed de d t o increase validity. St ud ent s sh ould be t aught an d a t t he same time }ea rn h ow to
assess th e wor k object ively based on t he d ea r ly d efined crite ria an d dismiss any inte rest b ias.
Stud ents may b e ind uces t o make a ccu ra t e assessment of t he ir ow n perfo rmance if th ey wil1 be
info rmed tha t t he ir self-assessme nt ratings wil1 b e comp ar ed to those mad e by t he ir peer s a nd
teach er .
Accor ding to AMLE - Association for Midd le Level Education. (n.d.). Ensuring Valid,
Effective, Rigor ous Assessments. Retrieved September 15, 2018, from
https://www.amle.org / BrowsebyTo pic/WhatsNew /WNDet/ Tabld / 270 /Art MID /888 /ArticlelD /5
70 / Ensuring-Va1id -Effe ct ive -Rigo rous-Assessment s.aspx,. a pr ocess v,.rhich \•viii e nsu re th e
utiHza t ion of valid, e ffect ive an d d emanding assessme nts for stu dent s is comp osed of t he
following steps:
1. Deco nstr uct t h e standa rd s.
Iden tification of stan d ar ds w hich w m be a ddr essed in a cert a in unit of study should
be d one. Afte r which, deconstruct ea ch stan da rd . It in volves br eaking t he said stand ar d
in t o diffe r en t )e arning tar get s a n d a ligning ea ch of th e sa id lea rn ing targets t o var ying
achievem en t 1evels and th ese a r e t he fo l1owing :
1. Knowledge - It focu ses on kn ov,ring a nd u nd er standing Jike vocabu lary.
synt a ct ic stru ct u res, n umbe rs an d n umeration systems.
2. Reasoning - u tiHzi ng knowledge an d unde rsta nding in solving p rob lems and
in int erpr eting info rmat ion.
After d eriving t he sp ecific learning t arg ets from t he sta nd ar ds, assessme nt t o u se to
determin e if th e stud en ts h ave )ea r ned t he mat er ial should be conside red - e it h er pap e r-
penci1 assessmen t wit h mat ching. sh ort -an swe r times an d multiple ch oice q uestions or
may b e in a for m of p erfor man ce based a ssessme nt like projector performance. The
assessme nt to b e u sed sh ould b e aligne d wit h t he lear n ing target s. These sh ould cover also
a varie ty of critical-t hin k ing levels.
3. Create valid and re1iab1e assessmen ts,
Consider validity and r eHabiHty in cr eating assessment. It sho uld measure \•vhat is
supposed to be measured and if an assessmen t is vaHd. it will be r eliable. However, a
re1iab1e test or assessmen t d oes it is vaHd. To make test both vaHd and r eliable. guidelines
in p reparing traditional tests should be considered.
Accor ding to Constr ucting tests. [n.d.). Retrieved September 15, 2 018. from
http· / /www wasbington.edu /teaching /teacbing-resonrces/ preparing-to-
teach / constructing-tests/ . the fo1lowing ar e the gener al guidelines in constructing
tr aditional tests:
• Take in consideration reasons for testing
Jn giving a test, the r easons wm help the teacher determine feat ur es
s uch as for mat. length, level of detail r equired in answ er s and the time frame
fo r r et ur ning r esults to students.
Some o f the r easo ns fo r testing ar e to
✓ Monitor t he pr ogress of students so as for a teacher to adjust
t he cour se pace
✓ Motivate students
✓ Provid e daUl for stud en t's grade
✓ Challenge students to apply concep ts learned
• Maintain consistency
Teacher should consider the maintenance of consistency betw een the
co urse goals, met ho ds of teaching and the tests utilized to mea su re go al
achievement.
• Utilize testing met hods which ar e app rop r iate to }earning objectives
Met hod s should be app rop r iate to the }ear ning outcomes. For
instance multiple choice may be useful in demonstrating recaH and memory
b ut to demonstrate mor e independ ent analysis and syn thesis, an essay or
o pen-ended p roblem solving may be used.
• Help students pr epar e
By clar ifying the cou rse goals as wel1 as review materials, teachers
may help stud en ts pr epare fo r the test. By d oing so, test will be allowed to
r einfor ce w hat the teacher wan ts his students to r etain and }earn.
Threats to Validity
Reliability
What is r eliabilicy? Accord ing to De Guzman, Estefania.S., Adamos. Joel L. (20 15)
Assessment of Learning 1. Manila: Ad riana Publishing Co., Inc.. page 6 1. r eliabilityrefers to the
repr odu cib ility a nd consisten cy in criteria and meth ods. If a n assessmen t pr odu ces t he same
resul ts if given to stud ents o n tw o occasions, it is said to be reliable. ReHabiHty pertains to t he
obtained results of assessment s an d not t o t he t est or an y ot he r assessment tools/ in strumen ts.
In ad dition to that, reliability is u nlikely to tu rn ou t 10 0% because no two tests v,ri11 consistently
pr oduce the same r esults. Some envir onm en t.a) factors may affect r eHabiHty.
Likew ise, accord ing to Developing Reliable Student Assessments I Cen ter fo r Teaching and
l earning. [n.d.). Ret rieved Sep tember 15. 2018, from https://ctl.vale.edu /ReliableAssessmen ts.
reliability also r efer to how \•vell a scor e repr esents studen t's abiHty and ensur es that assessments
measur e accurately student knowledge. A full test or ru b ric can not be d escribed as r eliable or
unreliable because r eliability r efers to scor e specifically. Students are able to gr asp t heir
develo pment level and aid teachers to im pr ove t heir teaching t hr ough reliable scor es. To
estimate scor e reliability variety of met hod s ar e used and instructors can make such reliability
methods t ranspar ent so as to m otivate students to make effo rt and to a ssur e them o f accuracy.
The same so urce p rovided examples of reliabi1ity mea su res v,.rhich are also considered as
the different types of r eliability and these ar e t he following:
1. Jn ter·rater - Ther e wil1 be tw o separ ate individ uals v,rh o w m evaluate and scor e a test,
perfo rmance or essay which are given in a subject. Scores from each o f t he evaluator s
are cor r elated using the correlatio n coefficien t which is used as and estim ate o f
r eliabilit y, Anot her statistical tool which is called Cohen' s kappa m ay be utilized
wher ein t he amoun t of agr eement which may occur betw een t he tw o eva)uator s wiH be
consid er ed as a r esult of chance.
2. Test-r etest - On a separate occasions, individua1s take the same test and t he scores
ob tained can be corr e1ated b y teacher s using t he correlation coefficient as t he estim ate
of reliabiHty. This app ro ach should be sensitive to t he amount o f time and learni ng
degree betvveen test administrations because stud ents learn from tests.
3. Par al1e1 fo rm s - The same gr oup of individ uals ar e given wit h tw o equiva)ent tests
which measu re same concepts, knowledge, abilities and skills and the scor es can be
correlated b y teacher s. Corr elation coefficien t w hich is the estimate o f reliability w m be
utilized but teachers shou)d know that d esigning tw o separate b ut identica) tests ma be
so difficult.
4. Split-half- A test is d ivid ed into tw o sets of items. Scor es gained by the stud ents on half
of t he test is cor related with the scor es o f the said students on the ot her half o f the test.
Splitting t est may b e d one in d iffer ent wa ys but teach er s sh ould b e a wa r e t ha t t his
me th od w ill in fl ue nce the cor re la tion coefficien t.
5. Cron bach's Alpha - When an a lyzing mu lt iple choice t ests or Likert typ e scales, t his is
th e mo st common ly re ported measu re o f r e1iab ility. It is th e mea n of a ll possib1e sp1it -
ha lf comb ina tion s or t he a verage or cen tra l t en de ncy w hen a test is s plit aga inst itse lf.
It can be ca1cu1at ed by teache rs usin g Excel or a n y oth er sta tistical softwa r e pa ckage.
Accor ding to De Gu zman. Estefania.S. Ad amo s. Joe l L. (20 15) Assessm ent of Learning 1.
Ma n ila: Ad riana Publish ing Co., Inc.. pp . 6 1-6 3, in te r ms of sou rces of r e lia bility e vide nce , t he re
a r e five c1asses and th ese a r e t he follow ing :
1. Evid en ce based on stab ility
Th e test-retest relia bility is u sed t o de te rmin e th e test result stab ility over time. It
supp orts th a t th er e is n o con sid er ab le cha nge in t he constru ct b etv,re en t he first a n d
second t esting . Becau se ch aract er istics may cha nge if th e time int erva l is t oo long. timing is
cr itical. Becau se stud e nt s may still r ecal1 th eir r espon ses in th e first test. a shor t ga p
betw een t he testing sessions is n ot r e comme nd able too.
Decision consistency displays how consisten t the classification decisions are . It.is
not how consisten t the scores are. It is seen in situations when teachers or instructor s
decid e w ho wil1 receive a PASS OR FAJL mar k. or consid er ed to have t he master y or not.
Accor ding to De Guzman, Est.efa nia.S., Adamos, Joel L. (2 015) Assessment of Learning 1.
Manila: Adr iana Publishing Co., Inc. , page 65, w ell-constructed objective tests have better
reliability than per fo rmance assessment because o f judgmen tal scor ing. Depending on the
raters. inconsisten t scores may be ob tained may be because o f inad eq uate tr aining of rater s or
inad eq uate specifications o f the ru brics used fo r scoring. In ad dition to t hat. t here is a limited
sampling o f cour se con tent in a perfo rmance assessmen t. Reliability o f perfo rmance
assessmen ts may be r aised by constr aining t he domain coverage or by struct ur ing the responses.
By using analytic and top-specific ru brics completed wit h t he exemplar s and tr aining of raters.
reliab1e scoring of per for mance assessmen t can be enhanced.
Jn imp roving o ral exam inations, increasing the n umber of q uestions. n umber of
examiner s, response time and using a r ubric or mar king guide which con tains t he stand ard s
and criter ia may be consid ered.
With regar ds to observations, observation instr uments should be comp rehensive so as to
ad equately sample occur r ences and non occurrences of behavior, bu tstil1 manageable to cond uct.
Data from direct o bservatio n can be enhanced thr ough t he in ter -observer agr eemen t and the
in tr a-observer reliability. Inter -observer agreement r efer s to t he consistency of observation d ata
gather ed by multip1e observes and teachers w hile intra -obser ver reliability pert ains to t he
consistency of the coHected d ata o n behavior mul tiple tim es by a single o bser ver or teacher .
The same sour ce id en tified some ways on how to imp rove t he r e1iability of assessmen t
resul ts and t hese ar e the follov,.ring:
1. Provide mor e time, mo re questions and mor e observation w henever practical so as to
Jengt hen t he assessment pr oced ur e
2. Assess all the important aspects of the largest }earning per fo rmance so as to b ro ad en
the scope of t he pr oced ur e
3. By utilizing a systematic and mor e fo rmal pr oced ur e for scor ing stud en t perfor mance,
impr ove objectivity.
4. Employ in ter -r ater r eHabiHt y so as to u tilize multiple mar ker s
5. Combine r esults from several assessmen ts especial1y when mak ing cr itical decisions
6. In completing the assessmen t p ro ced ur e. pr ovid e sufficient time to students
7. By pr ovid ing p ractice and t raining to students and m otivating t hem. teach students
how to per for m their best.
8. Provide tasks t hat ar e neither too easy nor too difficult and tailor the assessmen t to
each student's ability level \•vhen possible so as to match the assessmen t difficulty to
the students' level of ability.
9. Select assessmen t tasks which d istinguish or discriminate the best from t he }east able
students so as to differ en tiate among stud en ts.
2. Gr ade item by item - Instructor s can grad e the first p roblem/ essay on each students
paper befor e g rading the second pr oblem/ essay. That is if students are given
multiple p roblems or essays.