Вы находитесь на странице: 1из 36

Malcolm X College Faculty Development Week 2008

Assessment Committee Packet


TABLE OF CONTENTS

Assessment Committee FDW 2008 Agenda……………………….3

MXC Assessment Blog Info Sheet…………………………………4

MXC Mission/AC Mission/ GenEd Philosophy…………………5

Student Learning Project Overview…………………………….6 - 8

Critical T hinking Definition and Component Breakdown…..9 - 10

Student Learning Project Sample Info Sheet ……..…………….…11

Student Learning Project Score Sheet………..……………………12-15

Student Learning Project Rubric…………..………………………16

Student Learning Project Timeline ……………………………17

Assessment Committee Calendar ……………………….……..18 - 27

Assessment Glossary…………………………………………….28 - 34

2
AC – FDW 2008
Faculty Development Week Fall 2008 ~ Assessment Committee Presentation
Tuesday August 12, 2008

10:00 - 10:05 Introduction of Assessment Committee Officers

10:05 - 10:20 2007 HLC Visit/Monitoring Report/ HLC Assessment Academy


(Quintanilla)

-- What was the HLC visit, and how did it affect assessment at MXC?

10:20-10:25 Introduction to Assessment Committee Blog (Owen)

10:25 – 10:35 Interactive Activity 1 (Reynolds)

-- What is assessment of student learning?


-- What is the difference between assessment and evaluation?

10:35 – 11:00 Student Learning Project Skit (Assessment Committee)

-- Project (Assessment Glossary, College Mission Statement,


MXC Gen Ed Outcomes, Critical Thinking SLOs, sampled courses)
-- Pre/Post-Test Development (question measurable by rubric)
-- Rubric
-- Scoring methodology (pair faculty in departments)
-- Timeline (Assessment Calendar)
-- Incorporating tests into class schedule (e.g. part of
midterm/in-class assignment)

11:00-11:05 SLP Recap (McDuffy)

11:05 – 11:40 Interactive Activity 2 (Boddie-Willis, Callon, and McDuffy)

-- What is critical thinking?


-- How do/would you incorporate critical thinking in your classroom?

12:00 – 1:15 Lunch

1:15 - 1:30 Mid-Day Regroup

1:30 – 2:40 Workshop (Assessment Committee)

-- Breakout according to courses being sampled


-- Develop question for pre/post tests (Consider CT SLO
and rubric)
-- Submit questions to Stephanie Owen

3
AC – FDW 2008
MXC ASSESSMENT BLOG

Web Address: http://mxcassessment.blogspot.com/

Purpose:

The MXC Assessment Blog is designed to keep the MXC community informed about the college’s
ongoing assessment of student learning. The blog will serve as a resource faculty, staff, students,
and community members interested in assessment at MXC. The blog was created and will be
maintained by Stephanie Owen (sowen2@ccc.edu) and contributing Assessment Committee
members.

4
AC – FDW 2008
Malcolm X College Mission Statement

Malcolm X College, a learning and assessment-centered community college, empowers students of


diverse backgrounds and abilities to achieve academic, career, and personal success.

Malcolm X College Assessment Committee Mission Statement

The Assessment Committee is the body of an institution of higher learning that promotes conscious
efforts to ensure accountability in effective learning and teaching.

Philosophy of General Education

Malcolm X College’s General Education curriculum empowers students to acquire the broad base of
knowledge necessary to understand their personal, moral, and ethical responsibilities to act as
leaders.

The learning experiences provided through general education are designed to build effective
communication skills, to strengthen critical thinking, to foster analytical inquiry, to inspire awareness
of history, to embrace diversity, cultural pride and identity, and to form a basis for responsible
citizenship.

To provide the broad educational base, Malcolm X College requires students in all degree programs
to take courses in communication, mathematics, biological sciences, humanities, and the physical
and social sciences.

By completing their general education requirements, students will be able to:

• Think and read critically so that they can solve problems using appropriate
information resources and reasoning processes.

• Read, write, speak, and listen effectively so that the expectations of appropriate audiences in
the academic, public, and private sectors are met.

• Demonstrate quantitative and technological literacy, especially computer literacy, for


interpreting data, reasoning, and problem-solving.

• Appreciate global diversity in gender, age, race, class, and culture as well as difference in
physical disabilities.

Develop ethical values, life goals, and interpersonal skills that will prepare them for life-long
learning, employability, and effective citizenship.

Document Drafted at Assessment Committee Retreat


5
AC – FDW 2008
July 11, 2008

Session 1
Student Learning Project

In discussing our student learning project (SLP) the first issue to settle was whether to use a
standardized test (such as one offered by Insight Assessment) or to continue with earlier plans to
have each department and program develop their own test. In the end, participants converged on
the home-grown test option. Jane Reynolds developed a sample answer sheet for use in all the
departments and programs with the following categories: interpret evidence, identify arguments,
analysis/evaluation, and your opinion of ____ is _____. We next had to develop a general rubric
for everyone. We adopted the rubric from the Insight Assessment website. All tests developed by
departments and programs must follow the rubric.

The basic outline of the project is:

- We will use the large, entry-level classes as decided upon in the spring: English 101,
Math 098-099, Social Science 101, Chemistry 100/121, Biology 121, and the
introductory classes decided upon by each of the career programs.

- We will use a pre-test/post-test format.

- The tests will consist of one question. Faculty from each department and program will
develop their test question during Faculty Development Week FDW on Tuesday,
August 12, with the assistance of the Assessment Committee retreat participants. We
are requesting that the IT department set up laptops at the site of FDW. (Michael
Callon is making the formal request.)

- The pre-tests will be given the first week of class.

- The tests will be administered by faculty teaching those classes, but all department and
program faculty members will participate in the scoring of the tests.

- Program directors and chairs are to keep track of faculty participation and report to the
Assessment Committee the number of faculty that participated.

- The answer sheet for students to use will have the following categories: interpret
evidence, identify arguments, analysis/evaluation, and your opinion of ____ is _____.

- The scoring sheet for the tests is as follows and is based on the answer sheet:
6
AC – FDW 2008
o Interpretation of facts 1 2 3 4
o Identify salient arguments 1 2 3 4
o Thoughtful analysis 1 2 3 4
o Evaluates alternatives 1 2 3 4
o Justifies or explains reasons 1 2 3 4
o Draws conclusions 1 2 3 4

- Faculty will grade the tests in pairs, each of them grading each of the tests in their batch;
in case of significant disagreement over scores, a third grader will be brought in.
Departments and programs will submit both sets of scores to Dean Javier.

- Each exam will have two score sheets and two rubrics – one from each grader.

- We will not conduct this assessment with the special session courses.

- Each department and program will hand over their exams, answer sheets and rubrics
with scores to Dean Javier.

- Scores must be turned in by September 8.

- There will be a chance for departments and programs to debrief during an Assessment
Committee meeting after administering and grading the pre-tests but before official
scores come back from Dean Javier. Dean Javier will create questions for departments
to answer in their meetings regarding the assessment and to use in debriefing the
Assessment Committee. The form will assess the faculty’s impression of test results and
of the process itself. Faculty will be able to submit questions to the Assessment
Committee for clarification. These preliminary results will be discussed at the regular
meeting on September 18.

- The post-tests must be given by Halloween.

- The post-test exams and rubrics must be submitted to Dean Javier by November 17.

- There will be a formal online survey for faculty at the end of the semester as well as
paper version.

- Scores will be disseminated in January when we return from Christmas break.


Assessment Committee officers and Roundtable participants will meet with Dean Javier
the week of January 4.

- Results will be unveiled to the entire MXC community on January 15.

- By March 5, the departments will submit a report on what they will do in response to
their results – how they are going to adjust instruction, curriculum, etc.

7
AC – FDW 2008
- The Committee will submit a report to the Higher Learning Commission’s Academy for
the Assessment of Student Learning by April 1.

8
AC – FDW 2008
Definition of Critical Thinking:

Critical thinking is the intellectually disciplined process of actively and skillfully conceptualizing,
applying, analyzing, synthesizing, and/or evaluating information gathered from, or generated
by, observation, experience, reflection, reasoning, or communication, as a guide to belief and action.
In its exemplary form, it is based on universal intellectual values that transcend subject matter
divisions: clarity, accuracy, precision, consistency, relevance, sound evidence, good reasons, depth,
breadth, and fairness.

Steps in Critical Thinking:

1. ID a problem
2. Conceptualize the question, come up with a thesis
3. Methodology to apply process for investigation and data collection
4. Analyze data and interpret meaning
5. Recommend results, realize a new perspective, Answer the question, find alternative options
6. Implementation of recommendations for change

Conceptualize
Identify an issue or problem
Recognize multiple perspectives on a given problem
Develop multiple perspectives on a given issue or problem
Define the context of a problem
Define key concepts related to problem or issue

Apply
Demonstrate a solution to a problem
Illustrate how to use an instrument
Choose appropriate procedures to solve a problem

Analyze
Compare concepts or information
Distinguish between two or more possible solutions
Interpret a situation, facts according to theory
Explain the cause and effects of the problem

Synthesize
Schematicize data to clearly support argument
Create an argument to support or refute conclusion
Construct an opinion
Weave together results and perspectives to support a hypothesis

9
AC – FDW 2008
Evaluate
Judge the quality of information or data
Choose between alternatives
Defend or justify a position or argument

10
AC – FDW 2008
Malcolm X College
Assessment Committee
Student Learning Plan at the Classroom Level

Total number
Students of students
Division Course Sections per Class participating
1 Math 110 9 30 270
2 118 4 30 120
3 Social Science 101 9 30 270
4 English 101 15 25 375
5 Physical Science 101 3 30 90
6 Chemistry 100/121 10 30 300
7 Biology 121 16 35 560
8 College Success 121 12 30 360
9 Nursing 1 50 50
10 Respiratory Care 1 30 30
11 Radiography 1 40 40
12 Pharmacy Tech 1 20 20
13 Physician Assistant 1 25 25
14 Phlebotomy 1 10 10
15 Clin Lab Tech 1 10 10
16 Surgical Technolgist 1 15 15
17 EMT 1 60 60
18 Paramedic 1 20 20
19 Mortuary Science 1 30 30
20 Renal Technology 1 10 10
21 Child Development 1 30 30
22 Business 1 15 15
23 CIS 1 25 25
Total students 2,735

The sample includes all sections of the lower levels of General Education Courses
and one the lower level courses in each of each of the career programs.

Class sizes are estimates of anticipated enrollment for Fall 2008.

11
AC – FDW 2008
Malcolm X Coll ege Critical Thinki ng Assess ment Answer She et

Course ID ____________ Section ID __________ Student ID _______________________

Instructor ID _________

1. Interpretation/Identification of Facts (Identify and interpret facts)

2. Main Argument(s) (Based on your interpretation of facts or information, please state your main
argument)

3. Thoughtful Analysis (Explain how the facts you identified support your argument(s))

12
AC – FDW 2008
4. Evaluate Alternatives (Identify and discuss other potential arguments)

5. Justification/Explanation of Reasons (Which argument is the best and why?)

6. Draw Conclusions (Finally, the answer is and this means . . . )

13
AC – FDW 2008
Malcolm X College

Rubric for scoring Critical Thinking Evaluation for SLP July 2008

Course ID ____________ Section ID __________ Student ID _______________________

Instructor ID _________ Grader 1 ID _________

Criteria 4 3 2 1

1. Interpretation/
Identification of facts

2. Argument

3. Thoughtful
analysis

4. Evaluate
Alternatives

5. Justification
/Explanation of
Reasons

6. Draw Conclusions

14
AC – FDW 2008
Malcolm X Coll ege

Rubric for scoring Critical Thinking Evaluation for SLP July 2008

Course ID ____________ Section ID __________ Student ID _______________________

Instructor ID _________ Grader 2 ID _________

Criteria 4 3 2 1

1. Interpretation/
Identification of
facts

2. Argument

3. Thoughtful
analysis

4. Evaluate
Alternatives

5. Justification
/Explanation of
Reasons

6. Draw
Conclusions

15
AC – FDW 2008
16
AC – FDW 2008
Student Learning Project & Department Assessment Plan Timeline

KEY: SLP – Student Learning Project DAP – Department Assessment Plan


FDW – Faculty Development Week ACN – Assessment Committee Newsletter
HLC – Higher Learning Commission

***

DUE DATES:
• Thursday, July 17 – DUE DATE FOR ALL ASSESSMENT RETREAT DOCUMENTS
TO CLAIRE!
• Tuesday, July 29 – Debriefing of Critical Thinking Conference and last minute updates for
FDW and work on skit (Room 3617: @10:00 a.m.)
• Tuesday, July 29 – Turn in all packet handouts to Akiza and Michael
• Tuesday, August 12 – Faculty Development Assessment Committee discussion (Student
Learning Project-SLP, Department Assessment Plan-DAP)
• Saturday, August 16 – Faculty Development Assessment Committee discussion with
Adjunct Faculty
• Monday-Saturday August 25-30 – (SLP) Faculty from selected courses conduct pre-test
• Thursday, August 28 – Assessment Committee Meeting update on Assessment Day
activities and plans
• Thursday, August 28 – (DAP) Discuss template for Dept. Assessment Plan
• Monday, September 8 – (SLP) Pre-test turned into Dean Javier
• Thursday, September 18 – (SLP) Preliminary discussion of pre-test results with Dept. Chairs
reporting out during Assessment Committee meeting (3-5 pm) Will address 3 questions
posed by Committee members
• Thursday, September 18 – (DAP) Complete template for Department mission statement and
identify Department level SLOs (?)
• Monday, September 22 –Assessment Committee Newsletter (ACN) 1st Draft due
• Tuesday, September 30 – ASSESSMENT DAY
• Beginning October discuss pre-test results (SLP)
• Wednesday, Oct. 1 – (HLC) Report due
• Friday, October 17 – (ACN) 2nd draft due
• Thursday, October 23 – (DAP) Develop standardize course level SLOs
• Monday-Saturday October 27-November 1 – (SLP) Conduct post-test
• Monday-Friday, November 10-14 – (ACN) published and distributed
• Monday, November 17 – (SLP) Post-tests due to Dean Javier
• Thursday, January 8 – (SLP) Collect report from Dean Javier and discuss results small group
Assessment Committee meeting
• Thursday, January 15 – (SLP) Unveil SLP results from Fall 2008
• Thursday, January 29 – (DAP) Dept. Assessment Plan due
• Thursday, March 5 – (SLP) Dept. report on changes or what they plan to change based on
the Project results
• April – (HLC) Report due
• Wednesday-Friday, April 1-3 – (ACN) published and distributed
• Thursday, April 23 – (DAP) Dept. report on their results of their assessment for department
plan
17
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

18
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

19
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

20
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

21
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

22
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

23
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

24
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

25
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

26
AC – FDW 2008
ASSESSMENT COMMITTEE CALENDAR

27
AC – FDW 2008
Assessment Terms- Glossary
A

Accountability

Use of results for program continuance/discontinuance; the public reporting of student, program, or
institutional data to justify decisions or policies; using results to determining funding

Accreditation

A certification awarded by an external, recognized organization, that the institution or program


meets certain requirements overall, or in a particular discipline

Action Research

School and classroom-based studies initiated and conducted by teachers and other school staff.

Alternative Assessment/ Assessment Alternatives

Describes alternatives to traditional, standardized, norm- or criterion-referenced traditional paper


and pencil testing. An alternative assessment might require students to answer an open-ended
question, work out a solution to a problem, perform a demonstration of a skill, or in some way
produce work rather than select an answer from choices on a sheet of paper. Portfolios and
instructor observation of students are also alternative forms of assessment

Analytical skills

The ability to discover the underlying structure of an argument, a communication, a problem, or a


solution

Analytic Scoring

A type of rubric scoring that separates the whole into categories of criteria that are examined one at
a time. Student writing, for example, might be scored on the basis of grammar, organization, and
clarity of ideas. Useful as a diagnostic tool. An analytic scale is useful when there are several
dimensions on which the piece of work will be evaluated

Assessment

The systematic process of determining educational objectives, gathering, using, and analyzing
information about student learning outcomes to make decisions about programs, individual student
progress, or accountability

28
AC – FDW 2008
Authentic assessment

Assessment technique involving the gathering of data though systematic observation of a behavior
or process and evaluating that data based on a clearly articulated set of performance criteria to serve
as the basis for evaluative judgments

Classroom assessment

Informal measures of student learning obtained in a traditional classroom setting, such as Thomas

Closing the loop

Using assessment results for program change and improvement

Co-curricular programs

Out-of-class activities e.g., student affairs programs and activities

Development explained through sequential stages in which individuals encounter problems or ideas
which cause cognitive conflicts that demand the individual to accommodate or change their way of
thinking to become more complex

Cohort

A group of study subjects, selected based on predetermined criteria, who are followed over a period
of time

Competency

The demonstration of the ability to perform a specific task or achieve a specified criteria

Refers to a defined domain of educational objectives

Control group

A group of subjects, matched to the experimental group, which does not receive the treatment of
interest

Course-embedded assessment

Collecting assessment data information within the classroom because of the opportunity it provides
to use already in-place assignments and coursework for assessment purposes. This involves taking a
second look at materials generated in the classroom so that, in addition to providing a basis for
grading students, these materials allow faculty to evaluate their approaches to instruction and course
design

29
AC – FDW 2008
Cross-sectional

A study, which measures a population at a specific point in time or over a short period of time; an
alternative to longitudinal study

Cut score

A score which a student needs to achieve to demonstrate minimal competency

Dependent variable

A variable that is considered to be an effect or a variable that is predicted

Direct measures

Direct measures of student leaning require student to display their knowledge and skills as they
respond to the instrument itself. Objective tests, essays, presentations, and classroom assignments all
meet this criterion

Evaluation

This term broadly covers all potential investigations, with formative or summative conclusions,
about institutional functioning. It may include assessment of learning, but it might also include non-
learning centered investigations (e.g., satisfaction with recreational facilities)

Formative Assessment

Observations which allow one to determine the degree to which students know or are able to do a
given learning task, and which identifies the part of the task that the student does not know or is
unable to do. Outcomes suggest fu8ture steps for teaching and learning

Independent variable

A variable that is considered to be a cause or variable that is used for prediction

Indicators

Measures for individuals or organizations that provide information about measurable traits,
situations, knowledge, skills, performances, resources, inputs, outputs

30
AC – FDW 2008
Indirect measures

Indirect methods such as surveys and interviews ask students to reflect on their learning rather than
to demonstrate it

In-house instruments/software

Non-proprietary instruments/software are tools developed by institutions for internal use, not
researched, or purchased from an outside source. In-house assessment tools are sometimes
preferred because they are designed to exactly match an institutional purpose

Inter-rater reliability

The level of consistency among raters using a constructed response format

Learning gain

A positive change in learning outcomes measured following instruction or educational experiences


are often referred to as learning gains; difference between pretest and posttest; longitudinal change

Learning outcomes

Refers to the specific knowledge or skills that students actually develop though their college
experience

Mean

One of several ways of representing a group with a single, typical score. It is figured by adding up all
the individual scores in a group and dividing them by the number of people in the group. Can be
affected by extremely low or high scores

Measure (noun)

A standard procedure for quantifying a sample of behavior from a larger domain; often used
interchangeably with test and instrument

Measure (verb)

The process of collecting data using appropriate techniques

Measurement

The systematic investigation of people's attributes

31
AC – FDW 2008
Median

The point on a scale that divides a group into two equal subgroups. Another way to represent a
group's scores with a single, typical score. The median is not affected by low or high scores as is the
mean

Minimum competency

A level of knowledge, skill, or ability (usually demonstrated on a measure) that has been determined
to be the minimum required for successful use of that knowledge, skill, ability, or personal trait

Norm Group

A random group of students selected by a test developer to take a test to provide a range of scores
and establish the percentiles of performance for use in establishing scoring standards

Outcome measure

Instruments used for gathering information on student learning and development Outcomes

Refers to the specific knowledge, skills, or developmental attributes that students actually develop
through their college experience; assessment results

Percentile rank

The percentage of examinees in the norm group who scored at or below the raw score for which the
percentile rank was calculated

Pilot

A pilot often refers to a small scaled down study designed to test the validity of measures and
manipulations of a planned full-scale study. A pilot can also refer to the initial administration of new
assessment items/procedures with the intent of evaluating and revising the items/procedures for
future use

Posttest

The measurement of a dependent variable, which occurs after an intervention, usually for the
purpose of comparing to a pretest measure on the same dependent variable

32
AC – FDW 2008
Pretest

The measurement of a dependent variable prior to an intervention, usually for the purpose of
comparing to a posttest measurement of the same dependent variable

Primary trait analysis

To ensure the success of using the grading process for assessment, Primary Trait Analysis is used to
help teachers develop their criteria for grading. The first step is to identify the factors or traits that
will be considered in scoring an assignment. For each trait, a three-to five-point scoring scale is
developed for use in scoring performances of students. An explicit statement that describes
performance at that level accompanies each number

Prior learning assessment

Techniques to assess student understanding and recall of material learned in previous, related
courses, so that faculty can teach accordingly. Information helps faculty determine the most
effective starting point for a given lesson and the most appropriate level at which to begin
instruction

Problem solving

Defining the problem, being able to obtain background knowledge, generating possible solutions,
identifying and evaluating constraints, choosing a solution, functioning within a problem solving
group, evaluating the process, and exhibiting problem solving dispositions

Proficiency

Performing in a given art, skill, or branch of learning with correctness and facility; achieving
competency on predetermined standard

Random sample

A sample drawn from the population such that every member of the population has an equal
opportunity to be included in the sample

Range

The range is the distance between the highest and lowest score. Numerically, the range equals the
highest score minus the lowest score

Rater

A person who evaluates or judges student performance on an assessment against specific criteria

33
AC – FDW 2008
Rating scale

A series of items or statements that describe an aspect of a skill or a personal trait

Raw score

The measure prior to scaling

Reasoning

The process by which one is motivated to and looks for evidence to support and refute a statement
or proposition

Result

Outcomes or assessment data obtained about student learning or development; frequencies obtained
from performance indicators

Rubric

A scoring tool that lists the criteria for a piece of work, or "what counts" (for example, purpose,
organization, and mechanics are often what count in a piece of writing); it also articulates gradations
of quality for each criterion, from excellent to poor

Sample

Sub-group of persons/items/observations drawn from and meant to represent a larger population

Scale score

A derived score based on the raw score of a test which takes into account slight variations in the
difficulty of different forms of the same test

Significance

Refers to the likelihood that relationships observed in a sample could be attributed to sampling error
alone

Skills

Are observable behaviors that demonstrate levels of competence (i.e., knowledge, comprehension,
application, analysis synthesis, and evaluation

34
AC – FDW 2008
Stakeholder- internal/external

Stakeholders are those who have a stake in the program to be evaluated or in the evaluation’s results.
Stakeholders can be internal or external to a program. Both types of stakeholders need to be
identified and considered when planning program evaluation, as each may have a different
perspective of the program and different expectations of the program and the evaluation.

Standard

A pre-determined criterion or expectation of a level of student learning; a passing score

Standards

The broadest of a family of terms referring to statements of expectations for student learning,
including content standards, performance standards, and benchmarks

Student learning

The acquisition of knowledge or behavior as a result of participation in programs and services

Summative evaluation

A sum total or final product measure of achievement at the end of an instructional unit or course of
study

Trend

A trend is a general direction or movement. In statistics, a trend is a statistically detectable change


over time

True score

An examinee's true score on a test is a measure without measurement error. It is also the mean of
the distribution of observed scores that would result if the examinee took the test an infinite number
of times. True score is the observed score minus error

Validity

The degree to which a test or other assessment measure measures what it is designed to measure

35
AC – FDW 2008
Value-added

The effects educational providers have had on students during their programs of study. The impact
of participating in higher education on student learning and development above that which would
have occurred through natural maturation, usually measured as longitudinal change or difference
between pretest and posttest; A comparison of the knowledge, skills, and developmental traits that
students bring to the educational process with the knowledge, skills and developmental traits they
demonstrate upon completion of the educational process

Variable

Any quantity that can assume more than one state or numerical value

36
AC – FDW 2008

Вам также может понравиться