Вы находитесь на странице: 1из 39

So What?

Using Outcome-Based Evaluation to Assess the Impact of Library Services

So What? Using Outcome-Based


Evaluation to Assess the Impact of
Library Services

Rhea Joyce Rubin


for the
Massachusetts Board of Library
Commissioners
June 2004

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________1__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

OUTCOME-BASED EVALUATION
WORKSHOP
AGENDA

9:30 10:00

Registration and Coffee

10:00 12:00

Why Use Outcome-Based Evaluation?


Concepts and Terminology
Selecting Outcomes

12:00 12:30

Lunch Break

12:30 3:00

Selecting Indicators and Targets


Writing an Outcome-Based Evaluation Plan

3:00

Adjourn

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________2__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

KEYS TO UNDERSTANDING
OUTCOME-BASED EVALUATION
Outcome = Impact on End User

Impact =

Changes in

Behavior
Attitude
Skills
Knowledge
Condition/ Status

Outcome-based evaluation = A user-centered approach to assessment of


programs/ services that are provided to address particular user needs
and designed to achieve change for the user.

From the user in the life of the library


to the library in the life of the user.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________3__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

THE VALUE OF OUTCOME- BASED


EVALUATION

Useful as a planning tool; requires needs assessment

Clarifies the purpose of the program/ service

Keeps staff and stakeholders focused on goals by stressing so what


rather than process

Stimulates discussion of issues

Helps keep implementation on track; milestones are identified

Indicates when changes to program are needed

Energizes staff by demonstrating the real, human impact their work


produces and by stressing common purposes and goals

Gives insight into why and how library services are used; new
perspective on library services in context of the user

Assists in fundraising & grant writing by providing statistics on results

Provides empirical evidence that what you are doing is what you
intended to do

Quantifies the anecdotes & success stories

Identifies effective programs/services

Demonstrates the librarys contribution to solving community problems

Demonstrates accountability as required by IMLS under the


Government Results and Performance Act of 1993.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________4__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

INTRODUCTION TO TERMINOLOGY
Inputs:

Are resources devoted to or used by a program


Collections, materials, equipment and supplies
Staff and volunteer time and expertise, community partners
Needs assessment findings and other background information
Facilities, computers, online access
Also, constraints on programs such as laws, regulations, funding
requirements

Outputs:

Answer How many? (extensiveness)


Are measures of product volume (i.e. number of services or products
provided) or evidence of service delivery (i.e. number of participants)
Are the results of inputs (resources) and activities (programs or
services)
Are from the staff perspective
Are objectively quantified by neutral observers

Outcomes:

Answer So what? (effectiveness)


Are measures of impact or benefit to end user, usually reported in
amount of change in skills, knowledge, attitude, behavior, or condition
Are also the results of inputs (resources) and activities (programs or
services)
Are from the participant/ customers perspective
Are often quantified by participants or others perceptions (i.e. selfreports or anecdotal evidence)
May be measured through professional assessments (e.g. testing)

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________5__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Are the success stories of outputs


Are best used in conjunction with output measurement
Make a projects expectations explicit
Present assumptions of cause and effect, not concrete scientific
evidence

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________6__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

OUTCOME-BASED EVALUATION GLOSSARY


Note: Terms are listed in alphabetical order.
Activities Programs or services (such as tutoring, lectures, storytelling)
provided to the end user by the library. The specific jobs (such as recruiting
and training volunteers, publicizing a lecture series) are called tasks. Answers
the question: What does the library do?
Candidate Outcomes are potential outcomes that must be verified. Usually
the result of brainstorming and assumptions by library staff members,
candidate outcomes must be checked with participants and others before
being accepted as the correct measures of a program. ( See Selecting
Outcomes to Measure.)
Community Goals relate to the needs of the community and are beyond the
scope of the library alone; the librarys program is usually one part of a
larger effort to meet a community goal. (For example, increase school
readiness among elementary school children, decrease juvenile delinquency
in teens, increase socialization by isolated elders.) Answers the question:
What does the community hope to achieve in response to an
identified community need? (Note- you are not being asked to define
community goals, but when you document why a project is needed, it
should make sense of terms of how the library is relevant to a community
achieving such a goal. )
Goals In evaluation, there are two types of goals. Both are developed in
response to a demonstrated need, broad in scope and long range. In
traditional evaluation, goals reflect the mission of the library. In outcomebased evaluation, goals reflect the ultimate impacts desired for the targeted
users, and usually relate to both larger community goals and the librarys
mission statement. Provides a compass for the program. Answers the
question: What does the library hope to achieve for the users?

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________7__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Immediate outcomes See Intermediate outcomes.


Indicators Measurable characteristic or change which indicates achievement
of the outcome. Behaviors which demonstrate changes in knowledge, skills, or
attitudes, or which imply such changes. Amount of expected change is part of
the indicator. (E.g. increase in one reading level within one year; ability to use
a database independently; successful completion of sample citizenship exam).
To gauge success for the participant. Answers the questions: How will we
know that participants experienced the intended outcome(s)? and What
will we measure?
Initial outcomes See Intermediate Outcomes.
Inputs Resources (such as money, staff, volunteers, facilities, collections,
community partners) used to plan and provide a program or service. Also
constraints on a program such as laws, regulations, funding requirements.
Answers the question: What do we have to work with?
Interim outcomes See Intermediate Outcomes.
Intermediate outcomes Milestones in the life of a project, events which are
necessary for successful outcomes. Points at which project staff must
decide whether to continue current activities or to modify them in order
to achieve the desired outcomes. (E.g. In a family literacy project with
the desired outcome of increasing family reading at home, attendance at
library story times may be considered necessary. So the number of
participants signing up for the first story time may be a milestone in the
project.) Often users satisfaction is an intermediate outcome because it
is necessary for users to be satisfied with a service in order for them to
use it repeatedly. Sometimes called initial, immediate, short-term or
interim outcomes. Answers the questions: What is the short-term
benefit to the user from this program/service? and :What will the
user do that is necessary if s/he is to achieve the long-range
outcome?

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________8__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Logic Model See Outcome-Based Evaluation Plan.


Measurement Verification of outcomes using the indicators in relation to the
selected targets. Data collection methods include interviews, pre and posttesting, professional observation, self-administered surveys.
Outcome-Based Evaluation Plan Statement of library goal; description of
whom the program expects to affect; inputs; services/programs; outputs;
outcomes; indicators; targets; and data collection plan for the entire
program. Sometimes called a logic model.
Outcome Measure Statement Assertion of indicator + target + data
collection method for a specific outcome. E.g. 90% of program
participants will double the amount of time reading to their children as
measured on a pre- and post- program questionnaire.
Outcomes Benefits or impacts to the end user which demonstrate the
effectiveness of the program or service. Usually are changes in knowledge,
skills, attitude, behavior, or condition. Answers the question: What
difference did our program make to the participant?
Outputs Products resulting from the inputs and activities of the library.
Volume of successful activities or evidence of service delivery. Objectively
quantified measures (such as number of books circulated, number of hours
devoted to homework assistance, or number of attendees at an author
reading). In traditional evaluation, projected outputs with a timeline
attached are referred to as objectives. Answers the question: How much
did the library do?
Short-term outcomes See Intermediate outcomes.
Targets Numerical standards or criteria of success for your program. The
expected proportion of outcomes. (E.g. 75% of participants reporting a
specified effect six months after a program; 20% of attendees at resume
writing workshops report getting job interviews). Some outcomes are defined
qualitatively and do not have a specified target. To gauge success of the

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________9__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

librarys program. Answers the question: How will we know that our
program is a success?
Tasks The specific jobs (such as recruiting and training volunteers or
publicizing a lecture series) staff or volunteers must do to be able to provide
successful programs or services. Key steps needed to provide or manage
services.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________10__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

GLOSSARY EXEMPLIFIED:
Information Literacy Program in a Public Library
Organizational Mission: [unique to each grantee]
Program Purpose: Train students to use electronic information competently
and independently.
Inputs: Bibliographic instruction librarian, contractual consultants,
collaboration with school librarians, computers, software, electronic
resources, print and non-print materials, computer lab, grant funds and local
match.
Program Services: Classes, ongoing support and mentoring, web tutorial.
Outputs: 100 flyers designed, produced, and distributed at 6 schools. Course
materials developed. 10 information literacy classes held, each of which is
attended by 10 students.
Immediate (Initial) Outcome (Milestone): Students can differentiate among
three types of electronic resources (licensed databases, OPACs, and freely
available web pages).
Indicator of Immediate Outcome: Students are able to correctly
distinguish among three kinds of resources, with at least 75% accuracy.
Target for Immediate Outcome: 80 of 100 (80%) of students who complete
the program will demonstrate the indicator.
Intermediate Outcome (Milestone): Students know how to find reliable,
objective, accurate and current information in different electronic resources.
Indicator of Intermediate Outcome: Students demonstrate their ability to
search an OPAC, the WWW, and licensed databases to find a specified piece
of information, with at least 75% accuracy.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________11__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Target for Intermediate Outcome: 80 of 100 (80%) of students who


complete the program will demonstrate the indicator .
Long-range Outcome: Students meet a personal or educational goal using
information they found in electronic resources.
Indicator of Long-range Outcome: Students report that they have met a
personal or educational goal using accurate and current information they
found in electronic resources.
Target for Long-range Outcome: 70 of 100 (70%) of students who complete
the program will demonstrate the indicator.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________12__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

CHARACTERISTICS OF
PROGRAMS/SERVICES BEST SUITED
TO OBE

Designed in response to a demonstrated need


Contribute to a larger community goal and to the librarys mission
Aim to impact people rather than produce or deliver a product
Focus on effectiveness, not efficiency
Concentrate on results, not process
Has a distinct beginning and end
Has stable program staff (including volunteers)
Has supportive administration and stakeholders
Have voluntary participation by users

Intend to change behavior, attitude, skill,


knowledge, or condition of a specific
group of people in response to a need.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________13__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

SELECTING OUTCOMES TO MEASURE

Identify your intended participants first.

Gather ideas on positive (and negative) outcomes for your program.


Brainstorm candidate outcomes with staff. Ask Who benefits from
this service/ program and how? Then, In the best of
circumstances, what other benefits might there be for those
participants? Note that talking to staff will not yield all the
outcomes and is only a first easy step.
Interview or hold focus groups with current and past participants.
Ask: If this program really helps you, how will you be better off?
or What do you expect to change for you as a result of this
program?
Talk with staff and volunteers who work directly with your intended
participants at other programs at your library or elsewhere. Ask:
What changes might we see in participants if our program is
successful?
Talk with representatives of agencies that might be the next step
for former participants. Ask: What will participants need to know
or be able to do in order to succeed in your program?
Review agency or program records to discover what outcomes have
been noted informally. Ask staff and others for anecdotes and
testimonials they have heard.
Ask about unintended negative outcomes so you can avoid them (e.g.
cheating on summer reading records in order to get prizes).

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________14__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

If you already have a program idea, test it by brainstorming the if-then


chain of influences and benefits. If the program does such and such, we
can assume such and such change in the participants. For example, if the
proposed library program is homework assistance, think through these
steps:
If students have assistance with their homework then they will
do better at their school work. If they do better at school then
they will get better grades and attend school more regularly.
If they get better grades and attend school more regularly then
they are more likely to graduate. If they graduate then they are
more likely to become employed.

If you do not yet have a program, but want to develop one to meet a given
goal, the if-then chain will work from intended results back to potential
services. To use the same example, if your community goal is higher
employment rate among young adults, consider:
If we want more young adults to be employed then they must
be prepared with a high school education. If we want more young
adults to graduate high school then we need them to attend
school more regularly and to get better grades.
If we want them to do better in school then we need to offer
assistance with their homework.

Be sure that the If and the Then phrases relate to each other and are
realistic.

Select outcomes that are valuable for the participants, achievable by the
program, and relate to the librarys priorities.

Remember that some intermediate outcomes may be identical to outputs.


For example, the number of people attending a program or satisfied with a
service can be an output or an intermediate outcome, depending on the
situation.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________15__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

The more immediate an outcome, the more influence a program has on its
achievement. The longer-term the outcome, the less direct influence a
program has on its achievement and the more likely that other forces
intervene.

Keep in mind that a programs outcomes must have an appropriate scope.


To clarify if an outcome is reasonable for your program, ask: Can our
program alone have a significant influence on this outcome? Of course,
the library cannot take all the credit or blame for a change in participants
lives.

Outcome-based evaluation is more difficult if the program/ service has


one time or short term contact with an individual; if follow-up is
impossible; if the goal is to prevent something negative (instead of to
produce something positive); or if the outcome is intangible (e.g.
recreation).

If this is the librarys first attempt at outcome-based evaluation, start


with:
Only one outcome per program
Outcomes for which you already collect indicator data
Short term outcomes
Outcomes which are easy to understand and to report to
stakeholders
Areas in which you have experience but want to improve

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________16__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

CONTINUUM OF PROGRAM OUTCOMES


Note that changes in individuals (outcomes) usually occur in the order
below, from bottom to top. The most difficult changes are those at
the top of the list.

Condition/ Status
Behavior
Skills
Knowledge
Community connections/ Social networks
Attitudes/ Values
Perceptions/ Feelings
Satisfaction with service
Participation/ Use of service
Awareness of service

Rhea Joyce Rubin 2003. Based on work by Patton in Utilization-Focused Evaluation (1997) and Reisman &
Clegg in Outcomes For Success! (2000).

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________17__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

CANDIDATE OUTCOMES WORKSHEET


Users:

Community Goal:

User Need:

Library Goal:

New Service/Program:

Outputs:

Immediate or Intermediate Outcome(s):

Indicator(s):

Long-range Outcome(s):

Indicator(s):

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________18__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

SELECTING INDICATORS AND TARGETS


FOR OUTCOME-BASED EVALUATION

SELECTING INDICATORS:
How will we know that the participants experienced the expected benefit or
change? How can we gauge the outcome or the achievement for the
participant?
An indicator is a behavior that actually demonstrates the change, or that
implies a change. An indicator identifies a statistic that summarizes the
users achievement of an outcome.
One to three indicators per outcome
Specific and unambiguous
Measurable
Observable and objective
Valid
Reliable
Relevant
Think of the story it tells

SETTING TARGETS:

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________19__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

How will we know if the program has met our expectations? What is our
standard for success for the program?
Numerical
Realistic and based on experience (baseline figures)
Valid
First year/ cycle, collect data that you can use to set a target
next time. Meanwhile, use improve or gain without numerical
qualifiers
Reasonable

AN EXAMPLE OF INDICATORS AND TARGETS:


In a job information program, an expected outcome might be that participants
learn of new job and career options.

An indicator might be that participants can list two more


appropriate jobs/ career possibilities after the program than
before.
A target might be that 40% of participants list at least two
more career options.

DESIGNING A DATA COLLECTION PLAN

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________20__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

The most basic questions to consider when designing a data collection plan
are:
What method will best capture your indicators? Since indicators
show you the change in behavior or a behavior that implies another
type of change you need to find the most direct way to clearly
observe and accurately count the indicators.
What schedule will give you the most significant information?
Monthly? Annually? Note that because outcome-based evaluation is
about change, it usually requires pre- and post-tests.

Five issues to take into account:


Collecting data should not interfere with the delivery of the service.
Participation in the data collection activities (e.g. interview or
survey) must be voluntary and not required to receive the service.
Participants privacy or confidentiality must be respected.
The benefits of the data collection activities must outweigh the
costs for both the program and the participants.
To minimize bias, the service provider should not serve as the
evaluator.

Four last essentials:


Data is collected from individuals, but the results are always
aggregated and reported as a group. The idea is to measure the
outcomes of your service, not to grade an individuals progress.
Outcome-based evaluation of LSTA projects is not scientific
research. We have neither a scientifically selected sample of users
nor a control group. Therefore, we can make only limited claims
about our results.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________21__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Results should not be compared to results of other programs unless


the programs and the evaluation were designed and managed
together.
The outcomes reflect the librarys contribution to a goal. But
outcomes cannot be attributed the library alone.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________22__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

SELECTING DATA COLLECTION METHODS


The six most common data collection methods are:

Review of existing records


Surveys
Interviews
Self-report
Observation
Tests

When determining which data collection method to use, keep several factors
in mind.
What type of information is needed?
How standardized must the information be?
How valid (accurate) must the information be?
How reliable (consistent) must the information be?
Is this information already available somewhere? If so, who might
already have access to much, if not all, of this information?
Is an instrument (e.g. a test or a survey form) already available? If
so, can it be modified for use in your program?

Whenever possible, use the most direct method possible. For knowledge or
skills, this may be tests or observation. For attitude change, this may be selfreport or interview. For behavior, this may be records or tests.
Review of existing records

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________23__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Existing records include internal (those of your own program or agency)


and external (those kept by others) such as school or employment
records
Internal program tracking mechanisms such as records, reports, or logs
can be used if they are consistently maintained and updated in a timely
manner and the information can be extracted easily from the system
Do official (external) records exist that track this indicator in a way
that is useful to you? If so, are the records reliable (i.e. tracked
similarly from one individual to another)?
Will it be possible to get the cooperation of the outside agency/
institution to get access to these official records?
Only use external records that collect the information you really need
Are they widely accepted as objective and credible? Be aware that
some records (e.g. program logs and attendance tallies) may not be kept
consistently and may not be directly related to achievement

Survey or questionnaire
Good when you need information from the participants perspective
Be aware that any information reported by the participant may be
inaccurate
More systematic than anecdotal self-report (though open-ended survey
questions are also self-report)
Can be self-administered (form filled in by participant) or asked orally
(with answers recorded by someone else)
Easy to analyze results statistically
Watch your language! Dont use value-laden terms or overly personal
language
Will participants be able to understand the survey questions? Consider
age, educational level, and cultural background.
Will participant be able to accurately answer the questions? Consider
awareness, knowledge, and language skills.
Are the questions culturally appropriate?

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________24__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Pilot test the survey to be sure it is easily understood and elicits the
information you need

Interview
Good when you need information from the participants perspective
Must follow pre-defined questions and answers must be recorded
accurately
Similar to oral survey but allows for follow-up questions
Similar to self-report but more systematic
May be used as adjunct to survey
Labor intensive as someone must personally ask participants questions
either by phone or in person
Will the participant be comfortable being interviewed?
Do you have translators as needed?
Pilot test the interview questions to be sure it is easily understood and
that the interviewers understand when and how to ask follow-up
questions

Self-report
Are there objective pieces of information the can be used to assess
the indicator?
Will the participant be comfortable with self-report? What about
those who are not comfortable?
Will the reports be acceptable as objective and accurate?
Observation
Most appropriate when behavior change is clearly observable, i.e. a skill
or condition
Can indicator be measured without asking questions?
Is this indicator most accurately measured by a trained observer?

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________25__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

If so, do you have staff resources to observe events and behaviors? Or


can you train volunteers?
Beware of the participants privacy when using observation

Cognitive or Physical Standardized Tests


Most appropriate when the indicator is related to knowledge or skill
Most appropriate when testing is typically included in that context, e.g.
literacy or computer skills
Will you be able to use a pre-existing test or will you have to create
one?
Will the participant be comfortable with testing?
Widely accepted as credible
Easy to analyze results statistically
Pilot test the instrument to be sure it is easily understood and elicits
the information you need

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________26__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

DATA COLLECTION METHODS


METHOD

Cost

Amount
of
Training
Required

Completio
n
Time

Response
Rate

Review
of
Existing
Records

Written
Survey

Intervie
w

Low

Moderate

Moderate None
To High

Low to
Low to
Moderate Moderate

Moderate None
To High

Depends
on
Amount
of
Data
Needed
High if
records
are
relevant

Long

Moderate
To Long

SelfRepor
t

Varies

Depends on
Moderate Varies
distribution* To High

Rating by
Trained
Observer

Tests

Depends
Low to
on
Moderate
Cost of
Observers
Moderate Moderate
to
High

Short to
Moderate

Varies

High

High

* Surveys can be distributed on-site, by mail, or by mail with telephone follow-up. How
the survey is distributed will affect all the collection characteristics on the chart above.
For example, a survey sent by mail and returned by the participant by mail is inexpensive
but slow with a low to medium response rate depending on whether multiple mailings
and/or telephone follow-ups are done. Surveys completed on-site are fast and have a high
response rate but may be more expensive because of on-site staff costs.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________27__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

WRITING AN OUTCOME MEASURE


STATEMENT
Remember: outcomes are about the participants, not the program.
Begin with only one outcome statement per program/ service.
If you have selected a long-term outcome, you need to write at least
one intermediate outcome too.
The participant should be the subject of the sentence. Begin the
sentence with the participant (e.g. Preschool children), followed with
will Who will show what?
Outcome verbs include: know, understand, attain, increase, improve,
decrease, reduce, expand. (Note that these are not activity-based.
Activity verbs will be used in the indicators.)
Check that the outcome is not an internal program operation (e.g.
recruiting and training staff) or an output (e.g. number of participants
served).
Next state the indicator(s) you expect to measure plus the data
collection method you will use.
Think SMART about your indicators: specific, measurable, attainable,
realistic, time-bound.
Check that you have not written an objective. An objective is a SMART
statement written for traditional evaluation plans; it expresses the
projected and quantifiable products of a program. It is a statement
from the library perspective.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________28__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Be sure it is, instead, a statement that describes what will happen to


participants as a result (at least partly) of your program. It should
express the so what? for your program.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________29__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

PLANNING FOR OUTCOME-BASED


EVALUATION
Designate an OBE coordinator to design and monitor the OBE process.
Consider each of your librarys programs/ services. Is it aimed at
changing the lives of individual people? Is it focused more on human
impact than on internal operations? If so, it is appropriate for OBE.
Identify the population of potential participants. It is necessary to
know for whom you are planning the service before you can consider
outcomes. Whose life will be affected by the service?
Is there more than one intended group? Many times we can affect
people other than those we serve directly (e.g., a program designed
primarily for preschool children may affect parents as well).
Include potential participants and others in the planning (see Selecting
Outcomes).
Determine the programs inputs and outputs. Consider resources and
constraints.
Decide on program outcomes to measure. What impacts can you
reasonably expect? Are there intermediate outcomes to measure?
Choose the indicators. Are they observable and measurable?
Set a realistic numerical target for program success.
Develop data collection plan (methods, timeline, and sources).
Pretest the indicators and data collection methods. Monitor the
process re time/ staff considerations, and ease or difficulty of

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________30__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

measurement. Tabulate and analyze results and then revise the data
collection plan as necessary.
Write your outcome-based evaluation plan: goals, inputs, activities,
outputs, outcomes, indicators, targets, and data collection methods.
Develop an action plan outlining what needs to be done when and by
whom.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________31__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

A SELECTED BIBLIOGRAPHY
Abend, Jennifer and Charles R. McClure. Recent Views on Identifying
Impacts from Public Libraries. Public Library Quarterly 17 (3): 361-390, 1999.
Association of Specialized and Cooperative Library Agencies. Outcomes
Evaluation Issue. Interface 24(4):1-11, Winter 2002.
Bertot, John Carlo and Charles R. McClure. Outcomes Assessment in the
Networked Environment: Research Questions, Issues, Considerations, and
Moving Forward. Library Trends 51 (4): 590-613, Spring 2003.
Corporation for National and Community Service. AmeriCorps Program
Applicant Performance Measure Toolkit. February 14, 2003. On the web at
http://www.projectstar.org
Dervin, Brenda and Benson Fraser. How Libraries Help. Stockton, CA:
University of the Pacific, 1985.
Durrance, Joan and Karen E. Fisher. Determining How Libraries and
Librarians Help. Library Trends 51 (4): 541-570, Spring 2003.
Durrance, Joan C. and Karen E. Fisher-Pettigrew. How Libraries and
Librarians Help: Outcome Evaluation Toolkit. October 2002. On the web at
http://www.siumich.edu/libhelp/toolkit
Durrance, Joan C. and Karen Pettigrew. How Libraries and Librarians Help.
Chicago: ALA Editions, forthcoming 2004.
Durrance, Joan C. and Karen E. Fisher-Pettigrew. Towards Developing
Measures of the Impact of Library and Information Services. Reference and
User Services Quarterly 42(1): 43-53, Fall 2002.
Hernon, Peter and Robert E. Dugan. An Action Plan for Outcomes Assessment
in Your Library. Chicago: ALA, 2002.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________32__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Hernon, Peter and Robert E. Dugan. Outcomes Assessment: Views and


Perspectives. Westport, CT: Libraries Unlimited, forthcoming 2004.
Hernon, Peter. Outcomes Are Key But Not the Whole Story. Journal of
Academic Librarianship. 28(1-2): 54-55, 2002.
Horn, Claudia B. Outcome-Based Evaluation for Literacy Programs. Syracuse,
NY: Literacy Volunteers of America, Inc., 2001.
Innovation Network, Inc. Helping NonProfit Agencies Succeed Their
workstation is a free web-based interactive planning tool that assists you to
write an outcomes-based evaluation plan while you are online. They estimate 2
to 4 hours to write a plan but it can be done in short steps; all your
information is stored for your future use. 1992. On the web at
http://www.innonet.org/workstation/workstation_handbook.pdf
Institute of Museum and Library Services. Frequently Asked OBE Questions.
Washington, DC: IMLS, rev. 2002. In print and on the web at
http://www.imls.gov/grants/current/crnt_outcomes.htm
Institute of Museum and Library Services. New Directives, New Directions:
Documenting Outcomes. Washington, DC: IMLS, rev. 2002. In print and on the
web at http://www.imls.gov/grants/current/crnt_obebasics.htm
Institute of Museum and Library Services. Perspectives on Outcome Based
Evaluation for Libraries and Museums. Washington, DC: IMLS, 2000.
Kellogg, W.K. Foundation. Evaluation Handbook. (See especially pages 28-46 on
OM. ) Battle Creek, MI. 1998. Also available on the web at
www.wkkf.org/pubs/Pub770.pdf
Kellogg, W.K. Foundation. Guiding Program Direction with Logic Models. Battle
Creek, MI. 2003. In print and available on the web at
www.wkkf.org/pubs/tools/evaluation/LMDGsummary_00447_03674.pdf

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________33__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Kellogg, W.K. Foundation. Logic Model Development Guide: Using Logic Models
to Bring Together Planning, Evaluation, and Action . Battle Creek, MI. 2001.
Also available on the web at www.wkkf.org/pubs/Pub1209.pdf
Lance, Keith Curry and others. Counting on Results: New Tools for OutcomeBased Evaluation of Public Libraries. (Final report). Washington, DC: IMLS,
2002. On the web at www.lrs.org
Marshall, Joanne Gard. Determining Our Worth, Communicating Our Value.
Library Journal 125 (19):28-30, November 15, 2000.
Matthews, Joseph R. Measuring for Results: The Dimensions of Public Library
Effectiveness. Westport, CT: Libraries Unlimited, 2004.
McNamara, Carter. Basic Guide to Outcomes-Based Evaluation for Nonprofit
Organizations with Very Limited Resources. For the Management Assistance
Program for Nonprofits. On the web at
www.mapnp.org/library/evaluatn/outcomes.htm
Mika, Kristine L. Program Outcome Evaluation: A Step-by-Step Handbook .
Milwaukee: Families International, 1996.
Morley, Elaine and others. Outcome Measurement in Nonprofit
Organizations:Current Practices and Recommendations . Washington, DC: The
Independent Sector, 2001.
Organizational Research Services. Outcomes for Success 2000! Seattle, WA:
The Evaluation Forum, 2000.
Organizational Research Services. Managing the Transition to OutcomesBased Planning and Evaluation. Seattle, WA: The Evaluation Forum, 1998.
Patton, Michael Quinn. Qualitative Research and Evaluation Methods. 3rd
edition. See pages 151-159 and 518-534. Thousand Oaks, CA: Sage
Publications, 2002.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________34__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Patton, Michael Quinn. Utilization-Focused Evaluation. 3rd edition. See


especially pages 147-177. Thousand Oaks, CA: Sage Publications, 1997.
Poll, Roswitha. Measuring Impact and Outcome of Libraries. Performance
Measurement and Metrics 4 (1): 5-12, 2003.
Project Literacy Victoria. Outcome Measurement for a Community Literacy
Program. Victoria: British Columbia, 2001 and on the web at:
http://www.plv.bc.ca/outcomes/programoutcomes.pdf
Project STAR for the Corporation for National Service. Programming for
Impact Manual. On the web at http://www.projectstar.org
Reisman, Jane. A Field Guide to Outcome-Based Program Evaluation. Seattle,
WA: The Evaluation Forum, 1994.
Rubin, Rhea Joyce. Predicting Outcomes: Outcome Measurement as a Planning
Tool in Joan C. Durrance and Karen Pettigrew. How Libraries and Librarians
Help. Chicago: ALA Editions, forthcoming 2004.
Sadlon & Associates, Inc. Workbook: Outcome Measurement of Library
Programs. Tallahassee: State Library of Florida, 2000. On the web at
http://dlis.dos.state.fl.us/bld/research_office/bld/research.htm
Schalock, Robert L. Outcome-Based Evaluation. NY: Plenum Press, 1995.
Steffen, Nicolle O., Keith Curry Lance and Rochelle Logan. Outcome-Based
Evaluation and Demographics for the Counting on Results Project. Public
Libraries, 41(4): 222-228. July-August 2002.
Steffen, Nicolle O., Keith Curry Lance and Rochelle Logan. Time to Tell the
Whole Story: Outcome-Based Evaluation and the Counting on Results Project.
Public Libraries, 41(5): 271-280. September-October 2002.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________35__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Taylor-Powell, Ellen and others for the University of Wisconsin Extension.


Planning a Program Evaluation. A series of excellent short pamphlets, 19962002. In print and on the web at http://www.uwex.edu/ces/pdande/evaluation
United Way of America. Agency Experiences with Outcome Measurement:
Survey Findings. Alexandria, VA: United Way, 2000.
United Way of America. Measuring Program Outcomes: A Practical Approach.
Alexandria, VA: United Way, 1996.
United Way of America Outcome Measurement Network. On the web at
http://national.unitedway.org/outcomes
Williams, Harold S. and others. Outcome Funding: A New Approach to
Targeted Grantmaking. 3d. ed. Rensselaerville, NY: The Rensselaerville
Institute, 1995.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________36__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

DATA COLLECTION AND ANALYSIS:


A SELECTIVE & DESCRIPTIVE
BIBLIOGRAPHY
Bolan, Marc and others. How to Manage and Analyze Data for Outcome-Based
Evaluation. Seattle, WA: The Evaluation Forum, 2000. How to manage
program data with specifics on how to use Access and Excel for data
preparation and analysis. This clearly written book, which comes with a
diskette of fictitious program data for practice purposes, may be more than
you need to know if you are running a small program with few participants.
Bond, Sally, Sally Bond, and Kathleen Rapp. Taking Stock: A Practical Guide to
Evaluating Your Own Programs. Chapel Hill, NC: Horizon Research Inc., 1997.
Although this slim volume (93 pages) covers more than data collection, the
sections on that topic are well-presented for novices.
Fink, Arlene and Jacqueline Kosecoff. How to Conduct Surveys: A Step-byStep Guide. 2d. ed. Thousand Oaks, CA: Sage Publications, 1998. A practical
and concise (111 pages) paperback guide.
Fink, Arlene and others. The Survey Kit. 2d ed. Thousand Oaks, CA: Sage
Publications, 2002. If Finks guide, above, is not enough for you, this newlyupdated ten volume set should answer all your questions about written surveys
and interviews. Each volume (100 to 200 pages) focuses on a specific subject:
self-administered & mail surveys, in-person interviews, telephone interviews,
sampling, reliability & validity, data analysis, and reports. The first two
volumes, The Survey Handbook and How to Ask Survey Questions are
especially useful to new survey developers.
Glitz, Beryl. Focus Groups for Libraries. Chicago: Medical Library Association,
1999.
Henerson, Marelene E. and others. How to Measure Attitudes. Newsbury Park,
CA: Sage Publications, 1987. Attitudes, unlike skills or knowledge, cannot be

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________37__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

observed directly; they are therefore more difficult to measure. This is an


excellent guide.
Herman, Joan and others. Program Evaluation Kit. Thousand Oaks, CA: Sage
Publications, 2d. ed. 1987. A nine volume set covering all facets of program
evaluation. Each paperback volume is medium length (96 to 192 pages) on one
of the following topics: how to focus an evaluation, how to design, how to use
qualitative methods, how to assess program implementation, how to measure
attitudes, how to measure performance tests, how to analyze data, how to
communicate evaluation findings. Each volume can stand alone and may be
purchased separately.
Insititute of Museum and Library Services. Introduction to Outcome
Oriented Evaluation: Selected Resources. Washington, DC: IMLS, February,
2002. On the web at www.imls.gov/grants/current/crnt_bib.html Lists
materials on user surveys and other data collection tools and methods used by
museums, science programs, and other non-profit and educational agencies.
Also lists automated online survey tools.
Krueger, Richard A. and David L. Morgan, eds. The Focus Group Kit. Thousand
Oaks, CA: Sage Publications, 1998. A six volume set covering everything re
focus groups. Each volume is a short (100 pages), clear paperback on one of
the following focus group topics: planning, developing questions, involving
community members, moderating, analyzing & reporting results. Each volume
can stand alone and may be purchased separately.
Peterson, Robert A. Constructing Effective Questionnaires. Thousand Oaks,
CA: Sage Publications, 1999. Another short (152 pages) manual for the new
evaluator.
Resiman, Jane and others. A Field Guide to Outcome-Based Program
Evaluation. Seattle, WA: The Evaluation Forum, 1994. After a 30 page
introduction to OM concepts, the next 75 pages of this book introduce data
methods with specifics on formats, sampling, and statistical analysis for
novices.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________38__

So What? Using Outcome-Based Evaluation to Assess the Impact of Library Services

Reisman, Jane. How to Manage and Analyze Data for Outcomes-Based


Evaluation. Seattle, WA: The Evaluation Forum, 2000.
Salkind, Neil J. Statistics for People Who Think They) Hate Statistics.
Thousand Oaks, CA: Sage Publications, 2000. As the title suggests, an
introduction to statistics for novices.
Simon, Judith Sharken. Conducting Successful Focus Groups: How to Get the
Information You Need to Make Smart Decisions. St. Paul, MN: Amherst H.
Wilder Foundation, 1999. A clear, concise, short (67 page) introduction to
how and when to use focus groups.
Zemke, Ron and Thomas Kramlinger. Figuring Things Out. Reading, MA:
Addison-WesleyPublishers, 1982. Written for trainers, this book has strong
introductions to focus groups, interviews, gathering information, surveys and
questionnaires, and sampling. I recommend those sections despite its 1982
publication date.
Zweizig, Douglas and others. The Tell It! Manual: The Complete Program for
Evaluating Library Performance. Chicago: ALA, 1996. Another book on a
broader topic that includes excellent chapters on data collection methods.
See Part III, which has 9 chapters (111 pages) on surveying, interviewing,
attitude measurement and more.

Rubin for the Massachusetts Board of Library Commissioners_2004____________________________39__

Вам также может понравиться