Академический Документы
Профессиональный Документы
Культура Документы
1. Introduction
Voluntary sector organisations are being urged to provide more convincing evidence
of their impact and with good reason, given that funders and commissioners want
to make sure they achieve maximum benefit through their investment, as do
organisations themselves. However, new to the sector is that improved standards of
evidence are sometimes presented as synonymous with using more scientific
methods of assessing outcomes and impact, specifically using comparison methods
in order to reduce bias that is, controlling for factors that could affect the result in
any way. In particular, RCTs (randomised controlled trials, so called because
participants are randomly allocated between the two groups, as in a clinical trial)
have long been regarded by government bodies as the gold standard for assessing
impact; they offer a way of reducing bias to a minimum. 1 This means that they
provide better internal validity than other methods, a greater level of truth about
cause and effect relationships, thus increasing confidence in the evidence that the
programme or project is indeed the cause of the outcomes.
1.1
Cupitt, S (May 2015) Randomised controlled trials gold standard or fools gold? The role of
experimental methods in voluntary sector impact assessment, NVCO Charities Evaluation Services
2 See http://www.ces-vol.org.uk/Publications-Research/publications-free-downloads/you-projectoutcomes-download
For step-by-step guidance and examples for CES Planning Triangle see: http://www.cesvol.org.uk/tools-and-resources/planning-for-monitoring-evaluation/ces-planning-triangle
4 See CES introductory guide: Making Connections: Using a theory of change to develop planning and evaluation
(CES)
Stern, E (May 2015) Impact Evaluation, Prepared for the Big Lottery Fund, Bond, Comic Relief and
the Department of International Development, pages 12-14.
6 Stern, E (May 2015) Impact Evaluation, Prepared for the Big Lottery Fund, Bond, Comic Relief and
the Department of International Development, page 12.
See Cupitt, S (May 2015) Randomised controlled trials gold standard or fools gold? The role of
experimental methods in voluntary sector impact assessment, NVCO Charities Evaluation Services
8 Stern, E (May 2015) Impact Evaluation, Prepared for the Big Lottery Fund, Bond, Comic Relief and
the Department of International Development, page 12.
of causal evidence: 9
1. Look for evidence for and against the suspected cause.
2. Look for evidence for and against other important possible causes.
Scriven, M (1976) Maximizing the power of causal investigations: The modus operandi method,
Evaluation studies review annual, Vol. 1 (1976), pp. 101-118
11 These key elements on contribution analysis are set out by Mayne, J Contribution analysis: An
approach to exploring cause and effect, ILAC Brief 16
http://www.cgiar-ilac.org/files/ILAC_Brief16_Contribution_Analysis_0.pdf
6.1
6.2
Overall, it will be helpful to check whether the results match what experts have
predicted or what might be expected by practice elsewhere. Data analysis should
also be able to reveal potential connections, such as:
any link between results achieved and the extent of participation (for example,
data showing participants attending activity sessions regularly for more than
one year with more positive outcomes than those with irregular or short-term
attendance)
how overall outcomes have been affected by the positive or negative
influence of individual participant characteristics external to the intervention
(for example, young people in a drug and alcohol programme with strong
family support achieving step changes more quickly than others).
The timing of change may also be important. Do changes relate to anticipated
timelines for change? If outcomes have been achieved earlier, for example, this
might be due to factors other than the intervention.
Case studies can be compared to check how results might be affected in the event
of differences in project implementation or individual participation levels. Can
elements of practice be linked to more positive or negative results? Different cases
may be analysed to identify how different components of the programme might
produce different specific outcomes.
6.3
overlap: 12
Ruling out technical explanations: identifying and investigating possible ways that
the results might reflect technical limitations in data collection and analysis rather
than actual causal relationships.
Following up exceptions: this involves treating data that doesnt fit the expected
pattern (such as exceptionally positive change) as potentially important in offering
clues to other causal factors and seeking to explain them.
Statistically controlling for extraneous variables: This can be done in nonexperimental design; external factors that might affect the outcome should be
identified during evaluation design so that necessary data can be collected and
controlled for when carrying out quantitative data analysis.
working should also be part of this approach, allowing decisions to be made about
how to work to greater benefit with agencies which contribute to desired outcomes
and how to counter barriers to positive change.
8. Further reading
Better Evaluation: Understand causes of outcomes and impacts, from (including a video
from Jane Davidson) http://betterevaluation.org/plan/understandcauses
Hind, J (2010) Additionality: A useful way to construct the counterfactual qualitatively,
Evaluation Journal of Australia, Vol.10, No. 1
http://www.fsnnetwork.org/sites/default/files/qualitative_approach_to_impact_evaluation.pdf
Interaction: Annex 7: A Range of Quantitative, Qualitative and Theory-based Approaches
for Defining the Counterfactualk
http://www.interaction.org/annex-7-range-quantitative-qualitative-and-theory-basedapproaches-defining-counterfactual
Mayne, J Contribution analysis: An approach to exploring cause and effect, ILAC Brief 16
http://www.cgiar-ilac.org/files/ILAC_Brief16_Contribution_Analysis_0.pdf
Mohr, LB (1999) The Qualitative Method of Impact Analysis, American Journal of
Evaluation, 20 (1), 199k
Stern, E (May 2015) Impact Evaluation, Prepared for the Big Lottery Fund, Bond, Comic
Relief and the Department of International Development.