Академический Документы
Профессиональный Документы
Культура Документы
3ALLY #UPITT WITH !NASTASIA -IHAILIDOU
#HARITIES %VALUATION 3ERVICES
-ARCH
Charities Evaluation Services would like to thank all survey respondents and
interview participants for giving their time and expertise to this research.
1
2
4
5
6
7
8
Bibliography
Appendices
Acronyms
Acronyms
ACRE
BERR
BIG
BAMER
BVSC
CES
CVS
ISO
LAA
LSP
OTS
NAO
PCT
RDA
SROI
TSO
VC
VCO
VCS
VE
Volunteering England
VIAT
Some definitions
Some definitions
Outputs
The activities, services and products
provided by an organisation.
Difference made
Infrastructure
Impact
Outcomes
Bibliography
Appendices
Contents
Executive Summary
Background to this research
Research focus
Findings
Recommendations
7
7
7
8
9
1.
1.1
1.2
1.3
2
2.1
2.2
Context 10
Background to this work
10
The importance of demonstrating difference in the current climate
11
The importance of demonstrating difference in the future
13
3.
3.1
3.2
3.3
23
23
24
26
4.
4.1
4.2
4.3
28
28
30
34
Contents
1
2
8.
8.1
8.2
8.3
8.4
Recommendations 48
Direct support to infrastructure organisations
48
Developing resources
49
Influencing the debate
50
Further research
51
53
56
24
29
32
37
52
Bibliography
7.
7.1
7.2
7.3
7.4
7.5
7.6
7.7
8
Bibliography
Appendices
Executive summary
Executive summary
Research focus
6
7
Bibliography
Appendices
Findings
Respondents reported collecting quite a
lot of outcome and impact information,
much of it for their own use. A fifth felt
they were demonstrating the differences
made by their services very well, and
a further 40 per cent reported doing
so adequately. We found evidence of
good practice too; for example almost
all survey respondents were doing some
follow up of their users.
Most respondents in this research
did not feel they were collecting too
much information for funders. For
most of them, their funders primarily
want output information, but there is,
however, evidence of a shift towards
requiring outcomes data. In fact, survey
respondents reported wanting substantial
information on differences made for their
own internal use significantly more than
their funders require of them.
Knowledge and use of existing tools for
demonstrating effectiveness was high
among survey respondents, although it was
slightly lower for specialist infrastructure
Executive summary
Recommendations
8
Bibliography
Appendices
1. Context
10
Context
11
Appendices
Bibliography
12
NI 6, another national indicator of relevance to infrastructure organisations, is about the levels of volunteering locally.
Context
8
Bibliogrpahy
Bibliography
Commissioning
Appendices
13
14
Rob Macmillan reports finding references to this dating back to 1990 (personal communication).
Context
2
3
4
5
6
7
8
Bibliography
Appendices
15
2.1 Infrastructure:
a different case?
Many of the problems with
demonstrating difference are not unique
to infrastructure organisations:
intended outcomes may be hard to
define or hard to measure
16
6
7
8
Bibliography
Defining infrastructure
Although one infrastructure interviewee
felt that ChangeUp has been helpful in
defining infrastructure, a shared definition
of infrastructure, beyond the very broad,
does not seem to exist. We found it hard
to find a definition for this research that
was both uncontroversial and practical
in application. One interviewee noted
that looking at the varied membership of
consortia across the country shows the
different definitions used in practice.
Appendices
17
18
www.navca.org.uk/localvs/lio/guidance/corefunctions.htm
www.volunteering.org.uk/WhatWeDo/Local+and+Regional/Volunteer+Centre+Quality+Accreditation.htm
7
8
Bibliography
Appendices
19
20
Infrastructure organisations of course may also play a key role in galvanising them for change and
promoting good practice.
2.2 Demonstrating
difference well
All the funders and the academic
interviewed were asked how they
would recognise an organisation
that demonstrated the difference it
makes well. The responses have been
aggregated below.
Bibliography
Appendices
21
22
2
8
Bibliography
23
Appendices
Most respondents in this research did not feel they were collecting too much
information for funders. For most of them, their funders primarily want
output information, but there is, as we would expect, evidence of a shift
towards outcomes.
Required by
Services
delivered8
Types
of users
Impacts
longer term changes
Infrastructure
organisations
84% (75)
86% (75)
81% (71)
75% (57)
Funders
86% (73)
68% (58)
63% (54)
40% (33)
Impact was briefly defined in this survey as longer-term changes as opposed to outcomes which are direct changes
you make. However, respondents may not have shared this view.
8
This is averaged across service types. The percentages take into account those not providing the service.
7
24
Outcomes
direct changes made
resources
(written or online, including newsletters)
brokerage
(such as providing volunteers)
providing networking or peer learning
lobbying, representation or
campaigning
5
6
7
8
Bibliography
25
Appendices
training or events
26
2
3
4
5
6
7
8
Bibliography
Appendices
27
28
#HALLENGES OF DEMONSTRATING DIFFERENCE
ORGANISATIONS (OWEVER IT IS NOT KNOWN THE
EXTENT TO WHICH THEY REALLY ARE FOCUSING ON
DIFFERENCES MADE AS OPPOSED TO OUTPUTS
#HART BELOW GIVES A DETAILED BREAKDOWN
OF HOW WELL INFRASTRUCTURE ORGANISATIONS
THINK THEY ARE DEMONSTRATING DIFFERENCE
ACROSS DIFFERENT SERVICE AREAS
/VERALL RESPONDENTS DID NOT FIND THE
TYPE OF SERVICE AFFECTED THE DIFFICULTY IN
DEMONSTRATING DIFFERENCE ! SURPRISING
NUMBER SAID THAT THEY COULD DEMONSTRATE
THE DIFFERENCE MADE BY LOBBYING
REPRESENTATION OR CAMPAIGNING WORK
WELL OR ADEQUATELY WE HAD ASSUMED THIS
WOULD BE HARDER FOR MANY INFRASTRUCTURE
#HART $EMONSTRATING DIFFERENCE HOW EASY IT IS
7E DEMONSTRATE THE DIFFERENCE THIS MAKES VERY WELL
7E DEMONSTRATE THE DIFFERENCE THIS MAKES ADEQUATELY
7E STRUGGLE WITH DEMONSTRATING THE DIFFERENCE THIS MAKES
"ROKERAGE EG PROVIDING VOLUNTEERS
!DVICE SUPPORT OR CONSULTANCY
3ERVICE AREA
4RAINING OR EVENTS
0ROMOTING SUPPORTING PARTNERSHIP WORK
,OBBYING REPRESENTATION OR CAMPAIGNING
0ROVIDING NETWORKING PEER LEARNING
2ESOURCES WRITTEN OR ONLINE EG NEWSLETTERS
"IBLIOGRAPHY
0ERCENTAGE OF SURVEY RESPONDENTS
!PPENDICES
30
Aggregating data
Collecting and reporting data at project
level can result in information that is hard
to aggregate and may be underused.
Aggregation was reported as a problem
by a few respondents. An infrastructure
interviewee described as a weakness
in their work the failure to gather the
project level information together
internally, to get a picture of the overall
differences made by the organisation.
A survey respondent noted that the
mixture of tools and approaches used
across teams was also a problem: we
collect information but everyone uses
such different formats that it is not
helpful when put together.
8
Bibliography
Appendices
31
#HART $EMONSTRATING DIFFERENCE WHAT SURVEY RESPONDENTS
FOUND CHALLENGING
#OLLECTING INFORMATION ON IMPACT LONG TERM CHANGE
&INDING GOOD WAYS TO COLLECT INFORMATION
#OLLECTING GOOD QUALITY INFORMATION
)DENTIFYING INDICATORS
!REA OF DIFFICULTY
#LARIFYING WHAT DIFFERENCES WE HOPE TO MAKE
!GREEING WITH FRONTLINE ORGANISATIONS WHAT DIFFERENCES WE ARE WORKING TOWARDS
)NADEQUATE )4 SYSTEMS
!GREEING WITH FUNDERS WHAT DIFFERENCES WE PLAN TO ACHIEVE
,ACK OF STAFF SKILLS IN DATA COLLECTION AND OR ANALYSIS
5SING THE FINDINGS
0ROCESSING THE INFORMATION
!NALYSING DATA
0ERCENTAGE OF SURVEY RESPONDENTS
7
8
Bibliography
Lack of capacity
Information Technology
Appendices
33
34
Bibliography
35
Appendices
Ellis (2008) has noted that some TSOs make no distinction between evaluation and quality assurance.
10
Tools heard of
Knowledge of the listed tools was high,
with most respondents being aware
of some or all of them. While it was
not possible to do a full analysis of
the comparative findings,11 specialist
infrastructure organisations (either by
function or constituency) were aware of
a smaller range of tools than generalist
organisations.
Two survey respondents had either
heard of none or only one of the tools.
Both of these were arguably unusual
organisations, perhaps out of the
usual networks of capacity building;
one distributes donated goods to
other charities, the other works with
professional orchestras. It is worth
restating that respondents were people
who CES could access reasonably easily.
Tools used
This is due to the limitations of the survey tool we used when analyzing qualitative data.
Or did not know whether their organisation was using them.
13
These figures are consistent with a recent survey of NAVCA members that found that 80% had achieved at least one
quality standard, and a further 10% were working towards one (Escadale 2008).
11
12
36
4OOLS IN USE BY INFRASTRUCTURE ORGANISATIONS
#HART 5SE AND AWARENESS OF TOOLS AMONG SURVEY RESPONDENTS
) HAVENT HEARD OF THIS
) HAVE HEARD OF THIS BUT WE DONT USE IT
7E USE THIS IN OUR ORGANISATION TO HELP DEMONSTRATE THE DIFFERENCE WE MAKE
01!33/
#%3 APPROACH
.!6#!S QUALITY STANDARDS
.!6#!S -EASURING %FFECTIVENESS TOOLKIT
6OLUNTEERING )MPACT !SSESSMENT 4OOLKIT
32/) SOCIAL RETURN ON INVESTMENT
3OCIAL ACCOUNTING AND AUDIT
%CONOMIC /UTCOMES 4OOL
!#2% QUALITY STANDARD
BASSACS #HANGE#HECK
NEFS IMPACT MAPPING
"IBLIOGRAPHY
0%2&/2-
0ERCENTAGE OF SURVEY RESPONDENTS
!PPENDICES
The true figure of those using multiple tool use is likely to be higher. The survey tool we used does not allow easy
analysis of the qualitative data alongside the quantitative. So it has not been possible to analyse in depth the number
using both listed and non listed tools.
14
38
SROI
6
7
8
Bibliography
16
39
Appendices
6. Good practice
40
Good practice
7
8
Bibliography
Appendices
41
42
St Helens CVS
Bibliography
While a few of the respondents in this research did not want any help in
demonstrating the differences made by their work, the majority outlined a
wide range of ways in which CES or similar organisations might be able to
help infrastructure organisations better demonstrate the difference
they make.
Appendices
43
44
We cannot be sure that all respondents interpreted impact in the same way we would at CES; however it was clear
from the qualitative responses that at least some did.
21
8
Bibliography
Appendices
45
46
7
8
Bibliography
Appendices
47
8. Recommendations
48
Recommendations
Bibliography
Appendices
49
50
Recommendations
4
5
6
7
8
Bibliography
Appendices
51
Bibliography
Bibliography
52
Appendix 1
Data collection
Bibliography
Appendices
53
Interview respondents
Twenty-eight interviews were carried
out, primarily by phone but a few face to
face. Nineteen of these interviews were
with infrastructure organisations, several
of which are partly or fully third tier.
Infrastructure interviewees represented a
wide range of local, national, large, small,
specialist and generic organisations.
Six funders were interviewed. They
included representatives from the Office
of the Third Sector (OTS), a Regional
Development Agency (RDA), the Big
Lottery (BIG), Capacitybuilders, a primary
care trust (PCT) and a local authority from
a different area. Despite attempts, it was
not possible to interview a charitable
trust. The National Audit Office (NAO), an
academic and an independent consultant
with experience in this area were also
interviewed.
Interviewees were chosen to represent
a range of funders and infrastructure
organisations. Individual contacts were
identified through a range of sources:
1. from previous Charities Evaluation
Services (CES) research where survey
respondents had been identified as
having particularly relevant things to say
We did not come across a good mapping of all infrastructure organisations against which
we could compare characteristics.
23
This includes those survey respondents ticking other who had a sub regional focus.
24
This includes one respondent whose organisation covers England and Wales.
22
54
Survey respondents
Appendix 1
4
5
6
7
8
Bibliography
55
Appendices
Note that we cannot be certain that the respondents all shared the same understanding of commissioned contracts.
25
Appendix 2
Services delivered
Information on who
you work with
The outcomes of
your work (the
direct changes you
make)
For
internal
For
funders
For
internal
For
funders
For
internal
For
funders
For
internal
For
funders
Training or
events
87 (90%)
91 (90%)
87 (90%)
70 (69%)
86 (89%)
70 (69%)
68 (70%)
43 (43%)
Advice, support
or consultancy
85 (84%)
85 (84%)
88 (87%)
76 (75%)
85 (84%)
72 (71%)
65 (64%)
40 (40%)
Resources
(incl. written
or online, eg
newsletters)
84 (88%)
83 (90%)
76 (79%)
48 (52%)
66 (69%)
43 (47%)
56 (58%)
26 (28%)
Brokerage
(eg, providing
volunteers)
49 (89%)
50 (94%)
50 (91%)
39 (74%)
48 (87%)
38 (72%)
36 (65%)
24 (45%)
Providing
networking /
peer learning
78 (80%)
70 (83%)
74 (84%)
56 (67%)
70 (80%)
54 (64%)
54 (61%)
32 (38%)
Lobbying,
representation
or campaigning
63 (80%)
58 (77%)
64 (81%)
47 (63%)
62 (78%)
45 (60%)
57 (72%)
30 (40%)
Promoting /
supporting
partnership
work
80 (77%)
76 (84)
86 (90%)
69 (77%)
80 (83%)
55 (61%)
66 (69%)
38 (43%)
Service Area
56