Вы находитесь на странице: 1из 60

$EMONSTRATINGTHEDIFFERENCE

3ALLY#UPITTWITH!NASTASIA-IHAILIDOU
#HARITIES%VALUATION3ERVICES
-ARCH

This research was conducted as part of


the National Performance Programme at
Charities Evaluation Services (CES).
The National Performance Programme
is funded by Capacitybuilders National
Support Services programme and is led
by CES in partnership with acevo, the New
Economics Foundation, New Philanthropy
Capital and Voice4Change England.
Improving Support is an initiative led by
Capacitybuilders that brings together
practical resources and learning to
strengthen support services for third
sector organisations.

Charities Evaluation Services is a company


limited by guarantee.
Registered in England and Wales
no. 2510318.
Registered charity no. 803602.
Registered office address:
4 Coldbath Square,
London EC1R 5HL, UK.
Charities Evaluation Services, 2009
ISBN 978-0-9555404-9-3
Unless otherwise indicated, no part of this
publication may be stored in a retrievable
system or reproduced in any form without
prior written permission from CES. CES
will give sympathetic consideration to
requests from small organisations for
permission to reproduce this publication
in whole or in part but terms upon which
such reproduction may be permitted will
remain at CES discretion.

Charities Evaluation Services would like to thank all survey respondents and
interview participants for giving their time and expertise to this research.

1
2

Demonstrating the difference


3

Sally Cupitt with Anastasia Mihailidou


Charities Evaluation Services
March 2009

4
5
6
7
8
Bibliography
Appendices

Acronyms

Acronyms

ACRE

Action with Communities in Rural England

BERR

Department for Business Enterprise and Regulatory Reform

BIG

Big Lottery Fund

BAMER

black, Asian, minority ethnic and refugee

BVSC

Centre for Voluntary Action in Birmingham

CES

Charities Evaluation Services

CVS

Council for Voluntary Service

ISO

International Organisation for Standardization

LAA

local area agreements

LSP

local strategic partnerships

OTS

Office of the Third Sector

NAO

National Audit Office

NAVCA National Association for Voluntary and Community Action


NCVO

National Council for Voluntary Organisations

PCT

primary care trust

PQASSO Practical Quality Assurance System for Small Organisations


RCC

Rural Community Council

RDA

Regional Development Agency

SROI

social return on investment

TSO

third sector organisation (includes charities, voluntary organisations,

community groups and social enterprises)

VC

volunteer centre (also called volunteer development agency)

VCO

voluntary and community organisation

VCS

voluntary and community sector

VE

Volunteering England

VIAT

Volunteering Impact Assessment Toolkit

Throughout this research, voluntary and community sector or organisation does


not include social enterprise, however when referring to Jean Ellis research (2008),
social enterprise is included.

Some definitions

Some definitions

Outputs
The activities, services and products
provided by an organisation.

The changes, benefits, learning or


other effects that happen as a result of
services and activities provided by an
organisation.

Broader or longer term effects of an


organisation or projects outputs and
outcomes.

For simplicity we are using the Home


Office definition of infrastructure as:

Difference includes both benefits


positive outcomes and negative
outcomes. Difference made is similar
to effectiveness, in that both have an
outcomes focus. However, the definition
of effectiveness is contested. It is often
defined more widely and may bring in
issues like quality and value for money
(see Mistry 2007).

It is worth noting however, that this is a


government definition, not necessarily
one agreed on by infrastructure

All the changes that result from


infrastructure work, most usually the
outcomes and impacts of a given
intervention. This term encompasses both
outcome and impact, in recognition of
the continuing and unresolved debate on
the definitions of outcomes and impact
(Macmillan 2006, p11).

The physical facilities, structures, systems,


relationships, people, knowledge and
skills that exist to support and develop,
co-ordinate, represent and promote front
line organisations thus enabling them to
deliver their missions more effectively.
Home Office (2004, p17)

Difference made

Infrastructure

Impact

Outcomes

organisations themselves. Also, it is


a functional definition, which may
encompass organisations fulfilling
infrastructure functions but not defining
as themselves as such. Infrastructure
organisations can be both second tier
(those working primarily with frontline
organisations) and third tier (those whose
primary focus is working with other
infrastructure organisations).

Where it is necessary to give the


response rate for a particular question,
it is written in parenthesis, for example,
(176). This means that 176 people
answered that particular question and
the percentage is based on this figure.

Bibliography

Not every respondent gave responses to


every question. Unless otherwise stated,
the percentages given are always as a
percent of those who answered that
particular question, not of the whole
respondent group. To make this clearer,
absolute figures are also given with
each percentage.

Appendices

Contents

Executive Summary
Background to this research
Research focus
Findings
Recommendations

7
7
7
8
9

1.
1.1
1.2
1.3

2
2.1
2.2

Context 10
Background to this work
10
The importance of demonstrating difference in the current climate
11
The importance of demonstrating difference in the future
13

3.
3.1
3.2
3.3

The information collected by infrastructure organisations


Amount of information
Type of information collected
Methods for collecting data

23
23
24
26

4.
4.1
4.2
4.3

Challenges of demonstrating difference


How well they demonstrate difference
Where there were challenges
Challenges faced by funders

28
28
30
34

The nature of infrastructure 16


Infrastructure: a different case?
16
Demonstrating difference well
21

5. Tools in use by infrastructure organisations 35


5.1 Tools heard of and used
35
5.2 The growth of economic tools
39
6. Good practice 40
6.1 Case examples of projects with wider applicability
40
6.2 Two to watch: work in progress
41

Contents

1
2

8.
8.1
8.2
8.3
8.4

Recommendations 48
Direct support to infrastructure organisations
48
Developing resources
49
Influencing the debate
50
Further research
51

53

Appendix 2: funders and infrastructure organisations


information requirements

56

Tables and charts


Table 1: Information required by infrastructure organisations and their funders
Chart 1: Demonstrating difference: how easy it is
Chart 2: Demonstrating difference: what survey respondents found challenging
Chart 3: Use and awareness of tools among survey respondents

24
29
32
37

Appendix 1: About this report

52

Bibliography

What would help respondents perspectives 43


Changes to their organisation or its funding
43
Training, consultancy and skills
44
A common framework
44
A theory of change
45
Work on tools
46
Learning from others
46
Work with funders and commissioners
47

7.
7.1
7.2
7.3
7.4
7.5
7.6
7.7

8
Bibliography
Appendices

Executive summary

Executive summary

Background to this research

Research focus

Demonstrating the Difference asked:


What do infrastructure organisations
need to demonstrate regarding their
effectiveness, both for organisational
development and accountability
requirements? Do different types of
infrastructure organisations (national,
local, generalist and specialist) have
different needs?

6
7

How might these requirements change


in the future?
Are the expectations of stakeholders
relevant and appropriate?
What can we learn from recent
initiatives offering support to
infrastructure organisations in
demonstrating their effectiveness?

What support and tools are available of


relevance to this sector? What is their
nature and usage?

Bibliography

What support does the range of


infrastructure organisations need
to be able to meet their reporting
requirements?

Appendices

Research into the effectiveness


of infrastructure work in 2006 by
Macmillan found only fragmented and
insubstantial evidence, largely in project
and programme evaluations. Although
many individual services, projects and
organisations can demonstrate that they
make a difference, this evidence has not
yet reached a critical mass to make the

Demonstrating the Difference is being


published at a time of considerable
public interest in the value for money
of infrastructure, and increasing
demands for better evidence to guide
future funding decisions. Voluntary and
community sector (VCS) infrastructure
has received high levels of investments in
recent years through ChangeUp and the
Big Lotterys BASIS programme, and this
has led to calls for better evidence of the
difference infrastructure makes.

Infrastructure organisations need urgently


to be able to demonstrate the differences
made by their work. The climate in which
we work is changing: there is public
scrutiny of infrastructure support; there is
likely to be pressure on funding which will
bring performance demands; more income
may well come from commissioning rather
than grants; and a possible new governing
party will bring change.

Charities Evaluation Services National


Performance Programme wanted to
investigate how well infrastructure
organisations are able to demonstrate
the difference their work makes. By
difference made we mean all the
changes, learning and benefits that
result from work to support, develop,
coordinate, represent and promote
frontline voluntary and community
groups. Our research asked what
infrastructure organisations need to help
them assess and demonstrate difference,
what support is available to them, as well
as what gaps exist. We also looked at
whether the expectations they have of
themselves, and the expectations their
funders have of them, are realistic.

case for the effectiveness of infrastructure


as a whole.

The findings in Demonstrating the


Difference are drawn from desk
research, 28 interviews and a survey
of 119 infrastructure organisations.
The interviews, which were largely by
telephone, involved 19 infrastructure
organisations, six funders, the National
Audit Office, a leading academic,
and an independent consultant with
expertise in infrastructure support. Of
the infrastructure organisations surveyed,
56 per cent had a local focus, 24 per
cent regional and 20 per cent national.
Sixty-nine per cent of the organisations
completing the survey provide generalist
infrastructure support.

Findings
Respondents reported collecting quite a
lot of outcome and impact information,
much of it for their own use. A fifth felt
they were demonstrating the differences
made by their services very well, and
a further 40 per cent reported doing
so adequately. We found evidence of
good practice too; for example almost
all survey respondents were doing some
follow up of their users.
Most respondents in this research
did not feel they were collecting too
much information for funders. For
most of them, their funders primarily
want output information, but there is,
however, evidence of a shift towards
requiring outcomes data. In fact, survey
respondents reported wanting substantial
information on differences made for their
own internal use significantly more than
their funders require of them.
Knowledge and use of existing tools for
demonstrating effectiveness was high
among survey respondents, although it was
slightly lower for specialist infrastructure

organisations. There was evidence that


some infrastructure organisations have to
adopt inappropriate tools to meet funding
requirements. Inappropriate tools may
not be a good fit to their organisational
type, or not fit for purpose; many have
adopted quality tools where they need a
more evaluative approach to demonstrate
differences made.
A large number of infrastructure
organisations in this research reported
that they could be demonstrating
their difference better. Key
challenges respondents reported with
demonstrating difference were:
lack of shared understanding of
infrastructure
the relevance and usefulness of
information collected for funders
difficulties in collecting data and the IT
needed to store and manipulate it
a lack of capacity to collect information
identifying indicators.
The research showed that a number of
things would help improve the ability
of many infrastructure organisations to
demonstrate the difference they make.
Many could benefit from increased clarity
and agreement as to services delivered
and intended outcomes. Support to
infrastructure organisations on process
oriented evaluation could be very helpful
for complex work where outcomes may
be hard to assess.
It would also be helpful to have a
better understanding, by infrastructure
organisations and their funders and
commissioners, of when it is appropriate
to demonstrate impact, as opposed to
outcomes. Impact assessment is often
harder than outcome assessment, and for
many infrastructure organisations it may

Executive summary

Recommendations

Any new resources would require funding


to support them. Some large initiatives
have, in the past, lacked resources to
adequately disseminate the tools, support
their users and crucially do follow-up
to evaluate their efficacy in practice. If
new resources are to be developed, there
must be adequate resources for ongoing
promotion, support and evaluation.
It would be beneficial to consider working
to inform funders and commissioners
about:
the difference between impact and
outcome and why this is difficult for
infrastructure organisations
when it is appropriate to seek
impact data
the limitations of an outcomes
approach and the role of process in
evaluating complex initiatives

the range and usage of evaluation and


quality tools in the sector.
Further research could include:

looking at the needs of infrastructure


organisations in communicating and
reporting on their achievements and
the differences they make
examining the extent to which
evidence of difference made plays a
part in the awarding of contracts.

8
Bibliography

Demonstrating the Difference does not


suggest that the sector develop more
tools for infrastructure organisations,
although the possibility in future of
resources for impact assessment should
be kept open. However, there are
four resources that would be of use to
infrastructure organisations that could be
developed. They are:

case studies of good practice.

Infrastructure organisations participating


in this research also spoke of the benefit
of opportunities to meet each other
and share learning and good practice in
demonstrating difference. As a starting
point, it might be useful to get together
some of those leading the way in this
work.

a map of existing tools

This included direct support in the


form of training, particularly in more
specialised areas such as impact
assessment, and aspects of data
collection and management. This training
could cover issues such as process
evaluation, ways to clarify the link
between activities, outputs and intended
outcomes (a theory of change), and how
to report against statutory indicators.

theories of change for infrastructure


work

While a few of the respondents in


this research did not want any help in
demonstrating the differences made by
their work, the majority outlined a wide
range of things that might help them
better demonstrate the difference they
make.

a bank of outcomes and indicators

not be possible to demonstrate impact


without additional resources. Further, it
may not be appropriate in many cases to
do so; impact assessment may be more
effectively done by those with evaluation
skills and an evaluation remit.

Appendices

1. Context

Infrastructure organisations need to be preparing themselves for change; the


effects of a move to more commissioning, plus a possible new governing
party will bring challenges. One respondent commented that infrastructure
organisations need to take action soon to demonstrate what they achieve:
We dont want to be in a situation where we see it coming but fail to act.
There is public scrutiny of the value for
money of infrastructure funding, and
infrastructure organisations are required
to demonstrate how they perform
against a range of government targets.
This renewed interest in providing
evidence of differences made by
infrastructure organisations, means they
need urgently to be able to demonstrate
the differences made by their work.

1.1 Background to this work


Calls for evidence
While there is evidence of the
effectiveness of many individual services,
projects and organisations, there is
a lack of evidence for voluntary and
community sector (VCS) infrastructure as
a whole. This is at a time of considerable
public interest in the value for money
of infrastructure and calls for better
evidence to guide future investment
decisions. The high level of investments
into infrastructure that have come
through ChangeUp and the Big Lotterys
BASIS funding, combined with the
governments interest in the third sector
taking on a wider role in the delivery

10

of public services, have led to calls for


better evidence of the difference made
by infrastructure. A respondent from
Capacitybuilders described such evidence
as the holy grail.
It is becoming increasingly important for
infrastructure organisations, especially
those funded through Regional
Development Agencies (RDA) and local
authorities, to be able to demonstrate
economic, social and environmental
outcomes in relation to government
targets. The Office of the Third Sector
(OTS) also has a particular interest at the
current time in the ability of the third
sector to demonstrate difference made,
shown for example by its three-year
project, Measuring Social Value, which
focuses on the dissemination of the social
return on investment (SROI) approach
within the third sector.
But a lack of it available
In 2008, Ellis published a major study into
monitoring and evaluation in the UK third
sector. She found that

despite the increase in resources and


training, there is still a huge constituency
of organisations that are struggling with

Context

Infrastructure organisations may become


more reliant on income generation by
charging for their services. In this survey
44 per cent (52) of respondents said
they received at least some of their
income this way. This may bring with it
new pressures to demonstrate difference
made, perhaps in terms of demonstrating
and marketing quality services to
potential customers.

The current financial and


policy climate

The relationship with the local


public sector

11

Appendices

Rob Macmillan, personal communication.

Bibliography

Each local authority now needs to report


against 198 national indicators as part of

The effects of the recession may in fact


stimulate demand for infrastructure work,

All across government, times are


extremely tight. Everybody is going to
have to work harder to demonstrate why
what they do represents good value for
money. All over government ministers are
looking for ways to increase efficiency.

Infrastructure organisations are to be


judged, albeit indirectly, along with
local authorities, against national
indicators. For many this may not be
about individual performance, rather a
judgement about infrastructure generally.
However, some local authorities may
choose to judge their own funded
infrastructure organisations against these
indicators.

Support from BASIS and ChangeUp


cannot be relied upon in the long term.
The current recession is likely to lead to
more competition for funding, in which
infrastructure may struggle to compete
with frontline services that have more
immediate, visible impact. The need
to demonstrate difference made may
become more urgent in this climate. The
OTS interviewee noted that:

One interviewee felt that demonstrating


difference was important to enable
infrastructure organisations to explain
the purpose and role of infrastructure.
This may be particularly important
as infrastructure is arguably poorly
understood (see section 3.1 below).
Respondents noted that this is crucial
to gaining funding, particularly when
working with an unpopular client group
or in an area of work less sexy than
some frontline work.

Perhaps unsurprisingly, all 28 interviewees


said that demonstrating difference was
important for infrastructure organisations
in the current climate. Key reasons for its
importance are outlined below.

For communicating with


external stakeholders

1.2 The importance of


demonstrating difference
in the current climate

In 2006, Macmillan undertook a rapid


evidence appraisal for the effectiveness of
infrastructure work (Macmillan, 2006). What
evidence he found was fragmented and
insubstantial, mostly in the form of project
and programme evaluations. In 2008,
Macmillan reports that the situation has not
substantially improved.1

if frontline organisations need more


support in making themselves more
effective.

the basics of monitoring and evaluation


and to understand outcomes approaches
(p vi).

the new Local Government Performance


Framework. National Indicator 7 (NI 7)
is an environment for a thriving third
sector which:

will measure the contribution that local


government and its partners make to
the environment in which independent
third sector organisations can operate
successfully (OTS, 2008, p5).
While all local authorities will be assessed
against all 198 indicators, some will also
have chosen to make NI 7 explicit local
priorities by choosing them within their
Local Area Agreements.2
Baseline information against NI 7 was
collected through the OTS National
Survey of Third Sector Organisations, in
September 2008, and will be repeated
in 2010. It went to about 100,000
randomly selected charities, with a 47 per
cent response rate. Among other things,
voluntary and community organisations
(VCOs) were asked about the support
available from other (second tier or
umbrella) third sector organisations.

12

The voice of the local voluntary and


community sectors
Infrastructure organisations have an
important role in Compact development
and relations. The relationship between
local authorities and the VCS has arguably
also become closer with the introduction
of Local Area Agreements (LAAs) and
Local Strategic Partnerships (LSPs). A
local authority partnerships officer
reported that the introduction of LSPs has
highlighted the need for closer working
relationships with the VCS at all levels
across the local authority. The officer
added that infrastructure organisations
are now even more necessary for the
local authority as a conduit for sharing
information with the local VCS and
getting information from them. While
a Primary Care Trust (PCT) may not be
representative of other sector funders,
the PCT funder interviewed also saw this
as an important role for infrastructure:

Its critical that we have organisations


that can speak on behalf of the VCS and
provide support for their development
outside the public sector. The reason
thats important is that we wont be able
to engage with the VCS, or commission
services from them, or develop them as
a market, without that kind of support.
Theyve got to be effective.

NI 6, another national indicator of relevance to infrastructure organisations, is about the levels of volunteering locally.

Context

An area for further research would be to


examine the extent to which evidence
of difference made plays a part in the
awarding of commissioning contracts. In
particular, it might be useful to look at
recent cases where existing infrastructure
providers have lost out in the
commissioning process to new providers,
to ascertain the extent to which evidence
of differences achieved (or intended)
played any part in the process.

8
Bibliogrpahy
Bibliography

In Ellis research (2008), 49 per cent (86)


of her infrastructure respondents had
been commissioned to deliver services.
Much of the data required of them was
still around outputs, although there was
a move towards outcomes reporting.
Several respondents in this research
felt that the move to commissioning
would require more evidence of

driving people towards inappropriate


quality assurance, because of the
dangle of money on the end of it. So
to get through the pre-qualification
questionnaire people are going
to International Organisation for
Standardization (ISO) 9000/9001 and its
not always the most appropriate system
for them to be using[they are] doing it
to get more ticks.

Commissioning

1.3 The importance of


demonstrating difference in
the future

Of course its about the funding but its


bigger than that, its more about them
being able to hold their heads up and
say this is who we are and yes, we are
making a difference. Unless they do
decent outcomes evidence gathering
they really dont know that for sure.

One effect of commissioning may be


the inappropriate adoption of tools by
VCOs in order to obtain funding (see
also section 5.2 below). When wanting
reassurance that organisations are
effective, commissioners may be more
likely to specify a quality tool rather than
evaluation of differences made, due to
their lack of familiarity with evaluation
tools and approaches. A respondent from
the National Association for Voluntary
and Community Action (NAVCA) reported
that one of the effects of commissioning
and procurement is that it is:

Two interviewees mentioned that it was


important to demonstrate differences
achieved for staff and trustee motivation.
Perhaps the need to demonstrate
difference is greater for infrastructure
organisations, where the outcomes are
less visible to staff, and it is easier for
them to get demotivated. Infrastructure
organisations may also be at risk of
criticism both from above from funders
and decision makers and below from
frontline organisations. One explained
that:

differences made. Where funding has


traditionally required output information
only, this may put some infrastructure
organisations at a disadvantage.
However, one interviewee commented
that the move to commissioning might
bring positive changes like a greater
focus on beneficiaries.

For several interviewees it was essential


to be able to demonstrate the difference
they make as infrastructure organisations,
otherwise, whats the point in existing?
Others felt that without evidence of the
difference being made they would be
operating in a vacuum, and at risk of
mission drift.

Good project management

Appendices

13

A re-emphasis on local provision?


The next general election may bring
with it a change of governing party. The
Conservatives green paper, A Stronger
Society: Voluntary Action in the 21st
Century (2008), is critical of national
initiatives to invest in infrastructure like
ChangeUp, and emphasises the idea of
local provision and choice to the frontline:

Government must respect the fact that


frontline voluntary organisations are
best placed to decide what support they
need, not politicians or their appointees.
Wherever possible frontline voluntary
organisations should be resourced and
empowered to commission the support
they need from Councils for Voluntary
Service and other providers. (p43)
The notion of direct payments, possibly
ring-fenced, has been mooted before in
Building Blocks (Harker and Burkeman,
2007) and a Needham and Barclay report
for local government (2004).3 This is a
suggested alternative whereby money for
capacity building is given direct to the
frontline.
The phrase and other providers may
be significant; this leaves it open for
non VCS organisations, or those who
would not normally be defined as
infrastructure organisations, to be providers
of infrastructure services. One funder
commented that ChangeUp had assumed
that VCS infrastructure was the key deliverer
of capacity building but there are other
deliverers, including the private sector.
One funder interviewed for
Demonstrating the Difference felt
that capacity building was different
from infrastructures role in voice and
influence, which the respondent felt

14

could be done only by VCS infrastructure.


However, it is likely that direct payments
would not work for representation
and influence work; if they do become
a reality, they might even lead to a
reduction in representation work.
The green paper contains an implication
that some of the larger strategic grants
may remain: Resources will always
be limited and it is better to provide
significant support to some organisations
(of various sizes), than insignificant
support to all or most of them.(p43)
Also, it may be assumed that even if
money was given to the frontline for
purchasing services, there would still be a
need for core funding for infrastructure.
One infrastructure interviewee felt that
the Conservatives have a very different
idea of infrastructure, and would
therefore judge success differently:
it doesnt mean that overnight we will
stop having an impact, we will just be
judged differently. One interviewee
noted that if direct payment for
infrastructure work comes into reality,
infrastructure will have to absolutely
ensure that it is user-led.
One interviewee felt that the
Conservatives had misunderstood the
purpose of infrastructure. Certainly,
the green paper implies that capacity
building for the voluntary sector for
delivery of public services is literally about
increasing capacity, rather than also
improving quality.
Preparing for change
While making a collective case for
infrastructure would be desirable, it
may not be possible, either because
the sector is too diverse or it may take

Rob Macmillan reports finding references to this dating back to 1990 (personal communication).

Context

One organisation described how they are


preparing for coming changes. The Centre
for Voluntary Action in Birmingham (BVSC),
reports that they are trying to become as
financially independent as possible. They
are trying to define their own intended
impacts, and their ability to articulate this
properly, arguing that if the environment
changes around them, they will be able
to stick to a clear vision of what they do
rather than have to move the goalposts
again. In this way, they are securing their
funding future.

2
3
4

I dont think demonstrating difference


will address this need overall, even if
all infrastructure organisations were 10
per cent better at demonstrating their
difference, I dont think the collective
case will be won, I think its a losing
battle already. What a demonstrating
difference approach will do is gear up
individual organisations to withstand the
difficulties we face in the years ahead.

too much time. Helping infrastructure


organisations demonstrate difference
is likely to require an emphasis on
individual organisations demonstrating
their own worth. One respondent felt
that although there might still be some
benefits in making a collective case for
infrastructure, the focus now should be
on gearing individual organisations up for
the hard times ahead:

5
6
7
8
Bibliography
Appendices

15

2. The nature of infrastructure

Infrastructure organisations are varied, meaning consensus as to their purpose


may be hard to achieve, making it harder to demonstrate the difference they
make. They could benefit from increased clarity and agreement as to services
delivered and intended outcomes. Support to infrastructure organisations
on process oriented evaluation could also be very helpful for complex work
where outcomes may be hard to assess.
Better understanding, by infrastructure
organisations and their funders and
commissioners, of when it is appropriate
to demonstrate impact, as opposed
to outcomes would be helpful. Impact
assessment is often harder than
outcomes assessment, and for many
infrastructure organisations it may not
be possible to demonstrate impact
without additional resources. Further, it
may not be appropriate in many cases
to do so; impact assessment may be
more effectively done by those with
evaluation skills and an evaluation remit.
Lastly, in order to demonstrate impact it
is usually necessary to be collecting good
information on your outcomes; those
infrastructure organisations wishing to
demonstrate impact need to get their
outcomes work in order first.

outcomes of preventative work may be


hard to capture
project timescales are often short but
outcomes can take time to occur
the need to track ex-users
staff concerns over capacity to collect
data.
However, there are some issues unique
to infrastructure and others where the
situation is arguably harder for them.
These include:
lack of a shared understanding of
infrastructure
multiple functions of infrastructure
the problems of impact
attribution and causality
lack of control at the frontline.
These issues are described below.

2.1 Infrastructure:
a different case?
Many of the problems with
demonstrating difference are not unique
to infrastructure organisations:
intended outcomes may be hard to
define or hard to measure

16

Lack of shared understanding


of infrastructure
One of the difficulties in demonstrating
the difference made by infrastructure
organisations is a lack of shared
understanding of the services provided
by infrastructure, its intended outcomes
and even the overall purpose of it. Even

The nature of infrastructure

Several respondents noted that the


purpose of volunteering infrastructure in
particular was not always agreed:

frontline organisations will often see our


role purely as brokerage of volunteers,
therefore that is the difference they
are working towards, and not better
volunteering.

6
7

To a certain extent this view may be


shared by government, for whom
indicators of success in volunteering
have tended to be about numbers of
volunteering opportunities and not
necessarily about quality. It was noted
by one infrastructure interviewee that
Volunteering England has been helpful
in separating the role that infrastructure
plays in developing the quality of
volunteering, from the role it plays in
actually providing volunteers.

8
Bibliography

Agreement about the purpose


and outcomes of infrastructure
Infrastructure and frontline organisations
share similarities. Both have beneficiaries
and share many intended outcomes

The definition of infrastructure is also


fluid, in part dependent on funding
streams. Three interview respondents
noted how the availability of new funding
streams had caused organisations to
define themselves as infrastructure, in
order to have a slice of the cake.

Macmillan (2008) also found, when


researching infrastructure in Durham,
many organisations claimed to be
infrastructure, but in practice many had
little capacity to offer support, or did
so only informally or with very limited
functions; it was not part of their core
work. The situation is further complicated
by the fact that many infrastructure
organisations also do frontline work.

such as improvements in knowledge,


confidence and skills. In some cases
the beneficiaries of infrastructure
organisations may be more accessible
for data gathering than those of
frontline organisations. However, most
interviewees thought there was no
agreement about the intended outcomes
of infrastructure as a whole. Where there
was agreement on intended outcomes
of infrastructure as a sector, this was
felt to be with high level outcomes:
the disagreement starts when we look
into the finer details. However, in some
subsectors, and perhaps particularly for
local infrastructure organisations, there
is greater agreement; for example the
NAVCA Performance Standards, which
were well consulted on, have brought
some agreement to the purpose of, and
services provided by, local infrastructure
organisations.

Defining infrastructure
Although one infrastructure interviewee
felt that ChangeUp has been helpful in
defining infrastructure, a shared definition
of infrastructure, beyond the very broad,
does not seem to exist. We found it hard
to find a definition for this research that
was both uncontroversial and practical
in application. One interviewee noted
that looking at the varied membership of
consortia across the country shows the
different definitions used in practice.

defining infrastructure organisations


is hard, perhaps in part due to the
confusion between defining the
functions of infrastructure, and defining
infrastructure organisations. Of course
infrastructure encompasses a hugely
varied group of organisations, but some
clarity about services delivered and
intended outcomes would be helpful.

Interview respondents from BIG, OTS,

Appendices

17

Capacitybuilders and a PCT all felt that


there was, to varying degrees, some
agreement within their organisations
about the purpose of infrastructure,
although for some of these there were
still improvements to be made. The
local authority respondent suggested
that there were likely to be different
perceptions about the purpose of
infrastructure across the authority
and that there were variable levels of
understanding about the VCS itself.
To a certain extent the lack of consensus
may not matter. One infrastructure
interviewee felt that there is sufficient
broad consensus about the purpose of
infrastructure, and added that diversity
of view is a good thing. Another felt that
something broad enough to include all of
infrastructure might be meaningless.
At the project level however, the lack
of agreed outcomes becomes more
problematic when trying to demonstrate
difference. One interviewee noted that
a lack of shared understanding of the
outcomes of infrastructure makes it hard
to demonstrate what differences have
been made. They noted:

if an organisation needs support with


the development of a strategic plan, how
do you then measure the effectiveness
of a strategic plan? What is a good
strategic plan?
Howells (2008) notes that it is unclear
whether infrastructure organisations
should be looking for improved services
from frontline organisations as a result
of infrastructure support. She argues
this might be problematic as the service
delivery is usually not what was advised
on; infrastructure may offer support
on funding or good governance, for
example.

18

The National Audit Office (NAO)


interviewee noted that the lack of
consensus matters to auditors: its easier
to evaluate something if it has clear
objectives and a plan to deliver
against that.
Multiple functions of infrastructure
There was a strong sense among
interviewees that the work of
infrastructure is some way different to
frontline work. The work of infrastructure
organisations was variously described
as intangible, elusive and nebulous.
However, some third tier infrastructure
organisations have gone a long way
towards clarifying the basic work of
infrastructure organisations. For example,
Volunteering England identifies six core
functions of a volunteer development
agency4 and NAVCA have identified five
core functions of a local infrastructure
organisation.5
However, within some broad areas
there may be difficulties. Two interview
respondents pointed out that the services
offered by infrastructure organisations
can vary widely, even when on paper
they seem to offer the same thing. One
gave the example of funding advice;
some infrastructure organisations will
provide information on funders, some
will do the funding application, and some
will support the frontline group in a more
holistic way.
At the level of monitoring, this can make
simple things difficult. Volunteering
England pointed out that comparing
output statistics across volunteer
centres was difficult. There is a lack of
shared understanding even on things
like number of enquiries. Is an enquiry
the number of people attending a
community event you speak at or

www.navca.org.uk/localvs/lio/guidance/corefunctions.htm
www.volunteering.org.uk/WhatWeDo/Local+and+Regional/Volunteer+Centre+Quality+Accreditation.htm

The nature of infrastructure

The problems of impact

The difference a project makes often isnt


seen until five or six years later funding
is normally limited to three years the
real impact is never measured, only the
immediate impact. Current funding
structures mean a lot of the differences
are never captured.

7
8

Working with other organisations


Working with other organisations brings
particular issues regarding impact. Some
infrastructure organisations find it very
hard to separate out their outcomes
from those of their clients. It could
help them to understand that their

Bibliography

Being a step removed from the


frontline may mean that infrastructure
organisations feel obliged to demonstrate
impact to a greater extent than frontline
organisations. Frontline organisations
may find outcome information sufficient,
whereas infrastructure organisations
may also need impact data to tell the
full story. Other writers have noted this
problem (Wing, 2004, and Connolly and
York, 2002), and many interviewees in
this research described the problem of
impact (although they may not have

As an infrastructure sector we have been


a bit led by the nose. The ChangeUp
initiative tells us what infrastructure
should look like; public and policy makers
tell us what infrastructure should look
like and actually what we should do as
a sector is defining what good quality
infrastructure should look like.

The timescale of infrastructure work


Infrastructure work often has intended
impacts that take many years to
achieve (see also Wing, 2004), and
this lag time needs to be understood
by those requesting information and
those trying to collect it. While some
outcomes may be demonstrable soon
after an intervention, impacts may take
a very long time to be seen. One survey
respondent explained that:

One infrastructure interviewee argued


strongly that it is important for the sector
to become more confident in defining
itself:

The need to be realistic


One interviewee was from an
infrastructure organisation that has done
a lot of work on outcomes and impact
and disentangling the two. Outcomes
that are so far down the line and so
away from what we are doing are not
a realistic measure of the impact of
infrastructure organisations. They gave
an example of a health project they
were involved in that aimed to reduce
infant mortality. This aim would not be a
realistic measure of their success; instead,
they are looking into changes in frontline
organisations.

The need to define from within


There was also some fear from
interviewees that unless infrastructure
organisations develop appropriate
standards, tools and frameworks
from within the sector, they might be
imposed. The interviewee from NAVCA
reflected that in part the impetus for
developing the NAVCA Performance
Standards and PERFORM was that the
sector felt that there was a chance
that standards might be imposed for
infrastructure if they did not do it for
themselves.

described it as such). The various issues


involved are outlined below.

someone you speak to face to face for an


hour or both?

Appendices

19

outcomes are seen as changes in the


frontline organisations they support. By
contrast, their impact includes changes
for communities as a result of frontline
work. By understanding this difference,
infrastructure organisations could be
supported to concentrate their efforts,
where appropriate, on assessing their
own outcomes and being confident
when presenting this data.
When trying to demonstrate their
impact, infrastructure organisations are
often reliant on their beneficiaries to
provide evidence of their own outcomes.
Several respondents noted that those
beneficiaries lack time and capacity
too and were not always forthcoming
with the data on their own outcomes.
One survey respondent noted that
consultation is not usually successful
because of our limited capacity and our
members own capacity!
It may be difficult to get precise data
from frontline organisations about their
client outcomes. However infrastructure
organisations could ask frontline
organisations, in surveys or interviews,
how services have been developed as
a result of infrastructure support, what
benefits there have been to clients and
whether these changes can be linked
to the intervention. This has been done
before, for example in Ellis and Latif
(2006) and by CES in evaluating its own
training in evaluation and quality.
Attribution and causality
To what extent can infrastructure
organisations show a causal link between
what they did and changes that
occurred, and how much of the change
is attributable to their work? One funder
commented that:

20

its very difficult to track the actual impact


and benefit [of infrastructure]There is a
hypothesis that [the benefit] then trickles
down to frontline and then that in turn
trickles down to Joe Bloggs in the street,
but demonstrating a causal link between
those three is, of course, difficult.
Another funder felt that causality was
harder for infrastructure organisations
because

its harder to come up with meaningful


counterfactuals for infrastructure. Its
easier with frontline to ask users what
would have happened anyway; they have
a more limited set of potential choices
available to them.
Frontline organisations are also likely
to seek support when they are already
galvanised by the desire to change;6 it is
hard to know how much change would
have happened anyway (see Ellis and Latif,
2006).
Issues of attribution deciding who or
what caused a change are difficult in
any field. However, for infrastructure
organisations, this may be especially
tricky. One respondent argued that
for infrastructure there may be more
noise, more variables in play. Levels
of partnership working complicate
the situation and make evaluation
more difficult. One survey respondent
commented that we can prove that we
were active partners in partnership work
but its hard to prove the difference that
our involvement had.
Attributing differences achieved may
be harder with particular infrastructure
services, like voice or lobbying work and
the production of resources (Howells,
2008), where there may be no clearly
identifiable direct beneficiaries. This is also

Infrastructure organisations of course may also play a key role in galvanising them for change and
promoting good practice.

The nature of infrastructure

2.2 Demonstrating
difference well
All the funders and the academic
interviewed were asked how they
would recognise an organisation
that demonstrated the difference it
makes well. The responses have been
aggregated below.

While it cannot be assumed that a good


infrastructure organisation necessarily
demonstrates its difference well, one that
does will:

have clear aims and objectives

have a clear theory of change, with


clear outcomes
be able to show evidence on outputs
and outcomes, as well as process
information on issues like partnership
working

Bibliography

While infrastructure organisations cannot


control what happens at the frontline,
they will still need to demonstrate some
differences made; delivering quality
services is unlikely to be sufficient to

What happens at the frontline is affected


by a wide range of things, including
other infrastructure organisations: there
is very little we do in isolation. The pace
of change at the project level may also
make it harder to pin down longer term
effects.

When you are working in a supportive


role, what matters is the delivery at the
frontline. So my effectiveness should be
judged according to the delivery at the
frontline of those I support. The reason
that is problematic is that I might be
giving fantastic support [But] you still
get rubbish at the frontline. You are
being judged on issues that you do not
fully control. And the converse is also
true if there is really good delivery at
the frontline, demonstrating that this
comes down to your support is really
difficult thing to show.

Infrastructure organisations need to


be able to set realistic targets about
outcomes achieved with frontline
organisations. As with frontline work,
not all beneficiaries of infrastructure
can be expected to implement changes
as a result of support. It would be
helpful to have a clear idea about how
many frontline organisations can be
expected to make changes as a result of
support received. A competitive funding
environment may of course mean that
infrastructure organisations are reluctant
to set low targets.

Ultimately an infrastructure organisation


cannot control what a frontline
organisation does with the support
received. One infrastructure interviewee
explained that:

convince many funders. As described in


the section on attribution above, looking
both at processes and intermediate
outcomes may be helpful here, to show
how and why some interventions work
and others do not.

Lack of control at the frontline

the case with much campaigning work.


Such work may be harder to evaluate, as
there may be no easily identifiable users
to ask about outcomes achieved. The
current focus on outcomes work may
have inadvertently made demonstrating
difference harder for such services.
Bringing in a focus on process and
intermediate outcomes may be helpful
here.

Appendices

21

have long term tracking systems


collect a mix of qualitative and
quantitative information
learn from findings, review past work,
ask difficult questions.
Two interview respondents also felt that
an infrastructure organisation with a
strong chief executive was more likely to
be able to demonstrate the difference

22

it makes well. One funder felt that this


was particularly the case for voice or
representation work. This is similar to the
finding of Macmillan (2007) who found
that, for the respondents in his research,
well regarded infrastructure organisations
were thought to be effective because they
were perceived to have a strong
chief executive.

The information collected by infrastructure organisations

3. The information collected by


infrastructure organisations

2
8
Bibliography

23

Appendices

A few respondents did make comments


about the high level of monitoring
required for their funders, not enough
resourcing for monitoring and evaluation,
or disproportionate demands in relation

Overall very few commented on having


to collect too much information for
funders. Most reported that meeting
funders requirements did not pose
any particular difficulty, perhaps in
part due to the requirements being
primarily about outputs. Interestingly, no
interviewees said they found reporting
on commissioned contracts any more
onerous. This echoes the findings of Ellis
(2008), in which 63 per cent (110) of the
infrastructure respondents to the survey
said they were doing about the right
amount of monitoring and evaluation, in
relation to their overall size and capacity,
10 per cent (18) too much and 27 per
cent (48) too little.

Infrastructure organisations in the survey


reported requiring a wide range of
information on their projects, on services
delivered, user types, outcomes and
impacts. They reported requiring a far
greater range of information about their
work for internal purposes than they
needed for their funders (see also 4.2
below).

Our minimal contract for advice work


requires us to meet MATRIX quality
standards. The onerous monitoring and
reporting requirements of virtually all
of our funders stretches our capacity
to deliver the services our community
needs.

3.1 Amount of information

There is evidence of good practice on


which to build, for example, almost all
respondents are doing some follow up of
users of their services.

to funding. Reporting on European


funding was mentioned by three
interviewees as very problematic, and
one mentioned corporate funders
as being much more demanding of
monitoring information. One survey
respondent noted that:

Respondents to the survey reported


wanting substantial information on
differences made for their own internal
use significantly more than their
funders require of them. While we
cannot be certain about how these
infrastructure organisations are defining
impact, it may be that some are asking
too much of themselves in terms of
impact data.

Most respondents in this research did not feel they were collecting too much
information for funders. For most of them, their funders primarily want
output information, but there is, as we would expect, evidence of a shift
towards outcomes.

3.2 Type of information


collected
While funders and infrastructure
organisations were almost equally
interested in information about outputs,
infrastructure organisations reported
that their funders were less interested in
information about user types, outcome
and impact.7 Barings, Yorkshire Forward
and BIG were mentioned by a few
respondents as funders who also wanted
outcome information.
Although funders wanted less information
than infrastructure organisations did

for themselves on differences made,


survey respondents still reported that
40 per cent of their funders wanted
impact information. However, almost
all infrastructure interview respondents
said their funders mostly wanted output
information only; for two respondents, this
was particularly the case with public funds.
The reason for the discrepancy between
interviewees and survey respondents is not
known, but it may be that some survey
respondents were not clear as to what
was meant by outcome and impact; with
interviewees it was possible to clarify this.

Table 1: Information required by infrastructure organisations and their


funders (as reported by infrastructure organisations)
Type of information

Required by

Services
delivered8

Types
of users

Impacts
longer term changes

Infrastructure
organisations

84% (75)

86% (75)

81% (71)

75% (57)

Funders

86% (73)

68% (58)

63% (54)

40% (33)

Local, regional and national


differences
Survey respondents from national
infrastructure organisations said they
needed impact data for internal use
more than regional or local infrastructure
organisations. It is not known whether
these national organisations are
understanding impact here as evidence
of changes for frontline clients or simply an
aggregation of infrastructure outcomes.

Across all services, 95 per cent (20) of


national organisations said they needed
impact data, as compared with 65 per
cent (15) of regional and 57 per cent
(54) of local. It must be noted that the
numbers of regional and national bodies
completing this question were low, so
the averages will not be very robust.

Impact was briefly defined in this survey as longer-term changes as opposed to outcomes which are direct changes
you make. However, respondents may not have shared this view.
8
This is averaged across service types. The percentages take into account those not providing the service.
7

24

Outcomes
direct changes made

The information collected by infrastructure organisations

Most funder respondents also reported


requiring primarily output data, or
only having recently begun asking for
information on differences made from
their funded infrastructure organisations.

Survey respondents were asked to report


on information required, by themselves
or their funders, across the following
infrastructure services:

Of the six funders interviewed, only two


(BIG and OTS) have been requesting
outcomes information from infrastructure
organisations for any length of time.
BIG state that they are explicitly funding
for longer term change changes for
communities. However they report that
at the project level, they are primarily
looking for infrastructure organisations
to report on outcomes in the frontline
organisations they support.

advice, support or consultancy

Capacitybuilders have just made a


public commitment to becoming an
outcomes-based funder. The RDA
interviewed reported rarely asking for
outcomes information from their funded
infrastructure organisations; they only
started to bring in an outcomes approach
to their work a couple of years ago.

Across these different infrastructure


service types, there were few differences
in information required. Both funders and
infrastructure organisations were slightly
less interested in data on resources;
most required information on resource
outputs, but were less interested in
who they went to and the differences
made. There appeared to be almost no
difference in the information required
on lobbying work as opposed to other
services. (For a detailed breakdown of
information required, see appendix 2).

resources
(written or online, including newsletters)
brokerage
(such as providing volunteers)
providing networking or peer learning
lobbying, representation or
campaigning

promoting or supporting partnership


work.

5
6
7

The role of evaluation


Three funders interviewed (the RDA, BIG
and Capacitybuilders) noted that some
of the work on impact, and to a certain
extent outcomes, would be done by their
own research or evaluation departments,
not through project monitoring. For
Capacitybuilders, this included the
aggregation of results to bring together
project level data into programme
wide findings.

8
Bibliography

One interviewee from a very large


national infrastructure organisation
also argued that the role of assessing

25

Appendices

The PCT interviewee reported that they


had not implemented an outcomes
approach but were planning to in the
near future. They are very interested
in impact but realise that it is hard
to demonstrate. They hope to focus
primarily on outcomes.

training or events

The local authority interviewee reported


that for the first time this year they had
moved towards an outcomes based
monitoring framework from the main
infrastructure organisation they fund.
They noted that in part the impetus for
this was the increased level of partnership
working in funding; the infrastructure
organisation is jointly funded by the PCT.
The respondent noted that an outcomes
focus can help clarify expectations in
partnership work.

Reporting on different service types

What funders want

their impact (by CES definition) was


the responsibility of their research and
communications departments. Of course
small or medium sized infrastructure
organisations would be unlikely to have
such specialist roles and may be at a
disadvantage here.
The role of trust and good
relationships
The relationship between funder and
funded, and informal feedback on
services delivered, were mentioned by
several respondents as important in
demonstrating the differences made by
infrastructure organisations.
Both the local authority funder and the
funder from a PCT said they found the
feedback from frontline organisations,
both directly and via the infrastructure
organisations themselves, an important
measure of the quality of infrastructure
support locally.
Infrastructure interviewees made similar
comments. One stated that they did
not need to evidence the difference
they make with the officers of the local
authorities, as they see and hear about
the benefits by working within the local
community. One talked of their funders
trusting them, another stated that the
strategic relationship they had with their
funders meant they were under less
pressure to evidence their work. It is likely
that this is more often the case with
grants than with contracts.
The PCT funder felt that the decision to
fund was often not about evidence:

theres not much point collecting


detailed information if you are just going
to carry on giving the grant anyway

26

because you need the service. If it isnt


used much one year youd still want
to commission it next year. Why count
things that arent going to affect your
commissioning decisions?
Even where infrastructure organisations
have a good relationship with the local
authority, they are still vulnerable if, for
example, the local authority changes its
leadership and/or contracting processes.

3.3 Methods for collecting data


Follow-up with users
Almost all survey respondents (92%,
97) reported doing some follow-up
activity with their users, of which about
half described this activity as formal.9
Regional and national organisations
were more likely to say they had formal
follow-up systems in place for capturing
differences made.
Twenty of those with formal systems
were using some form of questionnaire,
some after the intervention, some
implemented a few months later. Some
used an annual survey. One medium
sized CVS/Volunteer Centre described the
work they do:

We have an Outcomes Star which we


complete with organisations with whom
we are doing capacity building work, to
assess how they have progressed over
several months. We collect information
about representation work in an annual
survey. We ask participants to complete
evaluation forms after all training and
events and we follow up training with a
phone call several months later to see
what difference it made.

Analysis of the data reveals some differences in interpretation of this term.

The information collected by infrastructure organisations

scheduled in as part of the referral and


placement support.
Of those capturing follow-up information
informally, 15 said they collected
information in an ad hoc way, usually
when they met people as part of their
work. While collecting information in
this way is helpful if appropriately used
and reported, it is of course possible
that these workers were not recording it
systematically enough to be useful.

2
3

For a new brokerage service which is


targeted at volunteers with specific needs
for extra support, there is a formal followup process with both the volunteers and
the organisations with which they are
placed. For the volunteer this process
involves ongoing telephone contact over
six to 12 months and regular meetings;
for organisations follow-up contact is

Six respondents reported doing follow-up


interviews with users. One described a
new service they were undertaking:

4
5
6
7
8
Bibliography
Appendices

27

4. Challenges of demonstrating difference

A large number of infrastructure organisations in this research reported


that they could be demonstrating their difference better. Key challenges
respondents reported with demonstrating difference were:
lack of shared understanding of
infrastructure
the relevance and usefulness of
information collected for funders
difficulties in collecting data and the IT
needed to store and manipulate it
lack of capacity to collect information
identifying indicators.
It is worth remembering here that the
respondents in this research were all
self-selecting; they were also the ones
we could reach. As such, it is possible
that they are not representative of all
infrastructure organisations, and as a
group already have some interest in
evaluation and related issues.

4.1 How well they


demonstrate difference
Level of difficulty
Most survey respondents felt they could
improve the way they collect data on
their services and the differences they
make. Across all service types, about 20
per cent of survey respondents felt they
demonstrated the differences made by
their services very well, about 40 per
cent said they did it adequately, and

28

40 per cent reported struggling with


demonstrating difference.
National and regional organisations in
the survey were less likely to say they
had problems with demonstrating
differences across their services. Specialist
infrastructure organisations, in particular
those serving a specialist constituency,
were slightly more likely to say they
struggled with demonstrating differences
across their services.
This research did not ask respondents
about the quality of the data they collect.
However, Ellis (2008) found that at
least 70 per cent of the 90 funders she
surveyed found the quality of outcomes
data collected for them by VCOs to be
frequently or sometimes disappointing.
Service types
Survey respondents were asked how
well they demonstrated the differences
made by their services. Of the 105
respondents to this question, eight
said they demonstrated very well the
differences made by most or all of their
services, while about 40 per cent of
survey respondents said they struggled
with demonstrating the differences made
by half or more of their services.

#HALLENGESOFDEMONSTRATINGDIFFERENCE

ORGANISATIONS(OWEVER ITISNOTKNOWNTHE
EXTENTTOWHICHTHEYREALLYAREFOCUSINGON
DIFFERENCESMADE ASOPPOSEDTOOUTPUTS
#HARTBELOWGIVESADETAILEDBREAKDOWN
OFHOWWELLINFRASTRUCTUREORGANISATIONS
THINKTHEYAREDEMONSTRATINGDIFFERENCE
ACROSSDIFFERENTSERVICEAREAS

/VERALL RESPONDENTSDIDNOTFINDTHE
TYPEOFSERVICEAFFECTEDTHEDIFFICULTYIN
DEMONSTRATINGDIFFERENCE!SURPRISING
NUMBERSAIDTHATTHEYCOULDDEMONSTRATE
THEDIFFERENCEMADEBYLOBBYING
REPRESENTATIONORCAMPAIGNINGWORK
WELLORADEQUATELYWEHADASSUMEDTHIS
WOULDBEHARDERFORMANYINFRASTRUCTURE

#HART$EMONSTRATINGDIFFERENCEHOWEASYITIS
7EDEMONSTRATETHEDIFFERENCETHISMAKESVERYWELL
7EDEMONSTRATETHEDIFFERENCETHISMAKESADEQUATELY

7ESTRUGGLEWITHDEMONSTRATINGTHEDIFFERENCETHISMAKES

"ROKERAGEEG PROVIDINGVOLUNTEERS

!DVICE SUPPORTORCONSULTANCY

3ERVICEAREA

4RAININGOREVENTS

0ROMOTINGSUPPORTINGPARTNERSHIPWORK

,OBBYING REPRESENTATIONORCAMPAIGNING

0ROVIDINGNETWORKINGPEERLEARNING











2ESOURCESWRITTENORONLINEEG NEWSLETTERS













"IBLIOGRAPHY

0ERCENTAGEOFSURVEYRESPONDENTS

!PPENDICES



Survey respondents did report being


slightly better at demonstrating the
difference made by brokerage services
(for example providing volunteers) and
slightly worse at demonstrating the
differences made by resource provision.
Several interviewees said they felt that
evaluating voice and representation work
was hard, as was partnership working.
Macmillan (2006) found that within
the overall context of little information
about the effectiveness of infrastructure
services, there were particular gaps.
There was little evidence of effectiveness
in terms of promoting diversity, and little
about the role of linking policy makers
with the VCS.
Some of the community development
work done by Action with Community
in Rural Englands (ACRE) members
illustrates just how hard some
infrastructure work can be to evaluate. A
community consultation may show that
local people would like more affordable
housing. A rural housing enabler at the
local Rural Community Council (RCC)
may take on this work, and spend
several years lobbying, influencing and
encouraging a wide range of agencies
(such as the local authority, housing
department, housing developers) and
local people to get involved in the
project before the housing gets built. In
this case it can be hard to identify the
actual input by that worker, let alone
their share of the outcome achieved.
The achievements of that worker may
also not be recognised by the other
agencies involved. This work would be
hard even for an evaluator to evaluate;
by comparison, evaluating training or
other capacity building of individual
organisations may be much simpler.

30

4.2 Where there were


challenges
Interview and survey respondents
identified a range of things that made
demonstrating the difference they make
harder:
difficulties with the nature of the
information collected
demonstrating impact
data collection difficulties
Information Technology
lack of capacity
lack of mapping of frontline VCS.
Challenges with the nature of the
information
Relevance of information collected
for funders
Several respondents commented that the
information required by funders was not
always relevant to them. One explained
that we often have to let our funders
know that we have information that is
more pertinent.
One infrastructure interviewee reported
that their local authority does not share
their understanding of their intended
outcomes. The local authority sees the
infrastructure organisation as providing
a service to them, but the infrastructure
organisation feels it provides a service
to its membership. So the data
collected regularly by the infrastructure
organisation on its outcomes is not
always useful to the local authority.
Two interviewees noted that different
approaches to outcomes could be
problematic. One local infrastructure
organisation reported that their own
outcomes are at a strategic level, which
meant they could not use them for a

Challenges of demonstrating difference

Data collection challenges

Survey respondents identified a range


of aspects of data collection they found
difficult when demonstrating difference:
finding good ways to collect
information (59%, 60)

collecting good quality information


(56%, 57)
identifying indicators (47%, 48).
Chart 2 overleaf gives more detail about
areas of difficulty faced by infrastructure
organisations.

Strikingly, the most problematic


area identified by participants was in
collecting data on impact, which was
identified as a difficulty by 77 per cent
(73) of respondents. For 31 per cent
(32), clarifying intended differences,
and agreeing these with frontline
organisations, was also a problem.

Demonstrating impact and


outcome

Things have improved over the last few


years regarding outcome assessment. For
example, both NAVCA and ACRE report
that their members have got better at
monitoring at least some outcomes.
However, NAVCA added that although
there is considerable good practice, there
is variation throughout their membership;
outcomes work is still quite new to many
local infrastructure organisations. The
NAVCA respondent also expressed regret
that local infrastructure organisations
own work had not been a primary focus
of the National Outcomes Dissemination
Programme and National Outcomes
Programme, but added that those
local infrastructure organisations with
Outcomes Champions do understand the
situation better.

Aggregating data
Collecting and reporting data at project
level can result in information that is hard
to aggregate and may be underused.
Aggregation was reported as a problem
by a few respondents. An infrastructure
interviewee described as a weakness
in their work the failure to gather the
project level information together
internally, to get a picture of the overall
differences made by the organisation.
A survey respondent noted that the
mixture of tools and approaches used
across teams was also a problem: we
collect information but everyone uses
such different formats that it is not
helpful when put together.

lottery bid which required project level


outcomes. An infrastructure interviewee
noted that the information they need
internally is more holistic than the project
level data they collect for funders.

8
Bibliography
Appendices

31

#HART$EMONSTRATINGDIFFERENCEWHATSURVEYRESPONDENTS
FOUNDCHALLENGING
#OLLECTINGINFORMATIONONIMPACTLONGTERMCHANGE

&INDINGGOODWAYSTOCOLLECTINFORMATION

#OLLECTINGGOODQUALITYINFORMATION

)DENTIFYINGINDICATORS

!REAOFDIFFICULTY

#LARIFYINGWHATDIFFERENCESWEHOPETOMAKE

!GREEINGWITHFRONTLINEORGANISATIONSWHATDIFFERENCESWEAREWORKINGTOWARDS

)NADEQUATE)4SYSTEMS

!GREEINGWITHFUNDERSWHATDIFFERENCESWEPLANTOACHIEVE

,ACKOFSTAFFSKILLSINDATACOLLECTIONANDORANALYSIS

5SINGTHEFINDINGS

0ROCESSINGTHEINFORMATION

!NALYSINGDATA















0ERCENTAGEOFSURVEYRESPONDENTS









Challenges of demonstrating difference

For local infrastructure organisations


in rural areas, physical access to
respondents is harder due to the
distances covered.
Lack of mapping of frontline VCS
One interviewee argued strongly that
lack of good mapping of the VCS makes
it hard for infrastructure organisations to
assess the effectiveness of their reach,
or the extent to which they are reaching
diverse client groups. This may be an issue
for some infrastructure organisations;
Harker and Burkeman (2007) found that
some specialist frontline groups (such as
lesbian, gay, bisexual and transgendered
organisations) found it particularly
hard to access infrastructure support.
Macmillan (2006) also found that there
was a particular lack of evidence that
infrastructure organisations were effective
in promoting greater diversity.

7
8
Bibliography

Several felt that follow up was difficult


due to the resource constraints:

Volunteering England reported that


many of their members often only have
one member of staff and funding can
be very limited.

Its at the top of our agenda, but were


a small team, with not a lot of funding
and we are also facing funding issues
in March all this means that at a time
when proving our impact is critical we are
finding we dont have the time, resources
or know-how to do it. We are trying
though... we havent given up yet!

bassac reported that for their members,


their precarious existence means that
demonstrating difference is hard. They
are not adequately resourced and spend
much of their time fire fighting.

Lack of time and resources was a


common problem for survey respondents
in demonstrating the difference they
make. One explained that:

For certain subsectors, capacity may be a


particular issue.

Lack of capacity

As with other areas of the VCS, IT is a


problem for infrastructure organisations.
Ellis (2008) found that, of the 179
infrastructure organisations in her survey,
only nine per cent (16) had an IT based
system for storing their data; the rest
either had paper only or paper and IT
systems. Of those using IT, the vast
majority were using Microsoft Office
packages rather than anything specifically
tailored. One infrastructure interviewee
in this research commented that they
needed help in IT; NAVCA also reported
this would be useful to their members.

We are going to have to employ someone


to do the follow up as it is too time
consuming for our advisers to spend
time collecting this information instead
of providing the advice. One could
say that the follow-up could lead to a
new potential project and would give
satisfaction in closing the loop but in
reality it can mean five or six phone calls
or emails to try and reach the people.

Information Technology

Qualitative versus quantitative


Several respondents, both in the survey
and interviews, said that they felt their
work was better captured qualitatively
rather than quantitatively, especially
around differences made. This may in
part reflect a misunderstanding of the
role of numbers; it is possible to monitor
and quantify much outcomes data.

Appendices

33

4.3 Challenges faced by funders


For most of the funders interviewed,
their own lines of accountability had
little effect on the data they were asking
for from infrastructure organisations.
However, one of the difficulties faced
by the RDA interviewed was fitting the
infrastructure evidence they were getting
through evaluation into what they need
to report upwards. The Department
for Business Enterprise and Regulatory
Reform (BERR) is the central government
department to which RDAs report. BERR
is currently undertaking a national study
into the effectiveness of RDAs, for which
a lot of economic data is required, as well
as a lot of information on counterfactuals
(what would have happened anyway).

34

Finding evidence from infrastructure work


against these indicators is difficult.
It is likely that the funders and
commissioners of infrastructure
organisations, like those of the wider
VCS, are not getting the best out of the
data gathered for them by infrastructure
organisations. Ellis (2008) found that
funders frequently found the reporting
of outcomes inadequate, and often
over-simplistic and subjective. They
found the quality was variable and hard
to aggregate for programme wide
evaluations. The funders she interviewed
further reported that they were getting
more data than they could deal with,
often limited by their own systems (p viii).

Tools in use by infrastructure organisations

5. Tools in use by infrastructure


organisations

For simplicity, this report uses the


term tool to include a range of tools,
approaches and resources.

Bibliography

35

Appendices

Ellis (2008) has noted that some TSOs make no distinction between evaluation and quality assurance.

10

Survey respondents were given a list


of 12 evaluation and quality tools, and
asked to say whether they had heard
of them or if they were using them.
It was decided to include within this
list two tools that focus on quality
(PQASSO Practical Quality Assurance

Background to the tools

System for Small Organisations and


the ACRE quality standard) as we knew,
from previous experience, that these
would be mentioned anyway.10 However,
although PQASSO and ACRE both
have a results focus, they are generic
standards without specified outcomes
and outcomes areas, and as such do not
require information on the extent to which
outcomes are achieved. By contrast, the
NAVCA Performance Standards, although
described as a tool for quality, defines
outcomes and outcomes indicators for
local infrastructure organisations, against
which they are required to produce
evidence when applying for the Award.

5.1 Tools heard of and used

Despite the high level of awareness,


almost half the respondents were either
using no tools or only using quality tools
that do not clarify intended outcomes
and indicators. Some respondents were
not clear about the different purposes of
quality and evaluation tools. While it is
not necessary for an organisation to use
any particular tools to demonstrate the
difference they make, many organisations
do find them useful.

Knowledge of existing tools for demonstrating effectiveness was high among


survey respondents, although it was slightly lower for specialist infrastructure
organisations. Many were also using monitoring and evaluation tools (those
listed in the survey or others), and there has been a marked increase in the use
of economic tools. There was evidence that some infrastructure organisations
have to adopt inappropriate tools to meet funding requirements. Inappropriate
tools may not be a good fit to their organisational type, or not fit for purpose;
many have adopted quality tools where they need a more evaluative approach
to demonstrate differences made.

Tools heard of
Knowledge of the listed tools was high,
with most respondents being aware
of some or all of them. While it was
not possible to do a full analysis of
the comparative findings,11 specialist
infrastructure organisations (either by
function or constituency) were aware of
a smaller range of tools than generalist
organisations.
Two survey respondents had either
heard of none or only one of the tools.
Both of these were arguably unusual
organisations, perhaps out of the
usual networks of capacity building;
one distributes donated goods to
other charities, the other works with
professional orchestras. It is worth
restating that respondents were people
who CES could access reasonably easily.
Tools used

However, these figures hide important


differences. Almost half of the survey
respondents (44%, 51) reported using
none of the stated tools or only using
PQASSO (17 of the 51). Of those survey
respondents not using any of the listed
tools, or only using PQASSO, 22 were not
using any other tools or systems.12
Sixteen of those not using any of
the listed tools reported using their
own internal systems, although these
appeared to vary considerably in their
sophistication. Many reported using other
quality systems,13 indicating a common
confusion as to the function of quality
and evaluation tools. Non-listed tools
mentioned more than once were:
Investors In People (11)
Matrix (6)
Volunteering England
Quality Award (4)

Survey respondents reported using a


wide range of tools; this can be seen in
Chart 3 overleaf. The most common tools
used by the survey respondents were:

ISO 9000/1 (3)

PQASSO (54%, 63)

Rickter Scale (2).

CES' approach (37%, 43)


NAVCAs Performance
Standards (23%, 27)
NAVCAs Measuring Effectiveness
toolkit (18%, 21)
Volunteering Englands Volunteering
Impact Assessment Toolkit (VIAT)
(12%, 14)
only one organisation reported using
PERFORM.

Distance Travelled (2)


Community Legal Services
Quality Mark (2)

It is not essential for an organisation to


be using specific tools or approaches to
demonstrate the difference it makes.
However, some of those not using any
of the listed tools indicated that they did
not have alternatives that were working
well. Those not using the listed tools
gave the following reasons:
the tools are not appropriate (5)
confusion about what was
appropriate (3)
lack of capacity (4)
lack of internal will to do so (1).

This is due to the limitations of the survey tool we used when analyzing qualitative data.
Or did not know whether their organisation was using them.
13
These figures are consistent with a recent survey of NAVCA members that found that 80% had achieved at least one
quality standard, and a further 10% were working towards one (Escadale 2008).
11
12

36

4OOLSINUSEBYINFRASTRUCTUREORGANISATIONS

#HART5SEANDAWARENESSOFTOOLSAMONGSURVEYRESPONDENTS

)HAVENTHEARDOFTHIS

)HAVEHEARDOFTHIS BUTWEDONTUSEIT
7EUSETHISINOURORGANISATIONTOHELPDEMONSTRATETHEDIFFERENCEWEMAKE

01!33/

#%3APPROACH

.!6#!SQUALITYSTANDARDS

.!6#!S-EASURING%FFECTIVENESSTOOLKIT

6OLUNTEERING)MPACT!SSESSMENT4OOLKIT

32/)SOCIALRETURNONINVESTMENT

3OCIALACCOUNTINGANDAUDIT

%CONOMIC/UTCOMES4OOL

!#2%QUALITYSTANDARD

BASSACS#HANGE#HECK

NEFSIMPACTMAPPING

"IBLIOGRAPHY

0%2&/2-





















0ERCENTAGEOFSURVEYRESPONDENTS



!PPENDICES



Use of multiple tools is quite common. Of


the survey respondents, 27 per cent (31)
were using between three and six of the
listed tools.14

then we did the NAVCA one, then we


found out we had to have ISO 9000, and
something has to give. We have to make
a decision as to which ones add most
value in terms of achieving funding.

To some extent the use of several tools


makes sense; several tools fit together
and support each other. For example,
NAVCA reported in an interview for
Ellis 2008 research that it was very
obvious that organisations working with
outcomes found getting the NAVCA
Quality Award much easier. That CES has
had involvement in developing many
of the tools means they have the CES
approach in common and some are
compatible.

The RDA interviewed reported that they


use Customer First quality system, which
is specifically for organisations giving
business support. They realise it does not
fit infrastructure VCS very well, but feel
that it is such a small part of their work
they do not want to adopt a new system.
However, a researcher involved with a
CVS using Customer First reflected that
it led to the CVS focusing too much on
customer satisfaction data, and not on
outcomes.

Inappropriate use of tools

A respondent from ACRE explained that


their standards are just for their (38)
ACRE members. Because of this, a lot of
funders would not have heard of them,
which was problematic:

Use of multiple data collection tools

There was some evidence of


infrastructure organisations feeling
forced to adopt tools and systems
(mostly quality systems) that were not
appropriate to them, in order to get
funding. The tools in question were all
quality tools, which may reflect confusion
in the sector and its funders about
the difference between quality and
evaluation.
One respondent felt that the imposition
of inappropriate tools was particularly the
case with commissioned contracts. One
infrastructure interviewee reported on
their situation:

We have got a plethora of [quality


systems], because different funders have
required us to have different things.
So Yorkshire Forward require us to
have Customer First because we were
providing business support and we had
to have the Volunteering England quality
accreditation for the volunteer centre,

The true figure of those using multiple tool use is likely to be higher. The survey tool we used does not allow easy
analysis of the qualitative data alongside the quantitative. So it has not been possible to analyse in depth the number
using both listed and non listed tools.

14

38

More and more funders are asking for


ISO 9000 which is not appropriate
for the kind of organisations we work
with, but because they dont know the
different standards they are asking for
systems they recognise. What [our
members] report to us is that we can
pay for an ISO consultant to put all the
required systems in place, we hand over
the 3000 or 4000 and we have got
ISO 9000. But that does not make us a
good quality organisation, that is buying
something off the peg if you have got
the money you can buy it, and to our
members that is not what performance
improvement is about Its not having
the desired effect [of] making [an]
organisation think about what it does and
review how to make improvements and
embed those improvements.

Tools in use by infrastructure organisations

This tool was developed in 2006 (Moseley


et al). While an in-depth study of it was
not possible within the timeframe of the
research, it appears to be an adaptation
of an SROI approach. Users are asked
to clarify their outputs and assess their
economic outcomes in key areas.

Some attempt is made to deal with


potential bias by addressing attribution,
deadweight and displacement. The
model then requires external validation.

In Ellis research in 2008, for which


data was collected primarily in 2007,
no respondents said they were using
SROI.16 In this study, nine of the survey
respondents mentioned it. There
was no obvious difference in the
type of organisations using SROI and
those not; they represented a range

Economic Outcomes Tool


developed for Rural Community
Councils by Defra

SROI

of organisational types and sizes.17


Another survey respondent, the London
Community Recycling Network, reported
that they had supported some network
members in implementing SROI.

One interesting development is the


growth in the use of economic tools, in
particular social return on investment
(SROI) and Defras Economic outcomes
tool. This is surprising in many ways, not
least given that SROI UK has found that
SROI lends itself less well to the work of
infrastructure organisations.15 Although
it may be hard to monetise some
infrastructure work, certain infrastructure
outcomes would be relatively simply
and usefully monetised, like increased
volunteer hours or funding gained.

5.2 The growth of economic


tools

In the survey, 22 organisations had heard


of the Economic Outcomes Tool, and a
further four were using it; of these, three
were known to be RCCs.

6
7
8
Bibliography

16

39

Appendices

Jeremy Nicholls, personal communication November 2008.


They were not specifically asked about it as a tool, but none listed it when asked what tools they were using.
17
All were second (not third) tier. They comprised six local organisations, one regional and two national; three specialist and
six generalist; three CVS, one VCS network and one social enterprise network.
15

6. Good practice

6.1 Case examples of projects


with wider applicability
Voluntary Action Rotherham (VAR)18
VARs Policy and Performance team have
developed an outcomes framework, with
indicators, which is available on their
website;19 data collected against these
indicators since 2006 and trends over the
last few years are also available. It collects
data on both capacity building and some
voice and representation work. It could
have wider applicability beyond VAR.
The framework was developed over
six months, with a lot of stakeholder
involvement. The VAR staff member who
developed the outcomes framework had
CES training20 and had critical friend
input from Sheffield Hallam University.
VAR used PERFORM for the high level
outcomes, added some of the lower
level ones from the NAVCA Performance

VAR collect data against the indicators


using an annual member survey,
combined with regular customer
satisfaction surveys and a contact
management database. The
data collected goes to the senior
management and then to teams where
it is used to improve services. The Board
receive the data annually.
The framework is being used by one or
two other organisations, and was looked
at by The South Yorkshire ChangeUp
consortium (see below) when developing
their work.
VAR had a dedicated research officer in
post from 2002 to 2008. This worker
reported that the combination of
having this internal resource and some

This organisation was also a case study in Ellis 2008.


http://www.varotherham.org.uk/index.php?option=com_docman&task=doc_view&gid=280&Itemid=144
20
Advanced course in monitoring and evaluation.
18
19

40

Standards, then added a few local ones


of their own. They report that the two
systems fit well together.

Good practice

Suffolk ACREs Performance


Management System

It has been sold to nine other RCCs, two


regional consortia and Suffolk County
Council. National ACRE is planning to
use it to manage the Defra funding they
distribute to RCCs. Using the software
RCCs will be able to report directly to
ACRE on that funding.

7
8

6.2 Two to watch: work in


progress
BVSC

Bibliography

Both VAR and VAS are in South Yorkshire;


it is not clear why there has been so
much activity in the area. However, the
local RDA (Yorkshire Forward) reports
it has put 10 million into the area
over several years, and this investment
accounts for 90 per cent of the VCS

VAS are working with all their member


organisations to help them adopt it, and
hope it will be in practice in about a year.
They have had some interest from other
consortia, but have not yet decided how
it will be shared in future.

VAS are currently working on the


methods for collecting information.
They hope it will provide evidence across
South Yorkshire about the effectiveness
of infrastructure.

Suffolk ACRE has developed an IT system


for One Voice Suffolk, an infrastructure
consortium of 16 Suffolk agencies, using
Capacitybuilders funding. This bespoke
project monitoring system can be used
by infrastructure organisations, although
its applicability is wider. It is effectively a
project management system, recording
project aims and objectives, along with
contacts, partner agencies, income,
outputs, milestones and targets. It does
not record outcomes data, but the system
does allow users to input anticipated
outcomes and plans for assessing them.
It is not populated with indicators; each
organisation inserts their own.

Voluntary Action Sheffield received


Capacitybuilders funding to draft a
framework for the South Yorkshire
ChangeUp Consortium, to help its
members demonstrate effectiveness.
Developed in 2008 by Communities
and Organisations: Growth and Support
(COGS), it is based on PERFORM and
describes five high level outcomes,
broken down into 25 intended outcomes,
linked to indicators. The VAS framework
would also have applicability beyond the
consortium.

Voluntary Action Sheffield (VAS)

infrastructure they fund overall. The


interviewee from VAR felt that some of
the local activity was as an indirect effect
of the South Yorkshire Quality Project.
Based in VAS, this long-standing project
offers training, mentoring and support to
organisations in the implementation of
appropriate quality assurance systems as
a tool for organisational development.

people in strategic positions who


recognised early on the importance of
outcomes meant that VAR is relatively
advanced in its work on outcomes. The
research officer post is now redundant,
although it has clearly left a legacy. It is
as yet unclear what effect this lack of a
dedicated resource may have on VARs
performance management.

BVSC is in year two of a three-year


Baring Foundation funded project. It
aims to look at how BVSC identifies and

Appendices

41

measures its impact. The project also


aims to develop a framework that other
organisations will be able to use.
The process undertaken has started with
the long term impacts they want to bring
about, then working back to services that
might achieve this. They report that by
breaking down the work in this way, it
has helped each team to demonstrate
how they contribute to the wider work
of the organisation. That has had a
direct positive effect on staff motivation,
especially the team working within the
internal corporate services (such as
finance and administration).

42

St Helens CVS

Nuts and bolts is a four-year project, run


by St Helens CVS is in its second year. It
is funded by a four-year Basis grant from
BIG, and is looking at ways to improve the
effectiveness of infrastructure work across
Merseyside. One early example of an
outcome of this work is the centralisation
of client information across local
infrastructure organisations in Merseyside.
It includes a longitudinal study into the
effectiveness of infrastructure services
across Merseyside, with comparison
group, carried out by Liverpool University.

What would help respondents perspectives

7. What would help


respondents perspectives

training, consultancy and skills

developing a theory of change for


infrastructure work

learning from others


working with funders and
commissioners.

Survey respondents identified a range of


things about their organisations or their
funding that would help them better

Bibliography

7.1 Changes to their


organisation or its funding

Most respondents (68%, 69) said that


more funding for monitoring and
evaluation would be helpful. Ellis (2008)
also found that, of the 179 infrastructure
organisations in her survey, 26 per
cent (44) did not get any funding for
monitoring and evaluation from their
funders, and only 11 per cent (14)
reported getting all the costs of their
monitoring and evaluation covered by
funders.

These options are described in more


detail below.

Its in everyones contract and work


plans but that means it isnt as systematic
as it needs to be. [There are] some
benefits in doing this but a more
dedicated resource would be helpful.

streamlining, developing or mapping


existing tools

Ellis (2008) found that only 17 per cent


(30) of the infrastructure organisations
in her survey had a specialist monitoring
and evaluation post. For several
respondents in this survey, a dedicated
monitoring and evaluation worker would
be helpful. One explained that:

developing a common framework for


infrastructure work

demonstrate the difference they are making.


In common with other research findings
(such as Ellis, 2008, and Heady and Keen,
2008), the biggest issue was time (69%, 70);
one commented that were so busy doing
the work theres no time to evaluate it!

changes to their organisation or its


funding

Respondents in both survey and


interviews were asked what would
help them better demonstrate the
difference they make. A small number
of respondents said they did not need
any further help. The majority had
suggestions including:

While a few of the respondents in this research did not want any help in
demonstrating the differences made by their work, the majority outlined a
wide range of ways in which CES or similar organisations might be able to
help infrastructure organisations better demonstrate the difference
they make.

Appendices

43

Respondents also felt they would


benefit from:
more appropriate reporting
requirements from their funders
(45%, 46)
making sure they only collect what
they need (30%, 31)
a greater commitment to
demonstrating difference, as an
organisation (28%, 29).
Two interview respondents said that they
would like more standardised reporting
to funders; several others described
difficulties with reporting in different
ways to multiple funders.

7.2 Training, consultancy


and skills
Most survey respondents said that further
training would be helpful. Training
in impact (long term changes)21 was
requested by 57 per cent (58) of survey
respondents. Respondents also felt
other training might be helpful. They
mentioned training in:
data analysis (39%, 39)
data collection (36%, 36)
presenting and using
findings (32%, 32)
outcomes (30%, 30)
general monitoring and evaluation
(28%, 28).
A fifth (20%, 20) of survey respondents
did not want further training. Three of
these explained that they already had the
skills;

There is a fallacy among the great and


the good that we in the community are
helpless, uneducated and unskilled. In
many cases, however, we who work at
ground level are far better educated,

44

experienced and equipped (though


inadequately resourced) to deal with the
needs of our communities than those
who are making and implementing
policy.
One interviewee pointed out that, for
them, the issue was not about how
much data they collected, but the skills
and time needed to be able to use it:
its the difference between data and
information. The NAVCA respondent
suggested that short sessions for chief
officers on outcomes might be helpful,
and they are considering running some
such sessions themselves.
This research was unable to focus on
what organisations need to better
communicate their achievements,
although some respondents touched
on this. One commented that
infrastructure organisations might benefit
from increased skills and confidence
in presenting evidence of their
achievements.
Many respondents (37%, 37) also
mentioned the need for external support
in the form of advice or consultancy,
although not all were clear where to
find it.

7.3 A common framework


There were a number of requests for a
common framework for infrastructure;
three survey respondents specifically
commented on this (it was not a tickbox option) and eight interviewees
also felt this would be beneficial. It
was clear there were slightly different
interpretations of common framework;
some wanted a prescriptive evaluation
framework, others just wanted some
shared measures, and others talked of a
shared methodology.

We cannot be sure that all respondents interpreted impact in the same way we would at CES; however it was clear
from the qualitative responses that at least some did.

21

What would help respondents perspectives

A good theory of change would be


helpful to funders. One interviewee felt
that funders may see the work done
by frontline, but not understand the
infrastructure support behind it. The RDA
interviewed commented that:

If every organisation could have a


reasonably coherent logic model and
collect evidence against outputs and
outcomes and impacts it would help us
a lot thats the only way you can put
together a coherent picture of what
organisations are hoping to achieve. Its
a framework that can be applied in any
situation.

8
Bibliography

By contrast, being a funder only of


infrastructure, Capacitybuilders can
afford to have specialised evaluation
frameworks. They have different tiers of

One respondent felt that there is a lot


of basic early work to be done to help
infrastructure organisations articulate a
simple theory of change, as not many are
good at this. There might need to be a
series of theories of change, which are
subsector specific.

Infrastructure specific frameworks might


help funders too. The RDA interviewed
noted that infrastructure organisations
represent a very small subsection of the
VCS work they fund, so their evaluation
systems are not always well suited to
their needs. That infrastructure is such
a small part of their work may mean
they cannot justify a separate evaluation
framework for it.

Several respondents pointed to the lack


of a theory of change for infrastructure
work, and similarly Ellis (2008) found
that it is difficult for VCOs generally to
link activities to the changes achieved.
Other evaluators like Connolly and York
(2002) have noted the usefulness of such
an approach; a logic model can help
bring order to [evaluation] questions and
articulate the underlying assumptions of
capacity-building efforts (p37).

Frameworks for funders

7.4 A theory of change

It would be beneficial to have a common


narrative framework against which
[infrastructure organisations] can report
and compare themselves. Wed like to
be part of something like that, it would
sharpen our practice, and I think that
any serious infrastructure organisation
would want to be. It might be scary but
... we get a lot of money and we ought
to deserve itinfrastructure wont get
better until it asks itself hard questions
relative to performance.

monitoring intervention and outcomes


depending on the grants programme.
These range from very high level
outcomes to some very low level project
outcomes.

The NAO interviewee reflected that


forming a collective voice might help
infrastructure explain what it does. One
infrastructure interviewee also argued
that a common framework would have
narrative value, to help infrastructure
organisations explain why infrastructure
exists to other stakeholders. They added
that:

Four infrastructure interviewees said they


would like to be able to benchmark their
work against the achievements of others.
Other respondents had concerns about
the quality of some infrastructure work
and felt that benchmarking would be a
way to improve practice.

An explicit theory of change might


also help communication with the

Appendices

45

beneficiaries of infrastructure work,


by clarifying what changes would be
expected to follow from support. One
respondent noted that:

Organisations may not associate changes


made with advice received, especially if
the survey is either too soon after advice
(no changes made yet) or too long [after]
(changes made but no longer recognise
why).

7.5 Work on tools


More tools?
There are few tools specifically for
infrastructure organisations, and those
are often for specific membership
organisations. Using the CES definition
of impact, there are no tools that focus
specifically on assessing the impact of
infrastructure organisations. This may
be in part because there is no good
theory of change for the impact of
infrastructure work.
However, the majority of respondents
did not feel there was a gap in the tools
or approaches available to infrastructure
organisations; there was a strong sense of
tool fatigue. One felt that

the tools and approaches are there, what


is lacking [are] the skills and confidence
to use them and adapt them to meet
individual needs.
A few felt there were too many options
and one described the choice as
overwhelming. One commented that

the proliferation of tools isnt good,


especially if different funders demand
you use different tools.
A few respondents identified gaps,
six mentioning the need for a tool or

46

tools that better assess changes made.


Five felt that nothing quite worked for
infrastructure, with one explaining that

it is tricky to use the available tools to


measure the effects of infrastructure
organisations. We find a combination of
approaches, and some inventiveness, is
required.
Several respondents felt that despite the
plethora of tools, they had not yet found
the silver bullet. One survey respondent
summed up this ambivalence:

I understand that there are now over


200 impact measurement tools in the
sector... we havent yet found one that
really adds value to our evaluation.
Mapping tools
A number of respondents, in the survey
and interviews, felt that a map of the
overwhelming range of tools would be
helpful. A brief introduction of each tool
would be useful and mapping of one
against another, this map could also give
advice as to how useful they would be in
demonstrating difference.

7.6 Learning from others


Peer learning
A desire for peer learning was expressed
by about half of the respondents
in the research. Almost half of the
survey respondents (42%, 42) said
that networking or collective learning
would be helpful; one explained that its
always really useful to see the measures,
indicators and tools other organisations
are using. Six interviewees also said this
would have been helpful.
A third wanted peer support or mentors
(33%, 33); with a couple adding that
regionally based support would be useful.

What would help respondents perspectives

Case study examples

This work with funders could be done,


at least in part, by infrastructure
organisations themselves. Four
infrastructure interviewees reported
working with their funders, either formally
or informally, to help the funders develop
a more outcomes based approach.

7
8

Several respondents in the survey and


interviews felt that some funders and
commissioners could benefit from
increased understanding of how to

If funders were clearer about how to


demonstrate the differences made by
infrastructure, and about the range of
tools out there, they would also be well
placed to offer support to those they
give money to. Ellis (2008) describes how
effective support from funders to VCOs
can be in helping them demonstrate
the difference they make, although this
support is not given perhaps as often as
it could be.

7.7 Work with funders and


commissioners

However, it is likely that there are


numerous good examples already in
existence, beyond those identified in
this research. A respondent at NAVCA,
interviewed for Ellis 2008 research,
noted that they lacked the resources to
really share the good work done by some
of their members.

There is a debate about whether any of


the [tools listed in the survey] are well
enough recognised by funders to make
a significant difference externally to
an LIO. For tendering, contracting and
commissioning the perceived appropriate
standard is ISO 9000 but the suitability
and relevance of this to TSOs in general
is not well explored. Commissioning and
procurement officers do not appear to
have sufficient information to recognise
other systems more suitable to TSOs and
more widely used in the third sector.

A subsectoral approach may be useful


to generate new case studies. The ACRE
interviewee reported that it would be
helpful if someone could work with
a small number of rural community
councils to identify simple ways for them
to overcome some of the difficulties
they face.

demonstrate difference, the difficulties


involved and the tools already in
place. One survey respondent put this
eloquently:

Examples of where infrastructure


organisations have demonstrated their
difference well were asked for by a
few of the interviewees; one wanted
real examples of how others have
demonstrated difference, a rounded
story of the nuts and bolts stuff.

According to one survey respondent,


regional support may be provided soon
within the CVS network.

Bibliography
Appendices

47

8. Recommendations

Survey and interview respondents have identified a range of ways


infrastructure organisations might be better supported in demonstrating the
difference they make. This section discusses some of these options and
suggests how they might be taken forward.
8.1 Direct support to
infrastructure organisations
Training
Respondents indicated a clear desire for
more training. However, as Ellis (2008)
found, more specialised training on
impact and aspects of data collection
and management was most needed. In
particular it would be useful if training
could cover the following ideas on
impact:
The need to be realistic
Infrastructure organisations may be
overstretching themselves in their
desire to collect impact as opposed to
outcomes information. There is a need
for well-informed dialogue between
infrastructure organisations and their
funders and other stakeholders as to
whether, and if so why, it is worthwhile
to attempt to capture and report on
impacts instead of outcomes.
Increasing understanding of impact
Infrastructure organisations would
benefit from a better understanding
of the differences between impact
and outcome, the issues involved in
assessing them, and when measuring
impact is appropriate. Making
distinctions about levels of change,
and where change is seen, would be
sufficient.

48

How to collect impact data


Where it is appropriate for
infrastructure organisations to seek
impact data, it might be helpful to
show infrastructure organisations
how to get information on impact
from frontline organisations without
relying on their data, which can be
problematic.
It would also be useful if training could
bring in the following aspects:
A focus on process
Some of the more complex work
done by infrastructure organisations
arguably does not respond so well to
an outcomes approach, and might be
evaluated using a focus on process.
Process evaluation often involves a
detailed investigation into the way in
which services are delivered, how and
under what circumstances, and to
whom. Organisations choosing not to
access the services of the infrastructure
organisation and ex-users would also
be relevant, as would service quality,
what is working well and what could
be improved.
Developing their theories of change
Infrastructure organisations would
benefit from support in developing
a theory of change for their own
organisation, one that clarifies, at the

Recommendations

a map of existing tools


case studies.

make monitoring more effective


clarify how change happens
be useful in cases where it is hard to
demonstrate outcomes; infrastructure
organisations could monitor identified
processes or intermediate outcomes
instead.

Bibliography

theories of change for infrastructure


work

help infrastructure organisations clarify


what they do, to themselves, their
funders and other stakeholders

a bank of outcomes and indicators

Infrastructure organisations would benefit


from a theory of change for their work.
It is likely that there would need to be
several subsector specific theories of
change, and these could be presented as
a range of examples. These would:

It is not suggested that the sector


develop more tools for infrastructure
organisations, although the possibility in
future of resources for impact assessment
should be kept open. However, there are
four resources that would be of use to
infrastructure organisations that could be
developed. They are:

A theory of change for


infrastructure work

8.2 Developing resources

Infrastructure organisations would benefit


from opportunities to meet each other
and share learning and good practice in
demonstrating difference. As a starting
point, it might be useful to get together
some of those leading the way in
this work.

There has been a lot of good work in


this area the NAVCA Quality Award,
VIAT and PERFORM along with the more
recent work by VAR and VAS from
which it would be possible to draw
many outcomes and indicators for such
a bank. Macmillan (2008) also identifies
potential outcomes for infrastructure.
Some additional work might need to be
done to better capture the work of more
specialist infrastructure organisations and
those with a regional or national focus. It
is important that this work complements
other work being done on indicator
banks in the sector.

Supporting peer learning


and mentoring

A prescriptive framework for all


infrastructure is unlikely to be either
possible or desirable, but a bank of
core outcomes and indicators might be
helpful. It would be possible to create
a list of outcomes, some of which at
least would have relevance to most
infrastructure organisations.

Collecting data in useful ways


Training for infrastructure organisations
could include discussion of how to
collect project level data so that it can
be more easily aggregated to give
a picture of difference made by the
whole organisation.

A bank of outcomes and indicators

Links to national and local indicators


Some infrastructure organisations
may need to provide evidence against
national or local indicators, and it
would be useful for them to see how
to pull these requirements into their
evaluation frameworks.

minimum, the link between activities,


outputs and intended outcomes and
impacts.

Appendices

49

A map of existing tools


There are many tools available for
use by infrastructure organisations in
demonstrating the difference they make,
but many are unclear as to their purpose,
how they fit together and how to choose
the right tool for them. A map of
existing tools, focused on the needs of
infrastructure organisations, would help
them navigate the options. Some funders
may also find this useful.
This map should clarify the functional
difference between quality and
evaluation tools, so that infrastructure
organisations are not relying on quality
tools to enable them to demonstrate the
difference they make.
Case studies
Infrastructure organisations would find
it helpful to be able to access a range
of case studies of other infrastructure
organisations who demonstrate the
difference they make well. It would be
useful to draw together a range of these
including those outlined in this study,
combined with some exemplars supplied
by third tier organisations such as NAVCA.
These could be made available through
training and online.
The need for adequate funding for
developing resources
There is little point developing new
resources without adequate funding
to support them. Some large initiatives
have, in the past, lacked resources
to adequately disseminate the tools,
support their users and crucially carry
out follow-up to evaluate their efficacy
in practice. If such a resource is to be

50

developed, there must be adequate


resources for ongoing promotion,
support and evaluation.

8.3 Influencing the debate


Work with funders and
commissioners
It would be beneficial to consider working
to inform funders and commissioners
about:
the difference between impact and
outcome and why this is difficult for
infrastructure organisations
when it is appropriate to seek impact
data
the limitations of an outcomes
approach and the role of process in
evaluating complex initiatives
the range and usage of evaluation and
quality tools in the sector (this might
be achieved by the mapping of tools,
above).
Work with other research and
evaluation bodies
If some infrastructure organisations are
to focus on outcomes instead of impact,
or indeed process instead of outcomes,
it would be helpful to have research to
support this. Ellis (2008) has pointed to

the need for funded research evidence


to establish the predictive links between
preventative or intermediate and higherlevel outcomes. Once the link has been
shown, the TSO can produce data on
intermediate outcomes [or indeed
process], pointing to research evidence,
and the probability that the final
outcome they want will occur. (p45)

Recommendations

examining the extent to which


evidence of difference made plays a
part in the awarding of commissioning
contracts
the use of economic tools by
infrastructure organisations.

Several areas for further research include:

looking at the needs of infrastructure


organisations in communicating and
reporting on their achievements and
the differences they make

8.4 Further research

There may also be a need for better


mapping of the VCS, which infrastructure
organisations could then use to
benchmark their own work and provide
evidence of their own effectiveness in
reaching a diverse audience.

4
5
6
7
8
Bibliography
Appendices

51

Bibliography

Bibliography

1. A Stronger Society: Voluntary Action in


the 21st Century. Responsibility Agenda Policy
Green Paper No.5. (June 2008)
2. COGS (2005) PERFORM: Case Study Findings
from the PERFORM Pilot Organisations.
3. Connolly P and York P (2002) Evaluating
capacity-building efforts for nonprofit
organizations. OD Practitioner, Vol 34, no 4
4. Ellis J and Gregory T (2008) Accountability
and Learning: Developing Monitoring and
Evaluation in the Third Sector. CES.
5. Ellis J and Latif S (2006) Capacity Building
Black and Minority Ethnic Voluntary and
Community Organisations: An Evaluation of
CEMVOs London Pilot Programme. Joseph
Rowntree Foundation
6. Escadale W (2008) Infrastructure for the
Local Third Sector. NAVCA
7. Harker A and Burkeman S (2007) Building
Blocks: Developing Second-Tier Support for
Frontline Groups. City Parochial Foundation
8. Heady L and Keen S (2008) Turning the
Tables in England: Putting English Charities in
Control of Reporting. New Philanthropy Capital
9. Home Office (2004) ChangeUp: Capacity
Building and Infrastructure Framework for the
Voluntary and Community Sector, executive
summary, Home Office. (p17)
10. Howells (2008) Infrastructure Bodies and
Outcomes Policy. NAVCA Circulation Dec 2008/
Jan 2009
11. Macmillan R (2008) A Shape of Things to
Come: Reviewing County Durhams Voluntary
and Community Sector Infrastructure, Report for
the County Durham One Voice Network, May
2008, pp46-7

52

12. Macmillan R with Batty E, Goudie R, Morgan


G and Pearson S (2007), Building Effective
Local VCS Infrastructure: The Characteristics of
Successful Support for the Local Voluntary and
Community Sector. CRESR
13. Macmillan R (2006) A Rapid Evidence
Assessment of the Benefits of the Voluntary
and Community Sector Infrastructure, Centre
for Regional Economic and Social Research,
Sheffield Hallam University
14. Mistry S (2007) How Does One Voluntary
Organisation Engage With Multiple Stakeholder
Views of Effectiveness? Voluntary Sector
Working Paper 7, The Centre for Civil Society,
London School of Economics and Political
Science
15. Moseley M et al (2006) Rural Community
Value: Assessing the Impact of the Work of
Rural Community Councils. DEFRA
16. Needham J and Barclay J (2004) Voluntary
Sector Infrastructure Organisations, the
Availability of Funds in London Boroughs;
Infrastructure for Black and Minority Ethnic
Organisations in London and Mapping Voluntary
and Community Sector Networks. Government
Office for London
17. Office of the Third Sector (2008) Briefing for
Local Strategic Partnerships. NI 7: Environment
for A Thriving Third Sector. Cabinet Office
18. Wing (2004) Assessing the Effectiveness of
Capacity-Building Initiatives: Seven Issues for the
Field, Non Profit and Voluntary Sector
Quarterly, 33

Appendix 1

Appendix 1: About this report

The research questions

Data collection

Initial desk research

Nine interviews with key funders and


policy makers

Bibliography

Scoping interviews with a small


number of key people within the
sector to clarify the remit of the
research

When interpreting the data, it is


important to understand that all
respondents were self-selecting; they
have a particular interest in evaluation
and related topics. They were also
the people we had access to, who by
definition are often those more engaged
with evaluation issues. Those with the
capacity to respond may of course be
those more able to demonstrate the

Data for this research was collected in


the following ways:

Few differences were found regarding


answers and organisational type. Where
these were identified, they have been
noted in the text.

5. What support and tools are available of


relevance to this sector? What is their
nature and usage?

Where it is necessary to give the


response rate for a particular question,
it is written in parenthesis, for example,
(176). This means that 176 people
answered that particular question and
the percentage is based on this figure.

4. What support does the range of


infrastructure organisations need
to be able to meet their reporting
requirements?

Not every respondent gave responses to


every question. Unless otherwise stated,
the percentages given are always as a
percent of those who answered that
particular question, not of the whole
respondent group. To make this clearer,
absolute figures are also given with each
percentage.

3. What can we learn from recent


initiatives offering support to
infrastructure organisations in
demonstrating their effectiveness?

Understanding the data

2. Are the expectations of stakeholders


relevant and appropriate?

An online survey of organisations


defining themselves as infrastructure
organisations.

1. What do infrastructure organisations


need to demonstrate regarding their
effectiveness, both for organisational
development and accountability
requirements? Do different types of
infrastructure organisations (national,
local, generalist and specialist) have
different needs? How might these
requirements change in the future?

Nineteen interviews with infrastructure


organisations at all levels, including the
providers of a range of tools

Appendices

53

difference they make. Therefore the


extent to which respondents can be said
to be representative of infrastructure
generally is not known.22

2. from CES knowledge of key players in


this area
3. using a snowballing technique where
early interviewees recommended
subsequent ones as key informants.

Interview respondents
Twenty-eight interviews were carried
out, primarily by phone but a few face to
face. Nineteen of these interviews were
with infrastructure organisations, several
of which are partly or fully third tier.
Infrastructure interviewees represented a
wide range of local, national, large, small,
specialist and generic organisations.
Six funders were interviewed. They
included representatives from the Office
of the Third Sector (OTS), a Regional
Development Agency (RDA), the Big
Lottery (BIG), Capacitybuilders, a primary
care trust (PCT) and a local authority from
a different area. Despite attempts, it was
not possible to interview a charitable
trust. The National Audit Office (NAO), an
academic and an independent consultant
with experience in this area were also
interviewed.
Interviewees were chosen to represent
a range of funders and infrastructure
organisations. Individual contacts were
identified through a range of sources:
1. from previous Charities Evaluation
Services (CES) research where survey
respondents had been identified as
having particularly relevant things to say

119 people completed all or a


substantive part of the survey. To ensure
robustness of the data, of the 146
who logged onto the survey, 27 were
excluded. For a few, this was because we
were not convinced their organisation
was fulfilling an infrastructure function.
The rest were excluded because they had
not completed the survey beyond giving
brief details about their organisation.
Of those who completed the survey:
Most were local (56%, 66), but there
was good representation of regional
(24%, 29)23 and national (20%, 24).24
Most (57%, 67) described themselves
as both rural and urban; 32 per cent
(38) as mostly urban and 11 per cent
(13) as mostly rural.
The majority had 15 staff or less: 29 per
cent (34) had one to five paid staff, 32
per cent (38) had six to 15 paid staff
The majority (69%, 82) provided
generalist infrastructure support to any
community groups in their local area;
of these at least 50 were CVS.

We did not come across a good mapping of all infrastructure organisations against which
we could compare characteristics.
23
This includes those survey respondents ticking other who had a sub regional focus.
24
This includes one respondent whose organisation covers England and Wales.
22

54

Survey respondents

Appendix 1

They had mixed funding, with most


(64%, 76) having local authority

funding. BIG, Capacitybuilders, central


government, primary care trusts,
trusts and foundations and earned
income also accounted for a significant
amount of their funding.

Just over a fifth (22%, 26) specialised


in a limited number of specific services,
such as training only.

Almost a third (29%, 35) provided


services to specific types of organisations
(for example, BAMER organisations or
housing organisations only).

Commissioned contracts25 were


common; for 38 per cent (44) of
respondents they accounted for over a
quarter of their income, and for 15 per
cent it was over half of their income.

4
5
6
7
8
Bibliography

55

Appendices

Note that we cannot be certain that the respondents all shared the same understanding of commissioned contracts.

25

Appendix 2

Appendix 2: funders and infrastructure


organisations information requirements

Services delivered

Information on who
you work with

The outcomes of
your work (the
direct changes you
make)

The impacts of your


work (longer-term
changes)

For
internal

For
funders

For
internal

For
funders

For
internal

For
funders

For
internal

For
funders

Training or
events

87 (90%)

91 (90%)

87 (90%)

70 (69%)

86 (89%)

70 (69%)

68 (70%)

43 (43%)

Advice, support
or consultancy

85 (84%)

85 (84%)

88 (87%)

76 (75%)

85 (84%)

72 (71%)

65 (64%)

40 (40%)

Resources
(incl. written
or online, eg
newsletters)

84 (88%)

83 (90%)

76 (79%)

48 (52%)

66 (69%)

43 (47%)

56 (58%)

26 (28%)

Brokerage
(eg, providing
volunteers)

49 (89%)

50 (94%)

50 (91%)

39 (74%)

48 (87%)

38 (72%)

36 (65%)

24 (45%)

Providing
networking /
peer learning

78 (80%)

70 (83%)

74 (84%)

56 (67%)

70 (80%)

54 (64%)

54 (61%)

32 (38%)

Lobbying,
representation
or campaigning

63 (80%)

58 (77%)

64 (81%)

47 (63%)

62 (78%)

45 (60%)

57 (72%)

30 (40%)

Promoting /
supporting
partnership
work

80 (77%)

76 (84)

86 (90%)

69 (77%)

80 (83%)

55 (61%)

66 (69%)

38 (43%)

Service Area

56

Number requiring information on:

Вам также может понравиться