Вы находитесь на странице: 1из 18

Monitoring and evaluation in the third

sector
Findings of a survey to third sector organisations carried
out by Charities Evaluation Services in May and June 2007
Dr Jean Ellis
Lead Consultant
Charities Evaluation Services
July 2007

Contents
Introduction

1.

Who completed the survey?


1.1. Responses
1.2. Funding (n=679)
1.3. Organisational characteristics (n=510)
1.4. Location and geographical remit (n=509)

1
1
1
1
2

2.

What sort of reporting is being done?


2.1. Internal and external evaluations (n= 674)
2.2. Funders requirements (n=598)
2.3. Outputs, outcomes and impacts (n=604)

2
2
3
4

3.

3. What monitoring and evaluation approaches and methods are used?


3.1. Methodologies (n=571)
3.2. Involvement of users (n=565)
3.3. Evaluation approaches (n=555)

4
4
4
4

4.

Monitoring and evaluation capacity and resources


4.1. Amount of monitoring and evaluation (n=571)
4.2. Funding for monitoring and evaluation (n=461)
4.3. Who is doing it? (n=533)
4.4. Getting information about monitoring and evaluation (n=526)
4.5. Training for monitoring and evaluation (n=526)
4.6. Data management systems (n=525)

5
5
5
6
6
7
8

5.

Benefits of monitoring and evaluation

6.

Difficulties in carrying out monitoring and evaluation (n=500)

7.

Qualitative responses

10

Charities Evaluation Services

Introduction
This survey was carried out in May and June 2007 as part of a more comprehensive
research study into the development of monitoring and evaluation in the third sector
since Charities Evaluation Services was set up in 1990. The survey was widely
distributed across the UK through sector-wide and sub-sector and regional networks.
As well as this survey to third sector organisations, we gathered data from funders in
a parallel survey. We also carried out 117 face-to-face and telephone interviews,
reviewed a limited number of external evaluations and contextualised our field
research with a wide review of relevant literature. The research and the resultant
publications,1 including a policy briefing, were supported by the City Bridge Trust, the
Performance Hub, the City Parochial Foundation and Barclaycard.

1. Who completed the survey?


1.1. Responses
We had 682 responses in total from third sector organisations across the UK,
although response rates to different questions varied considerably.

1.2. Funding (n=679)

Organisations had a wide mix of funding, with just over half (52%) in receipt of
local authority funding, and 37% in receipt of central government funding.
Exactly half of respondents were in receipt of funding from a trust or foundation
and just over one-third from Lottery funders.
Nearly one-third received income from fees, charges and sales.
12% had only one funder, but most organisations had more than one funding
source.
Nearly half the organisations (48%) had been commissioned, with a contract to
deliver services, by a public sector agency.

1.3. Organisational characteristics (n=510)

40% of the responding organisations were established over 20 years ago, 26% 410 years ago, 19% 11-20 years ago and 14% three years ago or less.
Organisations with 6-25 employees were the largest single group of respondents
(42%). Nearly one-third of respondents were small organisations, either with no
employees (6%) or with 1-5 employees (26%). The remaining responses were
from larger organisations.

Ellis, J and Gregory, T (2008) Accountability and Learning: Developing monitoring and evaluation in
the third sector. Research Report, CES and Ellis, J (2008) Accountability and Learning: Developing
monitoring and evaluation in the third sector. Research Briefing, CES.
Charities Evaluation Services July 2007

44% of organisations defined themselves as community organisations, 11% as


BME organisations, 9% as refugee and 6% as faith organisations. 11% classified
themselves as rural.
Over half (59%) were recorded as providing services, such as care or counselling,
and almost as many (57%) as providing advice, information or advocacy. 29% of
the organisations were local infrastructure organisations and 8% acted as a
national infrastructure organisation. 21% were campaigning organisations.

1.4. Location and geographical remit (n=509)

Over half (53%) of organisations had a local remit, 43% a regional or national
one, and 4% worked internationally.
The organisations were located within the following countries and English region:

Country/ region

Percentage of responses

London

33.6

North West

10.8

Yorkshire and the Humber 8.1


East of England

7.7

South West

7.5

East Midlands

4.9

West Midlands

3.7

North East

3.3

Wales

5.9

Scotland

5.1

N. Ireland

1.2

2.

What sort of reporting is being done?

2.1. Internal and external evaluations (n= 674)


Nearly half the organisations produced self-evaluation reports. Over one-third had an
external evaluation in the past.

Charities Evaluation Services

Only 1.5% were recorded as not carrying out either monitoring or evaluation. 94%
said that funders required monitoring information or reports and 73% that funders
required evaluation reports.

47% produced self-evaluation reports, often as well as, but distinct from a
monitoring report2
38% of organisations had an external evaluation in the past
8% were currently having an external evaluation
13% had an external evaluation planned.

Respondents identifying themselves as projects rather than organisations were


slightly less likely to produce self-evaluation reports (41%) and to have had an
external evaluation in the past (28%). This may relate to the fact that they are more
newly established and/or because they are smaller, or part of a larger organisation.
Those respondents established three years ago or less were less likely to produce
self-evaluation reports (32%) or to have had an external evaluation (12%).
Organisations with no staff or with 1-5 staff less frequently reported having had an
external evaluation (8% and 29% respectively).
By contrast, larger organisations were more likely to report having had an external
evaluation, possibly reflecting the greater complexity of the organisation and greater
availability of resources (51% of those with 26-100 staff; 58% of those with 101-400
staff; 60% of those with 401-1000 staff).

2.2. Funders requirements (n=598)


Over three-quarters reported that different information was required by different
funders.

Half of the respondents reported that different information was required by some
of their funders.
Nearly 30% reported that all their funders wanted different information.
Two-thirds of respondents reported that funders monitoring and evaluation
requirements had become more demanding over the last five years.
The vast majority of respondents (83%) reported that they developed their
monitoring and evaluation to meet their own requirements as well as those of their
funders.
The percentage rose to 93% for organisations set up three years ago or less.

CES expects a self-evaluation report to contain a level of analysis and judgement in relation to
evaluation questions or criteria as distinct from a presentation of collated data.
Charities Evaluation Services July 2007

2.3. Outputs, outcomes and impacts (n=604)

A large majority of respondents were reporting on both outputs (90%) and


outcomes (83%). Nearly half (49%) said that they reported on impacts and 45.5%
reported on value for money.
Nearly all were reporting both quantitative (98%) and qualitative (95%)
information.

3. 3. What monitoring and evaluation approaches and


methods are used?
3.1. Methodologies (n=571)
A range of methodologies was reported by respondents.

Quantitative output monitoring was the most commonly reported (95%).


Participant feedback forms and questionnaires or surveys were frequently used
(81% and 83% respectively).
Case files and other record keeping and individual assessments were used by
approximately two-thirds of respondents, and interviews and focus groups by just
over half.
Audio-visual methods were less frequently used, but nevertheless reported by
18% of respondents.

3.2. Involvement of users (n=565)


In a significant minority of cases, users were proactively involved in the monitoring
and evaluation process.

Only 7% said that users were not involved in any way in monitoring and
evaluation. Most organisations collected feedback from users and 46% of
organisations consulted them on findings.
Users were also involved more proactively:
o Helping to define outcomes and indicators
40%
o Helping to decide what to monitoring and evaluate
20%
o Collecting information
19%
o Leading the evaluation
8%

3.3. Evaluation approaches (n=555)


The largest number, 69% of respondents, identified evaluation of achievement of
aims and objectives as an approach they used.

10% were unable to identify any listed approaches and 20% were not sure what
approaches were used.
Apart from an aims and objectives model, other approaches identified are shown
in the table below:
Charities Evaluation Services

Type of approach

Percentage of responses

Case study enquiry

27

Logical framework

11

Social accounting

Balanced Scorecard

Theory of change

Appreciative enquiry

Approaches were not used exclusively and frequently organisations referred to


several approaches as being used within their organisation.
There was no marked difference in the use of different approaches and models,
whether respondents defined themselves primarily as using monitoring and
evaluation to meet their own needs or doing what was necessary to meet funders
requirements.

Those organisations unable to identify with any of the approaches listed were
more likely to be smaller organisations.

4.

Monitoring and evaluation capacity and resources

4.1. Amount of monitoring and evaluation (n=571)


60% of organisations felt that they were doing the right amount of monitoring and
evaluation both for their size/capacity and in relation to their income.

However, 12% of respondents felt they were doing too much in relation to their
size and capacity and 18% felt that they were doing too much in relation to their
income.
29% of respondents felt that they were doing too little in relation to their size and
capacity and 23% felt that they were doing too little in relation to their income.

4.2. Funding for monitoring and evaluation (n=461)

Just over 30% of organisations did not receive funding to cover monitoring and
evaluation costs.

For 13% of respondents overall, the amount of funding to cover monitoring and
evaluation costs had increased and for 6% the amount of funding had decreased.
For others it had stayed the same or they were not sure.

Charities Evaluation Services July 2007

4.3. Who is doing it? (n=533)

Management posts were most likely to be those posts that had monitoring and
evaluation written in as a formal part of their job responsibilities.
In 10% of organisations, monitoring and evaluation were not a formal part of any
job responsibilities.
16% had a specialist monitoring and evaluation post while in nearly half of
organisations monitoring and evaluation was carried out by frontline
staff/volunteers (46%) and administrative staff (34%).

4.4. Getting information about monitoring and evaluation (n=526)

Training courses were most frequently reported as a way that organisations got
information about monitoring and evaluation (68%), although almost one-third had
received no such training.
Publications and the internet were also significant ways of getting information
(56%; 51% respectively).

Additional sources of information were as follows:


Sources of information

Percentage of responses

Funders

42

Consultants

24

Local infrastructure organisation

24

Local authority or government agency

24

National infrastructure organisation

21

Head office or network

14

5% did not identify any of the sources listed as providing information.


20% of respondent organisations provided information or support on monitoring
and evaluation to other organisations. Those defining themselves as projects
were a little more likely to provide this support (29%) as were organisations that
started three years ago or less (33%).
The 23 organisations without paid employees responding to this question drew on
local infrastructure support in line with the mean (26%), but were less likely to
have access to a national infrastructure organisation or head office (4% and 8%).
They were also less likely to use the internet or consultants as a source of
information and support.
Organisations with 6-25 staff were most likely to access support from a national
infrastructure organisation (27%).
Charities Evaluation Services

Generally, larger organisations had greater access to all forms of support than the
smaller ones. The use of consultants was more likely to be found in these groups,
as follows:
6-25 staff

21%

26-100 staff

31%

101-400 staff

38%

401-1000 staff

40%

1001 + staff

63%.

BME organisations cited training courses more frequently than the mean (79%),
but use of the internet less frequently (42%). Refugee organisations used the
internet and publications least (37% and 43%) but accessed other types of
support well. 27% of refugee organisations provided monitoring and evaluation
support to others.
Faith organisations (with 46 respondents) were least likely to rely on a local
authority for information and support, possibly reflecting their source of funding.
Those identifying themselves as community organisations drew rather more on
local infrastructure organisations (34%).
Rural organisations reported levels of access above the mean for all sources of
support except consultancy (11%).

4.5. Training for monitoring and evaluation (n=526)

Almost one-third of organisations had received no training in monitoring and


evaluation.
69% of organisations had received training in monitoring and evaluation.
31% had received no training.
Introduction to monitoring and evaluation and assessing/managing outcomes
were the most frequently reported training topics, with just over half covering the
topics as specific courses. For the rest, the topics were covered as part of broader
training.
The following regions recorded training as a means of getting information less
frequently, and below the mean:
East England

57%

South West

53%

South East

60%

Yorks and Humber

63%.

West Midlands and North East recorded training as means of getting information
considerably more frequently than the mean (84% and 88% respectively).

Charities Evaluation Services July 2007

A higher percentage of responses indicating that no training had been received


was found for:

Those identifying as projects

41%

Local offices

37%

Organisations with no staff

57%

Organisations with 1-5 staff

34%

Started three years ago or less

38%.

A relatively lower percentage of responses indicating that no training had been


received was found for BME and refugee groups (20% and 13% respectively).

4.6. Data management systems (n=525)

The large majority of respondents (80%) used a combination of paper-based and


IT-based systems.
8% of respondents used a paper-based system only, while 12% used an IT-based
system only. Of the 23 respondents with no staff, the percentage using a paperbased system only was highest (26%).
A small percentage used quantitative packages other than Microsoft Office (6%).
2% used qualitative software packages.
37% used off-the-shelf IT systems.
36% used customised off-the-shelf systems.
27% had a system designed and built for them.

5.

Benefits of monitoring and evaluation

509 organisations answered a question describing the main benefits derived from
monitoring and evaluation. Organisations were most likely to value being clear about
the benefits of their organisation, learning about what was working well and
improving the end result for beneficiaries.
The responses were as follows:
Benefit described

Percentage of responses

Being clear about the benefits of what we do

57

Learning about what is working well/

55

effective practices
Improving the end result for beneficiaries

47

Better services/strategic planning

45

Charities Evaluation Services

Improving the way we work

44

Telling others about our results

41

Competing for funding and resourcing

40

Improving reporting to funders

37

Focusing on shared aims and objectives

28

Improving our products and services

27

Providing information for quality assessments

21

Improving our record keeping

19

487 respondents described changes introduced as a result of monitoring and


evaluation.
Of these, the main changes introduced were as follows:
Changes introduced

Percentage of responses

Changes to the way things are done

81

Changes to services or products

65

Changes to what they expected to achieve

51

Changes to the way they defined the beneficiary


group

28

Changes to the policy and practice of other


organisations

20

6.

Difficulties in carrying out monitoring and evaluation


(n=500)

Three-quarters of the respondents identified insufficient time as causing


difficulties.
Other important practical organisational issues identified by just over one-third of
respondents were having enough money (39%) and having the right skills
(35%). 29% of respondents found difficulties with maintaining stakeholder
participation.
Those identifying as projects reported these difficulties slightly less frequently.
66% reported time as an issue, 33% money and 29% skills. 90% of local offices
reported insufficient time as a problem. Organisations with no staff cited lack of

Charities Evaluation Services July 2007

7.

money (60%) above the mean. For larger organisations, money and skills were
less important, but time remained a major difficulty.
Most frequently reported process difficulties were in finding appropriate or
manageable information collection methods (42%) and getting responses back
(48%). Those more frequently having difficulty with finding appropriate information
collection methods were those identified as BME groups (50%) or refugee groups
(50%).
Over one-third (39%) of respondents overall reported identifying outcomes and
impacts as a difficulty, 26% identifying indicators and 32% getting baseline
information as problematic. 55% of local offices reported identifying outcomes and
impacts as a difficulty. Larger organisations also registered difficulties with
identifying outcomes and impacts above the mean (54% of those organisations
with 101-400 staff and 47% of those with 401-1000 staff).

Qualitative responses

Responses also provided a substantial amount of qualitative data.

Have funders requirements changed during the last five years?


There were strong feelings expressed on this question. There was a sense from
some organisations that higher expectations were appropriate, and that organisations
should be more accountable.
More demanding but more relevant.
Requirements are mostly the same. Expectations and interest have risen a lot.
However, full cost recovery was not always reflected in the grant, which meant that
resources for monitoring and evaluation were tight. For others, it was felt that too
much time was spent justifying ourselves.
Funders requirements have become much more demanding to the point
where the monitoring requirements of our funders are dictating how we do all
of our work.
All become more demanding all want similar information but in different
ways.
There was also some frustration at lack of clarity and moving goalposts:
The requirements are subject to change each year as the DOH changes its
priorities. Outcomes based monitoring is beginning to come in. More
evaluation from service users is required.
Public sector commissioners do not seem to be able to decide what they
want.
This sense of changing requirements echoed through much of the survey.

Charities Evaluation Services

What changes have been brought about by commissioning


arrangements?
Some organisations reflected that requirements had become more prescriptive, while
for others it was the increase in the number of contracts that had increased the
workload.
I now have between 20 and 30 different monitoring reports to complete each
quarter. All our bids are based on full-cost recovery by FCR (EC Financial
Control Regulation) needs now to include a full-time monitoring officer.
We have 10 Service Level Agreements all similar from different health
trusts/localities but each one requires different information and some are
monthly, some quarterly and some annually. Very hard to keep track of.
We are expected to do additional monitoring to meet local/central government
targets. Impact of this is an increase in time and resources we have to allocate
to change data gathering mechanisms.
Specific funding streams had become more onerous, for example:
DFID PPA [partnership programme agreements] for 6 years. This replaces
Block Grants and is much more onerous in terms of reporting, monitoring,
outcomes etc.
For some organisations this was a welcome move:
A recent contract resulted in us needing to better evidence the learning
outcomes that result from our training activities. Whilst this has been an effort
to implement, we see benefits for our organisation and so have been happy to
implement this change. There has been good support from the commissioning
body to help us make these changes.
The projects provide additional data we have not yet had time to assess, that
will be of value in our work with policy makers and promoting best practice.
We are currently seeking funding for a researcher who can concentrate on
maximising the value of collected data. We have no spare staff capacity at
present.
The bulk of comments were about increased stringency, level and focus to the
monitoring, disproportionate amount of monitoring for one small part of the overall
organisations services, multiplicity of contracts or service level agreements from
different health trusts or localities, each requiring similar but different information, and
with different reporting schedules and timetables.
We are often required to complete very detailed reports for relatively small
contracts. Funding eg, 1,000. For a voluntary organisation with no paid
managers, this in itself can be threatening to our being able to deliver services
as a disproportionate amount of time is spent satisfying these requirements.

Charities Evaluation Services July 2007

11

What systems, toolkits or guidance have you used?


This question was answered by 174 organisations. Of these 44 mentioned CES:

Training courses and publications


Practical monitoring and evaluation
Website information
Outcomes training
First Steps booklets
Outcomes Champion training
Triangle method

Other specific tools and resources mentioned were:

NAVCA quality award

SAN masterclasses
NVCO sustainable funding outcomes case study
Your Project and its Outcomes BIG
LSC data and monitoring training
SOLE
Nef proving and improving
ABCD
Sport England Toolkit
Kirklees soft outcomes software
CASE management software (CAB)
BIG
Arts Council England
Performance Hub monitoring toolkit folder
Volunteer England Impact Toolkit
NCVO publications
Outcomes Star
Social Accounting Pack
Guide to MSC USAID 10 Steps to Results
Kings Fund guidance - note very little about funder guidance
Age Concern Evaluation Toolkit
CORE Clinical Outcomes for Routine Evaluation - national system for
monitoring counselling and psychotherapy.

Many of the respondents were using specialist support agencies, such as CES and
ESS for their publications and websites. Support from CES Outcomes Champions
was mentioned. Often these resources were used together with other systems or
toolkits referring to specific work areas. Examples quoted were NIACE, Sports
England and the Arts Council. Other sources included BIG Lottery material; nef and
use of social accounting and methodologies in nefs Prove It publication; Volunteer
Charities Evaluation Services

Englands Toolkit; the NAVCA outcomes-focused quality assurance system; and the
NCB Evaluation Toolkit.

What are the benefits to organisations of monitoring and evaluation?


Organisations provided some illustration of benefits gained from monitoring and
evaluation:
Clarity of definitions within funded programmes and criteria for assessing and
prioritising. More flexible respite service and wider range of support and
increased numbers of Young Carers supported. Revised quality surveys to
obtain outcome based feedback. Services focused to achieve each projects
outcomes.
More rigorous objectives. More consistent approach.
More responsive to participants interests and development needs in
developing future programmes.
Increased activity and numbers of volunteers and beneficiaries taking
advantage of wider scope of service provision through listening and
responding to needs of both.
Adjustments to services provided. Changed priorities for funding.
Everything is now outcomes-based. We have had to change the
organisational mindset as a result.
Monitoring and evaluation has enabled us to be more realistic about what we
can achieve with different people and the actual costs involved in achieving
the results required.
Ninety-nine respondents provided examples of changes that they had made as a
result of monitoring and evaluation. Of these, 9 described how they had used
information to rethink or develop promotional material, to market work differently or
rebrand and to use information to obtain funding. Seven talked about using
information either to change their own campaigning or public awareness approaches
or to influence public policy.
Twelve organisations talked about the benefits of feeding information back into
review and strategy:
We are now much clearer about who our various services and initiatives are
targeted at, Monitoring and evaluation has also enabled us to be more realistic
about what we can achieve with different people and the actual costs involved
in achieving the results required.
Monitoring and evaluation findings are used continuously to review our
programmes and development of new services.
Twenty-three organisations gave information about developing services.
The whole direction of the project has been changed over the past 4 years, as
Charities Evaluation Services July 2007

13

we have monitored and evaluated our work more effectively.


We shed direct services as a result of monitoring and evaluation our capacity
to maintain and develop them.
Targeting of under-represented groups developed outreach sessions.
When targets are not being met, monitoring and evaluation allows us to know,
and either adapt services to meet targets, or discuss changing targets with
funders.
More efficient work practices and improved systems were also cited as significant
changes.
More service-user led, work more closely with beneficiaries, changes to
campaigning and public awareness approaches.
It provides ongoing enhancement to my advice as an independent adviser in
wildlife conservation, especially by providing real life examples which can be
used in narrative form. Value from evaluation experience is most effectively
fixed; by the preparation of papers, tables, presentations etc.
Everything is now outcome-based we have had to change the organisational
mindset as a result.
All clients now have an individual outcomes profile and are tracked through
the organisation. The organisation has defined core outcomes and all funding
and new projects are designed with measurable and achievable outcomes.

Charities Evaluation Services


23 July 2007

Charities Evaluation Services

Charities Evaluation Services


4 Coldbath Square
London EC1R 5HL
020 7713 5722
www.ces-vol.org.uk
enquiries@ces-vol.org.uk

Вам также может понравиться