Вы находитесь на странице: 1из 28

Monitoring and Evaluation

Framework
Economic Development Department

October 2015
1|P a g e
Table of Contents
1. Introduction ............................................................................................................................................. 3
1.1 Contextual Background ................................................................................................................. 3
1.2. Project Brief....................................................................................................................................... 4
1.3M&E framework ................................................................................................................................. 4
2.Methodology ........................................................................................................................................... 4
3. Theoretical framework .........................................................................Error! Bookmark not defined.
4. Legislative framework ........................................................................................................................... 5
4.1 Policy Framework for the Government-wide Monitoring and Evaluation ............................ 5
4.2 Provincial –Wide Monitoring and Evaluation Framework (PWMEF) ....................................... 5
4.3 Monitoring and evaluation in the City of Cape Town Context .............................................. 6
5. Governance ........................................................................................................................................... 7
5.1 Defining monitoring ......................................................................................................................... 8
5.2 Defining evaluation ......................................................................................................................... 8
5.3 Objectives of M&E: .......................................................................................................................... 9
5.4 Relationship between Monitoring and Evaluation .................................................................... 9
6. Economic Development Department ............................................................................................... 9
6.1 Purpose and service mandate of Economic Development Department ........................... 9
7. Putting the M&E Framework into practice ...................................................................................... 10
7.1 Phase 1: Planning .......................................................................................................................... 11
7.2 Phase 2: Monitoring ....................................................................................................................... 18
7.3 Phase 3- Evaluation ....................................................................................................................... 20
7.4 Phase 4 - Evaluation report .......................................................................................................... 24
7.5 Phase 5 - Communicating and providing feedback of M&E to stakeholder .................... 24
8. Conclusion ............................................................................................................................................. 25
9. References ............................................................................................................................................ 26
10. Annexures .............................................................................................Error! Bookmark not defined.
10. 1 Annexure 1 ....................................................................................Error! Bookmark not defined.
10.2 Annexure 2 .....................................................................................Error! Bookmark not defined.
10.3 Annexure 3 .....................................................................................Error! Bookmark not defined.

2|P a g e
1. Introduction

1.1 Contextual Background


The National Framework for Local Economic Development (2006) articulates that the most
thorough analysis reveals that all economic development takes place at the local level. It
further claims that the only way national economies will achieve the goals set for it and
create a better life for all, is if Local Government influences the shape and direction of local
economies.

It is expected that local area development is demonstrated to community members and this
raises a challenge of local government having to provide proof of meaningful impact on
community members’ lives. To meet the challenge local public and private sector actors
must work together in order to create sustainable local economies and provide physical and
documented evidence of such developments.

In an attempt to create sustainable local economies the Economic Development


Department (EDD) was developed with a mandate to facilitate local economic
development. Furthermore, an Economic Growth Strategy (EGS, 2013) was generated to
identify the actions the City of Cape Town should take to maximise the benefits for City of
Cape Town community members. The EGS is conceptualised with the rationale that local
development cannot be done in isolation. Hence it positions Cape Town within the broader
international, national and regional economic trends and structures itself around the 5
following strategies:

 Building a globally competitive city through institutional and regulatory changes


 Providing the right basic service, transport and ICT infrastructure
 Utilising work and skills programmes to promote growth that is inclusive
 Leveraging trade and sector development functions to maximum advantage
 Ensuring that growth is environmentally sustainable in the long-term

The rationale of placing Cape Town in a broader economic context is to make Cape Town a
competitive city to enable it to address the challenges it faces.

The need for providing evidence of development gives rise to the development and
implementation of a Monitoring and Evaluation (M&E) system, and this document will
constitute as a framework for the above mentioned system.

3|P a g e
1.2. Project Brief
The EDD M&E project is aimed at facilitating the development of M&E mechanisms and
indicators for the Department as well as for the Economic Development projects. M & E
project outcomes should assist in assessing whether the Economic Development Department
is achieving its intended objectives or not, what are the areas of weakness are and what the
areas of strength are.

Specifically, the assignment was commissioned to achieve the following substantive


objectives:

• Development of Economic Development overarching M & E framework.


• Development of project specific M & E mechanisms to ensure that projects realize
their objectives and they advance broader EDD objectives.

1.3 M&E framework


This document serves as a ‘framework’ for improved monitoring and evaluation (M&E) within
the Economic Development Department. It provides the foundation for a common
understanding of key M&E principles and elements amongst all Economic development staff
and stakeholders.

2. Methodology
The methodology used in developing the M & E framework is a combination of desktop
information collection and benchmarking and work shopping of the project with EDD staff.
The project evolution has been guided by the project management team (PMT) which met
on weekly basis.

Below are project methodological tools\mechanisms that have been used to develop the
EDD M & E Framework:

• Literature Review: This formed the crucial base for the framework as it entailed getting
background information from similar initiatives for benchmarking purposes.
• Review of all EDD documents: This exercise aimed at establishing the basis upon
which the programmes, projects, and services was planned and implemented in the
Department.
• Consultation with EDD staff.

4|P a g e
3. Legislative framework

3.1 Policy Framework for the Government-wide Monitoring and Evaluation


The Policy Framework for the Government-wide Monitoring and Evaluation (GWMEF) is the
central point of reference for the South African government institutions in terms of monitoring
and evaluation principles, practices and standards to be used. The GWME provides a
framework to which government agencies should subscribe to when implementing systems
aimed at tracking performance of government programmes. It further provides guidelines for
assembling and reporting information on the performance of programmes of government
departments and other public bodies concerned with the aim to improve governance.

The objectives of this policy framework includes wanting to:

 Improved quality of performance information and analysis at programme level within


departments and municipalities (inputs, outputs and outcomes).
 Improved monitoring and evaluation of outcomes and impact across the whole of
government through, e.g. Government Programme of Action bi-monthly Report,
Annual Country Progress Report based on the national Indicator etc.
 Sectoral and thematic evaluation reports
 Improved monitoring and evaluation of provincial outcomes and impact in relation
to Provincial Growth and Development Plans
 Projects to improve M&E performance in selected institutions across government
 Capacity building initiatives to build capacity for M&E and foster a culture of
governance and decision-making which responds to M&E findings (Government-
wide monitoring and Evaluation system, 2007:7)

The GWMEF is based on the following principles:

• Monitoring and Evaluation should contribute to improved governance.

• Monitoring and Evaluation should be development oriented.

• Monitoring and Evaluation should be undertaken ethically and with integrity.

• Monitoring and Evaluation should be user-friendly and operationally effective.

• Monitoring and Evaluation should be methodologically sound.

3.2 Provincial –Wide Monitoring and Evaluation Framework (PWMEF)


This framework gives directive on how to collect, interpret, analyse and disseminate data
and information to key stakeholders that adds value to the performance management and

5|P a g e
decision-making processes of the Provincial Government (Department of the Premier,
2009:47).

In the Western Cape Province the PWMES provides ordinance around:


 The Development and implementation of Provincial-wide M&E policies, strategies and
programmes for M&E on implementation and results-based level.
 Compliance with the GWMES.
 Continuous provincial-wide M&E of the PSP and Provincial Strategies by focusing on
measuring the results on implementation and results-based levels.
This framework outlines 7 obligatory elements to ensure effective M&E systems at a
departmental level are successful. The key interdependent M&E elements are:

 Readiness Assessment and Stakeholder Engagements


 Overarching Frameworks for the PWMES
 Indicator Development Process
 Monitoring and Results Frameworks
 Data Management and Data Assessment
 Information Architecture
 PWMES Process – Planning to Implement and Sustain the PWMES

3.3 Monitoring and evaluation in the City of Cape Town Context


The City’s Performance Management Framework (Compliance) Policy (11 May 2011)
functions to give effect to the performance management system as prescribed by
legislation. It provides an overarching framework for the management of performance in the
City of Cape Town. This policy framework will provide the structure for the overall
management of performance within the City at both organisational and individual levels.

The policy prescribes that the Performance management system must include the following
components:

6|P a g e
Adapted from Performance Management Framework (Compliance) Policy
4. Governance
Public institutions constantly strive for more efficiency and effectiveness. Greater efficiency
and effectiveness come from:

 Compact strategic planning


 Performance management
 Analysis and identification of success factors contributing to service delivery and
 Innovation

Monitoring and Evaluation is an important tool which enables users to evaluate the links
between:

 Strategic priority choices


 The use of resources to achieve these objectives
 The quality of programme designed to implement them
 Measuring the outcomes and impact of projects on clients and communities

7|P a g e
Monitoring and Evaluation systems provide users with reliable evidence on which to base
their decisions to apportion spending and budget priorities. They help to analyse and identify
how important challenges should be dealt with identify lessons learned from programmes
and projects and provide learning for future programmes and projects implementation.

4.1 Defining monitoring


Monitoring is the continuous function that uses the systematic collection of data on specified
indicators to provide management and the main stakeholders of an on-going development
intervention with indications of the extent of progress and the achievement of objectives
and progress in the use of allocated funds (Kusek & Rist 2004:12).

Four functions of monitoring

 Compliance: Is the implementation process in line with legal and professional


standards?
 Auditing: Does allocated resources reach the intended beneficiaries?
 Accounting: Did the desired social and economic changes occur (over time)?
 Explanation: Are outcomes of a policy caused by the policy, or by other factors?

4.2 Defining evaluation


Evaluation is the identification of relevant standards of merit and worth; then some
investigation into the performance of evaluates, followed by the systematic and objective
assessment of on-going or completed projects, programmes, or policies, including its design,
implementation, and results. The aim is to determine the relevance and fulfilment of
objectives, development efficiency, effectiveness, impact and sustainability (Kusek & Rist
2004:12).

Functions of evaluation

 Reliable valid information on policy performance and satisfaction of needs and


values
 Clarification and critique of values as encapsulated in goals and objectives
 Support to other policy analysis tools, prescriptions, and problem structuring

Both monitoring and evaluation are geared towards learning from what you are doing and
how you are doing it, by focusing on:

 Efficiency tells you that the input into the work is appropriate in terms of the output.
 Effectiveness is a measure of the extent to which developments programme or
projects achieve the specific objectives it set.

8|P a g e
 Impact tells you whether or not your actions made a difference to the problem
situation you were trying to address (Kusek & Rist 2004:12).

From this it should be clear that monitoring and evaluation are best done when there has
been proper planning against which to assess progress and achievements.

4.3 Objectives of M&E:


 assist with the identification and selection of programmes and projects that have a
good chance to succeed
 to determine progress regarding selected social, economic, sectoral and national
development objectives
 to determine whether the project is implemented efficiently and reaches the
intended beneficiaries
 to make informed decisions about the allocation of funds
 assess the impact on wider developmental objectives (Rabie,2011:32-36)

4.4 Relationship between Monitoring and Evaluation


Monitoring Evaluation
Clarifies programme objective Analyses why intended results were or were not
achieved
Links activities and their resources to objectives Assesses specific causal contributions of activities to
results
Translates objectives into performance indicators Examine implementation process
and sets targets
Routinely collects data on these indicators, Explores unintended results
compares actual results with targets
Report progress to managers and alerts them to Provides lessons, high-lights significant
problems accomplishment or programme potential and offers
recommendation for improvements

5. Economic Development Department

5.1 Purpose and service mandate of Economic Development Department


The Economic Development Department’s (EDD) programme builds upon the Economic
Growth Strategy of the City of Cape Town. Programmes are aimed at positioning of the CCT
as a business - friendly destination by championing interventions that lead to of inclusive
local economic development. The EDD aims to do this through the provision of professional
economic development services that are based on sound analytical research and expert

9|P a g e
knowledge of economic development. It contributes to the City’s core business by leading,
advising, advocating, and facilitating implementation of programmes and partnerships to
support the city’s economic development agenda (Economic Development Department
business plan).

The Economic Development Department’s ultimate goal is to create an economically


enabling environment in which investment can grow and jobs can be created.

It aims to achieve this by:

 Creating an enabling environment to attract investment that generates economic


growth and job creation
 Leveraging the City’s assets to drive economic growth and sustainable development
 Maximizing the use of available funding and programmes for training and skills
development
 Provide and maintain economic and social infrastructure to ensure infrastructure led
growth and development (Economic Development Department business plan).

6. Putting the M&E Framework into practice


The chapter that follows addresses the question of how the M&E Framework should be
implemented in practice. The diagram below depicts the M&E cycle in the context of the
organisation.

Planning for M&E usually takes place concurrently to the strategic planning and project
planning. This adoption of M&E takes place over 5 phases if Kusek and Rist (2004) 10 steps to

10 | P a g e
a Results Based Monitoring and Evaluation Framework system is divided into phases. It is thus
suggested when plans are compiled and considered for the department, unit and projects
officials should consider how will it be known whether the goals or outcomes have been
achieved, how will action be kept on tract, and action be kept track of, and how will
corrective action be taken when needed. These considerations will inform the M&E planning
which is basically concerned with tracking, assessing and reviewing performance and
delivery. The five phases of M&E implementation is graphically demonstrated below.

Phase 5:
Communicating and
Phase 1: Planning for Phase 2: Conducting Phase 3: Conducting Phase 4: Reporting on
providing feedback in
M&E monitoring evaluations M&E findings
respect of M&E and
delivery

6.1 Phase 1: Planning


The first step of the M&E cycle is the planning process. The planning phase involves
establishing a shared theory of change, developing indicators, setting baselines, defining
targets, determining approaches to data collection and integrating the M&E planning into
business plans.

The theory of change will help managers to demonstrate a linear path of cause and effect
(Taplin and Clark, 2012). It will also position the departmental programme within a wider
analysis of how change will come about and help the Department in articulating its
understanding of how it intend change to occur. It will also challenges the developers to
explore the intervention further by considering the wider systems in which the policy exist and
the environment and actors that influence it

The development of the theory of change involves identifying what inputs are needed to
perform the specific activities required to produce certain outputs that will help the
Department achieve its outcomes and assist in reaching of the City’s goals.

11 | P a g e
Identify the Identify the Identify the Identify outputs Identify

Outcomes
Inputs

Activities

Outputs

Impact
resources activities that deliverables required to envisioned long-
needed for will result in viewed as achieve impacts term goal
activities desired output necessary to (Indicators, (Indicators,
(Indicators, (Indicators, achieve baselines, baselines,
baselines, baselines, outcomes targets) targets)
targets) targets) (Indicators,
baselines,
targets)

What do we want to How will we measure and


How will we know
Where are we at the achieve within each analyse delivery against the
we have achieved
moment? time period? defined targets?
our plans?

The various inputs required, for delivery on the defined activities (Step 5) Steps 1 and 2
generally relate to strategic planning (as reflected within the IDP and EGS), while steps 3, 4
and 5 tend to align more with the City’s ‘business planning’ and annual planning processes.

Step 1- Theory of change/ log frame

By nature, the theory of change process usually starts from impact to input. The identification
of the impact basically involves the identification of the envisaged long-term goals- what the
Department aims to change. These impacts then normally align with the IDPs goals.

As many factors influences goals and impact, like the policy environments, international
events, research, stakeholders and politicians, the identification of the envisioned impacts
should thus include :

 A status quo analysis, and research into the future vision for the City – with due
consideration challenges and opportunity faced, shaping forces, and dynamics
within other spheres of government;
 A problem analysis – to identify the gaps between the desired future, the current
state and the causal steps to support a movement to this future
 Testing the future vision and the desired impacts through participatory processes
 Refining the defined impacts, on the basis of thorough analysis of information
emerging from the research and stakeholder engagement process; and
 Recognising the need to review ‘impacts’ on a regular basis, ensuring continued
applicability in the context of a rapidly changing environment (Rabie, 2011: 120).

12 | P a g e
After the impact has been determined, it needs to be considered what we wish to achieve
by change the situation. This involves the identification of outcomes that will contribute to the
achievement of the goals or desired. Outcomes are usually positive present-tense
statements of the changed state, identified through an inclusive, participatory process – with
the underlying logic and assumptions reviewed, debated and through this process, jointly-
owned by all stakeholders (Rabie, 2011: 120).

Next project/programme managers should identify outputs that link to the outcomes.
Outputs are usually framed within the context of short and medium term delivery reflected in
documents like the SDBIB, business plans, unit plans and individual performance assessment.
When developing outputs:
 Always prioritize outputs
 Focus on what should be delivered, achieved, provided and produced (stated in
past tense)
Following the output development process, the activities/ tasks and jobs that should be done
to deliver the output should be determined. These activities should always be in present
tense, contain a verb and should align to both inputs and outputs (Rabie, 2011: 120)..

In this regards, inputs refer to resources that we use to do the work. These usually include
human resources, financial resources, skills, consensus amongst other things.

The next exercise will be bringing it all together in a log frame.

13 | P a g e
14 | P a g e
Step 2 - Indicators
In order to know whether and when we achieved our planned impacts, outcomes, outputs,
activities and inputs it’s essential to identify indicators that will enable assessment of these
things. Indicators are “the quantitative or qualitative variables that provide a simple and
reliable means to measure achievement, to reflect the changes connected to an
intervention, or to help assess the performance of an organisation against the stated
outcome” (Kusek & Rist, 2004:95).
Kusek and Risk further argues that indicators need to be:
 “Clear Precise and unambiguous
 Relevant Appropriate to the subject at hand
 Economic Available at a reasonable cost
 Adequate Provide a sufficient basis to assess performance
 Monitorable Amenable to independent validation” (Kusek & Rist, 2004:95).

Indicators should further encapsulate time, quality and quantity standards, be precise, be
responsive to programmes and be unaffected by change.

Step 3 - Baseline

The next step is to determine where exactly we are before the monitoring exercise begins.
This information can be collected in many ways.

Conversatio
ns with informal Focus Direct Field
concerned interviews groups census experiment
individiuals s

Community Participation Interviews Panel


interviews observation Observations surveys

Review of
Field official One time
Surveys
visits documen survey
ts

15 | P a g e
Step 4 – Setting targets

Following step 3 targets needs to be set. Targets here will provide the planned value against
which an indicator will be measured against at a specific time in the future. Thus the target
should encapsulate the specific number, time and location to be realised. Targets should be:

S - Simple, clear and understandable


M - Measurable, in terms of quantity and where possible, quality, money and time
A - Achievable and agreed
R - Realistic – within the control of the responsible parties, but challenging
T - Timely – to reflect current priorities; assessable within the defined reporting period (Rabie,
2011, 2011:96).

Step 5- Means of Verification/ Data collection

Next it would be ideal to determine the mechanisms through which progress against defined
targets is to be assessed, for both evaluation and monitoring. The means of verification will
tell us where we should obtain the data necessary to prove the objectives defined by the
indicator have been reached. The different data collection sources are graphically
illustrated below (Rabie, 2011:97).

Conversations with Review of official Participation


Community interviews Field visits
concerned individiuals documents observation

Key informantl
Focus groups Questionnaires One time survey census
interviews

Field experiments

Step 6- Integration

Following the completion of Steps 1 to 5, the M&E planning process comes to a conclusion
through the integration of all elements (indicators, baselines, targets and data collection/
MoVs) into various levels of integrated M&E plans. On completion of this planning phase
potential for improved delivery is increased. A log frame should house all these different
elements.

16 | P a g e
Logical Framework

Narrative summary Indicators Means of verification Risk/ assumptions

• Any external factors • Where we should obtain


• Long-term, effect on the incidence (e.g. reduction • Long-term, population level change.
which may adversely the data necessary to
in mortality due to influenza-like illness) of the
Impact • Can relate to a programme or affect the attainment of prove the objectives
disease or the effects on the population at large
organizations vision / mission statement the stated objectives. defined by the indicator
(e.g. population living longer/healthier)
has been reached
• Can relate to a program or organization vision /
mission statement
• Longer-term change in knowledge, • Any external factors • Where we should obtain
• Longer-term expected results related to changes in
Outcome
knowledge, attitude, and behaviour. attitude, behaviour, etc. which may adversely the data necessary to
• Outcomes usually give an indication whether
• Related to programme Goal affect the attainment of prove the objectives
program goals are being achieved
Example: Measure of change in quality of the stated objectives defined by the indicator
…. has been reached

Outputs • Immediate results from your activity • Where we should obtain


Immediate results from your activity, e.g.: • Any external factors
• people trained, services provided the data necessary to
• people trained which may adversely
Examples: # of people trained prove the objectives
• services provided affect the attainment of
# of trainings conducted defined by the indicator
the stated objectives
has been reached

Activities • What you do to accomplish your • Any external factors • Where we should obtain
• What you do to accomplish your objectives?
objectives? which may adversely the data necessary to
• What else do you do to accomplish these
objectives? Are there any sub-objectives that Example: Training affect the attainment of prove the objectives
should be measured?
the stated objectives defined by the indicator
has been reached
Inputs/
• Quantifiable resources going in to your • Any external factors
Resources
Quantifiable resources going in to your activities – the activities – the things you budget for. which may adversely
things you budget for. affect the attainment of
Examples: # of training manuals
the stated objectives
amount of money spent on
the training workshop

17 | P a g e
Conclusively, the key steps covered in the planning stage.

6.2 Phase 2: Monitoring


Monitoring is defined by the OECD as a “continuous function that uses the systematic
collection of data on specified indicators to provide management and the main
stakeholders of an on-going development intervention with indications of the extent of
progress and the achievement of objectives and progress in the use of allocated funds.”
(Rabie, 2011:97).

The primary focus of monitoring is the gathering, collating, inspecting and analysing of
information, in the context of indicators and short, medium and long-term targets.

Monitoring takes place over three (3) steps:

Step 1: Confirmation of monitoring tools and systems

This step firstly involves identifying the most appropriate tools through which monitoring-
related (or evaluation-related) information will be gathered and analysed. The choice and
confirmation of monitoring tools is usually directly related to the level of the outcomes
approach being assessed, audience and level of detail to collect. Monitoring tools is usually
divided amongst three categories. Examples of monitoring tools across the categories are
illustrated below.

18 | P a g e
Monitoring tools should in most cases be supported by a monitoring system through which
indicators; baseline information, data and analysis can be stored, maintained and readily
accessed (Rabie, 2011:100). In setting this system up developers should consider
 What data will be collected? (i.e. source)
 How often will data be collected? (i.e. frequency)
 How will data be collected? (i.e. methodology)
 Who will collect the data?
 Who will report on the data?
 For whom is data collected?
After the supporting monitoring system is established a manager of the system should be
identified to ensure the system managed, maintained and the data kept on it is credible.

Step 2: Gathering and collation of information


In this process, focus is to be placed on data that is relevant, accessible, timely,
understandable and accurate. Understanding how the data will be used impacts directly on
the nature of information collected.

Step 3: Analysis of information


The process in analysis information for monitoring purposes is demonstrated in the graph
below.

Review the Ensure data is Establish a Organise the Document the


Focus on
indicators collected with structure for data within the findings, and
patterns, varied
identified for these indicators the analysis – context of this establish
forms of
the monitoring in mind (i.e. e.g. in terms of structure, in conclusions and
interpretation
or evaluation data is concerns, ideas preparation for recommendatio
or trends
process; relevant); or themes; analysis; ns

Data analysis is applicable in respect of all aspects of M&E. Regular analysis of


implementation data as part of a monitoring process may assist in improving performance
during the delivery of outputs and associated activities and also allow for the identification of
trends, challenges, risks and areas of success.

19 | P a g e
Step 4 - Reporting on findings

The M&E Framework will only be of value if findings are reported on and put into action,
where necessary. It should also noted that there are a set of pre-defined reporting
mechanisms in place within the City, many of which are legislated – while others represent
good practice that has evolved within the City over time.

Monitoring reports such as the quarterly review report assists in building an understanding of
progress and delivery in the context of business plans and the SDBIP, thereby ensuring on-
going strategy-aligned implementation.

During this process, project/programme managers should decide on types of reporting,


audience, purpose, format and frequency – thereby assuring that results are aligned with
their intended uses.

6.3 Phase 3- Evaluation


Evaluations are periodic and seek to see what has been achieved in projects and programs,
while trying to understand why. Evaluation focuses on outcomes and impacts further
investigates monitoring information. It assesses overall performance, focusing on positive or
negative changes in beneficiary behaviour or status occurring as a result of an intervention.
Evaluation is conceptualised across three (3) steps.

Purposes range from efficiency, effectiveness, relevance, impact or sustainability. The


graphic below illustrates when these purposes are pursued.

Adapted from DPME Evaluation Guideline

20 | P a g e
There are 6 types of evaluations. The table below host the types of evaluation.

Type of evaluations Covers Scheduling Examples of evaluation


methodology

At key stages prior to design  Formal surveys


This is preparatory research) to ascertain the current situation or re-planning  Stakeholder analysis
prior to an intervention and to inform intervention design. It  Secondary data – e.g.
identifies what is already known about the issues at hand, the statistical analyses; interviews;
Diagnostic Evaluation problems and opportunities to be addressed, causes and focus groups; literature reviews
consequence, including those that the intervention is unlikely to
deliver, and so the likely effectiveness of different policy options.
This enables you to draw up the theory of change before you
design the intervention.

After an intervention has  Quantitative statistics (e.g.


Used to analyse the theory of change, inner logic and been designed, in first year, community survey; household
consistency of the programme, either before a programme and possibly later survey)
starts, or during implementation to see whether the theory of  Qualitative methods such as
change appears to be working. This is quick to do and uses only semi-structured and structured
secondary information and should be used for all new interviews, observation
Design evaluation programmes. It should check that the outcomes chain records, field notes, and focus
culminates in impacts that address the main situation that gave groups transcripts
rise to the intervention, even if the intervention won’t be held
fully accountable for these ultimate outcomes. It also assesses
the quality of the indicators and the assumptions.

Aims to evaluate whether an intervention’s operational Once or several times  Secondary data – e.g.
mechanisms support achievement or not and understand why. during the intervention statistical analyses; interviews;
Looks at activities, outputs, and outcomes, use of resources and focus groups discussions;
Implementation the causal links. It builds on existing monitoring systems, and is direct observation; literature
evaluation applied during programme operation to improve the efficiency reviews
and efficacy of operational processes. It also assesses the  Field work – e.g. participant

21 | P a g e
quality of the indicators and assumptions. This can be rapid observation; data collection,
primarily using secondary data or in-depth with extensive field and survey research
work.

 Seeks to measure changes in outcomes and the well-being Designed early on,  Quasi-experimental design
of the baseline implemented with before and after
 Designed early on, baseline implemented early, impact early, impact checked at comparisons of project and
checked at key stages e.g. 3/5 years key stages e.g. 3/5 years control populations
 The target population that is attributable to a specific  Ex-post comparison of project
intervention. Its purpose is to inform high-level officials on the and non-equivalent control
Impact evaluation extent to which an intervention should be continued or not, group
and if there are any potential modifications needed. This
kind of evaluation is implemented on a case-by-case basis.

At any stage  Cost-effectiveness analysis


 Economic evaluation considers whether the costs of a policy  Cost-benefit analysis
or programme have been outweighed by the benefits.
Types of economic evaluation include:
 Cost-effectiveness analysis (CEA), which values the costs of
implementing and delivering the policy, and relates this
amount to the total quantity of outcome generated, to
Economic evaluation produce a “cost per unit of outcome” estimate (e.g. cost
per additional individual placed in employment); and
 Cost-benefit analysis (CBA), which goes further than CEA in
placing a monetary value on the changes in outcomes as
well (e.g. the value of placing an additional individual in
employment).
 Synthesising the results of a range of evaluations to After a number of  Annual report on evaluation
generalise findings across government e.g. a function such evaluations are completed findings across the City –
Evaluation synthesis as supply chain, a sector, or a cross-cutting issue such as synthesising all evaluations
capacity.

22 | P a g e
The evaluation process would involve:

•Determine key indicators for the evaluation process


Step 1

•Collect information around the indcators


Step 2

•Develop a structure for your analysis, basis on your intuitive understanding of emerging theme and corcerns, and
Step 3 where you suspect there have variations from what you had hoped and/or expected

•Go through your data, organising it under the theme and concerns.
Step 4

•Identify patterns, trends, possible interpretations.


Step 5

•Write up your finding and conclusions. Work out possible way forward (recommendations)
Step 6

Step 6 is specially aligned to phases 4.

23 | P a g e
6.4 Phase 4 - Evaluation report

Evaluation reports are time-specific analyses of commitments delivered. Evaluation


reports is carried out to validate what was actually achieved in relation to the
planned outcome. These reports communicate why something is or is not
happening. The Mid-Term Performance Assessment Report and the City’s annual
reports are examples of evaluation reports.

6.5 Phase 5 - Communicating and providing feedback of M&E to stakeholder

This phase also involves communicating the M&E findings to the City’s wider range of
stakeholders. To maximise the extent of this exercise it needs to be determine
beforehand, which stakeholders should be kept up to date, how often, in what
format and the frequency. Officials can consult the draft stakeholder engagement
framework to determine the before mentioned. A communication strategy aligned
to the M&E Plan will assist in ensuring follow-through in this regard. In this way, the
credibility of the system and those who manage it will be supported. Stakeholders
will also be afforded the opportunity to gain a real understanding and appreciation
of the efforts, achievements and challenges faced by the City.

The value of an M&E Framework is only fully apprehended when analysis,


evaluations and findings are applied back to operations to support performance
improvements. A flourishing M&E ethos requires all officials understanding and
actively participating in the M&E system. The information generated in the M&E
system can be used to:

 “To demonstrate accountability—delivering on political promises made to


citizenry and other stakeholders
 To convince—using evidence from findings
 To educate—reporting findings to help organizational learning
 To explore and investigate—seeing what works, what does not, and why
 To document—recording and creating an institutional memory
 To involve—engaging stakeholders through a participatory process

24 | P a g e
 To gain support—demonstrating results to help gain support among
stakeholders
 To promote understanding—reporting results to enhance understanding of
projects, programs, and policies.” (Kusek & Rist, 2004: 130).

7. Conclusion
This M&E Framework aimed at creating a shared understanding of the realm within
which M&E occurs, clarifying concepts, exploring reporting processes, systems and
tools of M&E. Good practice principles in relation to M&E systems have been
explored, along with accepted practices in terms of M&E implementation.
Ultimately, this framework is primarily concerned with ensuring a real focus on the
delivery of the Economic development long-term outcomes – through short and
medium-term planning, delivery, monitoring and evaluation mechanisms, and the
associated reporting.

If the M&E framework and system is implemented correctly it would enable the unit,
programme and project managers to:
 Have an on-going picture of progress
 Use resources efficiently
 Plan workflow
 Identify problems, solutions and opportunities
 Have archived records of events
 Motivate staff by illustrating purpose of work
 Establish baselines
 Provide information for decisions
 Review causes of a problem
 Assist in deciding amongst alternatives
 Build consensus on the causes and responses to a problem
 Identify unintended results

25 | P a g e
References
 City of Cape Town. 2013. Economic Growth Strategy. Available at:
https://www.capetown.gov.za/en/IDP/Documents/EconomicGrowthStrategy.pdf.
Accessed on 20 February 2015.
 City of Cape Town. 2011. Performance Management Framework (Compliance).
Available at:
https://www.capetown.gov.za/en/Policies/All%20Policies/Performance%20Managem
ent%20Framework%20(Compliance)%20Policy%20-
%20approved%20on%2011%20May%202011.pdf. Accessed on 09 May 2015.
 Department of the Premier. 2009. Provincial-wide Monitoring and Evaluation
Strategy. Available at: https://www.westerncape.gov.za/other/2010/4/the_provincial-
wide_monitoring_and_evaluation_system_provincial
wide_monitoring_and_evaluation_framework_2009.pdf. Accessed on 07 April 2015.
 Drucker, P. 1954. The Practice of Management. New York: Harper.
 Kusek, J.Z. & Rist, R.C. 2004. Ten steps to a results-based monitoring and
evaluation system. Washington D.C.: The World Bank.
 National Humanities Centre. 2005. Frederick Wilson Taylor, The Principles of
Scientific Management, 1911. Available at:
https://www.google.co.za/search?client=ms-android-om-
lge&site=webhp&source=hp&ei=qaAhVfa-
B4v0UrSWg7gI&q=Monitoring+and+evaluation+report&oq=Monitoring+and+evaluati
on+report&gs_l=mobile-gws-
hp.3..0l5.2138.17057.0.22591.26.25.1.10.10.8.2524.17411.3-
11j4j4j1j2j2j1.25.0.msedr...0...1c.1.64.mobile-gws-hp..2.24.12767.3.lk-uOp-
QCbs#q=scientific+management+theory+pdf
 Rabie,B. 2011. Improving the systematic evaluation of local economic development
results in South African local government. PhD. Stellenboschh University.
 Segone, M. (ed) 2008a. Bridging the gap. The role of monitoring and evaluation in
evidence-based policy making. UNICEF Evaluation Working Papers Issue # 12.
Romania: UNICEF.
 Taplin, D and Clark, H. 2012. Theory of Change Basics. Available at:
http://www.theoryofchange.org/library/publications/. Accessed on 12 May 2015.
 The Presidency. Republic of South Africa. 2007. Policy Framework for the
Government-wide monitoring and evaluation system. Pretoria: The Presidency.
 Vigoda,E. 2003. New Public Management. Available at:
http://poli.haifa.ac.il/~eranv/material_vigoda/NPM.pdf

26 | P a g e
Economic Development Department logframe

28 | P a g e

Вам также может понравиться